Immersive Entertainment for Hospitalized Kids
  • Client: AT&T Foundry Palo Alto
  • When: 2017
  • Team: Product Manager, Virtual Reality Development Engineering team (Quantum Interface), and Non-profit with connections to musician and hospital (Melodic Caring Project)
  • My Role: Product Manager
Overview

As an Innovation Lead at the AT&T Foundry, I work on applying emerging technologies to make the world a better place. In 2017, I led a technology for good project to create immersive entertainment experiences for children in hospitals as a form of distraction therapy, through collaboration with Quantum Interface (technology startup) and the Melodic Caring Project (nonprofit).

AT&T is transforming into both an Entertainment and Telecommunications company. Projects such as this highlight the need for a strong 5G use case where there is a need for high throughput and bandwidth to support a quality experience for users. By applying emerging technologies to real-life problems, we are able to solve business needs, uncover new markets, and help change lives.

Immersive Entertainment for Hospitalized Kids: Amos Lee Concert in Virtual Reality on Google Daydream VR Goggles UX Mockup

Immersive Entertainment for Hospitalized Kids Poster

Background

In 2016, I initiated and spearheaded our exploration of the Augmented and Virtual Reality market to propose strategic business plays for AT&T and Ericsson. As I engaged with our ecosystem, I discovered two amazing companies, each experts in their own field — one a technology company specializing in virtual reality and custom interfaces (Quantum Interface) and the other a nonprofit that streams concerts to children in hospitals (Melodic Caring Project).

It occurred to me that if I connected Quantum Interface’s next generation, interactive interface for 3D experiences together with a 360 virtual reality video of Melodic Caring Project’s live streaming of personalized concerts to hospitalized kids, we could create something truly memorable. So, we did just that.

Live Streaming

After introducing everyone, the project took off immediately! Quantum Interface brought in SubVRsive to make the Virtual Reality capture a reality, and Melodic Caring Project brought in singer-songwriter Amos Lee and team as the content creators at the core of this experience. Together, the team used the magic of 360 live streaming to transport hospitalized children to an Amos Lee concert experience that they could share with their loved ones as they received encouragement from near and far!

Our partners were pivotal in making our live stream a success. Fresh off this achievement, we quickly moved onto the next step — building an interactive, immersive experience.

“Watching the concert as a VR show completely swept Maya away. She was feeling badly from recent chemo and having the glasses and the 360 experience took her away from the yuck that she had been feeling.”

– Maya’s father (Maya is in the photo below)

VR Virtual Reality 360 Video Amos Lee Concert Experience for Maya

Hands-free, Interactive Experience

After the live stream, our team created a post-produced 360 experience with an interactive layer on top of this recorded concert. Children are not only able to be immersed in the concert, but also can interact with the environment and control playback using only small head movements, all completely hands-free! This is huge for patients who may not just be confined to their care facilities, but may also be bedridden or tethered to medical equipment with limited mobility.

We have been fortunate enough to share our interactive experience with countless viewers both inside and outside of hospitals, and the response has been overwhelmingly positive.

Instructions for Immersive Entertainment for Hospitalized Kids
AT&T Shape Conference attendees enjoying the Immersive Entertainment for Hospitalized Kids

Quantifying the Impact

We started by collecting anecdotal evidence that we are helping patients escape from the discomfort of their hospital rooms. Now, we are quantifying the impact our immersive entertainment experience is providing as distraction therapy through clinical trials.

With the use of quantifiable biometric sensors and qualitative surveys, we are able to correlate the two to assess how effective this experience is as distraction therapy and what factors are necessary to build an effective distraction therapy experience.

Streaming a Catalog of Content

As we widen our audience reach, we are also expanding the catalog of content accessible within our video player. Using AWS, we are building a repository of content pieces that a user can draw upon. Patients are able to navigate hands-free through a library of options to suit their needs, whether it be a desire for an outdoor adventure to a relaxing meditation by the fire.

Amos Lee Concert being recorded backstage in 360 degrees for Immersive Entertainment for Hospitalized Kids
Amos Lee on stage during the concert for Immersive Entertainment for Hospitalized Kids

Press Release

Austin – March 7, 2017 – Philadelphia-based singer-songwriter Amos Lee played a sold-out concert at the world-renowned Austin City Limits Moody Theater on Saturday, February 25, 2017. This time, Amos’ sold-out show didn’t just reach the crowds of the Moody Theater. With the support of partners including Quantum Interface, SubVRsive and the Ericsson team at the AT&T Foundry, the Melodic Caring Project, a nonprofit that bridges the gap between music, technology, and patients battling serious illness by live streaming personalized concerts to kids and teens in the hospital, live streamed its first-ever multi-camera, virtual reality 360 video show directly to kids’ hospital/home care rooms.

Since Melodic Caring Project’s founders, Levi and Stephanie Ware, began this journey seven years ago, the nonprofit has broadcast approximately 400 concerts to nearly 5,000 children all over the world and has partnered with some of the world’s biggest artists, including The Black Eyed Peas, Jason Mraz, Andra Day, Alabama Shakes, Rachel Platten, Daughtry and The Head and the Heart. The nonprofit gives the performing artists a list of kids (aka rockSTARS) who will be watching the night of their show. Then, during the shows, the artists call each rockSTAR out by name, offering support and words of encouragement.

The project all started at the AT&T Foundry, a place where corporations, technologists, start-ups and content creators come together to incubate ideas and test concepts. With a mission to take the viewing experience to the next level, a three-way collaboration was created between The Melodic Caring Project (MCP), Quantum Interface and AT&T Foundry/Ericsson. Quantum Interface and MCP crafted a vision to live stream a concert in fully immersive 360 Video and, by drawing in other parties from their networks, were able to make that vision a reality. With the support of Ericsson and thanks in large part to AT&T Foundry’s network of partnerships and the leadership vision that pioneered this collaboration along with Amos Lee’s desire to get involved, that vision came to life in Austin’s famous Moody Theater.

On the night of the show, SubVRsive, which handled the end-to-end production and distribution of the stream, placed cameras in two spots on stage and streamed the entire concert straight to the VR headsets of the evening’s rockSTARS. As the kids watched the concert, they were able to stand on stage with Amos Lee and the band, look out into an audience full of supportive faces, and leave the sights and sounds of the hospital behind to truly feel present at the show. Saturday’s concert in Austin was the first Melodic Caring Project concert that used state-of-the-art VR technology to give kids an opportunity to virtually be on stage with Amos and his band.

“One of the rockSTARS watching the show in Austin was a girl named Maya,” said Evan Blackstone, VP of Melodic Caring. “Maya had previously taken to Amos after watching one of his concerts and afterward, Amos paid her a hospital visit in Seattle. On Saturday, Maya lay on her couch in her hospital room, yet was front row at the show getting the full live experience of hanging out with her buddy Amos. He spoke directly to her and the other rockSTARS throughout the night and, although he was miles away, they were all able to be right there at the show, no longer in the isolation of their hospital rooms.”

“It’s been said that virtual reality is the ultimate empathy machine because it allows users to not just look through a frame, but step into it,” said Austin Mace, CCO of SubVRsive. “Being able to let these kids step through the frame and be on stage with Amos was a really powerful thing and we’re grateful to have had the opportunity to leverage 360 live streaming for such a great cause.”

The final stage of this project will be to combine the video with a virtual interactive layer, which Quantum Interface will be contributing. In this way, the amazing 360 video will become an interactive experience, so the children will not only be able to be immersed in the concert, but interact with encouraging messages, the environment, and other virtual content using only gaze, completely hands-free. This is a significant and first in world experience that will bring the 360 experience to life even more.

“Music is a celebration of life. Musicians have such a unique, magical ability that deeply moves people. We live in a time and place that needs this energy more than ever. Especially the kids that Melodic Caring Project works closely with. Levi, Stephanie [Ware] and crew are world class. We look forward to supporting their process and helping them tell that story”.

– Amos Lee

Learn More

Check out AT&T’s featured article on the project. AT&T’s LinkedIn post. AT&T’s Tweet. AT&T’s Innovator Series post.

Check us out on the Ericsson Tech For Good Blog.

Read SubVRsive’s perspective on our project.

Read our partner Melodic Caring Project’s perspective on our project.

We are also receiving international coverage.

Check out one of our demos at the following locations: AT&T Foundry in Palo Alto, AT&T Foundry in Houston, Ericsson Experience Center in Santa Clara, Ericsson Experience Center in Plano, Ericsson Experience Center in Sweden, AT&T Forum in D.C., Quantum Interface in Austin, and Melodic Caring Project in Seattle.

Hololens Augmented Reality Pre-Production Visualization Tool for Filmmakers
  • Client: Fox Studios
  • When: 2016
  • Team: Product Manager, User Experience Designer and Researcher, Technical Manager, 3 Augmented Reality Software Developers, Film Maker
  • My Role: Product Manager
Overview

While at the AT&T Foundry, I led an Augmented Reality project to create a pre-visualization tool for filmmakers, specifically camera crews and directors in collaboration with Quantum Interface, Economist Media Lab, and teammates at Ericsson Research. Together, our team conducted user research, designed, and implemented a complete voice and natural gesture controlled Hololens solution to a prevalent need for film crews within three days at the Fox Studios Hackathon in June 2016.

Background

In June 2016, we were invited by the Fox Innovation Lab to build an augmented reality concept to improve the entertainment production process using the Microsoft HoloLens. In response, we created Spike, a tool that helps filmmakers streamline their pre- and post-production processes by tagging sets with virtual camera marker “spikes” and visualizing adjustments to camera settings. This same concept can be applied to other virtual objects in other contexts as described in the presentation video below.

The Presentation

The Concept Video

Impact

As the first Foundry AR/VR project, this success demonstrated the value of this technology solving real use cases beyond the gaming industry. It has since sparked interest in looking deeper into Augmented and Virtual Reality with further explorations into other industries and spurred discussions across the company.

Learn More

Quantum Interface’s video of our proof of concept showcasing their innovative interface technology.

Legacy Network Machine Decommissioning Tool
  • Client: AT&T Network Capacity Engineers
  • When: 2016
  • Team: User Experience Designer and Researcher, 2 Data Scientists, and Front-End Engineer
  • My Role: Lead User Experience Designer and Researcher
Overview

Just like many other large, established companies, AT&T finds itself in an interesting situation of needing to deal with a world of dramatically increasing scale and complexity. For decades, experienced company veterans have been able to address these challenges manually with the knowledge, skills, and intuition they have built over a career in their roles.

However, we are now reaching a scale where this is no longer feasible, and we are facing a host of problems that cannot be manually addressed. By building intelligent systems, we bring radical efficiency changes to how AT&T operates. These are orders of magnitude improvements, where something that may have taken months manually, may now take only an hour.

Along with my user experience research and design skills, my team built one such intelligent system to manage this complexity for network capacity engineers as they decommission legacy machines on the network. Our tool was extremely well received by our colleagues in the field and is currently being used in decommissioning to save unnecessary hours of labor and costs.

Legacy Network Machine Decommissioning Tool UX Mockup on single screen
Legacy Network Machine Decommissioning Tool UX Mockup on three screens

The Opportunity

AT&T’s legacy telephone wire system is made of automated, interconnected switches on a nationwide scale and has been around for decades. Maintaining this vast network requires a significant amount of money spent towards power. Naturally, AT&T is interested in finding equipment that can be decommissioned or replaced with newer, more efficient equipment in an effort to reduce this large footprint, all with an active, live network that must remain fully reliable through this process.

Currently, decommissioning a machine requires weeks or months of tedious, manual planning by highly experienced and trained engineers, each with their own personalized workflow.

The Solution

We created an assistive tool for network capacity engineers to reduce this time to hours by recommending better alternatives to free them for their other responsibilities. Our tool helps discover and recommend plans for removing equipment from the network.

Our project focused on two primary objectives:

1. Evaluating network equipment to establish a priority order for decommissioning.
2. Creating and recommending a valid circuit reassignment plan for operators to take into consideration.

Design Process

While my colleagues were developing the machine learning aspects to this project, I conducted user interviews to better understand the needs of a network capacity engineer to inform our product decisions.

As themes in their workflows arose, I developed quick mockups to encapsulate the needs I was hearing.

Azorian_Mockup3
Azorian_Mockup1
Azorian_Mockup2

I continued to iterate on them as our understanding of the user needs evolved.

Hand drawn pen and paper mockup of AT&T Legacy Network Machine Decommissioning Tool

As we were on a short timeline, my colleague quickly began implementing a working web-interface prototype as per my designs to connect the command line into a more user-friendly experience. Within a few days, we had a fully functioning version of the tool that we could share with our users.

Through usability testing with network capacity engineers, we identified tens of changes to our working prototype and continued to iterate.

Today, this internal tool is used by network capacity engineers across the company and is radically changing the way AT&T works. This product was so successful in creating an efficiency disruption within the company that more work of this nature is being requested by internal teams. I found this highly impactful project extremely gratifying. With just a few months of applied work, we were able to disrupt an internal process to save costs and significantly change how the company and employees operate day to day.

AT&T Foundry Innovation Strategy
  • Client: AT&T Foundry Palo Alto
  • When: 2015-2017
  • Team: Team of Product Managers, Business Analysts, Data Scientists, Software Engineers, User Experience Designers and Researchers, Marketers
  • My Roles: Head of User Experience Design and Research, Business Development and Partnerships Lead, Business Innovation Strategist, Innovation Lead, and Product Manager
Overview

I joined the Ericsson team at the AT&T Foundry in Fall 2015, and my experience has been extremely diverse and fulfilling. My work spans a number of disciplines and industries as we work to create innovative solutions to challenging problems.

The AT&T Foundries are a network of six innovation centers across the world–Palo Alto, Atlanta, Houston, Plano (2), and Israel, each sponsored by a different company or internal organization. In Palo Alto, we are sponsored by Ericsson and jointly innovate with team members from both AT&T and Ericsson. As a member of the Ericsson team at the AT&T Foundry, I serve as a bridge between the two companies.

The Foundries were originally created several years ago as an open, collaborative environment to inspire and promote the rapid invention and innovation of strategic ideas from concept to commercialization. Foundry team members drive their own projects and champion them to stakeholders, much like entrepreneurs and intrapreneurs. Projects emerge from personal passions, business unit needs, external partnership opportunities, and our employee crowdsourcing platform.

As one of the top internal brands to AT&T, we are highly regarded internally as the reliable source of all types of innovation, with a special focus on efficiency and disruptive innovations.

AT&T Foundry Innovation Center Overview

Design Leadership

I was hired to lead design at the Foundry, and since then I’ve led numerous design projects, trained up team members, and worked on initiatives to bring design thinking to all of AT&T. I am frequently consulted on how to apply design to products, teams, processes, and business models. My work ranges from exploratory research to ideation to prototyping and implementation.

Projects

For instance, since 2016, I have collaborated with my teammate on the design portion of building a highly trafficked, enterprise product to provide a complete, seamless, self-service experience to incubate and validate any virtual network function against AT&T’s Domain 2.0 Architecture as part of the release of ECOMP (Enhanced Control, Orchestration, Management and Policy). Check out some press on this project and how we are in the process of open sourcing it in 2017:

Has AT&T ICE’ed VNF Onboarding?
AT&T ICEs Vendors of Virtual Network Functions

In 2015, I immersed myself with our key enterprise customers to understand their needs and expectations around our new Network on Demand product. My research uncovered key product strategies and features that our team was able to champion both companies.

In 2016, I led design for yet another engaging, enterprise design project, though for an internal tool this time. Our efficiency innovation reduced the time for network capacity engineers to plan out how to decommission legacy network machines from several weeks to hours.

Leadership

Over the years as the head of design at the Foundry, I have trained my teammates in design thinking through collaborative projects, coaching, mentorship, and workshops. In turn, my colleagues are now applying design thinking to their work and even leading training sessions of their own.

One such instance occurred in 2016 when my colleague and I conducted interactive design thinking workshops for middle school and high school women interested in technology. Read more about our contribution here: AT&T hosts Girls in Future Technologies (GIFT) Day

My design team has found success identifying and solving difficult problems they could never have imagined in short periods of time through my guidance. One team member dove into understanding the needs of Uverse and DirecTV installation and maintenance technicians for several weeks. His research uncovered so many high impact opportunities that when we presented our work to the Senior Vice President responsible for these teams, the SVP allocated millions of dollars in resources to addressing these findings immediately.

Business Strategy and Partnerships in Emerging Technologies

While at the Foundry, I have had the pleasure to engage with hundreds of startups working in emerging technologies. In early 2016, I became especially curious to learn more about augmented and virtual reality and identified it as a high potential path to new revenue streams.

With my market research, I educated, championed, and began the dialogue within AT&T to apply these emerging technologies to our business strategy. As concrete demonstrations of strategic AR/VR plays, I sought out and developed strong partnerships with leaders in the industry to create cutting-edge projects.

In one such project, my partners and I created an augmented reality, spatial tagging tool for camera teams and film production crews using the Hololens in collaboration with FOX Studios.

In another, my partners and I developed hands-free, interactive, virtual reality entertainment experiences for hospitalized children and tested immersive entertainment as a form of distraction therapy.

Marketing

Along with our numerous technology and design projects, marketing, partnerships, and thought leadership are similarly top of mind. I have contributed on visual design, information architecture, content strategy, startup partners, and developing marketing and brand collateral for several key initiatives, some of which are elaborated below.

Specifically, six times a year, we host the Futurecast Series, where we invite an honored guest to participate in a deconstructed panel where the curated audience is invited to participate in the conversation between the guest and moderator. Before the discussion, we also invite relevant startups to demo their latest and greatest to attendees. All our past and future events can be found on our website.

In 2016, we also began the Futurist Reports. In this series, we dig into technologies and trends while highlighting key insights that are reshaping entire industries and our world-at-large. Each report includes an industry-wide view from a diverse array of leading experts and features select startups at the forefront of technology. We delve into broader business implications of these technologies and explore indicators such as collaborations, investments, market demands, and technology advancements. Check out our latest reports on the Future of Drones and the Future of Entertainment.

Barnes & Noble Nook Media Tablets
  • Client: Barnes & Noble Nook Media
  • When: 2011-2014
  • Team: 3 Mechanical Product Design Engineers, Electrical Engineer, and external Contract Manufacturing team
  • My Role: Product Design Engineer
Overview

I joined Barnes and Noble’s Nook Media team in Fall 2011. Though the company headquarters are in New York, Nook headquarters were in Palo Alto. Nook functioned much like a startup within the larger company, where we had the advantages of agility and resources.

Products

During my two and a half years at BN, I worked on the design of about ten tablets and accessories. Some were products I joined during the later stages of the development process and most were early concepts that we took through various stages of the development cycle.

Nook HD+ (shipped)
Nook HD (shipped)

Barnes & Noble Nook Media Tablets: Nook HD+ and Nook HD

Product Design

As a Nook Product Designer, I was offered the opportunity to dive even deeper into Hardware Product Design through the mentorship of my team. Over the years, I worked on almost every subsystem in a tablet, including custom batteries, cosmetic housing and structural parts, displays and touch screens, audio systems, antennae, buttons, connectors, PCB and flexes, media and camera systems, and high-level system architecture.

In teams of three, we not only designed the overall system architecture and the detailed mechanical part design, but we also collaborated with internal Engineering, Marketing, and Operation teams and external ODMs, consultants, partners, vendors, and suppliers.

Design Strategy and Research

During my final months at BN, our team did a massive re-evaluation of our roadmap. We worked side by side with multiple design consultancies to brainstorm and strategize what direction we ought to take our products. Over the course of a few months, we used our user previous research data to settle on some unique experiences that would be enabled by innovative product form factors and materials. It was very exciting to push the bounds of what we think of as a typical reader and to not only make one unique experience but to also design a whole family of products that interplayed with each other.

Patents

While at BN, we invented a number of unique assembly techniques for consumer electronic device product design.

US 20140201997 A1: Method for split wire routing in a cavity for a device
US 20140204547 A1: Apparatus for split wire routing in a bracket for a device
US 20140201996 A1: Techniques for split wire routing for a bracket in a device

US 20140026411 A1: Techniques for efficient wire routing in a device
US 20140027166 A1: Techniques for efficient wire routing in electronic devices
US 20140029218 A1: Apparatus for efficient wire routing in a device

HP Palm Smartphones
  • Client: Hewlett Packard Palm
  • When: 2010-2011
  • Team: 4 Mechanical Product Design Engineers, Electrical Engineer, and external Contract Manufacturing team
  • My Role: Product Design Engineer
Overview

Hewlett Packard acquired Palm soon after I joined in the summer of 2010. During my time at HP Palm, I was a member of the team that shipped the Palm Pre 2 and Palm Pre 3 phones. I learned and experienced a lot in my first year out of Stanford!

Products

During my year and a half at Palm, I worked on the design of about ten smartphones. Some were products I jumped onto during the later stages of the development process and most were early concepts that we took through various stages of the development cycle.

Palm Pre 2 (shipped)
Palm Pre 3 (shipped)
Windsor Not (never released)

HP Palm Smartphones: Palm Pre 2 and Palm Pre 3

Product Design

As a HP Palm Product Designer, I put my Product Design and Mechatronics degrees to good use. With teams of three to four Product Designers, we designed and developed entire smartphones from concept to shipping. Along with designing the overall system architecture and the detailed mechanical part design, I coordinated with and negotiated between various internal Palm Engineering, Marketing, and Industrial Design teams and external ODMs, vendors, and suppliers. During the product development process, I regularly analyzed and developed solutions for reliability and manufacturing failures.

Design Strategy and Research

While working on product development, I explored and prototyped concepts for future devices and physical interactions and experiences. I used market research and competitive analysis to inform my initial design directions. For instance, I primarily focused on what a dual screen experience might be as well as what a biometric (fingerprint sensor) experience could look like.

During this research, I realized that there was a huge need in the market for a candy-bar smartphone with a streamlined OS and what better system than webOS. In our downtime, my colleague and I began designing the concept for a new keyboardless phone. We pitched it to our team lead, manager, and within weeks our grassroots design gained traction from all over the company. Soon, our design was not only a product on the roadmap, but also the flagship product that had drummed up support, excitement, and hope from throughout the organization. We proceeded to develop our design through nearly the final stages of development. Though our product was canceled, I am extremely proud to have been a part of that experience.