IBM Immersive insights

Augmented Reality application for data analysis.

Role: Global Design Lead

I started and implemented a global design program for AR product development at IBM Analytics. My team works on creating the next generation of enterprise solutions, exploring the intersection of data analysis and cognitive technologies with Augmented Reality and Mixed Reality. We established a design process for AR development, that goes from generative research to insight analysis, user testing, ideation, UX design, to design strategy and cross team collaborations. As well as generate and own the project vision, execution, quality and client relationship.

 

IBM Immersive Insights was inspired by the way we think. What began as a daydream for some and curiosity for others, has become a stepping stone to a new reality. We have real-time data streaming to a headset and displayed in 3D, fully manipulable and scalable. It is also our way to begin exploring, creating and refining the next generation of enterprise applications for data analysis.

This tool helps users explore their data and communicate findings. It brings the power of data science tools to Augmented Reality (AR) visualizations, improving the user experience, data exploration, and analysis. The project is initially designed with data scientists as the users, however Immersive Insights has all kinds of potential in data analysis and presentation, and can benefit business leaders and executives, analysts, and many other users.

With Immersive Insights, we aim to enhance workflows, create a shared collaborative space, and overall enhance the power of data science.

immersive3-03.png
immersive insights description-20.png
immersive device-24.png

Design Challenge

For designers, the challenge with Immersive Insights was incredibly unique as we explored the new field of three-dimensional user experience design while learning and developing its application to data science. Augmented Reality is a relatively new technology, and there were very few design resources for us to rely on when beginning the project. We had to create our own design guidelines, working mostly through trial and error to establish a new set of best practices for this type of interface. Among questions to answer were ideal dimensions and measurements for the interface, understanding how to respect the user’s personal space, what’s the right distance and place for interaction, and creating a holistic sensory experience for the user. We made, tested and implemented these core user comfort guidelines without any precedent. We faced these design challenges in conjunction with considering how complex data science is and how our design could help with users’ workflow.

From low, to mid and high fidelity conceptualization to explore UI/UX.

From low, to mid and high fidelity conceptualization to explore UI/UX.

User Experience

User testing is integral to our design process. To create the best possible interface design, we worked in low and mid fidelity interactions to high fidelity 3D photorealistic rendering, and conducted user testing with the AR headset.

Understanding how data scientists work was a challenging task. We perform user research, that include many hours of conversations with different data scientist and a lot of iterations and testing with them to help us build something valuable.

We found that our users value direct collaboration and constant communication in our product. We built our server to allow multiple users worldwide to view the same information at the same time, allowing for efficient and constant collaboration. Users can explore the data and share insights in real time, all in the same virtual space as each other.

Because AR blends the physical with the digital, one of the big improvements that we implement to our design process, was the integration of industrial design principals to create the digital elements of our experience.

To design our AR UI elements, we explore and physical objects and materials.

To design our AR UI elements, we explore and physical objects and materials.

Information panel exploration.

Information panel exploration.

Immersive Insights “War Room” concept

Immersive Insights “War Room” concept

Virtual Assistant

One of the most interesting features that we integrate to Immersive Insights was the creation of a virtual assistant using Watson cognitive technologies such as speech recognition, to help our users navigate and explore their data in a more natural and efficient way. We design the personality, look feel and VUI.

Virtual Assistant exploration and final concept.

Virtual Assistant exploration and final concept.

Activations & Recognitions

Our project has been presented in many different conferences and trade shows around the world. We were one of the main IBM activations presented at SXSW 2017. I got the opportunity meet and talk about Immersive Insights with IBM's CEO Ginni Rometty. We were also part of Strata+Hadoop Conference, VRDC, Spark, Fast Track Your Data, to name a few.

In 2018 the team won an Spark Award, and were finalist in Information is Beautiful and NEXT Design Award.

2019 We won the Indigo Design Award - Gold in UX, Interface and Navigation and Silver in Interactive Design.

Immersive Insights won and was recognize with different design awards.

Immersive Insights won and was recognize with different design awards.

Constant line to try our product. Demoing to Phil Gilbert, Design GM. Meeting IBM's CEO Ginni Rometty.

Constant line to try our product. Demoing to Phil Gilbert, Design GM. Meeting IBM's CEO Ginni Rometty.

Rob Thomas, GM of IBM Analytics featuring our demo in Las Vegas, NV. at the T-Mobile Arena.

Rob Thomas, GM of IBM Analytics featuring our demo in Las Vegas, NV. at the T-Mobile Arena.