Behind the Scenes of the World’s Best XR Designers
Interview with Morgan Fritz from ODG
At Halo Labs we are on a mission to reveal the ultimate workflow for creating XR digital products, and we believe that this mission can only be achieved by working together the community, sharing knowledge and exchanging thoughts.
Behind the Scenes of the World’s Best XR Designers is a series of interviews with the best and brightest talents from top companies in the immersive space, sharing with the community about their failures, best practices and lessons learnt on their way to build outstanding XR products.
Beside the (important) fact that Morgan Fritz, Head of UI/UX Design at ODG, is an amazing human being, she is leading the design efforts for one of the market leaders in the XR space. Morgan contributes to the XR community a lot (see links below) and was kind enough to share some insights into her workflow in this interview.
Osterhout Design Group (ODG) is a leader in Augmented Reality Smartglasses and extended reality technologies, and has a long history in the evolution of mobile computing. ODG current work aspires to transform how people engage with one another and the world around them in exciting new ways.
ODG has in-house hardware and software development here in San Francisco. The hardware team has developed a series of stellar AR Smartglasses for users across consumer, industrial, enterprise, and business mobility, and the software team is currently developing our SDK for R-8 and R-9 glasses.
Can you tell us about your background and how did you end up working in the AR space?
A bit of luck, actually! A progression of influential experiences have led up to where I am now.
I studied Industrial Design at Carnegie Mellon University, and minored in mechanical engineering to understand the full process of design to production. This same desire for holistic understanding fed perfectly into User Experience design; different medium, same thinking methods. I began as a UX designer at IBM, where I developed invaluable research, ideation, and production skills and, was part of a stellar team that took a product from ideation through to development.
Shortly thereafter, an exciting new opportunity arose to lead software design at ODG. Relying upon spatial thinking methods from my Industrial Design background and the systematic thinking of UX, I was able to merge the aspects I enjoy most from each.
Not until I actually began work, however, did I realize just how exciting the innovation at ODG would be. Augmented Reality feels like giving a painter a new set of colors that have yet to ever touch a canvas. We can certainly build from conventions tried and true for mediums like print, web, and mobile, but there’s so much more to question, break, and create anew.
Tell us about your design team
I am part of our core design team who sets standards and best practices. I work closely with our creative director, vp of headworn, software engineers, and our designers specializing in experimental work, 3D modeling, graphic design, and UX/UI. I sit with the software engineers, and can literally lean over to ask a question, or discuss an idea. Additionally, I work with sales and marketing to create demo content and share resources ranging from “cool demos” to relevant research findings or customer feedback. I really enjoy this casual collaboration that happens organically within our team.
What is the biggest challenge you face around your workflow and how do you solve it?
By far, the greatest challenge is how much of my process is still a bit of guess and check. We have no prior model on how to do this, and at the same time, no limitation to what we can do. User testing and prototypes are heavily relied upon, but it is still challenging to carry over discoveries from a VR prototype on this tool, to an AR prototype on that. At the moment, there just is no “one stop shop” tool for rapid prototyping in AR, although there are some that are coming very close, including Halo Labs. But this challenge is also what makes AR so exciting.
The best way I’ve found to remedy this is to just keep play testing. I’ve explored a large number of tools to find which are most compatible with one another and work for my process. The challenge has been a fruitful and fun exploration.
Can you walk us through your design workflow?
One example is that I have been working on a long-term project creating ODG’s headtracking experience. I help with demos at times, but otherwise my process is specific to the user-orbiting content built for consumer hands-free use.
I began the project with a known persona and quite a bit of market research. We began by discussing ideas, sketching out general directions on paper, and then building mockups in Sketch and After Effects. Although it’s flat, Sketch remains the fastest way to ‘pump out’ user flows and ideation concepts. However, I’ve found a few tools to view the flat content in VR and on our glasses as I work to start visualizing early on.
I will say my AR design process consists mostly of informed play. We repeat this simple prototyping process multiple times, and frequently user test externally and around the office formally or sometimes pull in coworkers just walking by.
The internal user tests have been the most beneficial part of my process. Every person I work with brings a different perspective to the table not only in terms of typical user testing, but also their own expertise in ODG. They bring up concerns of their own that are specific to our glasses. The range of focuses help us see more angles of a problem than the user testing questions alone, which I find to be the best tool in creating a unified experience!
From here, I start working on more clarified designs and prototypes in Halo, and begin exporting UI for developers. I’m fortunate to sit a desk away from our software developers, so our process is relatively informal. I do create final files to hand off, however I’ve found that we’ve had the most success in huddling up and discussing everything from questions to problems as they arise. Quick clarification, or leaning over and asking for a modal UI, gets things done when people need them and has proven very helpful.
From here, I am learning Unity so that I can start using it regularly to create more robust prototypes for final user testing before development. I’m especially hoping to begin using the EditorVR tool and see how this can assist my process as well!
If you are interested in more details, I have been compiling my findings into an ODG Process Article and Prototyping Article to share, which I hope to continue to build upon with the help of others!
What are the UI/UX best practices you always keep?
Organization is key! The amount of time you save by naming shared content consistently, and organizing strategically cuts out a LOT of frustration later on.
My best practices revolve around consistency. The options are limitless in designing for AR/VR. However, there is still a learning curve for the average user. The best way to address this is through consistency. If the user has a starting point of knowing what they can select, how to make selections, and how to navigate through content, they can begin to build familiarity. Core familiarity then allows users to expand to more complex operations when these are understood. I also recommend initial walkthrough tutorials for clarity.
The Process Part I article above goes more in depth on my organization best practices, and we also have an ODG Best Practices Article that you can check out for designing on our glasses!
What surprised you the most about designing for AR?
How much unexplored terrain is out there! The seemingly limitless possibilities are both intimidating and liberating. I’ve learned to try and let everything feel open and flexible until user testing tells me otherwise. Even in these early days of exploration, I’ve seen such powerful ideas and applications of AR, and this is just the start!
What were the biggest mistakes you made when designing for AR at first?
Staying inside the box. Literally. Especially prototyping with Sketch, I had to force myself to forget about the edges of the artboard and remember that content can live all around the user. It takes a few “oh duh” moments to move away from my former ingrained process and change the set of constraints.
There’s a bit of a beginning learning curve on the technical side as well. Looking back, I wish I had spent more time upfront just playing. Not making anything in particular but figuratively pressing all of the buttons to see what they do and discover what I don’t know. It takes that hands-on experience to move from concept to comprehension.
What advice would you give to a designer or a team starting to work at the AR space?
Don’t lose track of your design process! Remember that you’re designing for a user and not for the technology, so be mindful as to why you’re harnessing the capabilities of AR. Is AR applied for the sake of novelty or is it adding value?
Don’t forget that you still have a target user, and they’ll still have specific needs that drive your design. User testing is key. Rethinking your tools and process to user test for AR is challenging at the start, but is well worth its outcome!
How different is designing for AR glasses compared to mobile AR?
There are similarities between glasses and mobile, such as designing spatially for various environments, and knowing how to inform users where to face. However I would say a big difference is input methods.
I don’t have a screen. Clicking on content works very differently in AR glasses. A mobile phone has the option to tap select, and keep buttons always available on the screen. Additionally, users have relative familiarity with a phone which lends to a smaller learning curve, but this will likely change as HMD devices become more widespread and affordable.
When it comes to AR glasses, there is quite a bit to teach the user. If my content is headtracking based, how do they know they can select this by hover, or that that scrolls? Additionally, we have multiple means of input, such as headtracking, buttons on the glasses, a bluetooth keyboard, and bluetooth remote. How does a user interchange these, and how is that consistent across various apps? “Expected interactions” are being established, such as Hololens’ airtap, but the majority are yet to gain the same familiarity we have with our smartphones.
What makes you excited working at ODG and what is the most exciting project you are working on these days?
We’ve been creating an experience that is entirely headtracking based. It began feeling very futuristic as a concept. However, as we build and test, it is so exciting to see that, it truly WORKS! The tangibility of each tiny successful step feels so big. I sometimes think to myself, “what if no one’s thought of this before..?” It’s this constant stream of discovery and excitement that drives me each step of the way.
Additionally, the community of the AR/VR world has really changed my perspective on innovation. I’ve learned something from just about every conversation I’ve had with people in the space, and I’ve never experienced a field so filled with passionate people. Everyone’s excited to share and discuss ideas, the future of AR, and their process. I’ll be the first to admit I still have a lot to learn and explore, and I appreciate working with people that support and encourage that, not only within ODG, but in the greater community of AR/VR.
Anything else you would like to share?
Thank you for taking the time to listen to my thoughts and explorations. I will say that, what truly makes the AR/VR space so exciting is how accessible it is. From tinkerers to those working in the field, anyone can contribute, resources are open sourced, and there’s no rulebook! It’s a limitless expanse for creativity, and I’m extremely excited to see where it will take us!
Feel free to connect with us: Halolabs.io | Twitter | LinkedIn