Key Discovery: You always need to remember who the product is for and whose problem you are solving. After pointed feedback from Y Combinator’s Micheal Siebel, I realized a general solution is not enough to turn an idea into a product. This insight influces all my future design work.
Outcome: A functional prototype available on Test Flight and onsite interviews at Y Combinator and Boost VC.
The idea for Relay was born from a UX class project I built to see how a mobile app might improve Tesla’s onboarding experience for new Model 3 owners.
Augmented Reality (AR) provides a compelling experience to train new Tesla customers about their Model 3.
How can a mobile app help Tesla deliver more cars while still giving customers a great experience?
"The delivery of the cars is where the investment is needed. We need to deliver three or four times as many cars."
My goal here was to properly investigate the problem before jumping to any solution. I started by gathering all kinds of information from different sources and then used affinity mapping to organize and highlight key insights from the data.
I talked with a Tesla engineer to understand the company’s goals and pressing needs.
I visited Tesla retail locations to talk with sales employees and prospective customers to see what the experience was currently like.
I talked with Model 3 reservation holders and owners as well as read dozens of internet forums and articles to better understand users’ expectations and experiences.
The informal conversations and research enabled me to draft and finalize a questionairre for more precise research and analysis.
I sent out a survey to the network I had established and received 30 responses
From those results I conducted 4 phone interviews
"People need time with the car at their own pace."
Augmented Reality (AR) allows users to learn at their own pace in 2 unique ways: projecting a virtual vehicle and annotating physical vehicles with virtual instructions.
Creating mockups for an AR app expose unique challenges to prototyping and user testing, however I don’t describe them here. Furthermore, many early iterations and testing was done on paper which is not shown. Here I want to focus on the key results which answer user concerns from my research.
AR projection gives users a show-room experience in their own driveway.
With the virtual car, we can explain the quality of the construction and battery maintenance in an exciting and interactive way.
Walk in and explore the virtual car as if it were real. We can guide them to common topics and questions.
After an initial deadline was met, I returned to the project with a fresh perspective. As I considered user feedback and development constraints, I was convinced a more streamlined interface was needed.
The new design removes all of the annotated cards and instead has a universal menu which adapts depending what is in the view finder.
These changes reduce visual clutter and improve clarity.
Here we see a 3D overlay show the user how to activate autopilot.
This design adapts better to portrait use which is how most people hold their phones.
A secondary menu could be added along the top to direct users to specific topics.
How do you turn a prototype into a functional app? How do you turn an app into a product and build a business?
A business partner and I set out to generalize the Tesla concept to other learning applications. Our initial topics of interest were automotive guides (like Tesla) and repair for older vehicles (like how to change your oil) and furniture assembly (think Ikea).
There’s a long story here, but most of it is out of scope as a UX project, so here is a quick summary.
In order to build something like this you need machine learning to detect physical objects, and augmented reality to annotate those objects or project virtual ones in physical space.
There are 5 general approaches to this available directly from Apple, and serveral other libraries from Google and Amazon.
All of them are built on similiar principles and therefore have similiar shortcomings. Most noticeable for us was difficulty with reflective surfaces which made identifiying cars like a Tesla nearly impossible in different uncontrolled lighting environments.
Technical limitations forced us to shift away from our initial MVP idea of a Tesla companion app.
The concept could be applied to many different situations, but we needed a place to start that was both feasible for me to build by myself and would be useful enough to gain some initial traction.
As such we investigated many different options, including home repair, furniture assembly, and cooking. We visited stores such as Ikea to talk with customers about their assembly experience.
We kept experimenting and built a few demos of possible use cases. We had to find funding quick, as our runway was limited.
We applied and were accepted into Startup School. We then applied and interviewed in person at Y Combinator in Mountain View but ultimately were not accepted into the Winter 2019 batch.
Ultimately we strugged to find an initial market to really go after, and to have the machine learning expertise to build a proper MVP and not just a tech demo.
By using machine learning to recognize specific components, the app can provide guided AR instructions projected over the real object.
I experimented with several different libraries before settling with using the system Apple’s AR Kit provides because it allowed for much more rapid experimentation.
On the Model S you can adjust the stop point of the motorized hatch. I often would forget how to do this.
So I built a quick guide which recognizes the hatch handle and tells you what to do.
Adjust the hatch where you want, then push and hold the button until you hear a beep.
The Tesla support page is also presented as an example for quick access.
Another experiment I did was to give quick tips for every day tech items.
Here I highlight a few Mac keyboard shortcuts such as how to take a screenshot.
A common theme of this project was to take the frustration out of learning and make it fun. As such another industry I investigated was board game instructions.
Here is a simple tech concept showing that we can recognize different cards in a players hand. This would let us build interactive guides for a new player.
As it stands at the beginning of 2019, the only tangible result is a tech demo available on Apple’s Test Flight. And while I’m disappointed at not getting funded by YC and building the next great thing, I’m really proud of the work here.
The journey and what I learned along the way have proved to be the real value items here. This project stretched all my muscles, from design to engineering, and pushed me to learn new techniques to prototype and build for a brand new medium such as AR.
Furthermore, this project really tested and forged my perseverance to complete a working demo despite many technical and business related challenges. Going through the ups and downs of being invited to YC, ultimately not making it, and still continuing was also a true character builder.
For now Relay remains on hold, but I when I am able I will return, I think there is still potential here.