It was with great excitement and anticipation that we waited for the unveiling of the new Apple augmented reality headset last week, or spatial computer as they’re calling it. Whatever the title ends up being, the vision for this new product is pretty clear, blend the real world with the digital. Pretty exciting stuff alright.
Now the concept of augmented reality isn’t a new one, in fact, I myself can dust off a few blue-sky thinking projects from back in my college days when I explored the usage of augmented reality and nanobots to help people working in agriculture make optimal usage of their land based on certain geological factors and predictive analytics, at the time [around 10 years ago] the thinking was that these types of solutions were just a few years away from being a reality, if only technology would catch up.
Well, here we are.
I have to say, at first glance, Apple seems to have a lot of the bases covered. The device itself looks semi-comfortable if not that transportable [I don’t bring my MacBook with me when I'm out and about]. But in my opinion, where the device comes into its own is the user interface. Apple products are renowned for their ease of use for navigation and responsiveness, and I expect nothing less with this new offering. Personally, I was really excited to see the new methods for navigation:
Eyes - You use your eyes to focus and click
Voice - More voice search, mmmm okay I guess
Flick - To scroll, love smaller hand gestures
If Apple can hardbake immersive control into the user face of visionOS then I do think they will be onto a winner, good UX is more than just making an interface look nice, subtle nuances such as transitions and haptic feedback go a long way into providing a better overall user experience.
A subtle hint at what designers should be looking out for in the introduction video was the mention of scenes. This is almost Apple's way of telling us that the best experiences on visionOS will be the highly curated and well-thought-out userflows.
As I look up at my computer screens as I'm writing this, I currently have 5 different windows up, 3 of which are browsers and each browser has at least 10 tabs open, Apple will do everything they can to steer users away from having cluttered and overwhelming visionOS environments, as someone who loves nothing better than shutting down all my tabs and clearing my desktop of icons, I tend to agree with this approach and think it will provide a better overall user experience.
A spectrum of immersion
One of the coolest things I think we will see when designing environments for visionOS will be the usage of fluid 3D experiences to grab attention and focus our users on intended user flows. By giving our UI windows volume in the space, it should be possible to nudge users in the right direction and keep them focused on the job at hand.
How will we use this at TripAdmit
We have done extensive research into the different types of people who book tours and activities online, some people book when they're at their destination, others and the people who we think would benefit more from a visionOS solution are the ‘planners’. Planners will assemble their tour itinerary months in advance of their date of travel, they will watch influencers to get their recommendations and they are also people who really like having all their details stored centrally. With Apple’s mention of device detection, we think there will be a good crossover there between planning a trip and then taking all the relevant details of your itinerary with you.
It’s still early days here at TripAdmit skunk works and we probably won’t get our hands on one of these devices until later this year, but with only 8 third-party companies worldwide currently working on a visionOS solution, were happy with the initial dipping of our toes for now.
Link: Here is the college project for anyone interested: https://www.behance.net/gallery/11801217/Fresh-Farming-Solution