[ad_1]
Tomorrow will hopefully be the start of the arrival of the Apple Watch by a serious competitor in the form of Samsung hardware running Google software. The laptop space is in serious need of competition, but it’s unfortunate that the two competing efforts are basically the same. This similarity has profound ramifications for how we will experience technology for the foreseeable future.
The big complaint about Google products is that many have come before their time. As the years go by and successors are announced, it is clear to me that this was the case with a trio of active efforts between 2012 and 2015. If all was better implemented and – more importantly – there was greater engagement, the technological landscape today could be significantly different. Namely, it wouldn’t be two ecosystems run by Apple and Google doing basically the same things, just slightly differently.
Google Now: the basis for something different
What could have started with Google Now, a proactive feed that showed weather, upcoming calendar events, birthdays, ride and trip information, package alerts, movies / concerts you might like, nearby events / places / restaurants, news and much more including information from third party apps.
It was announced in July 2012 with Android 4.1 Jelly Bean, made its way to iOS a year later, and debuted on Chrome / OS in 2014.
All of this was displayed via a powerful card metaphor that only showed relevant information. Users had a feed accessible to the left of Android’s main home screen or quickly launched by swiping the home button up to track their day and see what’s coming next.
You didn’t need to go to different apps to see details of upcoming flights, check your emails to see when a package was arriving, or open a plethora of proprietary and third-party apps to see your information. In these apps, you are subjected to different layouts and have to learn different behaviors to access what is fundamentally your information.
Google Now has broken the silos of data and integrated it into a cohesive and familiar interface.
A year after the launch of Google Now, Google Glass was unveiled, and Android Wear came after that in 2014. Although very different hardware form factors, the common goal they shared was to present Now cards.
On Glass, you swiped horizontally on the side trackpad to see the cards on the small glass prism. Items to the left of the home screen / clock were relevant now or to come: weather, upcoming flight, or calendar events. Cards to the right of the home screen were a thing of the past, like messages and photos / videos.
Meanwhile, swiping from the watch face on the original Android Wear took you through full screen maps for message notifications (Hangouts), weather, Google Fit step count, running time. ‘You would need to get home / to work, information on upcoming flights, music orders and calendar events.
If you take a step back, it’s crazy to realize that from 2012 to 2015, consistent, insightful flow was the primary way you navigated two new form factors and was a central part of the phone / tablet experience. Android.
As a user of Google products, this three-year period has been so exciting for the consistency it has brought. It seemed like the company had a clear view of your data free from app-based silos and accessible through Google Now. The seemingly parallel development of Glass and Android Wear seemed to indicate something was coming after the app-driven smartphone.
In the short term, you would primarily access these cards on Android phones and tablets, while Wear helped you access information on the go with the added benefit of quick music control and message replies. Meanwhile, Google foresaw a future of face-mounted wearable devices that offer super-fast access to a camera and the promise of augmented reality where information is superimposed. As display technology matured, the prevailing theory was that it would replace the smartphone – rather than the smartwatch – because your next display wouldn’t be physically constrained by a panel. It could be projected into the world at infinite size and could be used for office work if paired with a keyboard / mouse.
Of course, none of this saw the light of day.
A typical retreat
Google Now was phased out over the course of 2016. It became less focused on you and was converted back to a “feed” that showed you web articles and videos that matched your interests. Discover, as it’s called today, brings up useful content, but not personal information related to your day.
Google Assistant was the intended replacement for proactive assistance, but the company was very adamant that users would get their voice assistance rather than an effective list of searchable maps. Google’s worldview has been greatly informed by Amazon Alexa and the Star Trek computer / vision, where people of the future interact with their technology through two-way conversation.
However, for those who didn’t like to speak out loud, this is a really big deal, and the assistant eventually ended up showing information about the assistant’s snapshot stream and smart display. However, it is clear that the visual display of information in the Assistant, compared to Google Now, does not have the same level of vision, the same central location on the phone, or wide support.
Elsewhere, Android Wear has gotten a lot more complex with version 2.0 and its stated goal of making your watch a standalone device. While Google wanted to make Wear less dependent on phones, it ironically replicated the entire smartphone paradigm. The feed you swiped to access became primarily for phone notifications, while apps took center stage with a Play Store on the device.
In a way, you can’t fault Google for going back to the familiar idea of apps. If users want something, they will search for the appropriate app to check the weather or see sports scores. This behavior is ingrained in them.
However, you can fault Google for chasing Apple. The key context in this three-year period where Android Wear became much more complex was, of course, the Apple Watch. The initial version of watchOS tried to do everything to the point of having to be scaled down before it made much sense to people.
But before this watchOS simplification happened, Google apparently took all the features the Apple Watch could do as a roadmap for how Android Wear should follow.
This two-pronged abandonment of a unified, non-siled flow across phones and portable devices has led to the entrenchment of the app paradigm today and into the foreseeable future.
Of course, Google didn’t operate in a vacuum, and while it can pull a lot of information from Gmail and your browsing interests, third-party apps made mobile successful, and developers wouldn’t want to give up their control of it. end to end. experience.
None of this is to be removed from the phone. The smartphone – and tablets, which were an extension, if not just an extension until very recently – are possibly the most important form factor ever invented and will be in place for the foreseeable future. For example, five to six years after the shutdown of Now, Glass, and Wear, Google is doubling down on the smartphone with custom silicon, set to restart its smartwatch effort, and display technology for wearable devices. next generation remains for years. a way.
However, everything is far from ideal.
Waiting for the next form factor
The application paradigm must end. It’s restrictive and forces people to learn and abide by a dozen different ideas of how to access information every day.
In the early days of the smartwatch, I would have assumed that a grid of icons couldn’t be crammed into a wrist-sized screen. I was clearly wrong, but I sincerely believe that you cannot use this existing paradigm on smart glasses for a myriad of technical limitations:
- Initially, the field of view on face-mounted wearable devices will be limited, so information that can be displayed should be prioritized. There needs to be a lot more curation and filing of what’s important.
- Controls will likely be wrist mounted – Jacquard, anyone? – and similar to a trackpad. Unlike individual touch interaction, the scanning is not too precise and, again, it is not possible to display as much touch information / targets on the screen.
The other thing to consider is that the Google Lens-like visual search and augmented overlays will be the main UI / Home screen. In the first case, you’ve just taken a scene and Google will hopefully display the most interesting thing you’re looking at. The latter will mostly surface as you navigate the world.
That said, it’s somewhat unfortunate that breaking the app paradigm and returning to a proactive user-centric flow requires a whole new hardware form factor that’s still a few years away. The great shame is that this change didn’t start years ago with smartwatches when all the pieces were in place.
If Google stuck to the basic concept of Now across all form factors, it could have familiarized people with a flow-based paradigm for using more personal technology and would have been a key difference from the iPhone. or at the very least forcing Apple to compete, all to the detriment of siled applications.
The launch of the first Wear OS 3 watch tomorrow is a momentous occasion for the competition but also a reminder that everything remains. Wearable devices on the wrist might have been a chance to free us from the phone app paradigm, but instead, we continue to embrace it fully.
FTC: We use automatic affiliate links which generate income. Following.
Check out 9to5Google on YouTube for more information:
[ad_2]
Source link