What interface? How Apple’s iOS 7 will change the way you ‘think’
“We believe technology is at its best, at its most empowering, when it simply disappears.”
— Jonathan Ive, Senior Vice President, Design, Apple Inc.
In a matter of hours, Apple predicts hundreds of millions of people will download its new iOS 7 operating system for iPhone, iPad and iPod touch devices. The company anticipates shipping its 700 millionth iOS device within weeks. Receiving the most consumer buzz this past week are the hardware features of the new iOS 7-based devices, from the vivid case colors of the iPhone 5c to the enhanced camera system, robust chipset and integrated fingerprint sensor of the iPhone 5s. But what is truly revolutionary about iOS 7 is what has all but disappeared: the interface.
Death of the Click
Apple calls iOS 7 a “pure representation of simplicity.” Essentially, it’s an interface that isn’t meant to be noticed, shifting how Apple’s mobile interfaces are now being designed and deployed. Here’s what fascinates me: Prior to June 2013, all of Apple’s previous iOS versions received attention for how the interface looked, from the glossy buttons, subtle textures and drop shadows to the reassuring, almost-like-real renderings of wood, plastic, fabric and metal. But, these days, all those design treatments are beside the point. With iOS 7, Apple has eliminated nearly every trace of texture, gloss and almost-like-real renderings in an effort to bring content to the forefront.
This follows a logical progression for Apple. When Apple introduced the iPhone and iOS, it leapt into a new user interface paradigm with multi-touch. For the first time, Apple’s users were able to touch a screen and move elements around, without the aid of a mouse. This was a tremendous step forward — removing layers of abstraction between the user and the content.
Consider a multi-touch ‘swipe’ versus a ‘mouse click.’ In Pavlovian terms, we’ve been conditioned to push (or click) a button on a mouse to make something happen. The mouse is attached to a cable plugged into our computer, which moves a cursor icon on our monitor’s screen and initiates an action representing our intention. This has been the case since 1984 when Apple popularized the graphical user interface and computer mouse. But the future of user interfaces steps away from this antiquated notion. The goal now is for user intention to become the interface. In 2007, Apple accomplished this by removing the indirect steps (like clicking a mouse) between the user wanting something to happen and it actually happening — suddenly, you could touch your content right on the surface of the screen.
Intentionally Intuitive
Today’s iOS 7 brings the user even closer to interacting directly with their content with an interface that’s carefully engineered to infer the user’s intention wherever possible, almost as if it can read minds. For instance:
- On-board sensors determine whether the user is stationary, walking, running or in a moving vehicle and adjusts the user interface as appropriate—automatically switching the maps app from driving directions to walking directions when it determines that you are now walking down the street instead of driving.
- The calendar app calculates the distance between your current location and the location of the next meeting scheduled—automatically scheduling travel time before the event and alerting you when you need to leave in order to arrive on time.
- The multitasking feature learns when you like to use each app in order to update your content before you launch it so there’s no lag time.
In its developer transition guide for iOS 7, Apple outlines three themes it deems critical for developing within the new operating system: deference, clarity, and depth. Borderless buttons, dynamic type, translucent controls, and realistic motion are just some of the elements Apple has incorporated to make this happen.
Apple’s secret here is that iOS 7 doesn’t APPEAR to be groundbreaking. Biometric technology like iOS 7’s new fingerprint scanner has been around for years. But, as Apple states boldly in its iPhone 5s marketing materials, “It’s not just [about] what’s technologically possible. But what’s technologically useful. It’s not just what’s next. But what should be next.”
Thought: The Next UI Frontier?
Imagine what will need to happen in order for the interface to disappear completely. We’ve gone from clicks to swipes, to voice commands with Apple’s Siri or Google’s Now. Is the next frontier of user interface eye-tracking, blinks, and nods (Google Glass)? Is it gestural motion (Microsoft Kinect for Xbox and Windows)? What about thought? If Apple’s goal is to remove abstraction between the user and their content, surely thought is the most direct route. As far-fetched as it may sound, this isn’t the stuff of science fiction.
In this potential future, tech won’t just appear to read your mind through predictive design. It will literally receive messages from your brain as if you are virtually clicking that beloved, proverbial mouse button. How? Perhaps the most promising indication comes from the campus of the University of Washington where, just last month, a researcher was able to send a brain signal via the Internet to control the hand motions of a fellow researcher at a separate location.
The groundbreaking hand motion that was achieved? It was the push of a button. Putting aside the obvious poetic irony, this moment will serve as a major step in the virtual disappearance of the interface as we know it.
As for iOS 7, consider it an exercise in inferring your intention without the aids we’ve come to know and expect. For current iPhone and iPad users, this will be a major adjustment. But make no mistake, the learning curve Apple is serving up today will become the groundwork for future generations (likely, our own grandchildren) to wonder, “What does a ‘click here’ mean?”
Leave a Reply