Unlocking iOS 10: the importance of continuity and context
Small gestures, big change. Apple announces updates with sweeping changes to better align interaction and visual design across its four…
Small gestures, big change. Apple announces updates with sweeping changes to better align interaction and visual design across its four software platforms.
by Michael Parker, Interaction Designer, Wilson Fletcher
Apple started its annual World Wide Developers Conference (WWDC) yesterday with a two-hour keynote of exciting announcements of new software being released in the coming months. This year, they announced major enhancements to watchOS, macOS (formally OS X), tvOS and iOS, releasing beta versions to developers for testing. Interestingly, the sentiment this year has shifted from last year’s feature-catch-up with Android, to a focus on improving the end user experience. There was little focus on the APIs and technical stacks that the audience of developers would be interested in and more of a holistic view of Apple’s platforms and their ability to create an omnichannel experience for customers.
Although there were a plethora of new features announced, I actually found the most interesting takeaway from today to be changes to the iPhone lock screen. The next iOS has the biggest interaction departure in nine years of iPhone. From iOS 10, users will no longer slide to unlock their iPhone. Think about that for a second. This is an interaction iPhone owners have been using around 80 times a day for up to nine years — certainly one of the most prolific digital behaviours. Yet even after only a few hours of using the beta version, I understand why it’s been changed and what it means more broadly for interaction design and product evolution.
It all started a few years ago with the introduction of Touch ID: the fingerprint recognition sensor built into the home button of iPhone. Even Apple was pretty blunt about it; the technology is so good and so quick, users waking their device with the home button were missing their notifications on the lock screen. It’s a good problem to have, but when you’ve spent nine years training people how to use their device in this manner, it isn’t an easy one to solve.
The first solution comes in the form of a motion-based wake, much like the Apple watch. iPhone 6s users can expect their iPhone to light up when raised without having to push any buttons — perfect for checking the time, weather or SMSs without accidentally jumping to the home screen. Swiping left to right now reveals a set of widgets (rather than a pin pad), which are useful glances of information like your next meeting or the battery level of your headphones. Camera also moves to a swipe action (right to left), bringing the interaction inline with Android and cleaning up the top and bottom swipes for the notification and control centre drawers. The iPhone lock screen now has swipe gesture in all four directions, and each direction has a specific function which is a massive win for simplicity and building cognitive recognition of the interface.
The second solution comes as a consequence of the new lock screen layout. Apple have taken the step to treat ‘unlock’ and ‘open’ as seperate states of the lock screen. An iPhone can actually now be on the lock screen and unlocked, it happens with a set of specific interaction intentions from the user. Pushing the home button is used to ‘open’, taking the user to the home screen or their last app regardless if the screen is on or off (much like touch ID does now). When the screen is on the behaviour is a little different; a small lock symbol at the top indicates if the device is locked or not. This fades out when a finger is placed on the Touch ID sensor, the text at the bottom changes from “push home to unlock” to “push home to open”, and that’s exactly how it works — you can just push the home button to open, or to unlock and open the device.
There’s an interesting affordance here in treating the home button universally as the button to access the home screen, both from within an app and from the lock screen.
It comes at a massive risk for Apple, they’re taking arguably the most iconic interaction of iPhone and replacing it, potentially causing millions of people to be frustrated, angry or confused about the simplest function of their familiar phone. There’s currently no onboarding process for the change, and even though the software’s in beta, I don’t expect there to be one on the final release. The trouble with onboarding is it’s admitting fault in the product’s ability to provide an intuitive experience at introduction; a dilemma every digital product owner has faced. Do we risk alienating users at the conception phase or hold their hands through the delightful interactions we’ve spent crafting? The answer is usually somewhere inbetween, but the general rule is if you need a manual for your app, you’re doing it wrong (*cough* VSCO).
The difference here is that Apple has actually recognised that the behaviour of their users has evolved beyond the functional interactions originally designed. It’s a case of product feature displacement, Touch ID performs so well it is actually breaking the simplicity of ‘slide to unlock’. Interaction designers, myself included, love to keep patterns the same. Patterns test well, they’re often simpler, they almost always align with the client’s human interface guidelines and stakeholders like their familiarity. Apple is probably one of the few companies with the luxury of actually being able to define their own user interaction behaviours and push them on their customers — they did invent the whole damn app economy. However, changing the unlock behaviour is an excellent case study in user-first design; the company that is notorious for pushing their own design agenda has actually recognised people are using their products as intended but now with great frustration. It’s refreshing to see deisgners’ egos drop aside and a digital product updated to reflect the nature in which it’s used.
I’m being anecdotal about this small but rather significant change as it’s a kind reminder to never set and forget. No matter the significance or the convenience, each interaction must support its own contextual function. Even at the great risk of alienating users, simplifying and better identifying how to best support the context of users is what will give longevity to digital products and services. If Apple can be brave and recognise it needs to do something genuinely hard and drastic for long term benefit, so can any product or service. I’m excited to see how this plays out, to watch and learn how the change process is managed, who struggles, and how long before #swipegate starts trending on twitter.
People generally hate change so if Apple can somehow pull this off seamlessly, I’d be paying close attention to what you don’t hear about come release in September.