- 2 Min Read / Blog / 3.2.2020
The new iPhone 7 features a redesigned Taptic Engine—Apple’s marketing name for the phone’s untraditional vibration motor—that powers tons of haptic effects across the device. The iPhone 7 home button, most notably, isn’t a button at all, but instead simulates a button click with some targeted vibrations from the Taptic Engine. Apple relied on the Taptic Engine in iPhone 6s to give some gravitas to its 3D Touch effect, physically letting users know when they’d activated a Peek or Pop. But with iOS 10 and iPhone 7, Apple is opening access to the Taptic Engine to third-party app developers, allowing for a myriad of new fine-tuned vibration effects to populate iOS apps from the App Store.
Apple has already deployed Taptic Engine effects across many of its system apps on iPhone 7, including adding subtle taps indicating a switch has been flipped in Settings or a series of pulses to accent a scrolling date picker in Calendar. And now, iOS app developers can begin experimenting with haptic effects in their own apps, as well, adding a physicality and playfulness to various parts of their interfaces. There’s a wide range of possible haptic effects available to iOS app developers, ranging from scarcely perceptible taps to strong and lasting shakes, which can each lend a different feeling to an iOS app user experience.
iOS app developers can begin experimenting with haptic effects that add physicality or playfulness to parts of their apps.
These new haptic effects make natural sense for the booming industry of iOS games—new games have control over device vibrations just like an Xbox or PlayStation controller does. But like Force Touch on Apple Watch, these Taptic Engine APIs offer iOS app developers an opportunity to bring a new dimension to their iOS apps’ user experiences, even beyond the context of games. Actions as simple as opening a menu or swiping to a new screen might not be significant or rare enough to warrant their own Taptic Engine effects. But these subtle vibrations can lend gravity and clarity to the actions app developers want to emphasize, like deleting a photo or reaching the end of a long form.
Like first-generation haptic effects on Android devices (or on more adventurous and odd devices like the oft-forgotten BlackBerry Storm), there is a risk of these examples of haptic feedback to be abused or feel like a gimmick. With any user experience tool in the iOS app development arsenal, it’s important to remember the actual value that these effects can deliver to end users. Haptics have the potential to aid users’ understanding of in-app actions, provide confirmations of having performed a task, and ultimately add to the experience of interacting with software. If deployed correctly, they can fundamentally transform a UX from being something expected and ordinary into something novel and delightful.
Far more than being a needless gimmick, haptics have the potential to help users better understand the actions they’re performing within iOS apps.
As iPhone hardware continues to evolve, Apple seems intent removing end-user–facing hardware complexity—like physical buttons and ports—and instead relying on more sophisticated, multi-purpose, and miniaturized solutions like wireless connectivity and haptics. It’s clear that Apple’s vision of its hardware future features zero moving parts, zero ports, and a multitude of internal complexity that offers users an approximation of the same features. There exists some future iPhone that looks something akin to the monoliths from 2001: A Space Odyssey, a featureless slab of polished glass that suggests no immediate interactivity and communicates through some inscrutable combination of wireless signals and haptic wiggles. Thankfully, it’s certain that Apple will provide iOS app developers with some APIs.