The crowd in Apple’s auditorium would have cheered loudly during the WWDC 2021 keynote segment dedicated to macOS Monterey when Craig Federighi placed an iPad next to a MacBook to show how the Mac’s keyboard and mouse interaction can be extended to the tablet. The press conference was an online-only event, so people had to cheer privately or in groups of friends or colleagues following live Apple’s newest software announcements. But there’s no question that Universal Control was the most amazing WWDC 2021 demo.
It seemed like magic; Apple’s “just works” mantra taken to an incredible extreme. There wasn’t a wired connection between the MacBook and the iPad. It all happened instantly and wirelessly, with no need to perform any setups. Add the iMac to the mix, and the Universal Control demo gets even better. This might have been a well-choreographed routine, so real-life usage might vary. But Universal Control is an incredible feature that seems out of this world.
It turns out that Apple didn’t have to invent any new features to get Universal Control working. It all relies on technology steps that Apple made in previous macOS and iOS releases. Following the same logic, Universal Control might be one of the major building blocks that will help Apple introduce the future of computing — the Minority Report interaction we’ve been dreaming about.
Today's Top Deal Amazon just kicked off a massive new sale — see all the best deals right here! Price:See Today's Deals! Available from Amazon, BGR may receive a commission Available from Amazon BGR may receive a commission
When Chief John Anderton (Tom Cruise) starts pulling those images apart in the film, looking for clues that could help him fight crime, the audience is immediately in love with the user interface. Anderton uses hands to zoom in, turn and twist, and project images everywhere around him to get a better look. That computer UI concept, created in 2002, comes up every time real-world tech gets closer to Minority Report tech — we’ve made the same parallel when Google unveiled the Project Soli radar in the Pixel 4 a few years ago. But even that technology isn’t that user-friendly, as it involves plenty of users interacting with the computer, not just Anderton. And it involves physical devices to move data from one place to another. Here’s a reminder of that amazing Minority Report computing tech:
Universal Control might be how Apple starts doing something similar with its interconnected world of smart devices. It might move just the cursor between computers now, but Universal Control might power UI modes that involve various devices, including a product that Apple is yet to launch, augmented reality glasses. It so happens that aside from Universal Control, Apple already announced an Apple Watch technology a few weeks ago that could work hand in hand with Universal Control in the future.
Before we get there, we have to understand that Universal Control relies on existing “magic” in macOS and iOS. The Verge learned precisely how the technology works. Continuity and Handoff features that allow users to continue activities while changing devices are at the core of Universal Control.
As long as the user is authenticated with the same iCloud account on all devices, Universal Control is possible. Bluetooth will tell the gadgets they’re in close proximity, and then the operating system will gauge intent. It’ll realize the user wants to move the mouse cursor beyond the edge of the Mac, and that’s where it’ll move the cursor on the iPad’s screen.
A Wi-Fi Direct connection will then allow data to move back and forth between the two devices, making features like drag and drop possible — here’s Apple’s Universal Control demo again:
In the not-so-distant future, Apple might build upon the Universal Control protocol and algorithms to allow users to beam information back and forth between devices that don’t necessarily have to be placed close to one another.
The functionality might be all the more critical for Apple’s smart glasses whenever they arrive, as the new product is likely to work well with the entire ecosystem. And the sleek AR glasses that Apple is reportedly working on will have to offer advanced features without necessarily having lots of processing and storage powers of their own.
That’s where a combination of eye and gesture tracking and Universal Control might work to beam and manipulate data from various devices.
The Apple Watch and other wearable devices might play a significant role in all of this, thanks to a brilliant innovation that Apple quietly introduced a few weeks ago. Called Assistive Touch, the feature can translate hand gestures into action on the Apple Watch display. It “reads” the user’s mind, if you will. The feature is meant to allow people who can’t use the touchscreen to interact with the device.
But the same technology might easily be used in more complex interactions that might involve detecting gestures and transforming that intent into actions on the screen — here’s that Apple Watch Assistive Touch demo from a few days ago:
For example, an Apple Watch in the proximity of a MacBook might interpret a flick of a wrist that happens as a movie is playing on the smaller MacBook/iPhone display as an intent to project the same film on a much bigger, virtual display that only the user sees via the AR glasses. The video would AirPlay on the glasses, and Universal Control would handle the interaction and data movement between all these connected Apple devices.
Voice control via Siri will likely be included in any computing experience that involves advanced eye and gesture recognition, in which the scenario above is even simpler to execute. Rather than using gestures and intent detection, telling Siri to beam the movie to a different device might do the trick.
Ultimately, touching a screen or a mouse to beam that video to a different Apple gadget would get the same result, but the combination of gestures and voice commands would reduce the need of touching screens, mice, and keyboards to get the job done.
In time, use cases would evolve to a computing experience similar to Minority Report, where multiple users could experience a similarly complex computing experience that involves interacting with multiple elements on multiple devices via a combination of gestures and voice commands. But if we ever get there, we probably won’t have to interact with physical objects to actually transfer data from one gadget to another as it happens during that iconic scene.
Today's Top Deal Luxurious bed sheets with 100,000 5-star Amazon reviews start at just $22 in this amazing sale! List Price:$37.99 Price:$22.39 You Save:$15.60 (41%) Available from Amazon, BGR may receive a commission Available from Amazon BGR may receive a commission