Eye Tracking + Voice + AI: A Former Apple/Meta Builder Demos Gaze-Driven Agent Interaction
A SwiftUI and ARKit prototype uses eye tracking as a 'micro-intent' signal, letting users direct an AI agent by simply looking at objects and speaking naturally.
Subscribe to unlock all stories
Get full access to The Singularity Ledger, archive included.
Cancel anytime. Payments powered by Stripe.