AURI X1
See What You Missed
AI Camera Earbuds — Extending Human Senses
This deck is private and requires a valid viewer ID.
Contact shawn@luckey.to for access.
See What You Missed
AI Camera Earbuds — Extending Human Senses
When you run with your favorite music, you feel like a hero from Mission Impossible. But if someone captures that moment on their phone? You see a sweaty, struggling middle-aged person.
Current devices can't bridge this gap:
"Billions of people move every day. None of them can effortlessly film themselves like a Nike commercial."
AURI X1 captures ego-centric wide-angle video from the most stable point on your body — your ear. AI transforms this into cinematic third-person footage that looks like a drone is following you.
The ear is the body's most stable mounting point. Unlike glasses (forehead movement) or chest (breathing). The ear position is the human body's natural gimbal.
A single fisheye lens captures forward environment, peripheral vision, AND the wearer's own body (shoulders, arms, legs) simultaneously.
500M+ people already wear earbuds daily. No new behavior needed. No social stigma like camera glasses. Invisible data collection.
Third-person cinematic video (follow-cam, orbit, hero shot)
AI-triggered photo capture of meaningful moments
3D body scan + virtual try-on from ear-level capture
Privacy mode: audio only, camera off
Ear-clip form factor. All-day comfortable. Non-invasive.
Skeleton extraction from partial body visibility + IMU
Volumetric scene via 3D Gaussian Splatting
Autonomous cinematography: follow-cam, orbit, hero shot, music sync
Cinematic output with style transfer + generative enhancement
85 claims across 16 subsystems
Including: ego-to-exo synthesis, autonomous cinematography, hardware-rooted reality proof, predictive safety, multi-person collaborative sensing, embodied AI data engine, and more.
Opportunity at the Intersection of Three Markets
AURI sits at the center: a wearable that is BOTH a consumer product AND an enterprise data collection platform.
Tesla's genius wasn't the car — it was turning millions of drivers into free data labelers. Every mile driven improves FSD for everyone.
AURI does the same: Every minute a user wears our earbuds generates egocentric video data that trains world models and robot AI.
20,854 hours of egocentric video → 54% improvement in robot dexterity. Log-linear scaling law with R²=0.9983. NO saturation.
The more data, the better the AI. There is no ceiling.
• Build AI: $22M raised, 100K hours factory egocentric video
• KLED AI: ~$100M valuation, Solana-based human data marketplace
• Meta Ego-Exo4D: 1,422 hours ego+exo paired data, 15 universities
Current cost to collect egocentric data: $50+/hour (hired actors). AURI users generate it for free.
Kickstarter $299, retail $399. BOM target <$80.
AI rendering: $9.99/month. Cloud processing for ego-to-exo video.
B2B: anonymized egocentric data sold to robot/world model companies. Per-hour pricing.
Hardware is the launchpad. Subscription is the booster. Data licensing is orbit.
Avin Wang (Meta Superintelligence Labs) · Yilin Zhu (Apple ML, Stanford CS PhD) · Growing network of Stanford, Tsinghua, and Silicon Valley talent
Shawn Gong | CEO/Founder
LUCKEY GLOBAL LLC | Delaware, USA + Singapore