Private Access

This deck is private and requires a valid viewer ID.

Contact shawn@luckey.to for access.

AURI X1

See What You Missed

AI Camera Earbuds — Extending Human Senses

[ Hero Image ]
Scroll

The Gap Between Who You Are and What the Camera Shows

When you run with your favorite music, you feel like a hero from Mission Impossible. But if someone captures that moment on their phone? You see a sweaty, struggling middle-aged person.

Current devices can't bridge this gap:

  • Smart glasses: Shake violently during motion, slide with sweat, compress your nose
  • Selfie sticks & drones: Occupy your hands, disrupt your activity, attract attention
  • GoPro/action cams: Shaky first-person footage no one wants to watch

"Billions of people move every day. None of them can effortlessly film themselves like a Nike commercial."

[ Comparison Image ]

Wear Earbuds. Get Cinematic Video.

AURI X1 captures ego-centric wide-angle video from the most stable point on your body — your ear. AI transforms this into cinematic third-person footage that looks like a drone is following you.

1
Wear
2
Live your life
3
Get cinematic video
[ Demo Video Coming Soon ]

The Golden Point of Human Biomechanics

01

Stability

The ear is the body's most stable mounting point. Unlike glasses (forehead movement) or chest (breathing). The ear position is the human body's natural gimbal.

02

170° Field of View

A single fisheye lens captures forward environment, peripheral vision, AND the wearer's own body (shoulders, arms, legs) simultaneously.

03

Zero Friction

500M+ people already wear earbuds daily. No new behavior needed. No social stigma like camera glasses. Invisible data collection.

[ FOV Diagram ]

Four Modes. One Device.

🎬

Motion

Third-person cinematic video (follow-cam, orbit, hero shot)

📸

Moment

AI-triggered photo capture of meaningful moments

👤

Model

3D body scan + virtual try-on from ear-level capture

🔇

Muted

Privacy mode: audio only, camera off

Ear-clip form factor. All-day comfortable. Non-invasive.

[ Product Render ]

From First-Person to Third-Person: The Pipeline

STEP 01

EgoPose

Skeleton extraction from partial body visibility + IMU

STEP 02

3D Reconstruction

Volumetric scene via 3D Gaussian Splatting

STEP 03

AI Director

Autonomous cinematography: follow-cam, orbit, hero shot, music sync

STEP 04

Neural Render

Cinematic output with style transfer + generative enhancement

85

Protected by PPA (63/999,137)

85 claims across 16 subsystems

Including: ego-to-exo synthesis, autonomous cinematography, hardware-rooted reality proof, predictive safety, multi-person collaborative sensing, embodied AI data engine, and more.

$50B+

Opportunity at the Intersection of Three Markets

$40B
TWS Earbuds
by 2028
$5.2B
Action Cameras
by 2028
$4.2B
AI Training Data
by 2028

AURI sits at the center: a wearable that is BOTH a consumer product AND an enterprise data collection platform.

The Tesla FSD Playbook, Applied to Human Movement

Tesla's genius wasn't the car — it was turning millions of drivers into free data labelers. Every mile driven improves FSD for everyone.

AURI does the same: Every minute a user wears our earbuds generates egocentric video data that trains world models and robot AI.

Evidence: Log-Linear Scaling With No Ceiling

NVIDIA EgoScale (Feb 2026)

20,854 hours of egocentric video → 54% improvement in robot dexterity. Log-linear scaling law with R²=0.9983. NO saturation.

The more data, the better the AI. There is no ceiling.

Market Validation

Build AI: $22M raised, 100K hours factory egocentric video

KLED AI: ~$100M valuation, Solana-based human data marketplace

Meta Ego-Exo4D: 1,422 hours ego+exo paired data, 15 universities

Competitive Advantage

Current cost to collect egocentric data: $50+/hour (hired actors). AURI users generate it for free.

[ Flywheel Diagram ]

Early but Moving Fast

$650K
Angel funding secured
(Chen Danian, Xiao Lianfeng)
85
Patent claims filed
(PPA 63/999,137)
1
B2B customer in pipeline
(Orbifold AI, $5M seed)
5+
Suppliers evaluated
(Shenzhen, Huaqin ODM)

Roadmap

Q1 2026
Demo complete, PPA filed
Q2 2026
Kickstarter launch ($299 consumer, B2B pilot)
Q3 2026
First batch production
Q4 2026
Data platform beta, Series A

Three Revenue Layers

60%

Hardware

Kickstarter $299, retail $399. BOM target <$80.

25%

Subscription

AI rendering: $9.99/month. Cloud processing for ego-to-exo video.

15%

Data Licensing

B2B: anonymized egocentric data sold to robot/world model companies. Per-hour pricing.

Hardware is the launchpad. Subscription is the booster. Data licensing is orbit.

Built to Ship Hardware + AI

👨‍💼
Shawn Gong
CEO / Founder
Forbes 30 Under 30 (Manufacturing). BMW #NEXTGen 2040 Global Champion. 33 granted patents. Tsinghua University. Previous: Nums (smart trackpad, global distribution, Unbox Therapy 1.7M+ views).
👨‍💼
Mingmin She
Head of Product Ops
Ex-Google 7 years (PM). UCLA Anderson MBA. Kickstarter experience. Deep Shenzhen supply chain network.

Advisory Board

Avin Wang (Meta Superintelligence Labs) · Yilin Zhu (Apple ML, Stanford CS PhD) · Growing network of Stanford, Tsinghua, and Silicon Valley talent

Seed Round: $3-5M

Use of Funds

40% Hardware Prototype → mass production tooling
25% AI/Software Ego-to-exo pipeline, data platform MVP
20% Go-to-Market Kickstarter campaign, creator partnerships
15% Operations Team, legal, IP

Milestones After Funding

Month 1-3
Final prototype, Kickstarter pre-launch
Month 4-6
Kickstarter live, 1000+ units ordered
Month 7-9
Production, first shipments
Month 10-12
Data platform beta, Series A prep

Let's Build the Future of Human-Centered AI

Shawn Gong | CEO/Founder

shawn@luckey.to

wearluckey.com

LUCKEY GLOBAL LLC | Delaware, USA + Singapore