CONFIDENTIAL — FOR INTENDED RECIPIENTS ONLY
LUCKEY — Seed Round 2026

The Data Infrastructure
for Physical AI

Every robotics and world model company is hitting the same wall: not enough real-world data. LUCKEY turns billions of earbuds users into the world's largest egocentric data collection network — while giving consumers cinematic video they actually want. Current collection costs $75+/hr. Ours approaches zero.

AURI X1 ear-mounted camera earbud
$650K Angel funded
85 Patent claims filed
B2B LOI Orbifold AI (Physical AI data)
The Problem

Physical AI is starving for data

Foundation models for robotics, world simulation, and embodied intelligence require massive amounts of real-world egocentric data. Current collection methods can't scale.

01

Hired actors cost $75+/hr

Research labs hire people to perform tasks while wearing cameras. Expensive, limited diversity, artificial behavior. It doesn't scale beyond a few thousand hours.

02

Robot teleoperation is narrow

Teleoperated data only covers lab environments — kitchens, tabletops, warehouses. It misses the vast diversity of real human life: commuting, cooking, exercising, socializing.

$75+/hr Current egocentric data collection cost
~100K hrs Total egocentric data in existence
No ceiling Log-linear scaling (NVIDIA EgoScale)
The Solution

Hardware collects.
AI pipeline distills.
Robots learn.

AURI X1 is a pair of camera earbuds — fisheye lenses mounted at the ear, the most biomechanically stable point on the human body. Users wear them to exercise, commute, and live their lives while listening to music. The cameras capture continuous wide-angle egocentric video.

Our AI pipeline transforms this raw footage into structured embodied data: skeleton trajectories, action semantics, object interactions, 3D scene reconstructions. Data that robotics companies can directly use for training.

Users get something they want — AI-generated cinematic third-person video of their activities. We get something the industry needs — diverse, real-world egocentric data at near-zero marginal cost.

Virtual drone follow-cam while running
VIRTUAL DRONE CAPTURE
AI-generated cinematic social media post
SOCIAL MEDIA OUTPUT
1

Wear & Capture

Ear-mounted fisheye cameras capture 170° ego-centric video. Built-in IMU tracks head motion. High-quality earbuds make it worth wearing all day.

2

Distill & Structure

Our pipeline extracts skeleton pose, hand-object interactions, action segmentation, camera ego-motion, and scene geometry from raw fisheye video.

3

Deliver & Monetize

Structured data is packaged per customer spec and delivered via API. Users receive AI-rendered cinematic videos. Data buyers get training-ready datasets.

Two Products, One Device

Consumer magnet. Enterprise engine.

For Consumers

Cinematic video from your life

AI-generated cinematic social media content

Wear AURI X1 while running, cycling, cooking, or traveling. AI transforms ego-centric footage into cinematic third-person video — follow-cam, orbit shots, hero moments.

  • Motion mode: third-person cinematic video
  • Moment mode: AI-triggered photo capture
  • Model mode: 3D body scan & virtual try-on
  • Muted mode: audio only, cameras off
For Enterprise

Structured egocentric data at scale

Every minute of user activity generates training data for robotics, world models, and embodied AI. Our distillation pipeline outputs structured data mapped to customer schemas.

  • Skeleton trajectories & hand-object interaction
  • Action semantic segmentation & task graphs
  • 3D scene reconstruction (Gaussian Splatting)
  • Anonymized, consented, privacy-compliant
Why the Ear

The golden point of human biomechanics

Maximum Stability

The ear is the body's most stable mounting point during locomotion. Unlike the forehead (glasses shake), chest (breathing), or wrist (arm swing), the ear sits at the natural pivot of head movement — the body's biological gimbal.

170° Ego + Body

A single fisheye lens at the ear captures forward environment, peripheral context, AND the wearer's own shoulders, arms, and legs — simultaneously. This dual visibility is what makes ego-to-exo reconstruction possible.

Zero Adoption Friction

500M+ people already wear earbuds daily. No new behavior required. No social stigma of camera glasses. Invisible, natural, all-day wearable. The best data collection device is one people already want to use.

Technology Pipeline

From raw fisheye to robot-ready data

STEP 01

EgoPose

Skeleton extraction from partial body visibility + IMU fusion. Fisheye distortion-aware models trained on our proprietary ego-view data.

STEP 02

Scene Reconstruction

Volumetric 3D scene via Gaussian Splatting. Camera ego-motion estimation. Object detection and tracking across frames.

STEP 03

Semantic Distillation

VLM-powered action segmentation, hand-object interaction graphs, task phase annotation. Structured output mapped to customer schema.

STEP 04

Cinematic Render

Ego-to-exo view synthesis. Autonomous cinematography: follow-cam, orbit, hero shot. Neural rendering with style transfer for consumer output.

4D Gaussian Splatting scene reconstruction
4D GAUSSIAN SPLATTING — SCENE RECONSTRUCTION FROM EGO VIDEO
Intellectual Property

85 patent claims filed under PPA 63/999,137 covering 16 subsystems:

Ego-to-exo video synthesis
Autonomous cinematography
Reality Anchor — hardware-rooted authenticity proof
Predictive safety alerting
Cross-morphology data engine
Multi-person collaborative sensing
Privacy-first distributed computing (5-tier)
Lifelong memory & digital twin
The Moat

Tesla's playbook, applied to human movement

Tesla's insight wasn't the car — it was turning millions of drivers into free data labelers. Every mile driven improves FSD for everyone. LUCKEY applies the same logic: every minute a user wears our earbuds generates egocentric data that trains world models and robot AI.

"20,854 hours of egocentric video → 54% improvement in robot dexterity. Log-linear scaling with R²=0.9983. No saturation observed."
— NVIDIA EgoScale, Feb 2026

Data flywheel

More users → more diverse data → better AI output → better consumer experience → more users. The flywheel that makes egocentric data abundant and cheap while competitors pay $75+/hr.

Distillation moat

Raw ego video is commodity. Our proprietary pipeline that extracts structured embodied data — skeleton trajectories, action semantics, interaction graphs — from fisheye ego-view is the real barrier. Optimized for our hardware's specific optical characteristics.

Privacy-first architecture

5-tier distributed computing: data processed locally first (device → phone → charging station). B2B customers only receive anonymized, structured data. Users own their raw data. Hardware-rooted Reality Anchor provides cryptographic proof of authenticity in the deepfake era.

Market Validation
Build AI $22M raised. 100K hrs factory ego video.
KLED AI ~$100M val. Human data marketplace.
Meta Ego-Exo4D 1,422 hrs paired data. 15 universities.
Traction

Early but moving fast

$650K
Angel funding secured from Chen Danian (co-founder, Shanda Group / WiFi Master Key) and Xiao Lianfeng.
Orbifold AI
B2B LOI signed. $5M seed (Bonfire/Fusion Fund). Multimodal data platform serving Fortune 500 and robotics labs — integrating AURI X1 as primary egocentric data capture device for Physical AI customers.
Scale AI
Active relationship with robotics data sales team. Distribution channel for structured egocentric data to world model and robotics customers.
85 Claims
PPA 63/999,137 filed. 16 subsystems protected including ego-to-exo synthesis, embodied AI data engine, and autonomous cinematography.
Business Model

Three revenue layers

Hardware is the entry point. Subscription is retention. Data licensing is where the real value compounds.

Layer 1 — Entry

Hardware

Kickstarter at $299, retail $399. BOM target under $80. The device consumers want to wear. The data collection infrastructure enterprises need deployed.

B2B: custom form factors for enterprise deployment
Layer 2 — Retention

AI Subscription

$9.99/mo for ego-to-exo cinematic rendering, cloud processing, AI-triggered highlights, and 3D reconstruction features.

Ongoing engagement that keeps devices active
Layer 3 — Core Value

Data Licensing

Anonymized, consented egocentric data sold to robotics and world model companies. Per-hour pricing. Structured data at premium. Raw video at standard. Custom schemas for enterprise.

Near-zero marginal cost. Infinite scaling.
Team

Built to ship hardware + AI

Shawn Gong
CEO / Founder

Shawn Gong

Forbes 30 Under 30 (Consumer Technology). Youngest-ever Red Dot Design Award: Best of the Best recipient (age 19). 33 granted patents. Tsinghua University. Founded Nums — smart trackpad with global distribution, Unbox Therapy feature (1.7M+ views). Deep Shenzhen hardware supply chain expertise.

Julian Quevedo
World Model Lead

Julian Quevedo

Stanford CS. Co-created Oasis — first real-time playable world model. Former researcher at World Labs (Fei-Fei Li). ICLR 2026 first author (Percy Liang). Chose to join LUCKEY over MIT, Berkeley, and Stanford PhD offers.

Mingmin She
Head of Product Ops

Mingmin She

Ex-Google 7 years (PM, Mountain View). UCLA Anderson MBA. Kickstarter launch experience. Deep Shenzhen supply chain network. Manages hardware production and vendor relationships.

Advisory Board

Avin Wang (Meta Superintelligence Labs) · Yilin Zhu (Apple ML, Stanford CS PhD) · Growing network across Stanford, Tsinghua, and Silicon Valley.

Roadmap

Path to market

Q1 2026 PPA filed (85 claims). Julian Quevedo joins as World Model Lead. Orbifold AI partnership initiated. Ego-to-exo demo in progress.
Q2 2026 Ego-to-exo demo complete. B2B POC with Orbifold — first structured data delivered to Physical AI customer. Kickstarter pre-launch ($299). Data distillation pipeline v1.
Q3 2026 Kickstarter live. 1,000+ consumer units ordered. B2B data pipeline in production. Scale AI distribution channel active. Multiple hardware form factors for enterprise (headband, clip, hat mount).
Q4 2026 First batch shipments. Data platform beta with API access. 3+ enterprise data customers. ARR trajectory visible. Series A preparation.
The Ask

Seed Round: $3–5M

Use of Funds

AI / Data Pipeline — core moat 35%
Hardware — prototype → production 30%
Go-to-Market — Kickstarter + B2B pilots 20%
Operations — team, legal, IP 15%

Milestones After Funding

Mo 1–3 Final prototype. Kickstarter pre-launch. First B2B data delivery.
Mo 4–6 Kickstarter live. 1,000+ units ordered. Data platform MVP.
Mo 7–9 Production shipments begin. 3+ enterprise data customers.
Mo 10–12 Data platform beta. ARR trajectory visible. Series A prep.

Let's build the data layer
for Physical AI

The world's robots will learn from the world's people.
We're building the infrastructure to make that happen.

Shawn Gong — CEO / Founder

shawn@luckey.to · wearluckey.com

LUCKEY — Palo Alto · Singapore · Shenzhen