ACL Hacks 26

Control the car. Watch your ghost race.

Drive a 1/18 scale race car with hand gestures. A neural network learns your style. Then your AI ghost takes the wheel and races another driver's ghost on a real track.

Latency< 50 ms
TrackPhysical 1/18
How it works

From hand gesture to autonomous lap

A short pipeline: capture human driving, train a model, hand the car back to the model. Three stages, one car.

01

Drive

Hand position controls steering and throttle. Pinch to brake. The system records every input alongside synced video frames.

X · steering
Y · throttle
Pinch · brake
Z · gear
02

Clone

A behavioral cloning CNN trains on your session. It learns the way you take corners, where you brake, how aggressively you hold the line.

framescnncontrols
03

Race

Your trained model drives the car autonomously. Two ghosts — trained by different humans — face off lap-for-lap on the same track.

Tech stack

What's under the hood

Off-the-shelf hardware, open-source ML. Nothing custom-fabricated — the trick is the pipeline.

Hardware

  • AWS DeepRacer
    1/18 scale physical race car
  • AWS DeepLens
    Onboard camera + inference
  • ESP32
    Wireless control bridge
  • Leap Motion
    Hand tracking sensor
  • Webcam fallback
    MediaPipe-based hand input

Software

  • PyTorch
    CNN training and inference
  • MediaPipe
    Real-time landmark detection
  • OpenCV
    Frame capture and preprocessing
  • MQTT
    Low-latency control transport
  • Next.js
    Race dashboard and visuals
  • NumPy
    Data pipeline and telemetry
  • ChipKit
    Microcontroller firmware
Race status

Status & leaderboard

Standby
Ahuman_a · v0
Car A
Laps
0/5
Best lap
--:--.-
Progress0%
Bhuman_b · v0
Car B
Laps
0/5
Best lap
--:--.-
Progress0%

Leaderboard

Today
RankPolicyLapsBest lap
#1human_a · v00--:--.-
#2human_b · v00--:--.-

Hand cam

Source
Waiting for hand input
Run laptop/hand_drive.py to feed live hand commands.