DoorDash launched a new gig app called Tasks that pays workers to strap smartphones to their chests and record themselves doing everyday activities like laundry, cooking eggs, and walking around parks. The app is currently blocked in California, New York City, Seattle, and Colorado — likely due to labor regulations — but available in other states where workers earn undisclosed amounts for filming their hands performing specific tasks that will train AI models and humanoid robots.

This represents the next evolution of data harvesting for AI training, moving beyond text scraping to systematic collection of human movement data. Companies building robotics and computer vision models need thousands of videos showing hands manipulating objects to teach AI systems how to navigate the physical world. What's striking is how DoorDash is essentially turning gig workers into unwitting participants in AI development — the same people whose jobs these models might eventually replace.

The geographic restrictions tell the real story here. States with stronger worker protection laws are blocked, suggesting DoorDash knows this arrangement sits in legal gray areas around worker classification and data rights. The fact that workers receive a "free" body mount after completing an onboarding task shows how the company is systematically equipping people to become human data collection devices. Most concerning: workers likely have no idea what specific AI models they're training or how their recorded movements will be used.

For AI developers, this signals a shift toward more sophisticated training data collection, but it raises serious questions about consent and worker exploitation. If you're building AI models, consider whether your training data comes from workers who understand what they're contributing to — and whether they'll benefit from the systems they're helping create.