The comma body runs openpilot! From openpilot’s perspective, the comma body is a car. It runs the same camera and logging software, so we’ll be able to learn from the fleet of bodies in the same way we learn from our fleet of cars.
At launch, the software will balance the body (using the same localizer as the car!) and allow you to drive it around via web interface. Shortly we’ll add video streaming to that web interface, allowing you to drive your body outside line of sight. We’d also like to add a VR app to drive the body from a first person view, this will be needed by the time we ship the arms.
Once we start getting data back from our body fleet, we can train models. The first model we will train (jointly with the driving model!) is a posenet, allowing inside out position tracking. When this model is shipped, the localizer will change from 3-DoF to 6-DoF. And once the dead reckoning is similar quality to what we have on the cars, adding a SLAM system will be very simple to allow drift free localization in your environment.