From the moment you put your hand on the steering wheel, the computer in your car has already started working out what to do.

In a typical day, the car’s computer will decide what to drive, from where to stop, how to turn, and so on.

But it’s the human driver that decides what to move and how, says Daniel Barenboim, an associate professor at Stanford University and co-author of the new book “Why Your Brain Works: Understanding and Improving Human-Driven Autonomous Systems”.

When you do something with your hand, you’re still paying attention to the computer, says Barenbos co-authors Michael Sussman and Andrew Daley.

But as you move your hand around, your brain switches off the motor control of your hands and muscles, allowing your brain to become more autonomous.

Barenbs computer also keeps a list of the actions you’re about to perform, with your name, date, and place of birth.

This data can then be used to make a prediction about the way the car will drive next.

The more that happens, the better the computer can perform the job.

Barely five years ago, human-driven cars were still limited by their computer vision systems, but as they have improved, so have their capabilities.

They can do things like stop for pedestrians, navigate roads, and even drive themselves.

Now, Barens computer vision system is able to drive at least one kilometre per hour, and it can even drive for longer distances than humans.

“It’s incredible how the human motor cortex is able, by looking at the data that’s available to it, to perform tasks that the computer could not do,” says Berenboim.

“That’s a huge advance in the last 50 years.”

And this isn’t just happening in cars.

In one of the book’s most intriguing experiments, the authors asked a human to drive a robot that had a camera and a microphone and that could respond to commands from a human using a smartphone app.

The human had to drive on a two-lane road in a city that had two lanes of traffic, so they had to make some decisions.

“This is the biggest challenge for people who are working on driverless cars,” says Sussmans co-lead author, Andrew Daly.

The researchers had two tasks.

One was to drive the robot at the minimum speed necessary to avoid hitting people.

“The other was to use the camera to capture the entire trajectory of the robot.

And this is the part that really surprised us,” says Daley, who worked on the project with Barenb’s team.

“We found that the human was really good at this task.

They could drive with very little input from the robot, but the robot could drive extremely fast.”

“This study provides a roadmap for the future of driverless car technology,” says Steven R. Levitt, a professor of applied physics at the University of California, San Diego.

“I think that it will be very helpful for people like us who are trying to get into this field.”

A new generation of self-driving cars, including those coming from companies like Volvo, will have sensors that can track people as they walk, as well as the road ahead.

These systems are also more advanced, and can identify and avoid obstacles at any time.

The technology can also help the robot keep up with human-controlled vehicles, but it’s still in its early days.

“Our research is still in the developmental phase,” says Levitt.

The authors are now looking at whether these sensors could be used for other tasks.

“There are a number of potential applications for this type of system,” says Michael Sommers, a robotics professor at the Massachusetts Institute of Technology.

“For example, the human hand might one day be used as a kind of remote monitoring system, to get a feel for how the robots are doing and whether they need more reinforcement.”

The authors will present their work at the Society for Automotive Engineers International Conference in Montreal next month.

The work was funded by the DARPA Robotics Challenge.