MIT Looks at How Humans Sorta Drive in Sorta Self-Driving Cars

Ashesh Shah
December 8, 2017
182 Views

ALMOST HALF OF Americans will hop in their cars for a Thanksgiving trip this year. But if you were being very precise—if you were a team of Massachusetts of Technology researchers who study human-machine interactions—you wouldn’t say that all those Americans are “driving,” exactly. The new driver assistance systems on the market—like Tesla’s’s Autopilot, Volvo’s’s Pilot Assist, and Jaguar Land Rover’s InControl Driver Assistance—mean that some of those travelers are doing an entirely new thing, participating in a novel, fluid dance. The human handles the wheel in some situations, and the machine handles it in others: changing lanes, parking, monitoring blind spots, warning when the car is about to crash. Call it…piloting? Shepherding? Conducting? We might need a new word.
Fully autonomous cars won’t swarm the roads en masse for decades, and in the meantime, we’ll have these semiautonomous systems. And scientists need to figure out how humans interact with them. Well, actually, the first thing to know is that most humans don’t: Preliminary research by the Insurance Institute of Highway Safety noted that, of nearly 1,000 semiautonomous vehicles studied, 49 percent had their systems turned off. The warnings were annoying, owners said.
If you could actually watch those drivers—sit inside the car and eyeball them while they drive—you might get a better understanding of how these systems are helpful and how they’re not. Maybe drivers find one of kind of warning sound frustrating, but another (a bloop instead of a bleep?) helpful. Maybe they get more comfortable with the system over time, or stay mystified even as the odometer rolls over. That spying would be really helpful for people who build and design semi-autonomous systems; for those who want to regulate them; and for those expected to evaluate the risks of using these systems, like insurers.
That’s why MIT researchers are announcing this week a gigantic effort to collect data on how human drivers work with their driver assistance systems. They outfitted the cars of Boston-area Tesla, Volvo, and Range Rover drivers with cameras and sensors to capture how humans cooperate with the new technology. They want to understand what parts of these systems are actually helping people—keeping them from crashing for example—and what parts aren’t.
Read the source article at Wired.
Source: AI Trends

You may be interested

Trends Transforming Cloud Computing in Year 2018
Cloud Services
0 shares190 views
Cloud Services
0 shares190 views

Trends Transforming Cloud Computing in Year 2018

Ashesh Shah - Jan 12, 2018

Cloud computing accelerates Enterprise Transformation everywhere Cloud is no longer about cheap servers or storage — it’s now the best…

Machine Learning a Catalyst for Enterprise Productivity
Artificial Intelligence
0 shares390 views
Artificial Intelligence
0 shares390 views

Machine Learning a Catalyst for Enterprise Productivity

Ashesh Shah - Dec 29, 2017

What is that one thing that every enterprise strives to achieve? It's "Productivity". With the advancements in the technology, it's…

IoT A Giant Leap For Mankind
Internet of Things
0 shares365 views
Internet of Things
0 shares365 views

IoT A Giant Leap For Mankind

Ashesh Shah - Dec 15, 2017

The internet landscape is growing at an exponential rate. It is not just limited to desktops, laptops, mobile tablets and…

Leave a Comment