MIT Looks at How Humans Sorta Drive in Sorta Self-Driving Cars

Ashesh Shah
December 8, 2017
550 Views

ALMOST HALF OF Americans will hop in their cars for a Thanksgiving trip this year. But if you were being very precise—if you were a team of Massachusetts of Technology researchers who study human-machine interactions—you wouldn’t say that all those Americans are “driving,” exactly. The new driver assistance systems on the market—like Tesla’s’s Autopilot, Volvo’s’s Pilot Assist, and Jaguar Land Rover’s InControl Driver Assistance—mean that some of those travelers are doing an entirely new thing, participating in a novel, fluid dance. The human handles the wheel in some situations, and the machine handles it in others: changing lanes, parking, monitoring blind spots, warning when the car is about to crash. Call it…piloting? Shepherding? Conducting? We might need a new word.
Fully autonomous cars won’t swarm the roads en masse for decades, and in the meantime, we’ll have these semiautonomous systems. And scientists need to figure out how humans interact with them. Well, actually, the first thing to know is that most humans don’t: Preliminary research by the Insurance Institute of Highway Safety noted that, of nearly 1,000 semiautonomous vehicles studied, 49 percent had their systems turned off. The warnings were annoying, owners said.
If you could actually watch those drivers—sit inside the car and eyeball them while they drive—you might get a better understanding of how these systems are helpful and how they’re not. Maybe drivers find one of kind of warning sound frustrating, but another (a bloop instead of a bleep?) helpful. Maybe they get more comfortable with the system over time, or stay mystified even as the odometer rolls over. That spying would be really helpful for people who build and design semi-autonomous systems; for those who want to regulate them; and for those expected to evaluate the risks of using these systems, like insurers.
That’s why MIT researchers are announcing this week a gigantic effort to collect data on how human drivers work with their driver assistance systems. They outfitted the cars of Boston-area Tesla, Volvo, and Range Rover drivers with cameras and sensors to capture how humans cooperate with the new technology. They want to understand what parts of these systems are actually helping people—keeping them from crashing for example—and what parts aren’t.
Read the source article at Wired.
Source: AI Trends

You may be interested

Enterprise Mobility Management: Today’s Evolution, Tomorrow’s Solution
Mobile Application Development
0 shares29 views
Mobile Application Development
0 shares29 views

Enterprise Mobility Management: Today’s Evolution, Tomorrow’s Solution

Sreeni T - Jul 18, 2018

Earlier, when people used to be knotted to their work desks for long hours throughout the day and be held…

The World Changing On Two Letter Acronyms – AI/VR/AR/ML
News
0 shares56 views
News
0 shares56 views

The World Changing On Two Letter Acronyms – AI/VR/AR/ML

Sreeni T - Jul 16, 2018

In this digital era, the terms like Virtual Reality, Augmented Reality, Artificial Intelligence, and Machine Learning are thrown in more often.…

New Augmented Reality Measure App in iOS 12
Mobile Application Development
0 shares236 views
Mobile Application Development
0 shares236 views

New Augmented Reality Measure App in iOS 12

Sreeni T - Jul 12, 2018

iOS, with the advent of the next version of iOS 12, that runs on all iPhones and iPads - will…

Leave a Comment