When Your Autopilot Fails

Cars have always fascinated me. This likely dates back to what I can recall of my earliest memories—sitting next to my father, riding with him in his 1962 Ford Fairlane, and watching him manually shift on the column. He’d even let me grab the shifter and after he depressed the clutch, I got to throw the Ford into third gear.

My earliest driving lessons were in a 1962 Ford Fairlaine.

The 1962 Ford Fairlaine: Back when men were men, and cars were meant to be driven.

I’ve just spent much of the past week trying to get JBE1 back to where he was pre-breakdown. For some reason, when my electrical system failure related to losing the serpentine belt, the incident also threw off my air conditioning. All seems to be right in the world, or at least with my car, at the moment.

The automotive world, like much of the rest of the things in our lives, has been increasingly altered by technology. Techno-utopians always consider technology’s upside, while minimizing and often, whitewashing any of the negatives of computers controlling most of our lives—and now, our cars.

For a year now, I’ve been writing for trade magazines about cars. Being a freelance writer who loves cars, it doesn’t get much better than getting paid to write about a subject that you are interested in and have a passion for. Even better, I’ve been getting assigned some articles of late that touch on  the intersection between our vehicles and said technology. Here’s just one example—this one covering hackers and today’s hyper-networked cars.

In the course of researching these articles and speaking to knowledgeable car and computer people, I’ve picked up a host of new things that I didn’t know before. I’ve also become less critical about some of the technological innovation than I might have been before. I’m still not ready to join the evangelical wing of the techno-utopian movement yet, especially when it comes to the autonomy of my car.

Associated Press writer Joan Lowy’s article the other day about self-driving cars and the human element piqued my interest. Lowy touched on a previous story about autonomous cars that didn’t have a happy ending. It also highlighted one of the major pitfalls and concerns the prevent Google from ruling the highways and byways of America—at least for another week or so.

Joshua Brown was a 40-year old tech company owner from Ohio who I’m guessing was in the techno-evangelist camp when it came cars and his Tesla Model S. Brown had ceded driving to his Tesla’s Autopilot near Gainsville, Florida when it didn’t recognize the danger directly ahead, failing to brake when a tractor trailer made a left turn and his car drove into the side of the trailer. After hitting the trailer, Brown’s Tesla went under it, then veered off the road, hitting two fences and a power pole, killing Brown.

What I noticed about Lowy’s article, and the accompanying NY Times piece about Brown’s accident, is that both of the journalists take a “distant” approach to the outcome. What I mean is that they are careful not to assign blame to the Autopilot (which obviously failed to recognize the clear and present danger of a tractor trailer directly ahead), or offer any obvious critique of technology moving at a pace that exceeds the human capacity to adapt to it. Better, at least in Lowy’s article, it’s humans that are at fault—our brains are just too unreliable to step in when our Autopilot decides to check out. I’m sure that the way our brains function in these situations is due to how we’re hard-wired to adapt based on the previous 300,000 years as hunter-gatherers, prior to the recent Happy Motoring epoch, with cars covering the earth. Perhaps we’ll adapt, or maybe we just simply cede our driver’s perch to a robot.

I don’t want to be too critical of Lowy or any other person writing about technology. It’s the landscape that anyone who hopes to get paid to write has to trod, whether they’re covering progress, politics, or automobiles. Being overly critical of technology only gets you labeled as a crank—and who wants that?