I’ve been negligent in chronicling my automotive rants lately, having
gotten sucked into that swirling vortex that is daily-living.
Additionally, there is a new phenomenon in play that baffles me and
commands my attention. In a country fraught with racial tensions, economic woes
and (ugh) political campaigns we now seem to market shootings as if they were
reality TV. Not to be outshined by these horrible news-events turned
entertainment, our political leadership, incumbent and prospective, use the opportunity
of human tragedy to posture and push their agenda. I find the rancor repugnant.
I still want to believe America is better than all this, that she has value
other than what self-advancement we can squeeze out of divisiveness…. Maybe
that day is done, so let’s just have Hillary and The Donald duke it out on
American Ninja Warrior. It’s what we want anyway.
On May 7 Joshua Brown lost his life as his Tesla, at full highway
speed, drove under a tractor semi-trailer pulling across the road. Brown was
operating the vehicle in “autopilot” mode, which is a technological pit-stop
before we hit full car automation. This technology allows the car to maintain
lane, speed and brake as required during highway cruise. NHTSA and Tesla both
still have this incident on the half-shell, but it has been assumed the car
didn’t brake itself as it didn’t recognize the light colored trailer against
the bright sky. To little contrast for it to differentiate the threat.
Tesla reiterates that that the driver is ultimately responsible for the
car’s (or technology’s) usage. Yes, and as a manager I am responsible for the
performance of my assigned employees. This doesn’t mean I think out every move
they make or control every event of their day. I “trust” that they understand
their tasks and that they will involve me if they don’t. I put someone in place
to perform a task, so I let them perform it. Likely regardless of what legal
releases Mr. Brown signed at the dealership, this was his understanding of his
car’s technology as well. He trusted it would do what it was designed to do.
Back to the cost: So here we have a situation where an individual mistakenly
entrusted a technology with a critical task to the loss of his own life. By his
reasoning and past experience, the car would do what it had to do, go down the
road and leave him out of the mix. NHTSA’s most recent published records are
2014 data, but in that year 32,675 people lost their life in automotive accidents.
About 90 funerals a day. By way of contrast, in 2013 (CDC’s latest stats),
11,208 people lost their lives due to firearms in our 50 states; this data
includes malicious intent, accidental and defensive loss of life.
And here’s the rub; Automotive
or firearm causes, these people lost their lives; often through no fault of
their own, or sometimes due to poor choices… sometimes just being in the wrong
place at the wrong time. I suspect a husband
or wife, mother or father, son or daughter feels no less grief regardless of
the cause of their loved one’s death. The tears don’t suddenly evaporate as
they exclaim “oh, it was only a car”! Automotive deaths don’t drive TV ratings
or make viable campaign fodder I guess.
Automated cars will bone up their act one day, getting more proactive,
less prone to failure and “safer”. But regardless how automated and “safe” they
get, there will still be deaths. People will use them; with all the poorly
informed decisions that go along with personhood. We may not care for the
answer, but ultimately Tesla is right, we alone are responsible for how we use
the technology. So regardless of the technology, let’s use it responsibly.
©
2016 D.W. Williams
No comments:
Post a Comment