Monday, May 29, 2017

Reflections on Technology in France


My wife and I recently had the privilege of spending a week in southern France, at a conference in the small town of Aurillac (pronounced "AW-ree-ack").  I say small—27,000 people is about the size of Cleburne, Texas, which is a town I'm somewhat familiar with.  Based on my admittedly very limited and undoubtedly biased observations of what we saw and experienced, I'd like to make some comparisons between the different ways that the French citizens we encountered and Texans have dealt with technology, broadly defined.

First, transportation.  In Texas, if you don't have access to a car, you are automatically placed in a category that is inhabited largely by very poor people, eccentrics, and the homeless.  There are some folks who don't drive and who also don't meet any of those descriptions, but the great majority of able-bodied adults in Texas drive nearly everywhere.

Not in Aurillac.  We flew into town and landed at the single airport, which is basically one building in a field, by a parking lot.  And we took a taxi into town, about a seven-minute ride.  But from that point onward for the next week we didn't set foot in any motorized transport, and frankly didn't miss it a bit.  At the end of our visit, we walked twenty minutes or so to the train station and rode the train to Paris.

A lot of people appear either to walk to work in Aurillac or ride bicycles.  There are cars, but the main parking in the center of town is an underground garage.  This allows the Aurillacans (Aurillacois?—I don't know enough French to say) to avoid cluttering up their thousand-year-old town with ugly parking garages, or knocking down a 15th-century church to pave the land over for Renaults or Audis.  I can't imagine how much it cost to excavate the garage without disturbing the quaint 19th-century plaza park above it—many millions, I suppose.  But it was done, somehow, and consequently, much of downtown Aurillac would still be familiar to a peasant who knew the town as it was in 1600 A. D. 

In Cleburne, they have old stuff too—the county courthouse, for instance, which dates all the way back to 1913 A. D., and was recently restored.  But for parking, you just have to find a lot somewhere or park on the street.  There is no commercial airport, and although there are train yards and an Amtrak station, getting anywhere on the train is really complicated and inconvenient.  Nearly everybody who wants to go to Cleburne drives there along U. S. 67, or takes the new tollway that connects it to downtown Fort Worth nearby via the highway loop around the city for those who are just passing through.

Next, the pattern of daily life.  When my sister lived in Cleburne, she would get up early, get in her car at maybe 7:30, and drive 45 minutes or so to her job in Fort Worth, where she runs a nursing department that uses very high-tech stuff, computers, and so on.  Then she'd drive back in the evening around 5 or 6 and have supper, and while she lived in Cleburne for close to a decade, I'm not aware that she developed any serious connections with other people in the town.

In doing this routine, my sister follows a pattern laid down by the Industrial Revolution, which requires the close scheduling of large numbers of people doing coordinated things in institutions such as factories, schools, and hospitals. 

Things are different in Aurillac.  Yes, the little tobacco and newspaper shop across the street from our hotel opened up every day about 6 AM.  But for the next three hours, there wasn't much else going on in the way of business.  Around 9 or 10, most places were open, but at noon, a lot of them closed for two hours—lunch, you see.  Then at 2, they would open up again, sometimes, and then again maybe not.  The Museum of Volcanoes we visited had such hours, and stayed open till 7 PM. 

Then, and only then, the typical Aurillac resident starts thinking about supper.  The restaurants we went to typically didn't even open in the evening until 7.  In the afternoons and evenings especially, the outdoor cafes would fill with people of all ages, sitting around talking about—well, I mostly couldn't tell what they were talking about, because I don't understand French.  But they seemed to be content to jaw for hours on end, either in person or on their mobile phones.  We did see a lot of people using mobile phones there, and I suppose that's one way in which the French and the Americans are pretty much alike:  the near-universality of the smart phone.  But the French folks we saw haven't allowed it to put an end to the practice of polite conversation at the supper table, which smartphones have nearly succeeded in doing in many U. S. households and public places.

There were bars in Aurillac, but they weren't crammed with people seemingly desperate to unwind from a tense day.  People there seemed content to sit at a table with a glass of beer and just look around, or think, and not have a phone or a paper in their hand.  You don't see that much in Cleburne.

As I say, this is a completely unscientific sample of life in France.  I'm aware of many of the negatives cited by some Americans about life there:  the excessive government regulation and intervention in the economy, the high taxes, the paucity of religious influence.  But somehow, the citizens of Aurillac have made it to 2017 without letting modern technological society homogenize them into looking like any mid-size town in the U. S. with multinational-corporation logos plastered everywhere.  They do have a McDonald's in Aurillac, but they also have butcher shops that have been in the same place, with the same tile on the floor, since 1925.  And that isn't unusual there. 

I liked Aurillac a lot, and our week there was a sample of life in a slower, more meditative lane that I hope to keep with me, at least in thought, now that I'm back in Texas.  It wasn't better or worse than Cleburne, it was just different.  But different in some ways that were very appealing.

Monday, May 22, 2017

Your Money Or Your Data: The WannaCry Ransomware Attack


On May 12, thousands of users of Windows computers around the globe suddenly saw a red screen with a big padlock image and a headline that read, "Ooops, your files have been encrypted!"  It turned out to be a ransom note generated by an Internet worm called WannaCry.  The ransom demanded was comparatively small—about US $300—but the attack itself was not.  The most critical damage was caused in Great Britain where many National Health Service computers locked up, causing delays in surgery and preventing access to files containing critical patient data.  Fortunately, someone found a kill switch for the virus and so its spread was halted, but over 200,000 computers were affected in over 100 countries, according to Wikipedia.

No one knows for sure who implemented this attack, although we do know the source of the software that was used:  the U. S. National Security Agency, which developed something called the EternalBlue exploit to spy on computers.  Somehow it got into the wild and was weaponized by a group that may be in North Korea, but no one is sure. 

At this writing, the attack is mostly over except for the cleanup, which is costing millions as backup files are installed or re-created from scratch, if possible.  Experts recommended not paying the ransom, and it's estimated that the perpetrators didn't make much money on the deal, which was payable only in bitcoin, the software currency that is virtually untraceable. 

Writing in the New York Times, editorialist Zeynep Tufekci of the School of Information and Library Science at the University of North Carolina put the blame for the attack on software companies.  She claims that the way upgrades and security patches are done is itself exploitative and does a disservice to customers, who may have good reasons not to upgrade a system.  This was painfully obvious in Great Britain, where their National Health Service was running lots of old Windows XP systems, although the vast majority of the computers affected were running the more recent Windows 7.  Her point was that life-critical systems such as MRI machines and surgery-related instruments are sold as a package, and incautious upgrading can upset the delicate balance that is struck when a Windows system is embedded into a larger piece of technology.  She suggested that companies like Microsoft take some of the $100 billion in cash they are sitting on and spend some of it on free upgrades to customers who would normally have to pay for the privilege.

There is plenty of blame to go around in this situation:  the NSA, the NHS, Microsoft, and ordinary citizens who were too lazy to install patches that they had even paid for.  But such a large-scale failure of what has become by now an essential part of modern technological society raises questions that we have been able to ignore, for the most part, up to now.

When I described a much smaller-scale ransomware attack in this space back in March, I likened it to a foreign military invasion.  That analogy doesn't seem to be too popular right now, but I still think it's valid.  What keeps us from viewing the two cases similarly has to do with the way we've been trained to look at software, and the way software companies have managed to use their substantial monopolistic powers to set up conditions in their favor.

Historically, such monopolistic abuse has come to an end only through vigorous government action to call the monopoly to account.  The U. S. National Transportation Safety Board can conduct investigations and levy penalties on auto companies who violate the rules or behave negligently.  So far, software firms have almost completely avoided any form of government regulation, and the free-marketers among us have pointed to them as an example of how non-intervention by government can benefit an industry. 

Well, yes and no.  People have made a lot of money in the software and related industries—a few people, anyway, because the field is notorious for the huge returns it can give a few dozen employees and entrepreneurs who happen to get a good idea first, implement it, and dominate a new field (think Facebook).  But when you realize that the same companies charge customers over and over again for the ever-required upgrades and security patches (which are often bundled together so you can't keep the software you like without having it get hacked sooner or later), the difference between a software company and an old-fashioned protection racket where a guy flipping a blackjack in his hand comes in your candy store, looks around, and says, "Nice place you got here—a shame if anything should happen to it" becomes hard to distinguish in some ways.

Software performs a valuable service to billions of people, and I'm not calling for a massive takeover of software firms by the government.  And users of software have some responsibility for doing maintenance, assuming that maintenance is of reasonable cost and isn't impossibly hard to do, or leads to situations that make the software less useful.  But when a major disaster like WannaCry can cause such global havoc, it's time to rethink the fundamentals of how software is designed, sold (technically, it's leased, not sold), and maintained.  And like it or not, the U. S. market has a huge influence on these things.

Even the threat of regulation can have a most salutary effect on monopolistic firms, which to avoid government oversight often enter voluntarily into industry-wide agreements to implement reforms rather than let the government take over the job.  It's unlikely that the current chaos going on in Washington is a good environment in which to undertake this task, but there needs to be a coordinated, technically savvy, but also ethically deep conversation among the principals—software firms, major customers, and government regulators—to find a different way of doing security and upgrades, which are inextricably tied together. 

I don't know what the answer is, but companies like Microsoft may have to accept some form of restraint on their activities in exchange for remaining free of the heavy hand of government regulation.  The alternative is that we continue muddling along as we have been while the growth of the Internet of Things (IoT) spreads highly vulnerable gizmos all across the globe, setting us up for a tragedy that will make WannaCry look like a minor hiccup.  And nobody wants that to happen.

Sources:  Zeynep Tufekci's op-ed piece "The World Is Getting Hacked.  Why Don't We Dp More to Stop It?" appeared on the website of the New York Times on May 13, 2017, at https://www.nytimes.com/2017/05/13/opinion/the-world-is-getting-hacked-why-dont-we-do-more-to-stop-it.html.  I also referred to the Wikipedia article "WannaCry ransomware attack."  My blog "Ransomware Comes to the Heartland" appeared on Mar. 27, 2017.

Monday, May 15, 2017

India's Energy Future and Climate Change


In an article that appeared in May's Scientific American, Council on Foreign Relations Fellow Varun Sivaram shows that India's path of energy development could have a large impact on future greenhouse-gas emissions.  Unlike China, which currently pumps out about twice as much carbon into the air as the U. S., India's infrastructure is largely yet to be built.  And in that fact lies both a challenge and an opportunity.

It will help to get things in proportion if we compare greenhouse emissions and populations for China, the U. S., and India.  According to the U. S. Environmental Protection Agency, in 2014 the world leader of global carbon dioxide emissions was China, contributing about 30% of the total.  Next in line was the U. S., with 15%, and third was India, with 7%.  The much-ballyhooed Paris accords of 2015 committed India to an apparently almost meaningless limit, because Sivaram says "its overall commitment to curb emissions was underwhelming.  If the government just sat on its hands, emissions would rise rapidly yet stay within the sky-high limits the country set for itself in Paris."

By many measures, most citizens of India are still living in the same energy environment their ancestors occupied:  using dried cow dung, straw, charcoal, and firewood for domestic heating and cooking.  The lucky third or so who have access to more advanced fuel sources use either coal or oil.  The nation's electric grid is somewhat of a joke by Western standards, reaching less than a fourth of the population.  And those who get electricity can't count on it:  outages (both planned and accidental) are common, and government-inspired policies to keep rates low has resulted in chronic underinvestment that has further contributed to the grid's rickety status.

Unlike China, India has something approaching a democratic government, although with a heavy dose of socialist-style traditions left over from the Nehru years of the 1950s and 60s.  While the economy has improved greatly under more recent governments since the 1990s that have favored private enterprise and privatization of formerly government-owned enterprises, Sivaram points out that investment money is hard to come by.

Examining the two extremes of how things go from here, suppose that India follows the easier path trod already by China, exploiting readily-accessible fossil fuels and building coal-fired power plants to supply its increasing population of about 1.4 billion, which is due to outstrip China's population in a few years.  If that happens, the U. S. will no longer be the world's No. 2 carbon-dioxide emitter—India will be, and might even surpass China to become No. 1. 

Of course, this is a competition that no government wants to win.  But zooming down to the micro view of individual citizens, the meaning of drastic global-warming restrictions on future fossil-fuel use becomes more problematic.  Most Indian citizens do not drive cars, and the vast majority of motorized vehicles sold even today are motorbikes or three-wheel jitneys.  Mobility is something everyone wants, and as more Indians get better jobs and are able to save money to buy larger items, the market for automobiles could grow tremendously.  But that development would only exacerbate carbon-dioxide emissions.  The same people who want to drive would like to have plentiful, reliable electricity both for domestic uses and for things like agriculture and manufacturing.  But if power is generated with coal or oil, there goes more CO2.

In his article, Sivaram holds out an alternative energy future that could become reality, given enough willingness on the part of national and state governments and citizens generally.  Solar energy is abundant in the countryside, and the government is already deploying solar panels to power irrigation pumps, but on a small scale.  Given enough investment, the desperately-needed expansion of the electric grid could include the latest smart-grid technologies that would enable it to take advantage of wind and solar power, which otherwise would not fit easily into an old-fashioned grid designed for 24-hour-a day power sources.  And the nice thing is that little retrofitting will be required, because most of the needed grid does not yet exist today.

While coal and oil will be a large part of India's energy mix in the near future, another hope Sivaram has is that conservation measures will limit the increase in demand to less than it would be otherwise.  Rapid deployment of electric vehicles powered by renewable energy sources could help here, as well as an emphasis on energy-efficient appliances and buildings. 

The fly in this sweet-smelling ointment of the future, Sivaram admits, is the crying need for investment money.  And here is where things get murky.  In common with many other countries in Asia, India's regulatory environment is marred by complexity, delays, and corruption.  Even major infrastructure projects such as hydroelectric dams and grid improvements have been torpedoed by high interest rates, permit delays, and poor fiscal planning, resulting in abandoned projects and even bankruptcies.  These are not engineering problems.  These are social and government-policy problems, and it will take political courage and intelligence to make much progress in these areas.

With India halfway around the world, it's easy to ignore internal problems like these, but this academic semester just ending, I taught a graduate class for the first time in many years, and most of the students in it were from the Indian subcontinent.  Thirty years ago, most of them would have been from China, but there are plenty of Chinese universities that are as good or better than your average state school in the U. S. now, and so the new-graduate-student pool for middle-ranked U. S. universities has shifted south over the years.

If these students are like most foreign grad students, many of them will try to stay in the U. S.  But some will return to their native lands.  I hope that what they learn here about the social and political structure of the U. S. will help them realize that in many ways, India has a chance to avoid mistakes others have made before them.  Whatever your views on global warming, I think we can agree that it's a hard problem both to allow millions of people in India to enjoy some of the benefits of advanced technology that we in the U. S. have enjoyed for three generations, while avoiding preventable harm to the planet we all live on.  I hope the citizens of India can take advantage of their opportunities to work out this problem in the best way possible.

Sources:  The article "The Global Warming Wild Card" by Varun Sivaram appeared on pp. 48-53 of the May 2017 issue of Scientific American.  The EPA website from which I obtained 2014 data on carbon-dioxide emissions is at https://www.epa.gov/ghgemissions/global-greenhouse-gas-emissions-data.  I also referred to the Wikipedia articles on the demographies of China and India and the history of the Indian republic. 

Monday, May 08, 2017

The False Promise of Digital Storage for Posterity


Now that almost every book, photograph, artwork, article, news item, story, drama, or film is published digitally, we are supposed to rejoice that the old-fashioned imperfect and corruptible analog forms of these media—paper that ages, film that deteriorates—has been superseded by the ubiquitous bit, which preserves data flawlessly—that is, until it doesn't.  A recent article in the engineering magazine IEEE Spectrum highlights the problems that Hollywood is having in simply keeping around usable digital copies of their old films.  And "old" in this sense can mean only three or four years ago. 

It's not like there isn't a standard way of preserving digital copies of motion pictures.  About twenty years ago, a consortium of companies got together and agreed on an open standard for magnetic-tape versions of movies and other large-volume digital material called "linear tape-open" or LTO.  If you've never heard of it, welcome to the club.  An LTO-7 cartridge is a plastic box about four inches (10 cm) on a side and a little less than an inch thick.  Inside is a reel of half-inch-wide (12 mm) tape about three thousand feet (960 m) long, and it can hold up to 6 terabytes (6 x 1012 bytes) of uncompressed data.  Costing a little more than a hundred bucks, each cartridge is guaranteed to last at least 30 years—physically.

The trouble is, the same companies that came up with the LTO standard are part of the universal high-tech digital conspiracy to reinvent the world every two years.  Keeping something the same out of respect for the simple idea that permanence is a virtue is an entirely foreign concept to them.  Accordingly, over the last twenty years there have been seven generations of LTO tapes, and each one hasn't been backward-compatible for more than one or two generations. 

What this means to movie production companies that simply want to preserve their works digitally is this:  every three or four years at the outside, they have to copy everything they've got onto the new generation of LTO tapes.  And these tapes don't run very fast—it's not like burning a new flash drive.  Transferring an entire archive can take months and cost millions of dollars, but the customers are at the mercy of the LTO standard that keeps changing. 

According to the Spectrum article, Warner Brothers Studios has turned over the job of preserving their films to specialist film archivists at the University of Southern California, which already had a well-funded operation to preserve video interviews with Holocaust victims.  But USC faces the same digital-obsolescence issues that the studios are dealing with, and one USC archivist calls LTO tapes "archive heroin"—it's a thrill compared to the old analog archive methods, but it gets to be an expensive habit after a while.

And that gets us to a more fundamental question:  given limited resources, what should each generation preserve, in terms of intellectual output, for the next one?  And how should preservation happen?

For most of recorded history, preservation of old documents was left mostly to chance.  Now and then a forward-looking monarch would establish a library, such as the famous one in Alexandria that was established by Ptolemy I Soter, the successor of Alexander the Great, about 300 B. C.   It held anywhere from 40,000 to 400,000 scrolls, and lasted until the Romans conquered Egypt around 30 B. C., when it suffered the first of a series of fires that destroyed most of its contents. 

One can argue that the entire course of Western history would be different if all the works of the Greek philosopher Aristotle (384 B. C. - 322 B. C.) had been lost.  The way we came to possess what works we have of his is hair-raising.  After Aristotle died, his successor Theophrastus at the school where Aristotle taught, the Lyceum, inherited from Aristotle a large set of what we would call today lecture notes.   After Theophrastus died, he left them to Neleus of Scepsis, who took them from Athens, where the Lyceum was, back home to Scepsis, and stuck them in his cellar.  Then he died.  Evidently the Greek families held on to real estate back then, and it's a good thing too, because it wasn't till about 100 B. C., more than two centuries after Aristotle's passing, that Neleus's descendants had a garage sale or something, and a fellow named Apellicon of Teos found the manuscripts and bought them.  He took them back to Athens, where Apellicon's library was confiscated by the conquering Romans in 86 B. C.  Finally, some Roman philosophers realized what they had in Aristotle's works and started making copies of them around 60 B. C.

I won't even go into how most of Aristotle's works were lost again to everyone except Arabic scholars up to about 1200 A. D., but we've had enough ancient history for one blog.  The point is that historic preservation was left largely to chance until people began to realize the value of the past to the present in an organized way. 

While the movie industry deserves credit for laying out lots of money to preserve chunks of our visual cultural history, one must admit that their interests are mostly financial.  Once the people who see a movie when they're in their twenties die out, the only folks interested in such films are the occasional oddball historian or fans of specialty outlets such as the Turner Classic Films channel. 

The real problem with digital archives is not so much the fact that the technology advances so fast, although that could be alleviated.  It's the question that never has an answer until it's sometimes too late:  what is worth preserving? 

If you're a well-heeled library like the one at Harvard University, the answer is simple:  everything you get your hands on.  But most places are not that well off, so it's a judgment call as to what to toss and what to keep using the always-limited resources at hand.

Despite the best intentions of well-funded film archivists, my suspicion is that a few centuries hence, we will find that many of the works of most importance to the future, whatever they are, were preserved not on purpose, but by hair-raising combinations of fortunate accidents like the ones that brought us the works of Aristotle.  And if I'm wrong, well, chances are this blog won't be one of those things that are preserved.  So nobody will know.

Sources:  The article "The Lost Picture Show:  Hollywood Archivists Can't Outpace Obsolescence" by Marty Perlmutter appeared in the May 2017 issue of IEEE Spectrum and online at http://spectrum.ieee.org/computing/it/the-lost-picture-show-hollywood-archivists-cant-outpace-obsolescence?.  The story of how Aristotle's works came down to us is reported independently by at least two ancient sources, and so is probably pretty close to the truth, according to the Wikipedia article on Aristotle.  I also referred to Wikipedia articles on the Library of Alexandria and the Ptolemaic dynasty. 

Monday, May 01, 2017

New Cars Ain't What They Used to Be


A friend of ours whose age is somewhere north of seventy recently bought a new pickup truck.  Soon afterwards, in text messages she started calling herself "Keyfob."  When we asked why, she said, "Well, that's what my truck calls me.  When I get out of it it says, 'Keyfob has left the vehicle.'"

She has a new truck because she totaled her previous truck in a collision that she survived largely because of safety features that newer models have.  So no one should think I'm opposed to innovative technology in the automotive industry in general, especially when it contributes to safety.  But as Chicago Tribune reporter Robert Duffer recently pointed out, some of the innovations that carmakers have inflicted on new-car buyers recently can be annoying, confusing, or downright dangerous.

Duffer cites a J. D. Power survey of new-car-owner complaints that showed the category broadly described as "infotainment" was responsible for more complaints than anything else.  This includes things like touch screens, voice-activated commands, and touch-sensitive controls for radios and music players.  It turns out that by 2018, safety rules will mandate that every new car have a backup camera, and consequently a display screen will have to be somewhere in the driver's view.  Carmakers eager to get a competitive advantage are not going to leave such an opportunity alone, and you can expect they will pile more and more features into that screen in addition to simply displaying the backup camera output. 

Some of the problems with new cars stem from the fact that they are almost completely "fly-by-wire" in the sense that many driver outputs—accelerator, gearshift, and so on—don't do anything mechanical directly, but instead go to electronic sensors that run instructions through the car's CPU to execute commands, and similarly with the instrumentation that provides driver inputs.  Airline pilots, with their sophisticated and recurring training, managed the transition from mechanical airplane controls to fly-by-wire technology pretty well, but there were some glitches along the way even in that highly specialized realm.

Duffer provides evidence that when you take the average driver, whose total training to drive nowadays may consist in a few sketchy lessons under the reluctant tutelage of a parent decades ago, and plop him or her into a cockpit with literally dozens of new control surfaces, menus, options, and ways of doing things that used to be done basically the same way by automakers for decades but are now completely different, you're going to have problems.

Perhaps the most striking issue was the way some manufacturers misused the privilege of making the gearshift lever absolutely any way they want to now.  Back in the days of the column-mounted automatic gearshift lever, Duffer reminds us that the sequence "PRNDL" for "park-reverse-neutral-drive-low" was pretty standard.  Anybody back then could get into any car and at least know how to shift it.  But BMW and Fiat-Chrysler both went on the market in the last few years with gearshifts that defaulted to neutral, so the driver could turn off the engine and get out of the car with the vehicle still in neutral. 

For drivers who had developed the bad but understandable habit of relying on a car's transmission parking driveshaft-lock feature to keep the car from rolling, rather that setting the parking brake, this new feature was an accident waiting to happen.  And it did happen to a number of people, the most famous of whom was a Star Trek actor named Anton Yelchin who was pinned between his Jeep Cherokee and a brick column when his car rolled at him and crushed him to death.  Most of those cars have now been recalled to fix this issue, which never should have showed up in the first place.  

With freedom comes responsibility, and the new freedom that automakers enjoy to reinvent the driving experience comes with a responsibility to make sure that the average driver is not inconvenienced or worse by innovations that look attractive at first, but turn out to be annoying or dangerous. 

A lesson can be drawn from the early days of automobiles prior to 1925 or so, when there were literally dozens of carmakers vying for what promised to be a huge and growing market.  Henry Ford's Model T, produced in some form from 1908 all the way to 1927, is not a machine that your average driver today could get going without some lessons.  Even when an electric starter was added in 1919, the operator had to manipulate two steering-column-mounted levers (one was the throttle and the other was the spark-timing advance) and manage three foot pedals, two of which dealt with a mysterious planetary transmission that was part manual and part automatic.  By the mid-1920s, however, the accelerator had moved to the floor and the brake and clutch pedal position had stabilized in most newer makes, and there the matter stood until automatic transmissions came along. 

Then the question arose of where to put the automatic transmission controls.  It started out as a lever on the steering column, but even as early as the 1950s the designers started experimenting.  The ill-fated Edsel, for example, had a series of buttons on the dashboard to control the transmission, which probably led to problems like putting the car into reverse on the freeway when all you wanted to do was turn on the heater.  Eventually, with the advent of front bucket seats, the between-the-seats gearshift lever showed up, but even that standard has been tinkered with to the endangerment of the public, as the story of the Star Trek star showed.

Maybe it's too much to hope for, but a movement among automakers to standardize on a few basic features that all new cars will have in the same place that work in the same way would be welcome, at least by drivers who are no longer young enough to learn completely different operating systems each time they buy a new car.  At the very least, the car companies should view all software and hardware innovations with a mind to safety first, lest we have more potentially fatal problems such as the default-to-neutral gearshift. 

As for me, I'm going to hang on to my old vehicles till the wheels fall off, or maybe just before.

Sources:  Robert Duffer's article entitled "Five worst new car features reinvent the wheel for no reason" appeared on the Chicago Tribune website on Apr. 17, 2017 at http://www.chicagotribune.com/classified/automotive/sc-worst-new-car-features-autocover-0413-story.html.  I also referred to the Wikipedia article on the Model T.