Monday, August 26, 2019

This Business of Engineering


Early Sunday morning, Aug. 5, 1888, a 39-year-old woman named Bertha Benz set off for her mother's house in Pforzheim, some sixty-six miles (106 km) away from Mannheim, Germany.  She lived there with her husband Karl and two teenage sons, and she took her sons along for the ride.  Visiting her mother was not unusual.  But the way she planned to get there was. 

For the last several years, Karl had been developing what he called a Patent-Motorwagen—what we would call today an automobile.  Its one-cylinder engine burned an obscure solvent called ligroin, obtainable only at pharmacies.  It had wooden brakes and only two gears, low and high.  Bertha was from a wealthy family, and she had put a considerable amount of money into her husband's invention.  But like many inventors, Karl was content to make incremental improvements to his machine and treated it gingerly, never driving it more than a few miles away from home on short test drives.  Besides, there were laws regulating such machines, and to drive it a long distance legally, he would have had to get permission from various local authorities along the way.  It was much easier just to tinker with it in his shop and drive it only around town.

But Bertha had had enough of this.  She knew Karl's invention was good, but people had to know what it was capable of.  Without telling her husband, she and her two boys left Mannheim on the rutted wagon roads leading to Pforzheim.  On steep hills, the boys had to get out and push the underpowered vehicle uphill.  At one point the fuel line clogged, and Bertha unplugged it with a hatpin.  A chain broke, and she managed to find a blacksmith willing to work on Sunday to fix it.  The brakes proved inadequate, and she stopped at a cobbler's shop and had him cut some leather strips to fit onto the brakes, thus inventing the world's first brake pads.  A little after sunset the same day, she and her boys arrived in Pforzheim, no doubt to the great surprise of her mother.  She telegrammed her husband of her successful trip, and by the time she drove back several days later, reports of her exploit were in newspapers all over the country.  Which was exactly what she wanted.  Benz's invention, and Bertha's exploit, were foundational steps in the worldwide automotive industry.

Somehow I had gotten to my present advanced age without learning about Bertha Benz's first-ever auto trip.  But this was just the most interesting of many such anecdotes about engineering and business that I encountered in a new book by Matt Loos:  The Business of Engineering. 

Loos is a practicing civil engineer in Fort Worth who realized a few years after being in the working world, that many of the most important skills he was using every day had little or nothing to do with what he learned in engineering school. 

This is not to disparage engineering education (which is what I do for a living) but simply reflects the fact that the technical content of engineering is so voluminous these days that there isn't much room in a nominal four-year curriculum for what are (perhaps unfortunately) called "softer" subjects such as management techniques, ethical issues, and coping with the dynamics of rapidly changing technical fields. 

A newly-graduated engineer could do a lot worse than to pick up The Business of Engineering and read it to find out what the late radio commentator Paul Harvey called "the rest of the story."

If a person is going to claim to be able to use specialized technical knowledge to do something of value, they must have mastered that technical knowledge.  That fundamental requirement is the reason behind the extensive and challenging technical content of engineering undergraduate courses.  But as Loos points out in numerous ways—through anecdotes like Bertha Benz's story, through recent statistics and facts drawn from a variety of technical fields, and from his own personal experience—knowing your technical stuff by itself will not make you a successful engineer.  And even the definition of success depends on what sort of business you are in and how your own personal goals fit in with the directions that the industry is moving. 

I have to say that if I had read and taken to heart what Loos says in his book when I was, say, twenty-four, my career might have been very different.  At the time, I had a very simplistic and immature notion that all an engineer had to do was to come up with brilliant technical stuff, and the world would beat a path to his door.  But in thinking that, I was acting like Karl Benz, happily tinkering away in his shop but afraid to try his pet invention out in the real world.  The lesson I needed to learn was that if nobody but you cares about what you're doing, nothing much good will come of it.  Working engineers need to be engaged in the world around them, not only on a purely technical level, but also at the levels of economics, social relations, and ethics, to mention only a few.

This is Loos's first book, and as with most things, one's first efforts occasionally lack the polish that long experience can give.  But it is still highly readable, even if you don't read it for anything but the stories.  One of the strengths of the book is that Loos is realistic about how an engineer's personal habits can make the difference between success and something considerably below success:  things like attention to details, ability to organize one's time, problem-solving skills, and so on.  Now and then I come across a student who has more than adequate brain power to do engineering problems.  But when he confronts a problem he's not familiar with, he will simply sit there and appear to wait for inspiration.  And if inspiration doesn't come, well, it's just too bad.  The better way is to follow the advice of G. K. Chesterton (this isn't in Loos's book), who said anything worth doing is worth doing badly.  Even trying something that doesn't work will probably tell you something about what will work, and it's better than just passively waiting for something to happen.  Engineers make things happen—not always the best thing, but something that moves the process along.

Loos's book is now available on Amazon, and I recommend it especially to graduating engineers who can benefit from the experience and the stories that The Business of Engineering collects.

Sources:  The Business of Engineering by Matthew K. Loos, P. E. is available on Amazon at
https://www.amazon.com/gp/product/0998998788?pf_rd_p=183f5289-9dc0-416f-942e-e8f213ef368b&pf_rd_r=2V03WN39CX0ASA8E4VGJ.  Mr. Loos kindly provided me with a free review copy.  I enhanced the Bertha Benz story with some details from the Wikipedia page on her.
-->

Monday, August 19, 2019

Should Social Media Be Regulated?


Last month, the youngest U. S. Senator, Josh Hawley, a freshman Republican from Missouri, filed a bill called the Social Media Addiction Reduction Technology (SMART) Act.  The purpose of the act is to do something about the harmful effects of addiction to social media.

What would the bill do?  I haven't read it, but according to media reports it would change the ways companies like Facebook, Twitter, and Google deal with their customers.  The open secret of social media is that they are designed quite consciously and intentionally to be habit-forming.  So-called "free" media make their money by selling advertising, and advertising is worthless unless someone looks at it.  So their bottom line depends on how firmly and how long they glue your eyeballs to their sites.  And they have scads of specialists—psychologists, media experts, and software engineers—whose full-time job is to squeeze an extra minute or two of attention from you every day, regardless of whatever else is going on in your life. 

Writing on the website of the religion and public life journal First Things, Jon Schweppe says the SMART Act may not be the one-stop cure-all for our social media problems, but it's a step in the right direction.  It would  prohibit certain practices that are currently commonplace, apparently including one that has always reminded me of what life might be like in Hell:  the infinite webpage.

It used to be that when people first figured out how to make a web page scroll, it was only so long.  You could always get to the bottom of it, where you might find useful things like who wrote it or other masthead-and-boilerplate information.  Well, that doesn't always happen anymore.  The infinite webpage pits the pitiful finite mortal human against the practically unlimited resources of the machine to come up with more eye candy, as much as you want.  You keep scrolling, it will keep showing you new stuff. 

This particular feature reminds me of a passage from C. S. Lewis's The Lion, the Witch, and the Wardrobe featuring the candy called Turkish Delight.   The wicked-witch Queen of Narnia offered the boy Edmund his favorite type of candy to convince him to betray his friends.  The candy she offered him was enchanted so that whoever ate it always wanted more, and "would even, if they were allowed, go on eating it till they killed themselves."  No matter how much time you waste on an infinite website, there's always more.

The SMART bill would also give users a realistic option to voluntarily limit their own use of social media with daily timers, prohibits "badge" systems (which is evidently a kind of special-privilege feature that gets rewards heavy users and encourages them even more), and would prohibit or modify other addictive features. 

The Federalist's John Thomas sees the SMART bill as the first step in what may be a turning point in the history of social media.  He likens it to the Parisian reaction to brightly-colored advertising posters enabled by the then-new lithography process in the 1860s.  Pretty soon, a good percentage of all available vertical flat surfaces were covered with posters, and the town fathers decided to regulate how and where posters could be displayed. 

This may be the point at which the U. S. citizenry stops merely wringing its hands and saying there's nothing you can do in the face of rising teen depression and other ill effects of social media, and starts to take action.  As Thomas points out, though, there are few grass-roots organizations taking up the control-social-media banner. 

This may be because the dangers social media pose for mental health are insidious and gradual rather than abrupt and catastrophic.  Suppose that every person had an intrinsic social-media limit:  say after viewing X hours of social media (and X would be different for each person), your brain would literally explode and you'd die.  Well, you can bet that after two or three of these incidents, governments would come down on Facebook, Google, and company like a ton of bricks with all sorts of restrictions, up to and including an outright ban.

But nobody's brain literally explodes from doing too much Facebook.  The negative consequences of social-media use are much less obvious than that, but are nonetheless real.  Even the most tragic cases of teen suicides that result from peer persecution over social media can be blamed not just on the media, but on the cruelty of other teens.  Nevertheless, the nominal anonymity and ease of use that social media offer can turn what might be fairly well-behaved peers in person into abominable monsters on Facebook. 

Some writers oppose the SMART Act and similar legislation on the free-market principle that government is more likely to make things worse with legislation than otherwise.  While that can happen, it is foolish to take the hyper-libertarian position that if a good or service is bad, people just shouldn't use it.  Back when ordinary glass was used for automobile windshields, it would turn into long razor-sharp shards that decapitated numerous drivers, and Congress invited Henry Ford to testify about a proposed law that would require the use of the more expensive safety glass in windshields.  Reportedly (and this is from memory), Ford said, "I'm in the business of making cars, I'm not in the business of saving lives."

When Mark Zuckerberg testified before Congress not too long ago, he was self-controlled enough not to say anything that harsh.  But if the day has at last arrived when our elected officials are finally going to do something about the harmful effects of social media, one of two things (or perhaps a combination) is going to happen.  Either the social-media companies will have to get ahead of the proposed legislation and enact real, quantifiable reforms of their own and prove that they work, or they will have to change their ways in accordance with regulatory laws that they brought upon themselves. 

My own hope is that the companies will figure out a transparent and effective way to self-regulate.  But the choice is theirs, and if they brush off the SMART Act and think they have the raw power to squash such regulation, they may be in for a painful surprise.

Sources:  Jon Schweppe's article "Big Addiction" appeared on the First Things website on Aug. 13, 2019 at https://www.firstthings.com/web-exclusives/2019/08/big-addiction.  John Thomas's article "Hawley's SMART Act Is the Beginning Of the Revolt Against Big Tech" is on the Federalist website at https://thefederalist.com/2019/08/13/hawleys-smart-act-beginning-revolt-big-tech/. 
-->

Monday, August 12, 2019

USB-Crummy


About a year ago, my old Mac laptop died and I had to buy a new one.  I was pleased overall with my new machine, as far as the software and operating characteristics went.  And at first I wasn't too concerned that the only physical ports it had were one 3.5-mm phone jack for a headphone and four USB-C jacks, two on each side. 

I wasn't familiar with USB-C, although like anybody else who's dealt with computers, I knew about USB (Universal Serial Buses) in general.  I had nothing in my possession that would work with a USB-C connector—no printer cables, no external hard drive cables, no headphones.  So I went to Best Buy and bought a docking station that promised to solve all my interface problems.

It has a single USB-C connector plug on a short cable that goes to a flat aluminum box that has nearly every kind of jack you can think of:  an old-fashioned VGA (Video Graphics Array) connector for your ancient video projector, large and small HDMI (High-Definition Multimedia Interface) connectors that work with our medium-screen TV, an Ethernet cable port for hard-wired networking, an SD/MMC jack for flash-drive cards, a micro-SD for the teeny flash drive cards, three USB-2 (regular size) jacks, and another USB-C jack in case you need it.  And for most of a year after I bought the new laptop, I used these ports on the docking station whenever I wanted to connect anything to the laptop other than the USB-C power supply that came with it.

Then a month or so ago, I began to have problems.  First it was an issue with downloading data to a flash drive.  I would be downloading something and then all of a sudden I'd get a message on my Mac criticizing me for removing the flash drive before ejecting it.  Only I hadn't touched a thing.  I could take it out and put it back in and it would start working—sometimes.  And sometimes it would drop out again without warning.

That made me wonder if something was wrong with the docking station.  So I bought a simple adapter cable that has just a USB-C connector on one end and a USB-2 regular-size USB jack on the other, and used another adapter to get to a flash drive.  That worked for a while, but then I began to have the same problem. 

The worst thing was when I would do a backup to an external hard drive.  That takes a while, and in the middle of the transfer I'd get an error saying I'd removed the drive without ejecting it.  Only I hadn't.

Finally, I went online to see if other people were having these problems.  Turns out they were.  And amid all this wonderful stuff that USB-C is supposedly capable of doing (it has six modes of operation, including everything from external display support to power and 20-GB-per-second data transfer), there is a smelly fly in the ointment:  mechanical unreliability.

If you've never looked closely at a USB-C connector, get a magnifying glass and do so.  Inside that tiny rigid plug there are twenty-four pins, twelve on each side.  And apparently, for the high-speed data transfer to work, a good number of them (either four or eight, the best I can tell from online information) have to make perfect contact with their mating members in the socket you plug it in to.  Or else it looks to the hardware like you've jerked out the connector and you get an error.

From various online forums I read, it appears that the mechanical design of the consumer-grade USB-C jack is unreliable.  I saw tales of people with Macs like mine who had to take theirs in to get the USB-C jacks replaced because they simply wouldn't work for high-speed data transfer anymore.  I also use them (of necessity) for low-speed stuff like connecting to my keyboard and my printer, and I've never had any problems with those functions, because evidently they use different contacts than the high-speed ones.  But even if you get new jacks, they're just as unreliable as the old ones, and you're likely to have the same problem show up again in a few months.

As one online forum writer commented, every connector engineer knows that other things being equal, the more contacts you put in a connector, the less reliable it becomes.  Squeezing twenty-four pins into a tiny USB-C connector and expecting all of them to work all the time was one of the dumbest standards decisions I've come across in a long time.

The proverb saying a chain is only as strong as its weakest link applies in spades to connectors, and I am now stuck with a system that has a known weak link:  the USB-C connectors, which are the only practical physical way I have to get data in and out of my Mac. 

Now that I know it's a mechanical problem, I can do things like trying not to use one of the four USB-C ports for anything except the occasional backup, for example, and tiptoeing around any time a long data transfer is happening for fear I will set up vibrations that will break one of the eight vital connections and ruin the whole process.  This is not progress.

It may be too much to hope for, but maybe whoever devises the next standard after USB-C will come up with a fail-safe approach, or at least one that will be as reliable as the larger USB-2 standard was.  The old Bell System, which for much of the twentieth century relied on electromechanical relays for all of its network switching functions, found that the only way to make a relay reliable was to duplicate every one of its contacts, so that if a piece of dust got into one contact you had the other one that would still work.  Whoever designed the USB-C evidently forgot that hard-earned piece of wisdom.  I hope the next standards committee working on whatever comes after USB-C will not forget, but it looks like we may have to wait a while before that happens.

Sources:  I referred to the Wikipedia article on USB-C connectors and the website of the USB Implementers Forum (usb.org), as well as several online discussion boards about the unreliability of USB connectors. 

Monday, August 05, 2019

Should Engineers Get Their Fingernails Dirty?


Up the road from me in Austin, Texas, it turns out that Apple has been building Mac Pros, a desktop model, in a Flextronics manufacturing plant since at least 2013.  But a recent news item in the Austin American-Statesman says that the company will soon shut down its manufacturing activity of Mac Pros here in Texas and move it to China.  However, Apple is moving ahead with plans to open a new billion-dollar campus in Austin, which will increase the number of Apple workers there from around its current 7,000 to as many as 15,000.  But you can be pretty sure that most of those workers won't be holding soldering irons or screwdrivers—they'll be sitting at computers typing code.

The image of the engineer has changed radically over the years since the profession first attained significant public recognition, which came about in the late 1800s.  In 1900, artistic portrayals of an engineer would show a rugged, muscular man who might be holding an engineer's hammer in one hand (something halfway between a regular hammer and a sledgehammer), and adjusting a surveyor's transit as he squints through the eyepiece.  Most engineers prior to about 1920 were civil engineers, engaged in laying out railroad tracks or roads, installing water and sewer systems, and making sure bridges and buildings didn't fall down after they were built.

Then the era of scientific engineering came along.  The slide rule replaced the hammer, and now engineers were depicted in hiring ads in the 1950s as lab-coated intellectuals, wearing horn-rimmed glasses and looking through microscopes or fiddling with flasks of chemicals.  The habitat of the 1950s engineer was inside, not outside, but he (always a "he" back then in images of the time) was still engaged in working with exotic equipment or machinery for which special training and even clothing was required.

Then the computer came along, banishing the slide rule.  Companies quickly learned it was cheaper to let engineers try new designs in software models rather than actually building prototypes and finding out that most of them didn't work.  So as much of the engineering knowledge that formerly resided in engineers' brains moved into computer programs, the typical engineer wound up sitting at a desk in front of a computer monitor.  Sometimes she just writes code, and sometimes she works with drawings of actual stuff.  But whatever subject is displayed on the screen, that is often as close as the engineer gets to the actual thing that is built. 

The end product, whether it is a microchip, a car, or an airplane, is often made far away from where the designer sits, possibly in another country.  The people who actually get their hands dirty to make the products know no more about them than they have to in order to do their jobs right.  There is nothing intrinsically wrong with this—it is one more application of the great economist Adam Smith's principle that each person, organization, or nation should specialize in what they are best at, and swap their products with those who are best at other things.  If engineers are best at designing with software, why, that's what they should do, and not waste their expensive time on a workbench actually building things that lower-paid and less-educated technicians can build. 

I grew up and learned engineering at a time when math and models could get you only so far, and there were imponderable and incalculable factors that had to be worked out on the workbench.  In my specialization of RF engineering, that meant that any engineer worth his salt had to know how to solder and troubleshoot actual hardware, and we did.  And things got built—maybe not the absolutely most optimized designs, but good enough to go out the door and make money for the company (most of the time, anyway). 

But these days, things that had to be figured out on the workbench thirty-five years ago can be modeled in the much more sophisticated software that is available today.  And so almost every kind of engineer these days, whether chemical, electrical, mechanical, civil, or environmental, ends up spending most of their working time in front of a computer.  There are exceptions, of course, but it is in the interests of most firms to see that their engineers spend as little time as possible fiddling with hardware and as much time as possible doing what they are paid the big bucks to do. 

Engineering is an intensely practical business, and if most engineering firms succeed in satisfying their customers with the services of engineers who never touch hardware, I can't see anything to criticize in that.  The lingering suspicion I have that something is missing that may cause trouble down the road may be nothing more than an old guy's prejudice in favor of the way things used to be. 

In my own teaching, I try to make students deal with hardware when it's practical to do so.  This fall I will be teaching a course in analog design.  Software is available that lets you build the whole circuit on the computer screen and test it with software "scopes" and get results that are much more precise than anything you can do in the lab.  But after the students have done that, I will request them to go get real parts, that they really have to read the values of, and put them in a real prototype "breadboard" circuit, and show me that it really works. 

Most students don't complain of this.  In fact, over the years I have had positive comments along the lines of, "I never knew what that oscilloscope was for until I had to use it in this class," and so on.  I'm sure some of them feel that shoving little wires into just the right holes is beneath them, but a little humbling is good for the soul.

Next spring, if all goes according to plan, I will teach the first course in power electronics ever taught on this campus.  Power electronics involves things like controlling giant 1,000-horsepower motors in steel mills.  Much as I would like to have a lab that used a 1,000-horsepower motor, all the labs in this course will use software.  It's much cheaper and safer to have a software 750-kW motor blow up than a real one.  And that's perhaps the way it should be. 

All the same, engineers should never forget that no matter how nice things look on paper or the computer screen, physical reality reigns.  And sometimes, it bites back.

Sources:  The report on Apple's moving production out of Austin appeared at https://www.statesman.com/news/20190628/report-apple-moving-mac-pro-production-out-of-austin.
-->