Monday, July 27, 2020

Will Online Learning End College As We Know It?

This topic comes close to home, as I am a college professor preparing to teach most of my classes online during the coming fall semester, as thousands of other college classes will be taught during the COVID-19 pandemic.  While I am not privy to my university's decisions about what tuition to charge, I tend to agree with a reporter at Forbes  named Stephen McBride that if students are asked to pay the usual tuition and fees for a vastly reduced online experience, many will feel shortchanged.  And with good reason.

As Denise Trauth, president of Texas State University, has emphasized in her messages to faculty and staff, the in-person part of education is an essential aspect of what it means to go to college.  And that is why she has tried to arrange to offer as many in-person classes as possible this fall, consistent with social distancing and limiting classroom occupancy to 50% of normal.  But such measures are temporary at best, and what McBride is saying is that the transition to online classes is going to expose conventional providers of higher education to extreme competition from startups that will deliver the same degree in the same way for much lower costs.  And that will cause the overpriced higher-education bubble to pop at last.

McBride points out that over the last few decades, the cost of a college education has vastly outstripped inflation.  One factor that he doesn't address directly is the widespread availability of federal-backed student loans.  When the student-loan money spigot was turned on, colleges figured out a way to vacuum it up, and the result is increased costs.  The goal of making college available for more people was met, but at the price of tuition inflation. 

And middle-of-the-road state universities such as Texas State rely increasingly heavily on tuition and fees as the states steadily reduce the fraction of university income that comes from taxes.  Data from 2017 show that only 28% of Texas State's income was from state sources, while a little more than half (52%) came from tuition and fees. 

If anything comes along to upset the applecart of delicately balanced enrollment and expenditures, universities are ill-equipped to do significant budget cutting.  There are not many companies around that have a large fraction of their core employees (read:  tenured professors) with what amount to lifetime-guaranteed jobs until they feel like retiring.  So to make a significant budget cut at a large university, administrators have found that the most effective means is to create an atmosphere of impending doom with threats of closing entire departments.  I know this from experience at my previous institution, which shall remain nameless here but was in Massachusetts, and wasn't Harvard or MIT.  In such a poisonous environment, many of the good expensive people leave for better places, and the ones left are glad to keep their jobs and will take whatever cuts you give them.

Texas universities have already made budget cuts as requested recently by the state government, which is going through its own fiscal woes as tax revenue declines.  All these details are to say that despite precautions taken to deal with the threat of declining enrollment, the ability of state schools to deal with significant sudden drops is strictly limited, as funding formulas include enrollment and would decline precipitously if fewer students attend.

The kind of education McBride envisions would be all-online and much, much cheaper than a conventional college experience.  His back-of-the-envelope estimate (which conveniently ignores things like labs and accreditation requirements) is that you could deliver college for as little as $3000 a year per student.  I think that is low, but not very low.  If all you want to do is teach people online and you want to be a University of Walmart, I'm sure the thing can be done.  Tenure, research, football, buildings, traditions, commencements, and mascots would go out the window.  But people would still get educated, though mainly by computer, as it would still cost too much to hire qualified humans to interact in any meaningful way with the vast numbers of students each system would have to support to deliver costs that low.

In our rude introduction to Zoom and all the other technologies that allowed us to keep universities running past March of last spring, we instructors learned that online teaching is possible, and sometimes even effective.  By most reports, the students didn't like it compared to being in class.  But they recognized the crisis nature of the situation and cooperated as well as they could. 

If online competition similar to what McBride is talking about materializes in a significant way, he may be right that college will never come back, at least in the form we had up till last spring.  But throwing out the baby with the bathwater is not good for the baby, and I hope that young people trying to decide on their educational future will consider more than cost.

I think it's terrible that so many college students wind up with so much student-loan debt, even though that system has given me the job I have.  As a result of the student-loan windfall, colleges have built up a lot of administrative superstructure that is of questionable use, shall we say.  And maybe some competition from online-only outfits will produce some salutary trimming of what is truly not necessary.  But I think there is genuine value in the experience of living in a community dedicated at least nominally to learning, keg parties notwithstanding.  And it's only fair that people should pay something extra for that experience, but how much extra is an open question.

Every new technology brings with it the question, "Now that we can do it, should we do it?"  COVID-19 has given rise to the possibility of converting college massively to online-only at a much lower price.  I do not know how the public at large will respond if given that opportunity.  I suspect that McBride is right in that higher education is going to see more rapid changes in the next few years than we experienced perhaps in the last few decades.  But I hope at the end of it all, we end up with something that's better for students, better for universities, and better for the country as a whole.

Sources:   I thank my wife for bringing my attention to Stephen McBride's article "Why College Is Never Coming Back" at  I obtained my statistics on Texas State University's 2017 budget from a pie chart at

Monday, July 20, 2020

Twitter Hack Revelation: People Are Still Human

Last Wednesday, followers of the Twitter postings of famous people such as Joe Biden, Elon Musk, and Kim Kardashian all received some variant of the following message, which came from the Apple Twitter feed:  "We are giving back to our community. We support Bitcoin and we believe you should too!  All Bitcoin sent to our address below will be sent back to you doubled!"  This incident has brought to my mind a series of hoary epigrams, and the fact that enough people actually responded to this transparent scam to enrich the hackers by an estimated $110,000 reminds me of the first one:  There's a sucker born every minute. 

Twitter staff responded quickly, first by blocking the accounts on which the fraudulent tweets appeared, and then by briefly freezing the ability of all registered users to tweet anything.  (So for a few minutes on July 15, 2020, we had a Twitter-free world again, but not for long.)  Eventually, Twitter got things straightened out and life went back to what passes these days for normal.

How was this done?  Details are still scarce at this point, but apparently, it began when the hackers mounted what Twitter calls a "coordinated social engineering attack" on the organization.  That's techspeak for a trick like the following:  a bunch of emails or other messages purporting to be from someone in authority and asking for the victim to do something that they normally wouldn't do.  I partly fell for something like this myself once one Saturday when I received an email allegedly from the dean of my college at the university, asking me to contact her.  I emailed back and the hacker then said she was in need of some gift cards for a meeting, and would I please go and buy some and email them to her?  Only then did I realize I was dealing with a scam.

So by some similar means, the hackers were able to access internal Twitter administrative tools.  In other words, they were in the driver's seat and they proceeded to push the pedal to the metal.  First, they located all the famous Twitter names they wished to hack.  (Republican politicians, strangely enough, were apparently immune from this attack, for reasons that remain to be determined—maybe the hackers didn't think anybody would believe Republicans would give away money.)  Then they changed the accounts' email addresses so the real account owners couldn't access their own accounts.  And then the hackers did something really stupid, which was to ask victims to send money to a Bitcoin account.

According to one authority at a law firm that specializes in cryptocurrency matters, U. S. law enforcement authorities can trace Bitcoin transactions pretty well, so the chances that the hackers will get away with their ill-gotten gains for good are not high.  On the other hand, Bitcoin and similar cryptocurrencies are well known for the shady and illegal transactions that people use them for, so it's hard to say what the truth is here as to how easily they can be caught.  Overall, though, people involved with Bitcoin thought the net fallout from this incident would be favorable for cryptocurrency, because as one spokesman said to a Slate reporter, "Can you imagine if an advertiser wanted to ask all of these people to post about their company in one fell swoop? It would be an impossible purchase; you couldn’t even buy that much media."  Which brings to mind the second hoary epigram:  There's no such thing as bad publicity.  That is to say, just getting your name or product before the public is more important than exactly what causes the publicity in the first place, whether it reflects upon you favorably or otherwise.

The next epigram I will bring to your attention sums up what this incident tells us about human nature:  Plus ça change, plus c'est la même chose. ("The more things change, the more they stay the same.")  While the technology in this incident may be new, the aspects of human nature it exploited are as old as humanity itself. 

The hackers, who are simply criminals with some tech savvy, used their knowledge of human nature to get into the Twitter controls in the first place.  No matter how many seminars on computer security you make employees sit through, if your organization is large enough and if the hackers are clever enough, at least one person is likely to have a lapse of judgment when a hacker mimicks an authority figure and asks the victim to do something that would otherwise be against their better judgment.  And one is sometimes all it takes.

And on beyond that, the fact that enough Twitter users were gullible to the extent of sending thousands of dollars' worth of Bitcoin to Joe Biden or Apple or whoever, not stopping to wonder why the object of their admiration would first want them to send cash before returning twice the amount sent—well, it's people like that who keep con artists in business.  And of course, the millions of followers each of the famous people or organizations have, increased the chances that the hackers would find those few very special folks who both had the money and couldn't resist the thought of missing out.

A story in Physics Today, of all places, confirms that even people who are brilliant in one department can nevertheless be duped like anybody else.  Late in life, Sir Isaac Newton was a well-off government official (he ran England's mint) who others sought out for advice about financial investments.  In the spring of 1720, a government-chartered outfit called the South Sea Company (sort of like the British East India Company that profited from colonial trade, but less successful) began issuing stock.  Joint stock companies were a new thing back then, and Newton first bought some South Sea shares, but then decided there was something fishy about the setup and sold his stock, although at a handsome profit.  The South Sea Company operators were basically operating a Ponzi scheme, but as they were some of the first to hit on the idea of paying off investors who were promised high returns with the money from sales to later investors, few people other than Newton smelled a rat. 

All through the summer of 1720, South Sea stock soared, and the psychological pressure of seeing other people apparently getting rich from their purchases proved too much for Newton, who turned around and put almost all his free cash into the stock again in June and July.  In August, the bubble began to burst, and by the end of September Newton had lost his proverbial shirt, along with everybody else who hadn't gotten out in time.  So even the most brilliant scientific mind of the eighteenth century was taken in by a stock scam.

That may not make anybody who sent a thousand bucks to Kim Kardashian in hopes of financial gain feel much better.  But it confirms the fact that human nature hasn't changed that much in three hundred years, and whether the means are goose-quill pens or Twitter accounts, this final epigram is still true:  If it looks too good to be true, it probably is.

Sources:  I referred to articles on the Twitter hack and scam that appeared in Slate at and the website at  I also referred to the Wikipedia article on Twitter.  Andrew Odlyzko's article "Isaac Newton and the perils of the financial South Sea" appeared in the July 2020 issue of Physics Today, pp. 30-36.

Monday, July 13, 2020

The COVID-19 Crisis: Not Enough Technology to Go Around?

In a modern industrialized society like the U. S., we tend to take certain things for granted.  One of these things is that if someone needs emergency medical care, that care will always be available.  The COVID-19 pandemic is calling that assumption into question.

For a time in late spring, many hospitals in New York State were overwhelmed by COVID-19 patients who needed ventilators to keep from dying.  Even with ventilators, many died anyway, and it took weeks for the healthcare system there to recover to the extent that it could handle its normal emergency traffic along with the extraordinary COVID-19 patient load.  In Texas, where I'm writing this on July 12, we are currently being warned that if the COVID-19 infection rate continues to rise like it has in the last few weeks, we may be in a similar situation with maxed-out hospitals and the need to set up emergency wards in convention centers. 

Broadly speaking, a modern healthcare system is a technology in the same sense that a postal system is a technology.  It involves machinery, to be sure, but it also involves complex human relationships, states of training, and command structures that are just as essential as MRI machines and ventilators.  It takes a huge amount of resources in money, time, and investments of lifetimes of training and practice to develop the capabilities represented in a modern hospital.  So it's not surprising that when demands are placed on it that it wasn't designed for, you run into problems.  But the problems you run into aren't just failures of equipment.  It's things like what happened to Michael Hickson at St. David's South Austin Medical Center in Texas.

Until three years ago, Mr. Hickson was a reasonably healthy husband and father of five children.  In 2017, he had a heart attack while driving his wife to work, and suffered permanent brain damage from lack of oxygen before he received emergency treatment.  The injury left him a quadriplegic and in need of continuous medical care, which he was receiving at an Austin nursing and rehabilitation center when he tested positive for COVID-19 on May 15.  He ended up in St. David's ICU on June 3, and on June 5 the hospital informed Mrs. Hickson that he wasn't doing well. 

That day at the hospital, she had a conversation with an ICU doctor regarding her husband's care.  The situation was complicated by the fact that she had temporarily lost medical power of attorney to a court-appointed agency called Family Eldercare.  Someone recorded this conversation, and it makes for chilling listening and reading (the YouTube version is captioned).

When Mrs. Hickson asks why her husband isn't receiving a medication that can alleviate symptoms of COVID-19 and being considered for intubation, the doctor explains that her husband "doesn't meet certain criteria." 

The doctor explains that doing these things probably wouldn't change his quality of life and  wouldn't change the outcome.  When she asks him why the hospital decided these things, the doctor replies, " 'Cause as of right now, his quality of life . . . he doesn't have much of one." 

Mrs. Hickson asks who gets to make the decision whether another person's quality of life is not good.  The doctor says it's definitely not him, but the answer to the question about whether more treatment would improve his quality of life was no.

She asks, "Being able to live isn't improving the quality of life?"  He counters with the picture of Mr. Hickson being intubated with a bunch of lines and tubes and living that way for more than two weeks, but Mrs. Hickson gets him to admit that he knows of three people who went through that ordeal and survived.  She tells him that her 90-year-old uncle with cancer got COVID-19 and survived. 

His response? "Well, I'm going to go with the data, I don't go with stories, because stories don't help me, OK?"  Toward the end of the conversation, he says, ". . . we are going to do what we feel is best for him along with the state and this is what we decided." 

The next day, Mr. Hickson was moved to hospice care.  According to Mrs. Hickson, there they "withdrew food, fluid, and and any type of medical treatment" from him, and he died on June 11, despite his wife's attempts to gain medical power of attorney back from the court-appointed agency.

There are at least two sides to this story, and in recounting this tragedy I am not saying that the Hicksons were completely in the right in all regards, nor that the hospital, its doctors, or Family Eldercare was completely in the wrong.  But clearly, the hospital was under pressure to allocate its limited resources to those who would benefit from them the most.  And it fell to the unhappy ICU doctor to explain to Mrs. Hickson that her quadriplegic, brain-damaged (and maybe I shouldn't mention this, but he was also Afro-American) husband was going to be left behind in their efforts to help others who had what the hospital and the state determined were higher qualities of life.

It isn't often that conflicting philosophies clash in a way that gets crystallized in a conversation, but that happened when the doctor said, "I'm going to go with the data, I don't go with stories."  In going with the data, he declared his loyalty to the science of medicine and its supposed objective viewpoint that reduces society to statistics and optimized outcomes.  In refusing to go with stories, he rejected the world of subjectivity, in which each of us is the main character in our own mysterious story that comes from we know not where and ends—well, indications are that the Hicksons are Christians, so their conviction is that their stories end in the Beatific Vision of the face of God.

But Mrs. Hickson would have been willing to look into the face of her beloved husband for a little longer.  Unfortunately, the ICU doctor and the state had other ideas.  Mr. Hickson might have died even if he had received the best that St. David's could offer.  But the lesson to engineers in this sad tale is that the best designs at the lowest price mean nothing if the human systems designed to use medical technology fail those that they are intended to help. 

Sources:  I read about this incident at the website of National Review at and  The recorded conversation between Mrs. Hickson, the ICU doctor, and a friend of Mrs. Hickson's can be heard at

Monday, July 06, 2020

Is The Computer World Getting Beyond Repair?

Louis Rossman runs the Rossman Group, a team of about a dozen computer repair people in Manhattan.  Recently he was interviewed on the topic of the "right to repair," a concept that is of intense interest both to technicians employed in the repair industry and to anyone whose computer or phone goes on the blink, and is not prepared to chuck it and buy a new one. 

In the interview, Rossman describes the many ways that companies like Apple make it hard for anyone outside of Apple (and often inside too) to repair their ubiquitous devices.  For starters, a blanket of proprietary exclusion conceals useful information such as schematic diagrams, part numbers and identification, and diagnostic software.  Rossman has developed back-channel connections with engineers who work for the companies whose products he tries to fix, and sometimes can get the information he needs that way.  But he says that repair organizations shouldn't have to resort to legal gray areas such as under-the-table schematics to fix things.

Another obstacle arises when companies intentionally make their designs hard to fix.  Rossman said that some firms go to the trouble of pairing certain hardware chips with the particular computer they are installed in.  The machine would run just as well without this feature, which from the repair viewpoint is a bug.  All it does is prevent anyone from fixing the machine if that chip breaks, because even putting in a new chip won't work because the new chip is no longer paired to the machine, and it won't run.  Designs like this are aimed specifically at making the unit harder to repair.

Rossman devotes a small amount of time to promoting legislation and awareness publicity concerning the right to repair, but he isn't optimistic that huge changes will occur.  In an era when large corporations have skewed the intellectual-property field steeply in their favor, it's hard for small, independent operators like Rossman to gain a hearing.  And Rossman himself doesn't want too much government interference, as he is personally inclined toward libertarianism. 

If consumers were simply more aware of what was being done to make the products they buy harder to fix, they might make better choices, and repairability might become a cultural value like sustainability.  And the connection between the two is closer than you think.  For every computer (or phone, or car) that is fixed and stays in service, there is a new device that doesn't have to be made and sold, and the day is put off when that device ends up in a dump somewhere, as most electronics does despite increasing efforts at recycling. 

The simplistic view of this from the manufacturer's standpoint is that repair, especially by someone other than the company that made the product, simply cuts into sales of new units, and is to be discouraged as a pernicious legacy habit that consumers can eventually be trained to break.  But this is not the only way to go, as the automotive industry has abundantly demonstrated.  The average internal-combustion-engine car needs more maintenance and repair as it ages, and if you get attached to your old car (as I do), the inside of a car repair shop becomes very familiar. 

Perhaps if computers and phones had visible odometers like cars (and buried somewhere in the software, they probably do), it could become a point of pride to show how many hours your machine has racked up. 

But fixing things and keeping them for a longer time than absolutely necessary is to buck a trend that has been going mostly the other way since the beginning of the Industrial Revolution, when cheap machine-made products helped to bring millions up from poverty into the middle classes, both as employees of factories and as consumers of the products they made.  I'm not in favor of poverty, nor am I calling for a return to the bad old days when a watch was built mostly by hand, and a man expected it to last most of his life for the simple reason that most people couldn't afford to buy more than one watch in a lifetime. 

With Rossman, I'm in favor of reasonable accommodations for small repair firms and even individuals who are willing to crack open a computer or phone and see if they can fix it rather than simply getting a new one.  Such behavior is good stewardship of the built environment, which includes computers and phones.  Waste is not one of the seven deadly sins, but it's not a good thing either.  And when a fiendishly complex thing like a computer, with all its billions of coordinated parts that do amazing things, gets tossed in the trash simply because one of those parts falls down on its job, the world in general takes a hit that is not that noticeable, perhaps, but is significant nonetheless. 

Years ago I saw a TV episode of The Twilight Zone, I think it was, in which the writers wanted to show a man who lived a deplorably extravagant lifestyle.  So five minutes after the man picked up a new car, he pulled back into the dealership and said, "Hey, I want to trade this in on a new model."

When the salesman asked him why, the man said scornfully, "The ashtrays are full!"  (For those below a certain age, most cars used to have ashtrays, and most drivers used to smoke.)

Throwing away a complex piece of electronics simply because one part breaks that could be easily repaired (given enough information from the manufacturer) isn't much better than trading in a car because the windshield is dirty.  While there are plenty of other things to think about these days, perhaps people will use the extra time they have on their hands to take up a new occupation such as electronics repair.  And maybe the rest of us could look into how easy or hard our next piece of gear is to fix, and let repairability be a factor in our choices.  It would make Louis Rossman's job easier if companies recognize the value of repairability and start doing something about it.  And to let things go the way they're headed, which is a world where repairing stuff is unheard of, would be a waste in more ways than one. 

Sources:  The article "A Computer-Repair Expert Takes On Big Tech," appeared on the website of National Review on July 1, 2020 at  I most recently blogged on the right to repair on Nov. 6, 2017 at