Monday, January 26, 2026

Imagine There's No Libraries

  

The old Beatles song "Imagine" asks the listener to think what life would be without Heaven.  Libraries are not Heaven, but they perform a little-considered yet essential role in the cultures of many nations.  And some recent trends in library science and the way academic libraries are operated makes me wonder if we're heading toward the extinction of libraries as we know them.

 

With the invention of writing around 3000 B. C. (which took place in several locations around the world), it first became possible to store thoughts in physical form.  This had a profound effect on the way governments and individuals operated.  For the first time, a disputed question could be referred not simply to the oldest and wisest person around, but to a written record that doesn't change. 

           

Until the invention of printing, books and libraries were expensive, and access was limited mostly to the ruling and most prosperous classes.  The establishment of universities around 1200 A. D. led to the founding of research libraries, where scholars could go to learn the best of what had been taught in the past.  Both the Scientific and Industrial Revolutions depended vitally on libraries and the written word to propagate new knowledge and built upon what was learned.  Western civilization itself is unthinkable without widespread access to standard works of philosophy, government, science, and the arts which can be found, in a last resort, at a library.

 

Then came the digital revolution.  Some interesting statistics from a 2023 report of the Association of College and Research Libraries shows that while research universities—those which grant doctoral degrees—still have an average of 71% of their collections in physical form, only about 1.6% of all circulation in university libraries represents physical objects.  All the rest is in digital form.  And for all other types of colleges and universities—four-year bachelor-granting, master's degree only, and two-year and community colleges—the vast majority of their holdings are digital.  The bottom line is, physical books and journals are largely a thing of the past as far as university libraries are concerned.  And the research libraries which still hold physical books and journals are moving them as fast as they can to offsite storage facilities, which are much cheaper to maintain than open stacks of real books.

 

Okay, you say, so what?  Libraries used to house big awkward scrolls, then hand-written tomes, and now it's mostly digital.  The information is still there, and now it can be accessed by a wider variety of people.  What's to complain about?

 

Two things come to mind.

 

The first is what I would call the curatorial function of libraries and librarians.  Even on the level of a small town library, before the digital revolution their physical collection represented a hierarchy of importance, as assessed by educated professionals who knew what kinds of things a library should have in it for the particular place and time.  If a wacko walked in from the street trying to hand out brochures warning of an imminent zombie apocalypse, the librarians would listen to him, maybe, but after he left they'd dump his brochures in the trash.

 

No more.  Mr. Zombie Apocalypse has his own website, X account, Facebook page, and probably a Substack blog too.  And while the reference librarian isn't going to point you at his page, there's nothing to stop you from going there.  Even the little Guinness-Book-of-Records type questions that people dream up are no longer asked of librarians.  Everyone carries around a little oracle in their pocket, and all you have to do is ask the oracle, and you'll get some kind of answer. 

 

Librarians are aware that their former core function—to curate what they regarded as the most important information for a locality, whether a small town or a big university—is being steadily usurped by Google, Facebook, and now AI bots.  So they are busy cultivating other activities to host:  meetings, social events, makerspaces, and other doings that serve the communities for which they feel responsible.  So far, it has worked, but sometimes I wonder whether people will forget what libraries are for in the welter of auxiliary activities and services that librarians are coming up with.

 

The second and most scary downside to all this is one that's unlikely, but for certain places it's already become a reality, at least for a while.  In the recent turmoil in Iran set off by severe inflation, the government essentially shut down that nation's internet in late December.  And of this writing, the shutdown continues.

 

The ostensible reason was to quell political protests, but the effect has been much wider than simply to keep people from organizing anti-government demonstrations.  I'm sure that to the extent Iranian libraries have embraced the digital revolution and dispensed with their physical collections, they are currently regretting any progress they have made in that area. 

 

It can't happen here, you say, "here" being North America, Europe, or most other places with so-called enlightened governments.  Well, if you got in a time machine and went back to 1966 and told people there that in 2026, we'd have seen riots at the U. S. Capitol aimed at disrupting the results of a presidential election, unilateral action by the executive branch to impose tariffs and trade restrictions, to kidnap heads of other governments, and to intimidate leaders of the legislative and judiciary branches with spurious lawsuits, they might not have believed you. 

 

I'm not trying to criticize any particular administration, but to point out that an all-electronic library that relies on the internet has certain vulnerabilities that physical documents don't have.  This is not to say that we should reverse the trend toward digital archiving of information, though that is a sore subject on its own.  But in going the internet-based route by exchanging physical stuff kept in one place for digital stuff in the cloud, libraries have vaporized the main reason that people go to them.  If you can go straight to the cloud for information, guided by AI companions that are smarter than a hundred librarians, why bother with libraries?

 

That's a question librarians will have to struggle with in the future.  For their sake and for the sake of all cultures who have benefited from libraries in the past, I hope they come up with some good answers. 

 

Sources:  The 2023 report by the Association of College and Research Libraries "The State of U. S. Academic Libraries" is available at https://www.ala.org/sites/default/files/2024-10/2023%20State%20of%20Academic%20Libraries%20Report.pdf.  I also referred to the Wikipedia article "2026 Internet blackout in Iran."

Monday, January 19, 2026

When Is a Tube Amp Not a Tube Amp?

 

For those who may not be familiar with the term, a "tube amp" is short for a vacuum-tube amplifier.  Many people know that before the transistor was invented in 1948, the only way to do electronics was with vacuum tubes.  By the 1960s, most electronics used transistors or integrated circuits, but tubes are still used to this day in a few niche applications, notably the music industry.  For reasons that have to do as much with esthetics and culture as with technical factors, vacuum-tube guitar amplifiers play a role in rock music similar to what Stradivarius violins do in classical music.  And some people of a certain age who recall the days of the 1950s when a home stereo system required a big, heavy, hot vacuum-tube amplifier still think tubes sound better than transistors, or at least look better, with their filaments giving off a warm orange glow. 

 

So there is a market today not only for vacuum-tube guitar amplifiers and vacuum-tube microphones, but for vacuum-tube home stereo amplifiers as well. 

 

Over a decade ago, for both professional and personal reasons, I built a 1950s-style vacuum-tube stereo amplifier from scratch, using plans I found online.  The parts alone cost on the order of $200 or more, and if I had charged somebody for the fifteen or twenty hours of labor I put into it, at my going consulting rate I would have had to charge in the mid-four figure range if somebody had wanted to buy it.  And specialty high-end custom manufacturers still sell what I'll call genuine vacuum-tube stereo amplifiers similar to the old McIntosh or Harmon-Kardon models, but they do cost several thousand dollars and up.

 

Well, along comes a company I'll call Stonetown, advertising a "vacuum tube amplifier" for the low price of $160.  And a friend of mine I see frequently buys one, and installs it in his office where we have a sack lunch together.  He hooks it up to an FM receiver and a couple of speakers, and we enjoy listening to the local classical station while we eat.  It's a cute little thing:  brushed-aluminum front panel, a little analog VU meter that jumps up and down with the music, and four little miniature-style tubes on top inside vaguely Art-Deco symbolic metal ring shields that mainly keep you from knocking the tubes over.  And behind the tubes is a big square plastic box that I naturally assume encloses the necessary output transformers.  Tubes are inherently high-impedance devices (high voltages at low currents) and because speakers have low impedance (they need lots of current at a low voltage) you have to use big, heavy iron-and-copper transformers in vacuum-tube amplifiers that drive speakers. 

 

My friend has used this amplifier for the last couple of years.  It worked fine until last Tuesday, when the left channel went out.  We swapped speakers and input cables to make sure it was the amplifier and not the receiver or the speaker.  Then I said to him, "Look, we're electrical engineers.  Let's take the thing into the lab and see what we can do about troubleshooting it."

 

He was game, so he unhooked the thing and I picked it up to take it down the hall to his lab.  When I lifted it, it seemed rather light for a product that was supposed to have a lot of iron and copper in the transformers.  But as I set it upside-down on the workbench, I reminded myself that at least part of it was solid-state, because it had a USB input for digital audio, and I don't think anybody has ever built an all-vacuum-tube USB decoder.  So there had to be some solid-state electronics inside, but maybe just enough to create the analog signal for the tube-amp part. 

 

We were in for a surprise.

 

When we got the thing apart, here's what we saw.  There was a transformer, but it was inside the main chassis, and it was a quite ordinary inexpensive power transformer of the kind you would use for a solid-state amplifier, not a tube amp.  That was because nestled close to the front panel was a little circuit board about four by six inches with a complete solid-state stereo amp on it, all the way to the power transistors on heat sinks inside the chassis. 

 

What about the vacuum tubes on top?  Well, we could see a small cable that apparently did nothing but light up the tube filaments.  There were not enough connections we could see that would be necessary if the designers had used the tubes for anything more than window-dressing, or should I say amp-dressing. 

 

And that big black alleged transformer on top?  Empty.  Just a hollow plastic box with air slots on back to let air circulate to the heat sink below it.  You could call it a chimney, maybe, but it wasn't enclosing anything but air. 

 

The whole thing reminded me of a scene from Mark Twain's The Adventures of Huckleberry Finn.  Huck travels for a while with a couple of fraudsters who put on a mildly obscene show for a hick town in Arkansas, which is nothing like what they advertised it to be.  Just before the audience rises up to pelt the performer with tomatoes, they shout, "Sold!"  Meaning roughly, "You sold us a bill of goods." 

 

My friend is capable of getting mad, but when we realized what the deal was, he just grinned ruefully and said, "Well, I should have known better than to think I was getting a real tube amp for $160, when the next cheapest one is $2500."  What made it worse was that an officemate had seen his and bought one for herself.  We went back and told her the story, and she just laughed.

 

So here's the ethical issue:  is Stonetown guilty of false advertising?  I went back and looked up some of their ads.  They call it a "tube amplifier," and say it's a "[c]lassic tube amplifier, modern features," and so on.  I suppose you could say that it's up to the designers as to how much they use the tubes in the amplifier.  But lying can be by implication as well as by flat statement. 

 

This experience can be summed up in the Latin motto Caveat emptor—let the buyer beware.  If it's too cheap to be believed, don't believe it.  And if we hadn't opened the thing up last week, we might still be enjoying the illusion of having a genuine vacuum-tube amplifier at a solid-state price.  As it is, we're a little sadder, but a lot wiser.    

 

Sources:  As I'd rather not be sued for libel, the real name of the company does not appear in this blog.  But from online comments I've found, they are not the only firm that engages in this type of thing in the so-called tube amp market. 

Monday, January 12, 2026

The Best Investment for Engineering—and Everything Else

 

Engineering is inseparable from economics.  Doing engineering without regard for costs or return on investment (ROI) is doomed to failure.  So here's a question to ponder:  what investment can produce a return over time of $60.04 for every dollar invested?  Is it something like Tesla, or SpaceX, or a particular stock? 

 

Not according to Ray Perryman, a consulting economist who runs the Perryman Group and writes a newspaper column.  The investment he has identified that returns sixty-for-one is surprising in its nature and simplicity. 

 

It's early childhood education, conventionally defined as any formal education that takes place from birth up to the third grade in elementary school. 

 

The returns break down roughly as follows.  Every dollar invested in early childhood education returns about $15 in future earnings of the child educated.  It returns almost as much ($14) in parental earnings, presumably because the parents benefit from having a better-educated child.  Improved earnings from future generations amount to $6.  The largest single return is reduced social costs from decreased need for crimefighting and social services:  $21.  And because the child will be healthier and live longer, those effects are quantified at about $2.50.

 

Economists are not used to treating people as anything other than utility-maximizing consumers.  But as both popular writer John C. Médaille and academic John D. Mueller have pointed out, modern economics subscribes to what Mueller calls the "stork theory" of human development.  In the economists' mathematical models, people simply appear fully grown and ready to consume.  The effort of parents in raising children, all educational work, and the process of procreation itself is simply regarded as consumption, the same as if it were all blown on trips to Aruba.  Critically, modern economics ignores everything in the nature of gift:  the giving of parents to children, the giving of charity to the needy, and so on. 

 

Fortunately, some economists are beginning to remedy these glaring omissions in the way economics models the human world.  And Perryman's analysis of the profound effects of early childhood education is a good step in this direction. 

 

Engineering education is as blind as modern economics in this regard.  We set up engineering schools and just assume that somehow, qualified students will show up and be ready to learn how to be engineers.  And so far, it's worked.  But declining birthrates worldwide and massive educational disruptions such as COVID caused may show up as a severe shortage of qualified students, which is already causing problems for some institutes of higher education. 

 

In his book Redeeming Economics, Mueller shows how these blind spots in modern economics have caused profound distortions in the way economic predictions and advice turn out.  Perryman's calculation of the stupendous ROI of early childhood education shows that, systemically speaking, we should be pouring more money into Head-Start-like programs, preschools, and grades 1 through 3 than we spend on highways, the Internet, or AI.  But especially in the U. S., education is an unglamorous and often politically contentious field which enjoys little in the way of prestige or a united opinion as to what should be done.  This is one aspect of the larger anti-child tone of current culture, but that's a discussion for another day.

 

While money can help improve early childhood education, what helps most of all is interested and engaged parents.  And here we go way beyond economics, at least the modern kind.  I was once invited to sit in on a focus-group panel convened by some STEM (science-technology-engineering-math) educators who had recently received a National Science Foundation grant to improve engineering education.  The details of what they were trying to do have faded from my memory.  But at one point, the panel leaders asked us members what we thought were important factors in making students ready for engineering education.

 

I said that a stable family environment of two biological different-sex parents was one of the most important factors in favor of fostering children who could grow up to be engineers.  What I won't forget was the expressions of strained indifference that met my statement.  Family structure and stability were clearly beyond NSF's scope.  And this is not to say that children from broken homes or abusive parents can't make it as engineers.  They can, but it's harder. 

 

Back in the Early Pleistocene when I was a child, my mother, an intelligent woman educated as a teacher who went on to get her master's degree, looked upon me as a kind of project, and challenged both me and herself to see how much I could learn about the alphabet and reading before I entered first grade.  I was too young to appreciate it at the time, but she blessed me (an act of giving) with what Perryman would value at many dollars of equivalent early childhood education.  And that blessing enabled me to go on to have a moderately successful career first in industry, and then in academia. 

She put aside her teaching career to raise her children until her youngest was in school, and then went back to teaching.  But the dollars she deferred while she was raising us paid off handsomely, as Perryman's analysis indicates.

 

If it ever happens, it will probably take a generation before economists as a group learn how to quantify giving as well as consuming.  But if they do, we will get a vastly different picture of the economy than what we see now, which is focused on stuff and organizations, and neglects the vast amount of giving among people that is fundamentally vital to any economy that isn't going to die out in one generation.  My metaphorical hat is off to Ray Perryman and his colleagues, who have made one small step in a much-needed direction of acknowledging that investment in people is not only morally right—it pays much better than almost any other kind. 

 

Sources:  Ray Perryman's column "The Economist:  Essential Early Education" appeared in the Jan. 2, 2026 edition of the San Marcos Daily Record.  The study on which the column was based can be found at https://www.perrymangroup.com/media/uploads/brief/perryman-essential-early-education-12-09-25.pdf.  For more on the blind spots of modern economics, see John C. Médaille's Toward a Truly Free Market (ISI Books, 2010) and John D. Mueller's Redeeming Economics (ISI Books, 2010). 


Monday, January 05, 2026

What Will Happen to the AI Bubble?

 

In an online commentary on The New Yorker website, writer Joshua Rothman tackles the question of the artificial-intelligence (AI) bubble.  On this first week of the new year, that seems like an appropriate question to ask.  It's pretty clear that AI is not going away.  Too many systems have embedded it in their productive processes for that to happen.  But Rothman raises two related questions that only time will answer for sure.

 

The first question is whether the money spent on AI is going to be worth it.  "Worth it" can mean a variety of things.  The most obvious (and frightening, to some) application of AI is direct replacement of workers:  think a roomful of draftsmen replaced by three engineers at computer workstations.  Accountants can most easily justify this way of leveraging AI by showing their managers how much the firm is saving in salaries, offset by whatever the AI system cost.  And assuming the tasks, whatever they were, are being done just as well by AI as they were by people before, the difference is the net savings AI can effect.

 

But as Rothman points out, that approach is both overly simplistic and doesn't reflect how AI is typically being used most effectively.  The most powerful use mode he has found in his own life is to use AI as a mind-augmenting tool.  He gives the example of helping his seven-year-old son write better code.  (I will overlook the implications of what the future will be like with a world full of people who were coding when they were seven.)  ChatGPT helped Rothman find several applications that his son was both able to master, and enjoyed as well. 

 

And in general, the most fruitful way AI is used seems to be as a quasi-intelligent assistant to a human being, not a wholesale replacement.  The problem for businesses is that this sort of employee augmentation is much harder to account for.

 

He points out that if an employee uses AI to become better educated and more capable, that fact does not show up on the firm's balance sheet.  Yet it is a form of capital, capital being broadly defined as anything that enables a firm to be productive.  Rothman cites economist Theodore Schultz as the originator of the term "human capital," which captures the concept that an employee has value for his or her abilities, which can depreciate or be improved just as physical capital such as factory buildings or machinery can be. 

 

In a book I read recently called Redeeming Economics, John D. Mueller points out that modern economic theory simply cannot account for human capital in a logically consistent way.  This constitutes a basic flaw that is still in the process of being remedied.  The usual metrics of economics such as GNP (gross national product) treat investments in human capital such as education and training as consumption, the same as if you took your college tuition and blew it on a vacation to Aruba. 

 

So it's no surprise that businesses are unsure about how to justify spending billions on AI if they can't point to their balance sheets and say, "Here's how we made more money by buying all those AI resources." 

 

Something similar happened with CAD software.  When companies discovered how much more effective their designers were when they began using computer-aided design programs such as AutoCAD, and their competitors began underbidding them as a result, they had to get with the program and spend what it took to keep up. 

 

It's not clear that the results of widespread use of AI will be quite as obvious as that.  Some bubbles are just that:  illusory things that pop and leave no significant remnants.  Rothman cites a rather cynical writer named Cory Doctorow who believes the AI bubble will pop soon, leaving scrap data servers and unhappy accountants all over the world. 

 

But other bubbles turn out to be merely the youthful exuberance of an industry that was just getting established.  A good example of that kind of bubble was the automotive industry in the years 1910 to 1925.  There were literally dozens of automakers that popped up like mushrooms after a rain.  Most of them failed in a few years, but that didn't take us back to riding horses. 

 

Both Rothman and I suspect that the AI boom, or bubble, will be more like what happened with automobiles and CAD software.  The feverish pace of expansion will slow down, because anything that can't go on forever has to stop sometime.  But the long-term future of AI depends on the answer to Rothman's second question:  how good will AI get?

 

It's clearly not equal in any general sense to human intelligence today.  As Rothman puts it, AI assistants are "disembodied, forgetful, unnatural, and sometimes glaringly stupid."  These characteristics may simply be the defects that further research will iron out in ways that aren't obvious.

 

While I'm not in the market for a job right now, I nevertheless receive lists of possible jobs from my LinkedIn subscription.  A surprising number of them lately have been what I'd call "AI checking" jobs:  companies seeking a subject-matter expert to make queries of AI systems and critique the results.  Clearly, the purpose of that is to fix problems that show up so the mistakes aren't made the next time.

 

It's entirely possible that some negative world event will trigger an AI panic and rush to the exits.  But even if the short-term spending on AI does crash, we still have come a long way in the last five years, and that progress isn't going to go away.  As Rothman says, AI is a weird field to try and make forecasts for, because it involves human-like capabilities that are not well defined, let alone well understood.  My guess is that things will slow down, but it's unlikely that humanity will abandon AI altogether, unless some terrifying doomsday-sci-fi tragedy involving it scares us away.  And that hasn't happened so far.

 

Sources:  Joshua Rothman's article "Is A. I. Actually a Bubble?" appeared on Dec. 12 on The New Yorker website at https://www.newyorker.com/culture/open-questions/is-ai-actually-a-bubble?.  I also referred to John D. Mueller's Redeeming Economics (ISI Books, 2010), pp. 84-86.