Dynamist Blog

Shakespeare Grew Up in a Changing Economy, and People Felt Stress Then Too

This post went out to subscribers to my Substack newsletter on May 1, 2023. Check out all the posts, including ones from the archives not reposted here, and subscribe here.

Early popular music products in the form of 16th-century ballads, for sale for a penny or two. Images courtesy of the Huntington Library, whose collection includes about 600 16th- and 17th-century broadside ballads. To see a larger version, click here.

In his 1998 book, The Economy of Obligation historian Craig Muldrew examines the expansion of what he calls “marketing” in Elizabethan England. By this term he means the specialized commercial relations we now take for granted: “the way in which goods were bought and sold, and moved around by traders, wholesalers and other middlemen, and how credit was used to facilitate such exchange and create wealth generated through profit.” Over a relatively short period, England went from an economy dominated by household production and direct selling by local farmers and craftsmen to a more complex and extended commercial order.

Goods like coal, soap, iron, and textiles traveled around the country, becoming much more readily available to ordinary people outside London. In the Norfolk town of King’s Lynn, for example, the amount of soap imported from London more than tripled from 1566 (when Shakespeare was a two-year-old) to 1586. The economic expansion included entertainment as well. Alehouses boomed, as did consumption of beer, formerly a luxury. A popular music industry began, with 3 million to 4 million printed ballads sold for a penny or two each in the late 16th century. Foreign goods like sugar, currants, lemons, and peppercorns became more common. In 1581, Muldrew reports, “21,000 oranges and lemons reached Norwich in time for [London’s] Bartholomew Fair.”

In what has been called the “great rebuilding” of rural England, homes installed glass windows, plaster ceilings, and, most important, fireplaces with chimneys. Many added rooms, along with more and better furniture and kitchen tools. “Such improvements,” Muldrew observes, “mean there must have been a concurrent growth in the market for the services of carpenters, glaziers and bricklayers, and in the sale of material manufactures for such rebuilding and furnishing.”

Muldrew combs through probate inventories, counting the number of items listed to find out just how much stuff people owned. Here’s an example, showing the average numbers of goods (not including clothing) per household in Chesterfield. The numbers are small, but the percentage increase is significant.

Even poor people had more goods than their ancestors. Farmers exchanged straw for feather beds and wooden plates for pewter. William Barat, a mariner, died with goods worth a paltry £2 4s, while owing £6 9s. “He had just one hearth in the kitchen and most of his possessions consisted of old things of little value,” Muldrew writes, “but he still possessed a number of inexpensive ‘luxury’ goods including satin towels, hangings in the hall, pewter flower pots, valences for his bed and painted cloth.” The better-off Richard Rastryck, a Southampton porter,1 left £10 worth of household goods in 1575, suggesting a modest standard of living. Nevertheless, writes Muldrew, “there were over 225 items in a five-room house with two hearths. These included three flock beds, six feather pillows, four spice plates, a number of pieces of pewter, five silver rings and a number of painted cloths.” Many of these goods were old or cheap, but that’s the point. Thanks to the expansion of trade, transportation, and specialized production, for the first time, ordinary people had access to goods once considered luxuries.

The transformation increased the country’s prosperity but was also disruptive in ways that resonate today. Rents rose, and labor was no longer as short as it had been in the previous era. “After 1540 consumption expanded, but the previous security disappeared, as families had to compete for work,” writes Muldrew. “Some did well, while others failed.” More goods were available, raising the standard of living, but attaining it felt less certain.

16th-century market scene. The Elizabethans weren’t big on depicting ordinary people and activities in art, so I’m going with a Netherlandish drawing from the Metropolitan Museum of Art. Most of the paintings you’ll see online claiming to illustrate Elizabethan food or markets are actually from the Low Countries.

What struck me the most about Muldrew’s findings (and I’m still reading the book) is the way the gains were skewed. Some ordinary farmers and tradesmen benefited from the expansion of markets for their goods and services. Others did not. So within the same original social class, there were winners and losers.

So the losers didn’t have to be objectively worse off to feel that way. Muldrew cites mason John Clark, whose estate included only 17 items worth a mere £1 6s. “But even he owned better quality goods than someone in a similar situation might have 30 years earlier, for he had a half feather, half flock bed and a painted hanging, but he was very much poorer than many of his neighbours.” Of such contrasts is economic nostalgia born.

Muldrew observes:

What was much more important than any absolute rise or decline in the living standards of poor families was the fact that many of their neighbours had become much wealthier over the course of the century, and it was in comparison to their improved standard of living that poorer households seemed worse off. Also, because wealthier households had bettered themselves, they consequently interpreted the lack of mobility, or downward mobility, of poorer households as competitive failure.

Similarly, if in the mid-20th century an American family was solidly middle class (say, the third quintile of income) and its children went to college, they’re likely now in the top two quintiles of income. If they didn’t go to college, they’re much less likely to have risen and may be relatively worse off than their parents. Either way, however, they have a lot more stuff, including goods and services that were unimaginable 50 years ago. But, like the Elizabethans, contemporary Americans who feel poorer than their former economic peers resent their relative decline, while the upwardly mobile mistake their rise for personal superiority. Both groups tend to forget what the recent past was really like. And commentators decry the terrible state of things.


What I'm Reading

This post went out to subscribers to my Substack newsletter on April 23, 2023. Check out all the posts, including ones from the archives not reposted here, and subscribe here.

"Reading" to my dolls circa age 2.

What I’ve been reading:

The Man Who Loved China by Simon Winchester (audio, listened to twice, should buy the Kindle edition for future reference): Beginning during World War II, Joseph Needham collected and read countless historic manuscripts and books from China and, by pulling together the information they contained into his massive, multivolume Science and Civilisation in China fundamentally altered our understanding of the history of science and technology. Although it’s hardly the most important fruit of this work, I can’t imagine The Fabric of Civilization without it.

The Perfectionists: How Precision Engineers Created the Modern World by Simon Winchester (audio, then bought Kindle). I found the earlier chapters of this book more interesting than the more recent ones. But that’s probably more a reflection of my interest than the book’s strengths.

Wild Swans by Jung Chang (audio, then bought paper) Massive, best-selling book telling the story of modern China through the stories of the author, her parents (officially her mother, but her father is more than a background figure), and herself. It’s very good at showing the complexities of the Chinese Civil War and resistance to Japan, demonstrating why someone would join the early Maoists, and capturing the devastation of the Great Leap Forward famine and the Cultural Revolution.

China’s Good War by Rana Mitter (audio, then bought paper) Fascinating book on how China has revised the scholarly and official understandings of World War II, reincorporating the Nationalist resistance to Japan. The book discusses both serious scholarship and propaganda purposes.

Greetings from Bury Park by Sarfraz Manzoor (audio, after watching the movie, Blinded by the Light, which was based on it). Mostly just fun, especially if you like Bruce Springsteen. Also a window into Pakistani immigrants in the UK.

A Lost Lady by Willa Cather (audio, after reading that Cather hated the movie made from it, which does sound like a travesty). I’m slowly becoming a Cather completist. I have a Kindle collection of all her works so I’ll probably read that version as well. If you haven’t read it, check out my post on her.

The Jewel House: Elizabethan London and the Scientific Revolution (Kindle) I still in the midst of this book on the specimen collectors and experimenters centered on Lime Street in London. They’re distinguished from later natural philosophers by both their methodology—mostly collecting and exchanging specimens—and their economic positions. These were people who worked at regular jobs rather than academics or aristocrats. Not riveting but solid research on a little-explored aspect of early modern science.

I’m also working through a stack of books on early retailing, credit, and consumption. Interlibrary Loan is one of the great perks of being at Chapman.

The Most Misnamed Generation

This post went out to subscribers to my Substack newsletter on April 2, 2023. Check out all the posts, including ones from the archives not reposted here, and subscribe here.

The death of Gordon Moore, the Intel co-founder and semiconductor pioneer famous for “Moore’s Law,” is the latest reminder that the so-called Silent Generation is insanely misnamed. The generation born between about 1926 and 1945 is the one that changed the world in politics, science, art, sports, and business. The so-called Silents led the civil rights and second-wave feminist movements, built Silicon Valley and the New Hollywood, developed Pop Art and rock music. Silents walked on the moon.

The 1951 Time article credited with popularizing the term Silent Generation declared that “Today’s generation, either through fear, passivity or conviction, is ready to conform.” But it also reported signs of discontent. “This generation suffers from lack of worlds to conquer. Its fathers in a sense, did too well,” said one young man. A young woman lamented, “The individual is almost dead today, but the young people are unaware of it. They think of themselves as individuals, but really they are not.”

In a relatively small generation, a few creative dissidents could catalyze change. Individual Silents faced less competition for attention and resources than members of larger generations. They also enjoyed the possibilities opened up after World War II, whether American prosperity or the weakening of class strictures in Britain. Silents became the leaders that the Baby Boomers, arguably a more conformist bunch, followed.

Although there are more famous Silents, Stewart Brand may be their quintessential representative. He’s always slightly ahead off the curve. Back in the 1990s, he observed to me that his generation didn’t think business was bad, even when they were rebelling against corporate conformity. So they were naturally placed to lead an entrepreneurial revolution. (I’m remembering a very old conversation, so take the details with a grain of salt.) Stewart is the person who told me about Amazon.com when it was a brand new company, leading to this article. His then-new book How Buildings Learn was one of the first titles I ordered online and it had a significant influence on The Future and Its Enemies. My decidedly un-hippie father, born in 1934, was an early buyer of The Whole Earth Catalog. Its “access to tools” philosophy anticipated our technology-empowered DIY culture (see my 2007 take on the trend, which has only gotten stronger).

And here’s another Silent recently in the news.

About those aviator shades

My Foreign Policy article on Silent Generation President Joe Biden’s aviator sunglasses is now out and available for comparison to the ChatGPT version. I don’t love the headline, which conflates appearance and reality (or A&R, as we called it when discussing Renaissance literature in college). I encourage you to read the whole thing, using an incognito window if you hit the paywall. Here’s a excerpt:

In Vanity Fair in August 2020, Erin Vanderhoof skewered Biden as insufficiently radical, writing that the glasses “stand in as a symbol for why so many young people feel disillusioned by the candidate. Six decades ago, Biden picked an accessory and he has stuck with it ever since … . It seems to reflect his approach to ideas like bipartisanship and respect for norms.”

But that continuity—including the promise of respect for norms—appealed to much of the electorate, which wasn’t ready to write off the United States as an irredeemably awful country or make a virtue of demonizing their fellow citizens. Like Trump’s MAGA hats, Biden’s sunglasses hark back to the triumphs of the 20th century but without the sense of loss. Aviators suggest an America that is feisty, nonconformist, powerful, competent, and ultimately good. Like the classic lenses, that vision of the country goes in and out of fashion but never disappears.

I'm Still Better than a Robot

This post went out to subscribers to my Substack newsletter on March 15, 2023. Check out all the posts, including ones from the archives not reposted here, and subscribe here.

My most recently completed article, which won’t be published for a while, is a column for Foreign Policy magazine on Joe Biden’s aviator sunglasses.1 It’s part of a package about specific objects in the news. (One of the others, for instance, is on tanks.) So I decided to check out the new edition of ChatGPT by giving it the assignment: Write a column about Biden's aviator sunglasses in the style of Virginia Postrel.

The result isn’t a bad generic column. It could fill any number of content maws. But it isn’t in the style of Virginia Postrel. Without scooping FP’s publication, it’s safe to say that my actual column is both more substantive—more research was involved—and more interesting. Here’s what the chatbot turned out:

Joe Biden's aviator sunglasses have become a symbol of his presidential style, reflecting not only his personal fashion taste but also his political persona. The aviators are a classic choice for any leader, projecting an image of strength, confidence, and authority. But for Biden, they also convey a sense of authenticity and a connection to a bygone era of American politics.

The aviator sunglasses were first introduced in the 1930s as a functional tool for pilots to reduce glare and protect their eyes during flight. They quickly became popular among military pilots and then civilians, and have since become a timeless style statement. The classic shape features large, teardrop-shaped lenses with a thin metal frame that curves around the ears.

Biden's affinity for aviators dates back to his early political career, when he was known for his classic, preppy style. He has been spotted wearing aviators at various public events over the years, including during his time as Vice President under Barack Obama. But it wasn't until he took office as President in 2021 that the aviators became a staple of his daily attire.

The aviators are a fitting choice for Biden, who has often been praised for his down-to-earth demeanor and his ability to connect with people from all walks of life. They also serve as a nod to his long career in public service, spanning several decades of American history. The sunglasses reflect a sense of nostalgia for a simpler time, when America was seen as a global superpower and the world looked to its leaders for guidance and inspiration.

But the aviators are more than just a fashion statement for Biden. They are a visual reminder of the responsibilities that come with leadership, and the need to remain focused and vigilant at all times. The sunglasses shield his eyes from the harsh glare of the political spotlight, while also conveying a sense of calm and composure under pressure.

In many ways, Biden's aviator sunglasses embody the qualities that have made him such a popular and effective leader. They are timeless, classic, and unassuming, yet also powerful and authoritative. They speak to a sense of American pride and patriotism, while also acknowledging the challenges that lie ahead.

As Biden continues to navigate the complex political landscape of the 21st century, his aviator sunglasses will likely remain a fixture of his public persona. They are a simple yet powerful symbol of his leadership style, reflecting his commitment to duty, honor, and service to his country.

When my article is published, I will link to it.

Assorted Links

This post by Robert Graboyes is great fun and a powerful reminder of the ways in which everyday technologies have changed over the past few decades. It reminded me how, during my college years (1978-82), I used to have to stand in line at the bank whenever I needed cash or, when the bank was closed (i.e., weekends) I’d have to pay the university store a 50 cent fee to cash a $25 check.

Could plastic be the future of road construction? Zach Caceres looks into the possibilities for Works in Progress (which you should all be reading).

When applied to complex systems that rely on local knowledge “design thinking”—surprise!—is, at best, a big waste of time and money. What works for inventing the Swiffer turns out to be pretty dumb about education and health care. Writing in Technology Review, Rebecca Ackerman looks at where the designers went wrong.

Economist David Rose proposes an interesting reform to give voters primary results they’re more likely to find satisfactory: allow them to vote not only for candidates but explicitly against them. Then subtract the negative votes from the total. Along the way, he explains Harold Hotelling’s median voter theorem and why it doesn’t apply in primary fields with lots of candidates. I’d definitely vote for his proposed reform. Read the whole thing.

In one of those “content maw” articles soon to be replaced by ChatGPT, CNBC reports that graphic design topped Upwork’s list of in-demand skills. “This skill is in ‘giant’ demand—and can pay up to $145 an hour as a side hustle” is the headline. This result isn’t what the “robots are taking our jobs” crowd might have thought back when I was writing The Substance of Style. Skills once reserved for trained designers were being embedded in everyday software. But it turns out that just raised the ante—much as TSOS readers might have expected.

Jeff Maurer’s I Might Be Wrong offers a smart take about why liberal kids seem to be more depressed than conservative ones. It's the most plausible explanation I've seen. Plus jokes.

Japan's Old Age Crisis and Ours to Come

This post went out to subscribers to my Substack newsletter on March 5, 2023. Check out all the posts, including ones from the archives not reposted here, and subscribe here.

The days of our years are threescore years and ten; and if by reason of strength they be fourscore years, yet is their strength labour and sorrow; for it is soon cut off, and we fly away. —Psalms 90:10, King James Version

Most of the coverage of Japan’s aging population focuses on the current low birth rate and its implications for the future. In January, prime minister Fumio Kishida told legislators that the country is “on the brink of not being able to maintain social functions” because of its falling birth rate. “In thinking of the sustainability and inclusiveness of our nation’s economy and society, we place child-rearing support as our most important policy,” he said.

But even if the government succeeds in goosing the birth rate, the effects will be felt decades from now. Japan has an immediate problem that dates back to policies adopted in 1948. People over 75 now make up 15 percent of the population, and they don’t have a lot of kids to take care of them. Japan’s postwar baby boom lasted only about two years. By contrast, the U.S. experienced high birth rates from 1946 to 1964.

In 1948, the Diet passed the Eugenic Protection Law. It made abortions legal and cheap, about $10. “Critics assert that it is easier for a woman to avoid an unwanted child in this way than to have her tonsils re­moved,” The New York Times reported in 1964. “One result of the prac­tice has been the virtual elimi­nation of illegitimate births.”

The bill also promoted contraception, establishing “eugenic protec­tion consultation offices” throughout the country. They provided marriage counseling and gave couples “guidance in adequate methods of contraception.” Local governments trained midwives and nurses to encourage family planning. Employers, unions, and nonprofits pushed the idea of smaller families and helped spread information about how to achieve them.

The Times reported:

At the Tokyo Steel Com­pany plant at Kawasaki, near Tokyo, a pilot project was set up among families of the 20,0000 workers. Housewives were given the opportunity to receive family-planning guid­ance as an alternative to such other instruction courses as cookery or household eco­nomics. A leader met with groups of about 10 women and explained not only birth con­trol techniques but ways in which family life would be improved if there were fewer children.
Initially the labor union charged that the program was a plot by management to reduce the outlay for family al­lowances—wage supplements paid to workers according to the number of their children. But the women accepted the courses eagerly and as word spread, enrollment in family­ planning instruction rose to almost 100 per cent. The birth rate among company families dropped markedly, the rate of pregnancies and abortions de­clined and the number of those practicing contraception rose quickly by more than 25 per cent….
Since the eugenic program began, surveys have discov­ered that there has been not only an increasing acceptance of the idea of family limitation but a shift in the reasons given for it. In the immediate post­ war years simple economic factors were dominant. Now, in a more prosperous and com­petitive society in which edu­cation is the key to advance­ment, 43 per cent of mothers say that their chief purpose is to provide better education for a smaller number of children. Twenty-nine per cent mentioned safeguarding of the mother’s health and 9 per cent the desire for a more pleasant life, while only 12 percent mentioned financial strain.
The result of the eugenic program has been that Japan’s birth rate has been cut in half.

Unlike China’s notorious One Child policy, the program wasn’t coercive. But it made smaller families a norm in Japan a generation earlier than in the U.S. The results are a preview of what American baby boomers will face as they get old.

For the first time in human history, large numbers of people are living to advanced ages. A life span of only “three score and ten” is a tragedy for us, not an expectation. (When the mother of my oldest friend died last year at 91, I recalled overhearing a conversation she’d had with my mother after she turned 35, which seemed impossibly old at the time. “Half a lifetime,” she’d said.) Seven percent of the U.S. population is now over 85. In 1950, the number was so low it shows up as zero percent in United Nations figures. The 14 percent of the population over 80 compares to 1 percent in 1950.1

These numbers will only grow in the coming decades. The oldest baby boomers turn 80 in 2026, the youngest in 2044.

Thanks to the baby boom, today’s very old Americans tend to have multiple children to take care of them. Even for those living on their own or in institutions, eldercare is time-consuming, emotionally draining, and often extremely expensive. There are forms to fill out, bills to pay, errands to run, and health care to arrange—not to mention emotional needs. It’s easier if, like my own parents, the very old have a diversified portfolio of kids, preferably including at least one medical professional, to take care of the things they can’t handle.2

The alternative is what Japan is experiencing: a rapidly growing population of very old people without much family support. In some cases, the unshared burden of taking care of parents simply becomes too much, especially when the parent is a difficult character. In others, middle-age children—including increasing numbers of men—are quitting their jobs to take care of their parents. At the extreme are “lonely deaths,” or kodokushi, when people die alone and go unnoticed for days. (In some cases, the deceased elderly person was not alone but living with a person with dementia.)

As a childless baby boomer, I’m afraid I don’t have a good solution.3

But we’ve been warned.

1 Using ChatGPT, I got a bunch of potentially useful data on the percentage of various national populations over 80 from 1920 to the present. But I can’t be sure the AI isn’t making stuff up and by the time I find the numbers on the cited sources I haven’t saved any time. ChatGPT has an enormous advantage at rummaging through databases but that isn’t any good if I can’t trust it. And it looks like it was wrong.

2 I live a continent away from my parents. My very capable brother who lives in the same town has taken on day-to-day responsibilities. Another brother, who lives across the state and visits several times a month, is an M.D. and talks with medical staff. I mostly handle finances, which can be done at a distance.

3 No, I don’t regret not having kids. You’re reading my work because I didn’t.

The Ethics of Higher Education

Highland Park United Methodist Church as seen from the adjacent SMU campus.

This post went out to subscribers to my Substack newsletter on February 16, 2023. Check out all the posts, including ones from the archives not reposted here, and subscribe here.

Earlier this week I had dinner with a small group of MIT professors from a variety of scientific disciplines. Among other topics, they shared their concerns about threats to the culture of free inquiry and the intellectual playfulness and audacity on which it depends. Whatever the form of threat—and they vary—these scientists worry that the institute is letting its concern for protecting its brand and pleasing government funders trump its dedication to scientific inquiry. In response, I recalled this talk I gave at a FIRE conference in, I believe, 2017. I’ve long thought I’d expand it into a “real article,” backed by more research, but never have. Until that day comes, I’m posting it here. (For more on FIRE, now the Foundation for Individual Rights and Expression, visit their website here.)

I am speaking this afternoon (Thursday, February 16) at Brown. Details here.

Two stories to start, one about academic ethics and intellectual safety, and the other about how strange an American university seems to a foreigner.

First story: When I was a senior in college, I took a graduate class in Elizabethan drama. When we got to the final paper, I had a big problem. The Christopher Marlowe plays that I found most interesting were already the subject of my senior thesis. I wasn’t inspired by the Shakespeare comedies that made up most of the other good stuff in the course and, while I liked Richard II, I had nothing interesting to say about it. The only play I found thought-provoking enough for a paper was The Merchant of Venice. That presented another problem: The professor had written a whole book about it. To make matters worse, I disagreed with his thesis, and even though it wasn’t exactly what I set out to write, once I’d read his book that disagreement inevitably became the subject of my paper.

I wasn’t trying to be obnoxious. I just didn’t have anything to say about the other plays.1

There are two problems with writing a paper disagreeing with your professor’s book. The first is that he has spent years, not weeks, thinking about the subject. He’s the expert and you are not. He will find every flaw in your argument and you won’t find every flaw in his. Plus he has a whole book to make his case and you have only a few pages. The second, of course, is that he could get mad and give you a bad grade just for disagreeing with him.

I worried a little about the first but not at all about the second.

Legally, the professor was free to give me whatever grade he believed appropriate. But I knew he would grade me fairly because I could count on his ethics as a teacher and scholar. I knew that his classroom was an intellectually safe place—not a comfortable place, not an undemanding place, but a place where I was free to disagree without punishment merely for dissenting. We would all put our feelings aside and make—and respond to—the best arguments we could.

“This puts me in a difficult position,” he wrote on my paper, before going on to comment on its substance. I had indeed put him in a difficult position, and he did still disagree, but he and his criticisms were reasonable and fair. He gave me an A-.

Second story: In 2005, someone at The Atlantic had the idea of sending the French intellectual Bernard-Henri Lévy around the United States in imitation of Tocqueville. One of his stops was Dallas, where I was living at the time, and I was his host for much of a day. At one point, we drove past Southern Methodist University, where my husband was teaching. BHL was puzzled by the idea that SMU would employ a Jewish atheist even for a secular subject like business strategy. “Why would the Methodists do that?” he asked.

Character building, circa 1907.

The American university is a strange institution. First of all, it isn’t just one thing. There are nearly 5,000 institutions of higher education in this very large country. When we talk about “the American university,” we’re really discussing an ideal type: a place that combines teaching, research, personal development, career preparation, and social life.

That ideal evolved through the combination of three quite different models.

The earliest American colleges were devoted to civic and religious character development. They emphasized liberal arts, training future ministers and giving the Wall Street or law school-bound children of the wealthy a classical education. They also included that weird American institution, college sports, to inculcate self-discipline, leadership, and teamwork.

As the country grew, this model of higher education spread beyond the upper class. Many Christian denominations founded liberal-arts colleges, including schools for women. They believed that higher education improved individual character, made for better parents, and prepared Americans for citizenship.

The second wave of colleges were practical institutions, exemplified by the land-grant colleges that were funded by federal land sales in the mid-19th century. They trained farmers, engineers, and teachers. They also offered extension classes for local citizens who weren’t enrolled as students. They did research on things like crop rotation and hybrid corn. Unlike the private, character-building schools, they were largely government-supported and promised benefits to the citizenry at large, not just their student bodies.

The third model was a German import: the research university, devoted first and foremost to pushing the frontiers of knowledge and only secondarily to training students. Undergraduate education in particular was an afterthought. Johns Hopkins and the University of Chicago were early examples. While the practical colleges were rooted in the needs of their locales and the character-development schools were sectarian or clubby, research universities were cosmopolitan. They belonged to a worldwide community of scholars. The concept of academic freedom emerged from the research university.

Professor G.F. Sprague of the Bureau of Plant Industry at Iowa State counting seed samples of Iowa hybrid corn 1942. Library of Congress.

Most of today’s American colleges incorporate elements of all three of these models, which exist somewhat uneasily together. They suggest different purposes for the university and stress different ethical obligations on the part of students and faculty. All three are threatened, in different ways, by forces that manifest themselves in part through restrictions on freedom of speech.

One of these threats is the consumerist model of higher education. Another is a new form of character building that seeks to displace the liberal-arts ideal.

The consumerist model of higher education sounds perfectly reasonable at first, because it works so well for so many other things: I pay money and you make me happy.

This model treats higher education as a packaged experience, like a resort vacation, that includes an educational component and awards a certificate of completion. It encourages schools to invest more in recreational facilities, new dorms, and student-affairs administrators than in new faculty. It leads to much better food, with choices for every type of diet, and less taxing classes. It ignores the ethical obligations of students to study and learn and professors to set standards and reward excellence.2

And, to get to FIRE’s mission, the consumerist model inevitably encourages restrictions on free speech. After all, if the customer is always right, and students are the customers, then speakers and ideas that upset even a few students constitute bad customer service. Allowing Ann Coulter to speak on campus is like having the police drag a paying customer off a United flight.

The consumerist model dominates American higher education today. But it isn’t alone. One of the reasons our debates over free speech on campus are often so confusing is that it co-exists with a seemingly contradictory model: a new version of the old character-building ideal. The consumerist model has no political agenda. It’s an equal-opportunity censor that simply wants to keep student-customers happy. But by eroding the university’s mission to pursue and transmit knowledge—and the scholarly and teaching ethics that support that ideal—the consumerist model destroys its resistance to a more overtly political threat.

The new character-building ideal is specifically left-wing. It seeks to develop students’ sensitivity to issues of social justice and environmental crisis. Its adherents’ sense of right and wrong trumps their devotion to the advancement and transmission of knowledge.

Although some of these adherents believe in—or believe they believe in—the older liberal-arts ideal of critical inquiry, they tend to direct that inquiry outward, at American society, rather than inward, at their own assumptions. At best, they are like the good liberal Presbyterians who ran my parents’ colleges. They were happy to probe the ramifications of Christian teachings in the modern world but didn’t challenge the truth of Christ’s resurrection or divinity.

This moderate strategy can work in a relatively homogeneous environment, where basic assumptions are shared. But it can only tolerate so much dissent without cracking. And when it cracks, the school must choose between its allegiance to critical inquiry and its devotion to “higher truths”—between the pursuit of knowledge and the enforcement of doctrine. As their faculties and administrations grow more intellectually homogeneous, today’s campuses risk turning fundamentalist: allowing no more dissent on political questions—in the classroom or out of it—than Bob Jones or Liberty University permits questioning the inerrancy of scripture or the creation of the world in six literal days. When you limit the range of debate and forbid certain questions, you stifle the creation of knowledge and, over time, erode both the purpose of the university and the character of its constituents.

I wish I could end with a simple five-point plan for reversing these trends. One reason FIRE spends so much time on legal issues is that, in the short run at least, the courts are a friendly to free speech. So is much of the press. But, ultimately, protecting freedom of inquiry and the free expression essential to it depends on hearts and minds.

We, too, are in the character-building business. We are asking people to commit themselves to a vision of the university as more than a place to party or get your ticket punched—to treat it as a precious institution for the advancement and transmission of knowledge. That demands a lot more than a willingness to provoke the easily offended. It means trying and sometimes failing, challenging your beliefs, facing attacks, not knowing the answers, or even the questions, in advance. The process is satisfying in the long run but not always pleasant at the moment. It requires ethical commitments and the self-discipline to stick to them, even when you’re put in a difficult position.

1 In the classes I’ve taught at Chapman University, we’ve always had to develop prompts for student papers—something I don’t remember having either in high school or college. Here’s the final assignment for “Ambition and the Meanings of Success”:

Formulate a thesis inspired by one of the following topics. You may focus on a single work or draw on multiple sources to develop a broader pattern.

Your thesis must be something that could be wrong—that someone could against as well as for—not a factual statement. Saying, for instance, that Jiro’s success comes from constantly trying to improve his sushi is not a thesis. It is something the movie tells us is true. A good thesis will often answer the question why or it will establish a pattern out of disparate examples.

You may rely entirely on material we’ve covered in class or, after discussion with the professors, delve into other examples. If you would like to explore another topic, you may do so with permission. In all of the following, the questions are simply examples of avenues you might explore. There are many other possibilities.

  1. Ambition over time: Most of the ambitious people we’ve discussed are in the early stages of pursuing their ambitions. There are three exceptions: Jiro, Norma Desmond, and Tennyson’s Ulysses. What are the challenges of aging for an ambitious person? How does ambition change with time? As a young person, what might you learn about ambition from someone significantly older?
  1. Friends and partners: Few ambitious people succeed alone—or even try to. We’ve seen examples of productive partnerships and also of conflicts. In some cases, ambitious individuals collaborate as equals. In others, one person is clearly the lead and another, voluntarily or not, the supporter. We've looked at evidence that friendships with wealthier people can lead to upward economic mobility. Why might that be? When does collaboration succeed? When does it break down? What challenges does ambition pose to interpersonal relationships? How can those relationships further ambition? What might explain the connection between friendship and upward mobility?
  1. Transcendent ambitions: Ambition often includes goals that go beyond fame or money. What are the pitfalls of “big” or “noble” ambitions? What are the advantages? If two people pursue the same ambitions in the same way but for different reasons, one transcendent and one mundane, should we evaluate their actions differently?
  1. Finding your place: “Finding your place in the world” is usually a metaphor, but pursuing one’s ambitions often requires literally moving to a new location or environment. What kinds of places foster success? What does it mean for an ambitious person to find their place?

2 You might assume, as some conservatives do, that the “practical” model of higher education as job training is compatible with the consumerist impulse. But it is even more threatened by it. Keeping students happy erodes the demand to master material, leading to less course content and more generous grading. (Talk to anyone who has taught MBAs for more than a decade and you’ll get an earful.) When practical credentials are at stake, the consumerist model is especially corrosive.

Taking Shopping Seriously

“Venetian Fair" shop with two figures, Ludwigsburg Porcelain Manufactory, ca. 1765. Metropolitan Museum of Art.

This post went out to subscribers to my Substack newsletter on February 3, 2023. Check out all the posts, including ones from the archives not reposted here, and subscribe here.

This is a speech I gave at the Atlas Network’s annual conference in 2012. They asked me to give an after-dinner keynote and I agreed, on the condition that I could talk about whatever I wanted to. The Atlas Network is an international organization of free-market policy-oriented think tanks. So the topic was, as the talk acknowledges, a departure. I’ve resurrected it because the popularity of my WSJ article on shopping and equality has me thinking about possible books and, given funding, perhaps even a [Your Name Here] Center for the Study of Commerce and Culture to take up these and other questions.

Now that you’ve had a good conference learning about taxation, regulation, and monetary policy, I want to talk about something really important.


Specifically, two big questions:

Why do people buy things?

Not why do they buy things instead of producing them personally, but why do they buy the specific things they buy. And particularly...

Why do people buy things they “don’t need”?

Americans, at least, buy so many things that The Container Store does a good business selling us things to put them in.

Why do people buy things they “don’t need”?

Through most of human history, that wasn’t a question that came up very often, at least not when you were talking about the general public. It still puzzles people who think about it. It’s a hard question.

But scholars get paid to think about hard questions, and they’ve come up with some explanations.

It’s a fetish.

This is from Marx and Freud and their various offspring. Take your pick. The word “fetish” takes on slightly different meanings depending on the theorist. But it’s definitely derogatory. Primitive, superstitious, and perverted.

The basic point is that the consumer is tricked by an outward appearance and manipulative marketing into ignoring the real “use value” of the product that comes from the legitimate values of material and labor that go into producing it.

Anything subjective—anything about you rather than the way the thing was produced — is an illusion.

Consumers buy things they don’t need because they are hypnotized.

People buy things they don’t need because they’re dupes.

This is the theory that most cultural-studies scholars start with. It’s what they teach their students. This sort of theory is even what students enrolled in the Branding master’s program at the School of Visual Arts (where I teach a short seminar) are assigned, not because the program is some kind of Marxian indoctrination program—to the contrary, it’s professional training for people who’ll work in ad agencies and marketing departments—but because those are the books available to tell the story of consumer culture.

Library of Congress

You’ll be happy to know that Marx and Freud don’t have a monopoly on explanations. Social scientists, including some economists, offer an alternative to pure trickery and delusion.

It’s status competition.

“Conspicuous consumption” is what Thorstein Veblen called it. We buy things we don’t need because we want to keep up with the Joneses. We aren’t foolish. We’re envious and insecure.

This explanation has three virtues:

1) It offers an intangible source of value that doesn’t require that we attribute supernatural powers of manipulation to capitalist producers.

2) It fits neatly into various mathematical models.

3) And it’s sometimes even true.

Free-market supporters and most economists have two answers of their own to the question of why people buy things they “don’t need.”

We don’t know.

We don’t care.

There’s no accounting for tastes and no clear distinction between “need” and “want.” People decide for themselves what they like, and they make the best choices they can, given their incomes, among the goods and services available to them. It doesn’t matter where these choices come from. All that matters is that people are free to make them—and, of course, what goods are available and what their incomes are.

So let’s fight about entrepreneurship, free trade, and redistribution and forget all this silly stuff about shopping. If we admit the question of why people buy things—and particularly why they buy things they “don’t need”—might be important, it will only encourage laws to restrict what people can buy.

That may sound logical, but it’s a dangerous and foolish attitude.

Now, I agree with the concepts of consumer sovereignty and subjective value. And I’m all in favor of specialization. You don’t have to be interested in why people buy what they buy. You can be a political philosopher and argue about the legitimacy of the welfare state. You can be an economist and argue about tax incidence or the importance of international trade in development. You can be Ayn Rand and celebrate the heroic entrepreneur.

But we don’t apply the same don’t know/don’t care standard to the supply side of the market. We do care about what motivates entrepreneurs. We do care where innovation comes from. We do care what production and work mean in people’s lives. When we talk about production, we do think culture and psychology are interesting for their own sakes as well as how they bear on public policy.

The same should be true of consumption.

Do we really want to leave thinking about demand—the half of the market that accounts for most of our everyday economic experience, and certainly for most of the variety of our everyday economic experience—to the Marxists, the Freudians, and the status-obsessed? To people who have contempt for markets and for what Deirdre McCloskey in her important recent book calls “bourgeois dignity”?

Just because thinking about why people buy what they buy means thinking about culture and psychology rather than the role of the state doesn’t mean it’s irrelevant to freedom.

The good news is that there is a significant group of scholars who do understand that shopping has something to do with freedom.

They are feminists. Not all feminist scholars take this view, of course. Plenty subscribe to Marxian or Freudian or status explanations. But nearly all the scholars who write about consumer culture in a way that appreciates its relation to freedom are feminists.

What they teach us is that the growth of what is sometimes called the “consumer society” was good for women.

Shop Girls by Elizabeth Sparhawk-Jones c. 1912, Art Institute of Chicago

Poor girls could become shop clerks instead of servants. They could go shopping themselves and forge careers as buyers and even store detectives.

Middle-class ladies could get out of the house into a new and respectable public sphere. They could meet friends for conversation in department-store tea rooms.

Magazines supported by ads for cosmetics and fashion could argue in favor of women’s rights and give readers new images of female achievement.

Businesses that wanted to sell things to women had to pay attention to what they wanted. That meant goods and services, but it also meant the environments, institutions, and behaviors that surrounded those things.

The consumer society made women public and independent in new and powerful ways—not through politics, at least not at first, but through the marketplace.

Feminist scholars study these subjects, in the past and in the present. And, believe it or not, they don’t start with the assumption that people—that women—buy because they’re hypnotized. They respect consumers, because they respect women. And that’s a good start.

But the serious study of consumer culture should not be left to the feminists, as valuable as their work may be. For one thing, they aren’t too interested in the other half of the population. Men do buy things, and their experiences are important too. Plus, a lot of these feminist scholars tend in a socialist direction.

We need other voices, asking other questions.

Let me make it clear that I am not arguing for libertarian fables. The last thing the public discussion of culture and consumption needs is more ideological nonsense.

Human behavior, including market behavior, is complex, and it includes wrongdoing and mistakes. If you look honestly at why people buy what they buy, you’ll find some things you won’t like. And you’ll find some things that are just strange.

If we are correct, however, that freedom is good and that free markets are part of that good, taking shopping seriously will only enhance our understanding of why that is so.

There are three reasons people who support free markets should care about consumption in particular and the intersection of commerce and culture more generally—why it’s not enough to pay attention only to politics and policy.

The first is political.

By discrediting nearly every purchase decision, the fetish/status explanations undercut the legitimacy of nearly all market transactions and, thus, the legitimacy of markets as a whole.

On a practical policy note, if you can’t explain why people’s decisions about buying shoes make sense, you’ll never be able to explain why they should be allowed to decide about really complicated goods like health care or education.

You’ll never be able to explain why this or that “unnecessary” purchase shouldn’t be banned, regulated, or taxed—or why taxing away people’s unnecessary income is a bad idea.

Thinking about why people buy what they buy means thinking about the origins and nature of economic value, which is critical to thinking about and defending markets. It’s great to celebrate and defend entrepreneurship and production, but neither works without customers. And we are all customers.

That brings us to the second reason, which is practical.

If you’re a business person, you need to understand why people buy whatever it is you’re selling. What value are you creating? Is it simply utilitarian? Or is there more to it than that? And, if so, what?

The final reason, the one I really care about, is intellectual.

Understanding how the world actually works, what the past was really like, how human beings actually think and act is valuable in and of itself.

Knowledge is good.

That may have been a joke slogan in Animal House but it’s the truth.

And knowledge has practical consequences: What we know determines how we understand ourselves and our society. And how we understand ourselves and our society determines what actions we take, individually and collectively.

So I hope that while you’re thinking about how to promote freedom and improve policy, you will devote at least a little attention to the questions of why people buy what they buy and how markets evolve to serve not just practical but intangible needs and wants. Document what you see in your country and share it with the world—and with the future.

Successful Bangladeshi peddlers Asma and Jasmeen (on bike)

I would like to end with a story from Linda Scott, a [now-emerita] marketing professor at Oxford’s Said School and one of those feminist scholars I mentioned earlier—but not a socialist. She wrote an excellent book called Fresh Lipstick about feminism and consumer culture in 19th and 20th-century America, and now she works on projects in developing countries.1

One of the projects she has studied is the CARE Bangladesh Rural Sales Programme, which helps women, mostly widows, sell a wide range of products—food, seeds, toiletries, medicine—door-to-door to other women in rural Bangladesh. The program provides distribution both for products from multinational companies like Unilever and for locally produced goods.

These female peddlers bring the chance to shop to rural women who, following local custom, never leave their homes. The successful ones, like Asma and Jasmeen, understand what their customers want beyond mere survival. Asma, for instance, sells “everything they need to take care of their bodies.” And she sells beautiful saris, blouses, and petticoats.

One of the most remarkable things about the program is that it has changed the colors of saris women wear. Before, men bought for their wives and mothers, and most saris were blues and greens. Now women buy for themselves, and they choose reds and pinks. “Imagine,” writes Scott, “the articulation of control that comes from having to wear someone else’s choice of clothes–and the small but very important liberation that comes from choosing your own.”

Anyone who has ever been on one side of a parent-child struggle over clothing knows what she’s talking about. The chance to choose our own clothes is one of the first freedoms we crave as children. It is one of the freedoms we hold dearest and take for granted most often. The freedom to choose our own clothes isn’t made possible merely by the absence of a legally mandated dress code. An economic revolution first had to happen to allow us to buy clothes we “don’t need.” To understand the liberating power of markets, we have to understand culture as well as politics, and consumption as well as production. We have to take shopping seriously.

1 The information in the book is fantastic. The tone is angrier than would be most persuasive, reflecting the author’s infuriation at the narrow historical narrative and patronizing attitudes of some of her fellow feminists.

On the Effectiveness of Charitable Giving

Michelangelo’s David, photo by Flickr user Brian Dooley, public domain

This post went out to subscribers to my Substack newsletter on January 22. Check out all the posts, including ones from the archives not reposted here, and subscribe here.

The 400 florins that Florence paid Michelangelo to carve the David could have provided a whole year’s subsistence living for about 100 Florentine residents. Was the commission an immoral use of funds? Should the money instead have gone to the poor, some of whom surely starved without it? If justified for reasons of civic cohesion, would the sculpture have been unjustified as a private purchase like much other Renaissance art? And what about its value to the future? If Florentine authorities in 1501 had known Michelangelo’s masterpiece would become a major tourist attraction centuries later, bringing untold wealth and admiration to their city, should that future have entered their calculations?

Such were the thoughts that crossed my mind last month, when I read Shruti Rajagopalan’s excellent essay “Altruism and Development - It’s complicated......” In it, she examines how to assess the value of philanthropy if we want to “do the most good” in the world—the challenge posed by the movement known as Effective Altruism. Effective Altruism takes various forms, from the extreme utilitarianism of philosopher Peter Singer to practical cost-benefit assessments of charitable ventures. (Here’s a useful blog post delving into the philosophical distinctions.)

Shruti starts with a personal dilemma. Should she give to ameliorate the problem of Delhi’s terrible air pollution or support the anti-malaria causes dear to GiveWell and other EA advocates?1

Shruti, who works at the Mercatus Center at George Mason University, is from Delhi. She is under doctor’s orders not to return to visit her family because the pollution will aggravate her long Covid problems. Given her personal stake in the city’s air pollution, she worries that her charitable giving impulses are too emotional.

With air pollution dominating my thoughts and nudges for charitable giving in my inbox, my first instinct is to give to causes that help mitigate pollution in Delhi. But I am also aware of the literature on emotional giving or ineffective altruism. In their 2021 paper, Caviola, Schubert and Greene explain why both effective and ineffective causes may attract dollars. People often give emotionally to a cause that has personally impacted them in some way.
A US$100 donation can save a person in the developing world from trachoma, a disease that causes blindness. By contrast, it costs US$50,000 to train a guide dog to help a blind person in the developed world. This large difference in impact per dollar is not unusual. According to expert estimates, the most effective charities are often 100 times more effective than typical charities.
This paper resonated with me because I am exactly the sort of irrational dog lover likely to support the best training programs for guide dogs.

Working through the numbers, she finds that air pollution in Delhi is, in fact, just the sort of massive public health problem that EA types elevate as worthy of philanthropy. (It generates what my husband, in high school debate jargon, would call “bodies on the flow.”) But air pollution lacks an essential characteristic of EA-favored causes: an easy way of measuring interventions and their effectiveness. Shruti writes:

There are many reasons air pollution mitigation doesn’t make it to the top of these lists despite a ten times higher death toll. It cannot be avoided by distributing a $5 net. The costs and the benefits from air pollution in Delhi cannot be easily quantified. Nor can the benefits from the interventions to mitigate pollution be easily measured. Simply put, air pollution in Delhi is complex, while malaria death and malaria nets in Africa are legible. We can only evaluate impact of interventions and projects that are legible. And only studying complex phenomena narrowly can make them legible.

But that’s not the end of it. The more she digs, the tougher the problem becomes. I recommend reading (or at least skimming) the entire analysis. Eventually she starts to think about why malaria declined in India, what malaria and Delhi air pollution have in common, and why we don't see similar problems in places like DC or London. ("Outside of camping equipment stores, I don’t think I have seen any mosquito nets bought or sold in the U.S.") Ultimately the problem isn’t tightwad westerners, but dysfunctional institutions and insufficient economic growth.2

Her conclusion:

If you want to make the greatest impact in the long term, nothing can beat contributing to institutions working toward increasing economic growth and prosperity in poor regions like Africa and India. Increasing economic growth will help solve both malaria and air pollution. It will be your least attributable contribution, but the one with the highest impact.
Beggars at a Doorway, Master of the Béguins (possibly Abraham Willemsens, Flemish, active by 1627, died 1672), active 1650–60, Metropolitan Museum of Art

Effective Altruism suffers from the blind spots that are characteristic of highly intelligent, self-described rationalists: hubris and a fixation on counting things. It assumes that it’s easy to tell what will do good and that the only way of “doing good” is directly extending life expectancy. (You can count those “bodies on the flow.”) But, as Shruti points out, economic growth is the most effective avenue to saving lives.

And you don’t get economic growth from a philosophy that tells people they are morally culpable for countless deaths if they consume anything more than absolutely necessary. The bourgeois fellow in the painting above may be doing his Christian duty by giving to beggars, but it’s his business enterprise and spending on frivolous things like paintings that raised living standards in the Low Countries. Bernard Mandeville was on to something in The Fable of the Bees, when he scandalously suggested that the selfish pursuit of luxuries could make everyone better off.

Historians may argue about the exact connections between the consumer revolution of the 17th and 18th century, the industrial revolution, and the long-term great enrichment. But these three phenomena were definitely intertwined. As an organizing principle, self-sacrifice is a prescription for not just for personal misery but for global impoverishment.

I’m all for generosity. I’m glad people give to cure river blindness or prevent malaria. I’ve been known to take GiveWell’s advice and to send money to GiveDirectly, whose philosophy of giving cash to poor individuals in poor countries makes a lot of sense to me. On the margin, Effective Altruism does more good than harm. It’s the philosophy of a small elite who might not be especially generous without it.

But the “ineffectiveness” of sponsoring guide dogs to help blind Americans or donating to keep research libraries stocked with obscure titles isn’t a bug. It’s a feature. The diverse enthusiasms of generous people make for a richer cultural environment.

Walmart heiress Alice Walton thought the Ozarks should have a first-class art museum. “My mama always said, ‘Give the thing you love the most.’ And other than family, I decided that had to be art,” she says. Some people love ballet, some animals, some free speech, some amateur astronomy. Love of all sorts motivates a wide range of giving.

Andrew Carnegie funded hundreds of public libraries because he himself had used a generous man’s library to educate himself as a poor working boy. Like many benefactors, he used his money to create a better environment for people like his younger self.

Whatever its motivations, diversified giving helps correct for the limitations of our knowledge. We don’t know today what will matter tomorrow, how, or to whom. The human enterprise depends on many different, often incommensurable values, not all of which can be plugged into a spreadsheet. Life is fundamental but also more than mere existence. The David was worth the money—yesterday, tomorrow, and forever.

Synthetic Meat: The Reaction

This post went out to subscribers to my Substack newsletter on December 26. Check out all the posts, including ones from the archives not reposted here, and subscribe here.

The reaction to my WSJ article on cultivated meat has been fascinating and disturbing. Some people in the business have lectured me not to use the terms synthetic, as in “synthetic biology,” or lab-grown, lest I scare off customers. (Technically, meat is only lab-grown in the research stage, since scaling up requires something more like a brewery.) They are, in other words, squeamish about acknowledging the artifice involved in their own products—exactly what interests me!

Then there’s the knee-jerk right-wing reaction, represented by the comments on the WSJ site. When the WSJ accepted my article but said they wanted me to write the shopping feature first, I considered sending the synbio essay to another paper. But rereading the piece, which I’d written with the WSJ in mind, I decided it it was implicitly tilted right and would need revising to get into a left-of-center outlet. Since I didn’t have much time for revisions, I left the piece at the Journal.

The core of the article consists of these paragraphs:

A century ago, “a chicken in every pot” was an ambitious political slogan. It has long since become an everyday reality. Americans will consume nearly 100 pounds of chicken per capita this year, according to the National Chicken Council, up from around 67 pounds in 1992, when chicken first surpassed beef.

Behind chicken abundance is the efficient production that critics call factory farming. Bred for maximum meat in minimum time, confined to crowded sheds, and subjected to assembly line slaughter and disassembly, chickens destined for mass consumption endure short, unhappy lives. Cheap chicken also exacts a human toll. Although automation is improving conditions, chicken processing may be the country’s worst job: smelly, noisy, bloody, cold and injury-prone from slippery floors and repetitive motions. Plus the pay is low.

Most Americans aren’t about to give up chicken, but we’d rather not dwell on where it comes from. In the not-too-distant future, however, the trade-off between conscience—or ick factors—and appetite may no longer be relevant. Instead of slaughtering animals, we’ll get our meat from cells grown in brewery-like vats, with no blood and guts….

Synbio executives talk like animal lovers and environmental activists. But synbio is still a form of engineering, a science of the artificial. As such, its ethical appeal represents a significant cultural shift. Since the first Earth Day in 1970, businesses large and small have emerged from the conviction that “natural” foods, fibers, cosmetics, and other products are better for people and the planet. It’s an attitude that harks back to the 18th- and 19th-century Romantics: The natural is safe and pure, authentic and virtuous. The artificial is tainted and deceptive, a dangerous fake. Gory details aside, the “factory” in factory farming makes it sound inherently bad.

Synthetic biology upends those assumptions, raising environmental and ethical standards by making them easier and more enjoyable to achieve. It could help reverse what the writer Brink Lindsey has dubbed “the anti-Promethean backlash” that began in the late 1960s, defined as “the broad-based cultural turn away from those forms of technological progress that extend and amplify human mastery over the physical world.” Synthetic biologists are manipulating atoms, not merely bits.

Anti-Promethean attitudes are still culturally potent, of course, with their own intellectual ecosystem of publications and advocacy groups. “Cell-cultured meats are imitation foods synthesized from animal cells, not meat or poultry that consumers know,” pronounces Jaydee Hanson, the policy director for the Center for Food Safety. The activist group is lobbying the U.S. government to require that lab-grown meat carry off-putting labels like “synthetic protein product made from beef cells.” A neutral term like “cultivated meat” should satisfy most people, however; or the industry could push for the tendentious “cruelty-free” favored by cosmetics makers.

This is a story about market-driven progress! Abundance is good!! The anti-Promethean backlash is bad! “Cruelty-free” is tendentious and the Center for Food Safety is the bad guy. Those are all right-of-center tells.

Or they used to be. I was naively stuck in the 20th century.

Back then, when I hung out with ideologues more than I do today, people on the American right liked technological innovation and market competition. They celebrated ingenuity and entrepreneurship. They might predict that a given product would fail or choose not to buy it—that’s the system, after all—but they weren’t affronted by the mere existence of for-profit approaches to social or environmental issues. They weren’t insulted by the idea that technology might alter attitudes by changing costs.

Now, everything is personal and I, who write as a meat eater who likes human ingenuity and technological progress, am read as a woke propagandist.1

Take the comment was from one Alan Kelman. It’s my favorite because he is so, so deluded about both my household income and my eating habits: “I won’t consider eating this stuff until Ms. Postrel, Bill Gates, John Kerry, and their fellow Davis/WEF dirrrtbags give up their super prime filet mignon, primo lobsters, and free range poultry. Apres vous Alphonse.”

As my husband pointed out to me, it is true that I am more likely than this guy to be invited to Davos. But I’m also more likely to win an Oscar—in the sense that a probability of 0.0000001 percent is greater than a probability of 0.000000001 percent.

The best argument against the development of cell-grown meat is that technocrats believe that anything good must be mandatory, especially if the good thing claims to help the environment. So if someone invents cell-grown meat, government mandates will soon follow. We therefore shouldn’t encourage alternatives to the status quo lest we be forced to adopt them. It’s the same argument we hear from people who believe that saying cities should allow property owners more flexibility about what they build on their land is tantamount to banning single-family homes. This culture-war form of the precautionary principle is as bad as every other form. It’s a prescription for stasis.

The other thing that seems to worry the right-wing critics is the argument that ethical standards will change. Mike Wickerham comments:

So, I'm unethical for eating meat? Who gets to decide this? Is there some committee somewhere that determines what is ethical because it is what they think? Who are these supreme beings, these god-like entities greater than man, greater that 10,000 years of human civilization that suddenly determine that eating an animal is unethical.? I'm not sure how the rest of the readers feel about these self-righteous peddlers of their own determination of what is good and right: But for myself, I would be banned from commenting on WSJ again if I suggested what these people should go and do to themselves. Merry Xmas.

Contrary to this view, ethical standards evolve in a bottom-up way. They aren’t simply imposed, by me or anyone else. Their evolution is influenced not only by cultural ideas but by economics and technology. Authorities may try to impose ethical standards or to hold them still over time (see: the Taliban and women’s education), but standards do shift. To take a simple example, for 10,000 years of human civilization, child labor was completely normal; it still is in some places. It gradually vanished as people got richer and investing in children’s education made more economic sense than putting them to work. We now generally view child labor as unethical—not because that’s an eternal truth but because we can afford to. As noted in my article, even infanticide, which Christians always considered wrong, declined in Europe when preventing pregnancy became easier and raising children more affordable.

A Somewhat Self-Serving but Useful List of Last-Minute Gift Ideas

This post went out to subscribers to my Substack newsletter on December 21. Check out all the posts, including ones from the archives not reposted here, and subscribe here.

One of my books (surprise!)

The Fabric of Civilization: For the textile lover, of course, but for anyone interested in history, technology, or why our world is the way it is. More than one reader has told me it reminds them of James Burke’s Connections series, and it should appeal to fans of Matt Ridley (who gave it a nice endorsement blurb) and Steven Johnson.

The Power of Glamour: It’s a beautiful artifact, so you can buy it for non-readers who want something for the coffee table. The images serve as a visual soundtrack to a work of intellectual theory, including two chapters limning the history of glamour in its pre-modern and modern forms. This book decodes glamour as a form of communication and visual persuasion. Better for René Girard or Martin Gurri fans than for the typical fashionista.

The Substance of Style: An oldie but a goodie. What’s the value of making things look good? Status isn’t the only reason people care about how things look and feel. The trend that inspired the book is a background phenomenon now, but the analysis still holds—all the more so as we enter an era of AI images on demand. More accessible than The Power of Glamour, this is the book for anyone interested in design, branding, or why they buy cool-looking things they don’t need. Features a famous paragraph on toilet brush holders.

The Future and Its Enemies: The classic is enjoying a resurgence of interest, some of which I discussed in this early Substack post. It puts both the rise of illiberalism and the burgeoning “progress studies” movement in a useful framework—the conflict between ideals of bottom-up dynamism and top-imposed stasis—that crosses traditional political and cultural lines. (Just substitute “Donald Trump” where it says “Pat Buchanan,” keeping in mind that Buchanan was a more serious thinker.) For your loved ones who like arguing on the internet or listening to political podcasts.

An Audible gift subscription: Pick one, three, six, or 12 months at $15 a month. Members get one book a month plus access to a catalog of freebies.

Harry Potter books on Audible, starting with The Sorcerer’s Stone. I just finished listening to the series, which I’d never read. (I have an Audible subscription, but I got them from the public library, which did require a degree of patience.) The narrator Jim Dale, who does all the voices, is amazing and the books deserve their success.

A coffee gift subscription: See this CNET guide for advice.

More ideas from my friend and Works in Progress editor Sam Bowman here, including his “desert island” cookbooks and a microwave rice steamer I bought for myself. A microplane is a must!

You can also find Megan McArdle’s famous (but not updated) Kitchen Gift Guide on her Substack Cookery Monster.

I get a commission on Amazon links.

ArchivedDeep Glamour Blog ›

Blog Feed

Articles Feed