It’s August, and along with the bronzing rays of sun come the inevitable schmoozing estival get-togethers. It’s a ritual that has come back in force after two years of small-group pandemic huddles in yards and on beaches, and this conviviality comes coupled with that most modern of games, what might be dubbed Cultural Minesweeper.
“Have you read Tomorrow, and Tomorrow, and Tomorrow?” “No, but have you read The Overstory?” “Oh, I’ve heard that’s good. Did you watch the Game of Thrones prequel?” “I gave up on HBO Max last year, but have you seen The Crown, what did you think?” “It won a lot of awards, it’s on my list. What did you think of Jordan Peele’s new movie Nope?” “I loved Get Out, I should see it.”
In a shattered cultural milieu, the ricochets can bounce back and forth for a strained period: my record is about 20 before I surrendered to the sweeping currents of ephemeral but quality content. Perhaps one of this century’s greatest delights is happening upon a party guest that has read, listened, watched, or thought about anything in common with one’s self.
One cause is that we have transitioned more of our consumption patterns to social media, hampering conversational citations. Even in 2022 (and let’s deduct the Trump Twitter years), no one references tweets, Facebook posts, or TikTok videos as part of a dinner party conversation. Occasionally, you will get asked about some “insanely funny” video, but its ephemerality means that it almost never lands well, nor does it offer status to the interrogator.
This pattern continues, with the United Kingdom’s media regulator noting:
Ofcom’s News consumption in the UK 2021/22 report shows that, for the first time, Instagram is the most popular news source among teenagers used by nearly three in ten in 2022 (29%). TikTok and YouTube follow closely behind, used by 28% of youngsters to follow news. BBC One and BBC Two – historically the most popular news sources among teens – have been knocked off top spot down to fifth place. Around a quarter of teens (24%) use these channels for news in 2022, compared to nearly half (45%) just five years ago.
The more obvious cause is the deluge of content (see “Easternization of media” a few months ago for more), and to put it into precise terms, the inability of any cultural artifact to generate an audience. House of the Dragon, the new Game of Thrones prequel (a description I have to include for the precise reason that audiences are shattered!) garnered nearly 10 million domestic viewers for its debut last weekend. That’s the best series launch in HBO history, but a reminder of just how small audiences have become: even a lavishly funded production, building on the legacy of one of the most popular shows in history, barely receives glances from a percentage point or two of the viewing public.
Now, as a life-long pseud, I learned years ago how to float in an implied familiarity with an author or their work, without telling an outright lie. I have a shell like a Galapagos Tortoise when challenged on this sort of thing. But I have noticed that such libertine manners are fast becoming normalised.
His sardonic view is to take that libertine manner and enjoy the libertine nature of modern society:
Don’t feel ashamed. The sad truth is that we are becoming an illiterate society. As a pastime, books are as dead as theatre was fifty years ago. What little reading we do now is rarely in longer form than you are currently struggling through — I wonder if you have already scrolled down to see how much more of this you need endure? Even when we do buy an actual book, the data is in on our chance of finishing it, and it’s not pretty.
I don’t agree that we are becoming an illiterate society, but we are dissolving a collective literacy into an atomistic model of independent thinking. The irony is that some of the most interesting people who do think and ponder many of the sublime works that our culture produces are stuck not talking with others. Perhaps religious sects have it figured out: an intentional canon read by all that forms the basis for all further discourse. Criticism is layered on a bedrock of culture, so there’s at least an agreed set of material being brought into contention.
A few months ago in “Consensus functions,” I wrote about the convergence and divergence of society when it comes to facts, theories and politics:
Society, meanwhile, doesn’t have all those layers of consensus to build upon for new decisions. There’s no algorithmic blockchain ensuring that the basic facts of reality are cross-validated, or the scientific method to ensure that evidence is considered with appropriate context. Consensus is recursive, and without better consensus functions around values and tradeoffs, it’s impossible for a nation to make decisions.
Culture is a form of consensus, collections of intellectual intangibles designed to connect each of us together. Our divergence of taste ultimately means a divergence of culture, and the increasing need to translate across impermeable barriers of media. There’s no translator for my taste I can bring to a dinner party, and alas, that means the game of Cultural Minesweeper begins again. Thank god everyone in Brooklyn over 30 watched The Wire.
Full copy of Lux’s Q2 LP Letter
Josh Wolfe has posted a full copy of our Q2 LP Letter (that’s a Google Drive link to the PDF), which sets out how Lux sees the current macroeconomic environment, our areas of investment interest, and our overarching belief that 2022 and 2023 are going to be extraordinarily enriching for founders and investors who are aggressively honest about the markets and can adjust their now-dead plans.
I’ll highlight in the letter a longer explanation of a new theme we have been incubating the past few quarters we’ve dubbed “extensionalism” (which you can read in the full letter starting on page 9). A brief intro:
Seventy-five years ago, a philosophy of ‘existentialism’ was born that emphasized the existence of the individual acting with free will. But increasingly, we depend not just on ourselves, but on extensions through the technology we make––technology, both observable and invisible. We call this theme “Extensionalism”: the extension and embodiment of our senses and cognition aided and offloaded onto technological systems with their own requisite nervous systems (sensors, solid-state memory and logic processors, along with actuators and end-effectors that act on computation and programs) and metabolic systems (energy processing and giving off waste-heat).
For those frightened at reading the whole letter, Josh has a lengthy tweet thread filled with highlights.
The sun is rising (again) on elemental energy
What a difference a year makes. In “Fission today, fusion tomorrow,” I talked about the resurgence of political interest in nuclear power and what's blocking the transition to a more "elemental energy" approach. Since then, I briefly noted in “Intel’s Malaise” one of the most important about-faces on the issue from Germany’s Greens, which now endorse reopening soon-to-be-shuttered nuclear power plants in the heart of Europe.
This week, an even larger about-face — let’s call it a volte-face — was announced.
Japan, which turned its back on nuclear power in the wake of the devastating 2011 Fukushima Daiichi plant accident, is looking to rebound on nuclear energy. Secure in (political) power for the time being following upper House of Councillors elections in July, Prime Minister Fumio Kishida announced that the LDP-led government would begin reopening shuttered plants and actually begin construction on new ones.
Japan, like many countries this summer, has been buffeted by huge swings in oil and gas prices, placing acute attention on the country’s high demand for energy imports. Elemental energy isn’t an immediate salve, but it is a critical component of a long-term, sustainable, and clean energy strategy.
Flowing from one person’s idea into Adam Neumann’s head
Last week, I talked about WeWork founder and former CEO Adam Neumann’s new company Flow in “Truth and reputations,” asking:
How much should a VC wager on an entrepreneur whose company once hit a valuation of $47 billion? And that even today is still worth $3.69 billion on the public markets? My guess is that the popular reputation and the underlying truth are actually quite divergent, and the typical investor undervalues the fallen entrepreneur. There’s potential alpha sitting right there on the table.
That last line triggered a minor tidal wave of email responses from readers, who complained that there was no world in which Neumann should be backed again. One reader cited Warren Buffett’s aphorism “You can’t make a good deal with a bad person.”
Fair enough. Well, it didn’t take long for Neumann’s truth and perception to get rocked again. The Forbes trio of Iain Martin, Alex Konrad and Cyrus Farivar reported this week that Neumann might have filched the idea for Flow from an angel investment in Alfred, where his financial office had two representatives on the board. The evidence is fairly damning, and begs the renewed question: how much is that alpha sitting right there on the table worth, really?
Lux Recommends
Our scientist-in-residence Sam Arbesman recommends James Vincent’s interview with David Holz, the founder of AI image generator Midjourney, on the future of art in “‘An Engine for the Imagination’: The Rise of AI Image Generators.” Holz on whether machines collectively think: “Well, there isn’t really a machine collective. Every time you ask the AI to make a picture, it doesn’t really remember or know anything else it’s ever made. It has no will, it has no goals, it has no intention, no storytelling ability. All the ego and will and stories — that’s us. It’s just like an engine. An engine has nowhere to go, but people have places to go. It’s kind of like a hive mind of people, super-powered with technology.”
Peter Hébert previously recommended a Bloomberg piece on all the fossils and other interesting artifacts in Europe becoming visible during the continent’s drought. Now the WSJ has a similar piece on America’s reappearing past, which includes 113-Million-Year-Old dinosaur footprints.
Sam also recommended Cian Maher’s retrospective on Hnefatafl (don’t ask me to pronounce it — I just write a newsletter), the Viking board game that would become supplanted by chess. “‘[Chess] developed a reputation as a game where men and women could flirt, maybe because it did take quite a lot of time to play,’ says [Martha Bayless]. ‘By contrast, tafl was maybe too quick for flirtation. Flirtation was a lot of [chess’] appeal.’” Chess as romance — try that at a dinner party and let me know how it goes.
That’s it, folks. Have questions, comments, or ideas? This newsletter is sent from my email, so you can just click reply.
Forcing China’s AI researchers to strive for chip efficiency will ultimately shave America’s lead
In incididunt ad qui nostrud sint ullamco. Irure sint deserunt Lorem id officia dolore non. Anim dolor minim sit dolor et sint aliquip qui est. Ex in tempor laborum laboris dolor laboris ullamco quis. Enim est cupidatat consequat est culpa consequat. Fugiat officia in ea ea laborum sunt Lorem. Anim laborum labore duis ipsum mollit nisi do exercitation. Magna in pariatur anim aute.
In incididunt ad qui nostrud sint ullamco. Irure sint deserunt Lorem id officia dolore non. Anim dolor minim sit dolor et sint aliquip qui est. Ex in tempor laborum laboris dolor laboris ullamco quis. Enim est cupidatat consequat est culpa consequat. Fugiat officia in ea ea laborum sunt Lorem. Anim laborum labore duis ipsum mollit nisi do exercitation. Magna in pariatur anim aute.
Right now, pathbreaking AI foundation models follow an inverse Moore’s law (sometimes quipped “Eroom’s Law”). Each new generation is becoming more and more expensive to train as researchers exponentially increase the number of parameters used and overall model complexity. Sam Altman of OpenAI said that the cost of training GPT-4 was over $100 million, and some AI computational specialists believe that the first $1 billion model is currently or will shortly be developed.
As semiconductor chips rise in complexity, costs come down because transistors are packed more densely on silicon, cutting the cost per transistor during fabrication as well as lowering operational costs for energy and heat dissipation. That miracle of performance is the inverse with AI today. To increase the complexity (and therefore hopefully quality) of an AI model, researchers have attempted to pack in more and more parameters, each one of which demands more computation both for training and for usage. A 1 million parameter model can be trained for a few bucks and run on a $15 Raspberry Pi Zero 2 W, but Google’s PaLM with 540 billion parameters requires full-scale data centers to operate and is estimated to have cost millions of dollars to train.
Admittedly, simply having more parameters isn’t a magic recipe for better AI end performance. One recalls Steve Jobs’s marketing of the so-called “Megahertz Myth” to attempt to persuade the public that headline megahertz numbers weren't the right way to judge the performance of a personal computer. Performance in most fields is a complicated problem to judge, and just adding more inputs doesn't necessarily translate into a better output.
And indeed, there is an efficiency curve underway in AI outside of the leading-edge foundation models from OpenAI and Google. Researchers over the past two years have discovered better training techniques (as well as recipes to bundle these techniques together), developed best practices for spending on reinforcement learning from human feedback (RLHF), and curated better training data to improve model quality even while shaving parameter counts. Far from surpassing $1 billion, training new models that are equally performant might well cost only tens or hundreds of thousands of dollars.
This AI performance envelope between dollars invested and quality of model trained is a huge area of debate for the trajectory of the field (and was the most important theme to emanate from our AI Summit). And it’s absolutely vital to understand, since where the efficiency story ends up will determine the sustained market structure of the AI industry.
If foundation models cost billions of dollars to train, all the value and leverage of AI will accrue and centralize to the big tech companies like Microsoft (through OpenAI), Google and others who have the means and teams to lavish. But if the performance envelope reaches a significantly better dollar-to-quality ratio in the future, that means the whole field opens up to startups and novel experiments, while the leverage of the big tech companies would be much reduced.
The U.S. right now is parallelizing both approaches toward AI. Big tech is hurling billions of dollars on the field, while startups are exploring and developing more efficient models given their relatively meagre resources and limited access to Nvidia’s flagship chip, the H100. Talent — on balance — is heading as it typically does to big tech. Why work on efficiency when a big tech behemoth has money to burn on theoretical ideas emanating from university AI labs?
Without access to the highest-performance chips, China is limited in the work it can do on the cutting-edge frontiers of AI development. Without more chips (and in the future, the next generations of GPUs), it won’t have the competitive compute power to push the AI field to its limits like American companies. That leaves China with the only other path available, which is to follow the parallel course for improving AI through efficiency.
For those looking to prevent the decline of American economic power, this is an alarming development. Model efficiency is what will ultimately allow foundation models to be preloaded onto our devices and open up the consumer market to cheap and rapid AI interactions. Whoever builds an advantage in model efficiency will open up a range of applications that remain impractical or too expensive for the most complex AI models.
Given U.S. export controls, China is now (by assumption, and yes, it’s a big assumption) putting its entire weight behind building the AI models it can, which are focused on efficiency. Which means that its resources are arrayed for building the platforms to capture end-user applications — the exact opposite goal of American policymakers. It’s a classic result: restricting access to technology forces engineers to be more creative in building their products, the exact intensified creativity that typically leads to the next great startup or scientific breakthrough.
If America was serious about slowing the growth of China’s still-nascent semiconductor market, it really should have taken a page from the Chinese industrial policy handbook and just dumped chips on the market, just as China has done for years from solar panel manufacturing to electronics. Cheaper chips, faster chips, chips so competitive that no domestic manufacturer — even under Beijing direction — could have effectively competed. Instead we are attempting to decouple from the second largest chips market in the world, turning a competitive field where America is the clear leader into a bountiful green field of opportunity for domestic national champions to usurp market share and profits.
There were of course other goals outside of economic growth for restricting China’s access to chips. America is deeply concerned about the country’s AI integration into its military, and it wants to slow the evolution of its autonomous weaponry and intelligence gathering. Export controls do that, but they are likely to come at an extremely exorbitant long-term cost: the loss of leadership in the most important technological development so far this decade. It’s not a trade off I would have built trade policy on.
The life and death of air conditioning
Across six years of working at TechCrunch, no article triggered an avalanche of readership or inbox vitriol quite like Air conditioning is one of the greatest inventions of the 20th Century. It’s also killing the 21st. It was an interview with Eric Dean Wilson, the author of After Cooling, about the complex feedback loops between global climate disruption and the increasing need for air conditioning to sustain life on Earth. The article was read by millions and millions of people, and hundreds of people wrote in with hot air about the importance of their cold air.
Demand for air conditioners is surging in markets where both incomes and temperatures are rising, populous places like India, China, Indonesia and the Philippines. By one estimate, the world will add 1 billion ACs before the end of the decade. The market is projected to before 2040. That’s good for measures of public health and economic productivity; it’s unquestionably bad for the climate, and a global agreement to phase out the most harmful coolants could keep the appliances out of reach of many of the people who need them most.
This is a classic feedback loop, where the increasing temperatures of the planet, particularly in South Asia, lead to increased demand for climate resilience tools like air conditioning and climate-adapted housing, leading to further climate change ad infinitum.
Josh Wolfe gave a talk at Stanford this week as part of the school’s long-running Entrepreneurial Thought Leaders series, talking all things Lux, defense tech and scientific innovation. The .
Lux Recommends
As Henry Kissinger turns 100, Grace Isford recommends “Henry Kissinger explains how to avoid world war three.” “In his view, the fate of humanity depends on whether America and China can get along. He believes the rapid progress of AI, in particular, leaves them only five-to-ten years to find a way.”
Our scientist-in-residence Sam Arbesman recommends Blindsight by Peter Watts, a first contact, hard science fiction novel that made quite a splash when it was published back in 2006.
Mohammed bin Rashid Al Maktoum, and just how far he has been willing to go to keep his daughter tranquilized and imprisoned. “When the yacht was located, off the Goa coast, Sheikh Mohammed spoke with the Indian Prime Minister, Narendra Modi, and agreed to extradite a Dubai-based arms dealer in exchange for his daughter’s capture. The Indian government deployed boats, helicopters, and a team of armed commandos to storm Nostromo and carry Latifa away.”
Sam recommends Ada Palmer’s article for Microsoft’s AI Anthology, “We are an information revolution species.” “If we pour a precious new elixir into a leaky cup and it leaks, we need to fix the cup, not fear the elixir.”
I love complex international security stories, and few areas are as complex or wild as the international trade in exotic animals. Tad Friend, who generally covers Silicon Valley for The New Yorker, has a great story about an NGO focused on infiltrating and exposing the networks that allow the trade to continue in “Earth League International Hunts the Hunters.” "At times, rhino horn has been worth more than gold—so South African rhinos are often killed with Czech-made rifles sold by Portuguese arms dealers to poachers from Mozambique, who send the horns by courier to Qatar or Vietnam, or have them bundled with elephant ivory in Maputo or Mombasa or Lagos or Luanda and delivered to China via Malaysia or Hong Kong.”