This is a newsletter about “Securities” — national security, economic security, health security and how all those myriad of securities aggregate to human wellbeing. It’s been a slog covering this beat in 2022, with an economic recession driving up hunger in the developing world, Russia’s land war against Ukraine, and the ongoing and growing toll of Covid-19 and now monkeypox. Positivity is elusive when political leaders are resigning (UK), getting shot (Japan), sort-of-kind-of resigning (Italy) and having their official homes ransacked by protesters (Sri Lanka).
But damn, sometimes we can just leave the bleak and sinister reality of this blue jewel for the vivid celestial cosmos that abounds.
By now, I hope most “Securities” readers have taken the opportunity to peer deep into the furthest reaches of space and time offered by the first photos taken by the James Webb Space Telescope. I covered the telescope’s intrepid movements a few times in the newsletter, and our anticipation has now been rewarded with some of the most stellar (hardy-har-har) photography of galaxies, nebula, and the vast blackness of space that our species has ever produced.
If the response on Twitter, news channels, commentators and among friends is any indication, this feat has placed much of humanity in a pure state of awe. Why is that, and is that magic reproducible in other contexts?
First, the photos touch on something we don’t get to experience all that often anymore — the sublime, which the New Oxford American dictionary defines as “of such excellence, grandeur, or beauty as to inspire great admiration or awe” and Wikipedia condenses to the “quality of greatness.”
Witnessing these photos is a transcendent experience, but one deeply embedded in the fabric of humanity. In an era mostly bereft of spirituality, the Webb’s photos recontextualize each of us into the grand narrative of our species, our planet, and the universe. They’re a reminder of how small we are, how tribal, how riven with hatreds and divisions we can be, when really, do those twinkling stars give a single iota of care at all?
As modern artists increasingly trivialize their crafts to satisfy the whims of tasteless consumers, it’s downright refreshing to undergo an aesthetic experience that’s both authentic and awe-inspiring. This week was an opportunity to glimpse humanity’s grace and exceptional performance at its acme.
Second, the Webb telescope is nothing short of an incredible feat of engineering and a definitive achievement of the human intellect. Reams of articles and documents have been written about all facets of the telescope, and it’s unfair to try to summarize that copious work in just a few lines. But let’s just say that the constraints the Webb team faced, from building the telescope and launching it into space to positioning it at precisely the right Lagrange point in our solar system and now keeping it maintained, are legion and exacting.
We need these achievements, if only to remind ourselves that there is unlimited potential and ingenuity stocked up among H. sapiens. As I wrote last week in “Dissonant loops”:
What we have seen so far in this destructive phase is a completely limp and listless response to challenges that are hardly the toughest we’ve ever confronted. We’re talking supply chain disruptions, a phenomenon that’s modelable, controllable, fixable. If America can build the arsenal of democracy and fight fascist dictatorships in two theaters during World War II, we can deliver infant formula and put a consistent number of planes in the sky. We can get ships to port, unload said ships, and propel those vessels back to where they came from. We know how to build factories, and we know how to build them fast. That’s as true in America as it is England as it is in France.
None of our problems on Earth require the ingenuity demonstrated by the Webb engineering team. Food, water, housing, health, and employment crises are all human made with off-the-shelf solutions available whenever we want to use them.
Third, there is a universal joy that the sublime and this achievement offer us. Distant galaxies were splashed across front pages all around the world, with newsstands becoming their own universes of potential. No one is naive to think that the Webb’s first success is being ignored by national leaders — the space race is very much “on” between America and China, as well as Russia, Japan, South Korea, India, Brazil and others. But the possession of these images feels open, liberal, something shared collectively as a whole rather than owned individually by any one faction or government.
There is something anti-tribal about this form of scientific progress, a reminder that our daily zero-sum international politics and economic battles can actually be positive sum in the asymptote. Progress doesn’t need to be owned and wielded against others; sometimes, it can really just offer everyone a better life, or at minimum, aspirations for a better future. It’s a nostalgic throwback to the progress witnessed since the 1850s, as living standards were transformed for hundreds of millions and now increasingly billions of people.
Fourth and finally, the Webb telescope is a triumph of globalization and cooperation. By NASA’s estimate in 2020, “Thousands of scientists, engineers, and technicians from 14 countries, 29 U.S. states, and Washington, D.C. contributed to build, test, and integrate Webb. In total, 258 distinct companies, agencies, and universities participated – 142 from the United States, 104 from 12 European nations, and 12 from Canada.”
Those statistics belie even further globalization though. The openness of U.S. borders and America’s magnetism for top scientific and technical talent is the linchpin that supports this work. The universal curiosity in the Webb’s photos are only matched by the universal talent available to us when we keep the drawbridges lowered and allow every person the full support to pursue excellence in their work.
Webb is the world — and America — at its finest.
We can’t forget the immense challenges that face us right here on Earth, but we all need the reminder that the grand fusion of billions of human minds can offer us so much more. Those distant galaxies are worth the effort to peer into, since in focusing on those distant stars, we reflect a light back on ourselves.
The future of defense tech with Anduril
Like an asteroid hitting Earth, it’s now time to come down from space to reality. The hatreds on our planet aren’t going to be transcended quite so easily, and that means if America and the democratic world want to protect themselves from authoritarian adversaries, they’re going to need the best tech to do so.
On the “Securities” podcast this week, we had a full house with the leadership of Anduril Industries (a Lux portfolio company) joining us to talk all things defense tech. Founder Palmer Luckey, co-founder and CEO Brian Schimpf and co-founder and Executive Chairman Trae Stephens joined Josh Wolfe and myself as we discussed the genesis of Anduril, how Luckey’s hardware experience at Oculus influenced the company’s approach to building new products, the urgency for new technology at the Pentagon, and what messages should be taken from the recent success of Top Gun: Maverick starring Tom Cruise.
🔊 Take a Listen to “The United States has never won a conflict with the hardware that it had going into it.”
Last week, we also published a short episode with our scientist-in-residence Sam Arbesman and our producer Chris Gatestalking about the meaning of art in the age of DALL-E. Is AI-generated art, art? Do humans have to be involved? How do we acknowledge that DALL-E’s neural network model is built from the grist of human art and photography? It’s a fun and philosophical episode.
🔊 Take a Listen to “How will AI art generators affect human creativity?”
BMW, USB-C and the opposite of building
On the complete opposite pole of the sublime experience of the James Webb Space Telescope lies two stories of such sheer lunacy that one wonders whether the future should just be canceled.
First, James Vincent has the ridiculous tech headline of the year in The Verge with “BMW starts selling heated seat subscriptions for $18 a month,” following earlier stories that have percolated from Rob Stumpf at The Drive and Lawrence Hodge at Jalopnik. The company is pursuing recurring revenue opportunities around vehicle ownership, apparently settling for microtransactions on premium features like heated seats. Per Vincent:
A monthly subscription to heat your BMW’s front seats costs roughly $18, with options to subscribe for a year ($180), three years ($300), or pay for “unlimited” access for $415.
They say that the road to hell is paved with good intentions, but sometimes it’s just paved through corporate stupidity and greed.
There are reasonable concerns about device waste, lock-in, and intellectual property that together have driven momentum for standardized ports. Nonetheless, it’s gobsmacking to think that a phone feature which has seen significant innovation over the past three decades should be frozen in amber for time immemorial. Charging speeds, data transfer bandwidth, reliability, better design and more have improved with each iteration of these ports. I want whatever comes after USB-C, and look forward to getting it, if Congress doesn’t stand in the way.
Lux Recommends
Peter Hébert recommends this great animated chart of the top 10 websites on the U.S. internet over the past three decades:
As the world suffers another record hot summer, our summer associate Ben Jachim-Gallagher recommends The New York Times’ “Postcards from A World on Fire” from last year showing the scale and diversity of climate devastation.
Sam Arbesman recommends CLIPasso, a Best Paper awardee at SIGGRAPH 2022, which uses machine learning to abstract complex photography into simpler sketches. As the authors note, “Abstraction entails identifying the essential visual properties of an object or scene, which requires semantic understanding and prior knowledge of high-level concepts. Abstract depictions are therefore challenging for artists, and even more so for machines.” Sam also recommends an article in Nature by Davide Castelvecchi on how DeepMind’s AI platform learns physics like a baby.
Finally, with all this space talk, it’s time to think about alien contact again. Sam recommends Daniel Oberhaus’s book Extraterrestrial Languages, which asks two provocative questions, “If we send a message into space, will extraterrestrial beings receive it? Will they understand?”
That’s it, folks. Have questions, comments, or ideas? This newsletter is sent from my email, so you can just click reply.
Forcing China’s AI researchers to strive for chip efficiency will ultimately shave America’s lead
In incididunt ad qui nostrud sint ullamco. Irure sint deserunt Lorem id officia dolore non. Anim dolor minim sit dolor et sint aliquip qui est. Ex in tempor laborum laboris dolor laboris ullamco quis. Enim est cupidatat consequat est culpa consequat. Fugiat officia in ea ea laborum sunt Lorem. Anim laborum labore duis ipsum mollit nisi do exercitation. Magna in pariatur anim aute.
In incididunt ad qui nostrud sint ullamco. Irure sint deserunt Lorem id officia dolore non. Anim dolor minim sit dolor et sint aliquip qui est. Ex in tempor laborum laboris dolor laboris ullamco quis. Enim est cupidatat consequat est culpa consequat. Fugiat officia in ea ea laborum sunt Lorem. Anim laborum labore duis ipsum mollit nisi do exercitation. Magna in pariatur anim aute.
Right now, pathbreaking AI foundation models follow an inverse Moore’s law (sometimes quipped “Eroom’s Law”). Each new generation is becoming more and more expensive to train as researchers exponentially increase the number of parameters used and overall model complexity. Sam Altman of OpenAI said that the cost of training GPT-4 was over $100 million, and some AI computational specialists believe that the first $1 billion model is currently or will shortly be developed.
As semiconductor chips rise in complexity, costs come down because transistors are packed more densely on silicon, cutting the cost per transistor during fabrication as well as lowering operational costs for energy and heat dissipation. That miracle of performance is the inverse with AI today. To increase the complexity (and therefore hopefully quality) of an AI model, researchers have attempted to pack in more and more parameters, each one of which demands more computation both for training and for usage. A 1 million parameter model can be trained for a few bucks and run on a $15 Raspberry Pi Zero 2 W, but Google’s PaLM with 540 billion parameters requires full-scale data centers to operate and is estimated to have cost millions of dollars to train.
Admittedly, simply having more parameters isn’t a magic recipe for better AI end performance. One recalls Steve Jobs’s marketing of the so-called “Megahertz Myth” to attempt to persuade the public that headline megahertz numbers weren't the right way to judge the performance of a personal computer. Performance in most fields is a complicated problem to judge, and just adding more inputs doesn't necessarily translate into a better output.
And indeed, there is an efficiency curve underway in AI outside of the leading-edge foundation models from OpenAI and Google. Researchers over the past two years have discovered better training techniques (as well as recipes to bundle these techniques together), developed best practices for spending on reinforcement learning from human feedback (RLHF), and curated better training data to improve model quality even while shaving parameter counts. Far from surpassing $1 billion, training new models that are equally performant might well cost only tens or hundreds of thousands of dollars.
This AI performance envelope between dollars invested and quality of model trained is a huge area of debate for the trajectory of the field (and was the most important theme to emanate from our AI Summit). And it’s absolutely vital to understand, since where the efficiency story ends up will determine the sustained market structure of the AI industry.
If foundation models cost billions of dollars to train, all the value and leverage of AI will accrue and centralize to the big tech companies like Microsoft (through OpenAI), Google and others who have the means and teams to lavish. But if the performance envelope reaches a significantly better dollar-to-quality ratio in the future, that means the whole field opens up to startups and novel experiments, while the leverage of the big tech companies would be much reduced.
The U.S. right now is parallelizing both approaches toward AI. Big tech is hurling billions of dollars on the field, while startups are exploring and developing more efficient models given their relatively meagre resources and limited access to Nvidia’s flagship chip, the H100. Talent — on balance — is heading as it typically does to big tech. Why work on efficiency when a big tech behemoth has money to burn on theoretical ideas emanating from university AI labs?
Without access to the highest-performance chips, China is limited in the work it can do on the cutting-edge frontiers of AI development. Without more chips (and in the future, the next generations of GPUs), it won’t have the competitive compute power to push the AI field to its limits like American companies. That leaves China with the only other path available, which is to follow the parallel course for improving AI through efficiency.
For those looking to prevent the decline of American economic power, this is an alarming development. Model efficiency is what will ultimately allow foundation models to be preloaded onto our devices and open up the consumer market to cheap and rapid AI interactions. Whoever builds an advantage in model efficiency will open up a range of applications that remain impractical or too expensive for the most complex AI models.
Given U.S. export controls, China is now (by assumption, and yes, it’s a big assumption) putting its entire weight behind building the AI models it can, which are focused on efficiency. Which means that its resources are arrayed for building the platforms to capture end-user applications — the exact opposite goal of American policymakers. It’s a classic result: restricting access to technology forces engineers to be more creative in building their products, the exact intensified creativity that typically leads to the next great startup or scientific breakthrough.
If America was serious about slowing the growth of China’s still-nascent semiconductor market, it really should have taken a page from the Chinese industrial policy handbook and just dumped chips on the market, just as China has done for years from solar panel manufacturing to electronics. Cheaper chips, faster chips, chips so competitive that no domestic manufacturer — even under Beijing direction — could have effectively competed. Instead we are attempting to decouple from the second largest chips market in the world, turning a competitive field where America is the clear leader into a bountiful green field of opportunity for domestic national champions to usurp market share and profits.
There were of course other goals outside of economic growth for restricting China’s access to chips. America is deeply concerned about the country’s AI integration into its military, and it wants to slow the evolution of its autonomous weaponry and intelligence gathering. Export controls do that, but they are likely to come at an extremely exorbitant long-term cost: the loss of leadership in the most important technological development so far this decade. It’s not a trade off I would have built trade policy on.
The life and death of air conditioning
Across six years of working at TechCrunch, no article triggered an avalanche of readership or inbox vitriol quite like Air conditioning is one of the greatest inventions of the 20th Century. It’s also killing the 21st. It was an interview with Eric Dean Wilson, the author of After Cooling, about the complex feedback loops between global climate disruption and the increasing need for air conditioning to sustain life on Earth. The article was read by millions and millions of people, and hundreds of people wrote in with hot air about the importance of their cold air.
Demand for air conditioners is surging in markets where both incomes and temperatures are rising, populous places like India, China, Indonesia and the Philippines. By one estimate, the world will add 1 billion ACs before the end of the decade. The market is projected to before 2040. That’s good for measures of public health and economic productivity; it’s unquestionably bad for the climate, and a global agreement to phase out the most harmful coolants could keep the appliances out of reach of many of the people who need them most.
This is a classic feedback loop, where the increasing temperatures of the planet, particularly in South Asia, lead to increased demand for climate resilience tools like air conditioning and climate-adapted housing, leading to further climate change ad infinitum.
Josh Wolfe gave a talk at Stanford this week as part of the school’s long-running Entrepreneurial Thought Leaders series, talking all things Lux, defense tech and scientific innovation. The .
Lux Recommends
As Henry Kissinger turns 100, Grace Isford recommends “Henry Kissinger explains how to avoid world war three.” “In his view, the fate of humanity depends on whether America and China can get along. He believes the rapid progress of AI, in particular, leaves them only five-to-ten years to find a way.”
Our scientist-in-residence Sam Arbesman recommends Blindsight by Peter Watts, a first contact, hard science fiction novel that made quite a splash when it was published back in 2006.
Mohammed bin Rashid Al Maktoum, and just how far he has been willing to go to keep his daughter tranquilized and imprisoned. “When the yacht was located, off the Goa coast, Sheikh Mohammed spoke with the Indian Prime Minister, Narendra Modi, and agreed to extradite a Dubai-based arms dealer in exchange for his daughter’s capture. The Indian government deployed boats, helicopters, and a team of armed commandos to storm Nostromo and carry Latifa away.”
Sam recommends Ada Palmer’s article for Microsoft’s AI Anthology, “We are an information revolution species.” “If we pour a precious new elixir into a leaky cup and it leaks, we need to fix the cup, not fear the elixir.”
I love complex international security stories, and few areas are as complex or wild as the international trade in exotic animals. Tad Friend, who generally covers Silicon Valley for The New Yorker, has a great story about an NGO focused on infiltrating and exposing the networks that allow the trade to continue in “Earth League International Hunts the Hunters.” "At times, rhino horn has been worth more than gold—so South African rhinos are often killed with Czech-made rifles sold by Portuguese arms dealers to poachers from Mozambique, who send the horns by courier to Qatar or Vietnam, or have them bundled with elephant ivory in Maputo or Mombasa or Lagos or Luanda and delivered to China via Malaysia or Hong Kong.”