Securities

Thankful Holidays

Photo by Sergei Solo on Unsplash

Thanks for all the fish, err, turkey

With slight homage to Douglas Adams, we’re here this week with a briefer newsletter and off next week entirely for the Thanksgiving holiday.

It’s a tough time out there in a world riven with wars, conflicts, divides and binaries. So whether you believe in the reality, the myth or perhaps the nihilism of Thanksgiving, I encourage every “Securities” reader to seek out what’s positive and what’s worth affirming in their families, friends, communities and world. Fire is destructive, but it is also the fuel for the wick that lights our dinner tables and illuminates humanity’s best path forward. Cherish the present and aspire toward the future — and eat way more than you ever thought possible (the golden formula of progress = Thanksgiving + Ozempic).

Podcast: Thanks for nothing, AI regulators

Photo by Chris Gates via DALL-E
Photo by Chris Gates via DALL-E

I wrote two weeks ago in “Reckless Regulators” that politicians and government bureaucrats seem to have unified around limiting artificial intelligence technology before it even gets started. As I wrote about the situation:

For such a nascent and unproven technology, it behooves us to tread much slower on regulation. We need to encourage widespread experimentation and openness in the development of the bleeding-edge of AI performance and capabilities. We should be encouraging the distribution of open-source AI models to as many scientists and institutions and users as possible. Everyone should have access to the best AI models humanity has ever crafted.

This week’s podcast picks up the theme with a handful of experts on the AI landscape and how the regulatory regime interacts with it. Joining me is Techmeme Ride Home podcast host Brian McCullough, Supervised newsletter founder and writer Matthew Lynley, and Lux’s own general partner Shahin Farshchi.

We talk about the latest regulatory announcements from governments all around the world, and then get into the meat of the debate: is it open or proprietary models of AI that will win in the market? It’s the hot topic du jour, and a perfect complement to an extra helping of stuffing during the holidays.

🔊 Listen to the podcast here

Lux Recommends

  • Our scientist-in-residence Sam Arbesman has a good column on his personal Substack around "The Kitchen Sink Conundrum and Simulation's Balancing Act.” “This tradeoff between complexity and accuracy is a humble realization. Do not add complexity in the hopes of greater verisimilitude, as not only can there be a diminishing return to this effort, but it could even be entirely counter-productive.”
  • Thomas Johnson at the New Civil Engineer discusses why high-speed rail in the United Kingdom just can’t be built. The answer? Overengineering. “[Former Rail magazine editor Nigel Harris] gave the example of the electrification of the Great Western line. He said that, when the East Coast Main Line was electrified in 1989-90, the masts only went 2.5m into the ground and they have never been known to fall over. However, when it came to electrifying the Great Western in recent years, the masts were overengineered to go 10m into the ground to ensure stability to an even greater - perhaps unnecessary - degree.”
  • Sam forwards a fun clock made of marbles in Hackaday. “Here’s how it works: black and white marbles feed into a big elevator. This elevator lifts marbles to the top of the curved runs that make up the biggest part of the device. The horizontal area at the bottom is where the time is shown, with white and black marbles making up the numerical display. But how to make sure the white marbles and black marbles go in the right order?”
  • Our venture associate Alex Marley highlights Microsoft’s announcement this week of its first custom silicon for AI processing applications. “Microsoft is the last of the Big Three cloud vendors to offer custom silicon for cloud and AI. Google pioneered the race to custom silicon with its Tensor Processing Unit, or TPU, in 2016. Amazon followed suit with a slew of chips including Graviton, Trainium, and Inferentia.”
  • Shaq Vayda highlighted a milestone in pharma: the first approval in the UK of a CRISPR-Cas9 therapeutic. Jointly developed by Vertex and CRISPR Therapeutics and targeting sickle cell anemia, Time Magazine said that “Patients first receive a course of chemotherapy, before doctors take stem cells from the patient's bone marrow and use genetic editing techniques in a laboratory to fix the gene. The cells are then infused back into the patient for a permanent treatment. Patients must be hospitalized at least twice — once for the collection of the stem cells and then to receive the altered cells.”
  • Sam and I both read the exciting new research and model out of Google’s DeepMind called GraphCast which can predict weather with more accuracy and forewarning than any other current system. The research was written up in Science: “GraphCast significantly outperforms the most accurate operational deterministic systems on 90% of 1380 verification targets, and its forecasts support better severe event prediction, including tropical cyclones tracking, atmospheric rivers, and extreme temperatures. GraphCast is a key advance in accurate and efficient weather forecasting, and helps realize the promise of machine learning for modeling complex dynamical systems.”
  • Finally, Sam enjoyed James Somers’s thoughtful essay in The New Yorker on “A Coder Considers the Waning Days of the Craft” (a headline which doesn’t do the piece real justice). “Computing is not yet overcome. GPT-4 is impressive, but a layperson can’t wield it the way a programmer can. I still feel secure in my profession. In fact, I feel somewhat more secure than before. As software gets easier to make, it’ll proliferate; programmers will be tasked with its design, its configuration, and its maintenance.”

That’s it, folks. Have questions, comments, or ideas? This newsletter is sent from my email, so you can just click reply.

continue
reading