Securities

Software Ephemerality

Photo by Matt Mets via Flickr / Creative Commons

How permanent should software be?

We take for granted just how effortlessly consistent our daily life is. Our stoves burn with the swoosh of a knob, our stop signs (at least in the U.S.) are a stable hue of red, our wall outlets haven’t changed prongs in decades. The stairs, the shower head, the garage door — hardware devices that may not be the pinnacle of design, but at least tirelessly function as one would expect.

That stable world is being shaken though. Hardware is getting smart, and thus dumb. Smart fridges crash after a software update and can’t be opened. Smart light bulbs can’t find a Wi-Fi signal and fail to illuminate. Autonomous cars don’t stop properly, or they just drive away when the cops pull them over. As anyone who has ever tried to trick out their home with smart devices, “just plug it in” isn’t exactly the user experience anyone discovers.

C.P. Snow once coined “two cultures” to describe the epistemological differences between science and the humanities, but a modern version within engineering would divide between the “hard” engineering fields of electrical, civil, and hardware versus the “soft” engineering of, well, software. When the two worlds fuse, the kludge of software tends to dominate over the stability of hardware.

Our scientist in residence Sam Arbesman wrote a short post recently on the lack of long-term thinking in the world of software. He notes, “But here’s the problem: tech, and especially software, is inherently transient. Code, in many ways, is fragile and delicate: it can easily fail or become buggy or be rendered obsolete. An online service rarely operates the way it should even just a few years after it is created.” In his view, that culture creates a focus on rapid but ephemeral action: if you try to code a stable product, watch as others race ahead.

Indeed, the half life of software viability can be sometimes as short as hours for those in quantitative financial trading. There are roads built by Romans that are still functional today.

This is a choice. And it’s an important one to critique as software continues to dominate not just the economy but also the future of labor. In March in “We’re going to try to stop him,” I noted that:

The pressure will only intensify. As I wrote in TechCrunch in late 2020, the no-code generation is arriving. Empowered by platforms like Roblox and Minecraft, kids growing up today are learning digital skills and coding at earlier ages and with greater facility than ever before. If I am optimistic about anything, it is that the next generation is going to have an incredible wealth of talent to use technology to change the world.

There is certainly bountiful software talent maturing, but what happens when that acumen and ambition mixes with the current ephemerality of our software culture?

Nothing short of a generational waste.

Software is about abstractions, with each layer of code building on the others below. When lower layers of code change, it forces changes to all downstream layers. That’s why the kernel of an operating system tends to be extremely stable while the software libraries built on top of it are mostly stable but evolving. No application could be written if that infrastructure were any less permanent.

Unfortunately, far too many software layers in our world today are unstable. Every application feels like it’s built on a fault line, just waiting for the earthquake to shake all the code asunder. Apple and Google tweak their mobile operating systems, and tens of thousands of developers have to assess the damage.

That chaos and rapid adaption made sense over the past few decades as software engineers explored just what was possible with computing and code. There’s still plenty of areas where that hyper iteration should remain the rule of the law, but simultaneously, there are more and more areas where the expected functions of software are well defined and stability should reign supreme. We need to transition more of our software into the latter zones from the former.

Interestingly, the mostly immutable nature of crypto and blockchain technologies is one bright light in the search for software stability. The Bitcoin protocol and its reference implementation have had updates applied to it over the years, but fundamentally, it remains essentially indistinguishable from the technology described in the Satoshi white paper published in 2008. Fourteen years is a very long time in software.

The immutability of crypto protocols has been mocked relentlessly with each new bug and hack discovered. But that mocking misses a key observation: that software engineers are not comfortable at all today with immutability. The very notion that software can never be patched or updated is anathema to all but a handful of software engineers working in high availability technologies like phone switching or air traffic control.

That’s precisely the cultural change we need to start inculcating though. We need to improve the half life of software, and in order to do that, we need to reset the expectations developers have for the quality of their work. Coding will need to be slower and more deliberate. More peer reviews will be required, and some level of that dreaded “bureaucracy” is all but inevitable to raise the quality bar.

With new, higher standards in place though, something miraculous will happen: software can and will just work. Cybersecurity issues that plague hastily written software connected to the internet will be minimized, if not eliminated entirely. Perhaps most importantly, users will enjoy a consistent experience and a sense of relaxation when their favorite app doesn’t suddenly fail or change at a moment’s notice.

There are a couple of accelerants that can move the industry forward. From a decentralized perspective, software engineers can determine the culture of their own teams and the stability of their products. They have the power to change the trajectory of software for the better by instituting their own higher standards.

The key accelerant though in my mind is around insurance and warranties. Cyberhacks remain a measly cost for most companies, who face limited consequences for poorly-written software. In addition, companies offer only short guarantees for their software interfaces without long-term enterprise support contracts.

We can change the balance here. Cyber risk insurance can be made mandatory, with much more variance in premiums between companies with strong and stable coding practices and companies with churn-and-burn approaches. Warranties on software could be required and extended, forcing companies to consider how to support their software for more than just the short-term throwaway timeframe they are using today.

Such a standard would change not just the artifact of software itself, but would also instill a mode of thinking for the next generation of software engineers rising through school and into the workforce. No child needs to provide a warranty for their Roblox games, but a transition to professionalization will naturally happen if the software industry actually has high professional standards around its work.

Our expectations around software are so low, it doesn’t even faze us when even the simplest features fail. We need to demand better. No designer changes the color of stop signs every year, and no software engineer should have to change their software on such a rapid cadence. It’s time to do what Snow asked with “two cultures” and E.O. Wilson discussed in his book “Consilience” — we need to bridge the systems of thinking that divide our engineering disciplines and bring them together so that the smart devices of the future are actually smart.

Science funding and the growing specter of disasters

Glenn Beltz / Flickr / Creative Commons
Glenn Beltz / Flickr / Creative Commons

Speaking of two cultures, this week we also had the publication of two mammoth reports.

Sam highlighted New Science’s new report on the National Institutes of Health (NIH) written by Matt Faherty, who interviewed dozens of intramural and extramural NIH research scientists to explore what’s working right and failing at America’s premier and nonpareil biomedical science funding organization.

There is much to love with the NIH — America dominates in biomedical research for a reason — but the institution has also become sclerotic and unwilling to adapt according to Faherty, who believes that adding more money to the NIH’s budget could actually complicate the organization more than help it:

Giving the NIH more money could lead to more of the same. More low-margin research will be funded by a hopelessly broken study section system. The universities will take more money for their giant, new laboratories and colossal administrative staff. The entire bioscience industry will have an even stronger incentive to base all of its standards on the NIH. Bioscience as a whole could slow in the long run.

Perhaps not surprising given New Science’s mission, but the conclusion is for the NIH to exert less monopolistic control over the future of biomedical research and to pluralize the research funding system, encouraging more experimental research and less of the kind of conservative but expensive research that has become a mainstay in recent years.

The NIH has massively shifted its budget the past two years in response to the Covid-19 pandemic, and it’s precisely the growing scale of disasters that the United Nations Office for Disaster Risk Reduction’s new 2022 report on the state of disasters takes stock of. The trend is simple: disasters are up, and so are the costs of stopping and responding to them. From the summary of the report:

Risk creation is outstripping risk reduction. Disasters, economic loss and the underlying vulnerabilities that drive risk, such as poverty and inequality, are increasing just as ecosystems and biospheres are at risk of collapse. Global systems are becoming more connected and therefore more vulnerable in an uncertain risk landscape. Local risks, like a new virus in Wuhan, China, can become global; global risks like climate change are having major impacts in every locality. Indirect, cascading impacts can be significant.

As I write this, a heat wave in India and Pakistan is putting millions at risk as their climates exceed the wet bulb temperature for human survival.

Clocking in at 256 pages, the report’s not a short read. But what it lacks in brevity it more than makes up for in dynamic range. This report has everything from human psychology around decision making to complex systems analysis to a panoply of actions small to systemic on how to compensate for the growing risks that plague our planet today. Risk reduction isn’t just a duty of governments — it applies to every investment fund and corporate executive as well. So skim the report, and also check in with the risk management team as well (they do exist, right?)

“Securities” Podcast: The future of biotech is moving from bench to beach

The
The "Securities" podcast. Artwork by Chris Gates

While a huge amount of attention is being directed at crypto and media these days, one of the most important wild card investment trends of the 2020s is the coming expansion of biotech. Democratized science tools, improved research networking, and lab automation will revolutionize the practice of biotech, and that means there are huge opportunities for intrepid founders. But there’s a catch: biotech stock performance has been abysmal the past year, and many investors are walking away from the market.

Josh Wolfe joins me to talk about what the gyrations in the biotech markets means for startups, some developments around the Human Genome Project, and what strategies existing biotech firms can take to weather the coming consolidation and reinvention of the industry.

Listen here on Anchor (which has direct links to the episode on Apple Podcasts and Spotify as well). And as always, you can subscribe to the podcast on Apple Podcasts and Spotify.

Lux Recommends

  • Sam Arbesman recommends Arthur Brooks’s new book “From Strength to Strength: Finding Success, Happiness, and Deep Purpose in the Second Half of Life." He says it's “great and has hints of a modern-day Ecclesiastes.”
  • Peter Hébert recommends Liz Sly’s article in the Washington Post on how citizen action in Belarus slowed the Russian advance in Ukraine. “Starting in the earliest days of the invasion in February, a clandestine network of railway workers, hackers and dissident security forces went into action to disable or disrupt the railway links connecting Russia to Ukraine through Belarus, wreaking havoc on Russian supply lines.”
  • Shaq Vayda recommends a new research paper from a team of University of Illinois and Google researchers who built a supervised learning model to translate molecules to text-based language descriptions. Natural language processing has seen incredible advances in the past few years, and new technologies like this MolT5 model here could allow that progress to be applied to molecular science.
  • Sam recommends a new report from Lee Drutman publishing through the Niskanen Center on “How Democracies Revive.” His solution involves institutional changes to the United States to add proportional and multi-party elements to our political system.
  • I recommend Elizabeth Pennisi’s article in Science on the psychobiome. “A growing number of researchers see a promising alternative in microbe-based treatments, or ‘psychobiotics,’ a term coined by neuropharmacologist John Cryan and psychiatrist Ted Dinan, both at University College Cork.”

That’s it, folks. Have questions, comments, or ideas? This newsletter is sent from my email, so you can just click reply.

continue
reading