Notes: grey goo, permanence or otherwise, perspectives on Ross Anderson

The metaphors we use change how we think about things. Benjamin Santos Genta writes for Aeon about this, and in particular how many of our metaphors relate to war. Might an argument be better as a dance than a battle? (I've been trying to demilitarise my work language for a while; it's not always easy. You can, though, feed two birds with one scone, but that works less well with international colleagues.)

ReliefWeb reports on research into the value for money of recent humanitarian innovations. It was good to see my old colleagues at Field Ready featured.

Field Ready piloted a new model of localized medical product manufacturing in Syria, involving healthcare worker training, enhanced digital technology, and partnership-building between local medical device suppliers and medical facilities. Their pilot significantly reduced the time, costs, and carbon emissions of medical device production and repair without compromising product quality. TripleLine found that the aggregate cost of Field Ready repairs was 34% of the price that traditional methods would have incurred. In comparison, the aggregate number of days taken for these repairs was 56% of what would be required. Meanwhile, the evaluation found that Field Ready’s maintenance and repairs in 70% of medical centres were more precise and of higher quality than those of traditional maintenance suppliers.

Ian Betteridge writes about the "information grey goo." It's not that long ago that the nanotech apocalypse was on the agenda and it was amusing to be reminded of this, although this one is looming and hard to counter:

This is the AI Grey Goo scenario: an internet choked with low-quality content, which never improves, where it is almost impossible to locate public reliable sources for information because the tools we have been able to rely on in the past – Google, social media – can never keep up with the scale of new content being created. Where the volume of content created overwhelms human or algorithmic abilities to sift through it quickly and find high-quality stuff.

The social and political consequences of this are huge. We have grown so used to information abundance, the greatest gift of the internet, that having that disrupted would be a major upheaval for the whole of society.

It would be a challenge for civic participation and democracy for citizens and activists, who would no longer be able to access online information, opinions, debates, or campaigns about social and political issues. 

With reliable information locked behind paywalls, anyone unwilling or unable to pay will be faced with picking through a rubbish heap of disinformation, scams, and low-quality nonsense.

Patricia Hernandez writes for Polygon about MrBeast and his influence on YouTube. 

At first, YouTube prized authenticity, and the prototypical creator was likely recording out of their bedroom on an iPhone. Once real money came into the picture, the same creators who built the platform became brands who made a living off of their popularity. ...

MrBeast embodies this ostentatious era of YouTube so fully... You see his influence everywhere, in the types of boisterous content people make, the brisk editing styles populating all of YouTube’s trending content, and even in the way YouTubers style their video thumbnails. ... 

So far, much of what’s been heralded in YouTube’s era of MrBeast sounds bleak, but the next era might already be upon us. If the old YouTube was Instagram, the new YouTube will be more like TikTok...

The internet, in other words, is hungry for authenticity — or at least a person they can detect as human to deliver their content. It’s the very thing YouTube once did best, once the internet moved past the supremacy of blogs.

Google is no longer caching web pages.

But:

screenshot of toot about youtube holding much knowledge and the risk of that going away
https://aus.social/@ajsadauskas/111873470462906049

 Tantek Çelik writes about the spectrum of permanence to ephemeral in the open web:

There is the publicly viewable #OpenWeb that many of us take for granted, meaning the web that is persistent, that lasts over time, and thanks to being #curlable, that the Internet Archive archives, and that a plurality of search engines see and index (robots.txt allowing). The HTML + CSS + media files declarative web.

Then there are the https APIs that return JSON "web", the thing that I’ve started calling the ephemeral web, the set of things that are here today, briefly, gone tomorrow.

...  Nearly the entirety of every Mastodon server, every post, every reply, is ephemeral.

When a Mastodon server shuts down, all its posts disappear from the surface of the web, forever.


Turns out that "if you can't measure it, you can't manage it" isn't the famous, or even correct, quotation it is thought to be

"It is wrong to suppose that if you can’t measure it, you can’t manage it – a costly myth.”

W. Edwards Deming, The New Economics.

Drucker didn't say it either.


Boris Müller writes (on Medium - maybe paywalled?) about the complicated relationship between a person and a computer:

My problem with the term ‘user’ is not that it is inaccurate. My problem is that the term ‘user’ is suggestive. It indicates that a person is still fully in control of all the things that happen on their computer or their mobile phone. And I don’t think that is the case.

Our relationship with digital technology is constantly changing. We have been programmers, operators and users. Now, we are participants, contributors, consumers, clients. We are both actors and audience. But we are no longer users.

It is about time to come up with a new term that reflects the symbiotic relationship human beings have with computers. ....

Even if we stick with ‘user’, I think it is necessary to re-evaluate the term and realise that its meaning has changed. The moment we interact with a computer, we are deeply embedded in a socio-technological setting that is way beyond our direct control.

User Experience Design and User Interface Design should reflect this. As designers, we have to acknowledge that the autonomous user is an illusion. We have to think more about systems, relationships and interdependencies.


Ross Anderson died. A titan of practical security engineering, deeply aware of the human context (social and economic and political) around technology, who shared wisdom and drove change. He helped me with security advice when I was developing the AlertMe connected home platform - including some of the most startling facts that changed how I thought about things and stay with me still, nearly 20 years later. What struck me in all the many obituaries and reflections on his life was the variety of perspectives. People encountered him in such different contexts - as a researcher and educator, as an author, as a provocateur in governance structures, as an activist, as a catalyst for investment and change.  Open Rights Group and Danny O'Brien saw him as a fighter. The Cambridge Department of Computer Science and Technology and Frank Stajano on the Light Blue Touchpaper blog saw him as an academic. Bruce Schneier saw him as a pioneer. Wendy Grossman saw him as a communicator. What a loss. His advice, and Security Engineering, and approach to thoughtful change-making, will stick with me.



You can make almost anything correlate with something. Here's many fun spurious correlations, thanks to Tyler Vigen, complete with generated explanations.

Vaughn Tan writes about uncertainty and in this piece has a useful taxonomy of different kinds of uncertainty, risk, and not-knowing.


Take the long view, and imagine 10 billion years from now (by John Michael Greer and From Adbusters, in Utne Reader).