Sunday, April 26, 2026

fencing the non rivalrous data commons, and the sovereign internet that never really was

Lovely idea, the declarations of independence of cyberspace notwithstanding, the vision of a new politics is not standing up well to the greatfirewalls and random total isolation of states (yes, old fashioned states, as defined by geographic boundaries, including space and maritime, like china, iran, north korea, but now lots of little old places in Europe)....it isn't that hard to find all the ingress/egress points (including satellite updownlinks etc) and plug them after all...


Meanwhile, the data commons that was the World Wide Web is being rivalled to death bu hyperscalers, starting way back with google (search and click through) but now with AI.

Economicists love talk about data as non-rivalrous, because of their naive model of zero copy cost of bits, so anyone can "take" a copy, but the source is also left behind so there's no "loss" of value to the source.

Howver, attention (via meta data, provenance, attribution, even accounting/payment)....is very rivalrous - i only have two eyeballs, and one brain and so many hours each day, so it matters which copy I look at.

When the hyperscalers "add value" to the data, they subtract it from the originators. Worse, by trying to multiply the value (by combining), without recompense to the content creators, they undermine the motives of people to contribute any content. So the consequence is the death of Metcalfe's law (the value of the network is the square of the number of nodes -or at least is super-linear in the number of nodes, as any node that is added is both a consumer and a producer so the adding one more node to n, adds n+1 to the net value, not just 1). This means the hyperscalers' long term business plan is doubly dead. Their value cannot actually multiple once they trained on the common crawl - it will increase (at best) sub-linearly. In fact, as it de-motivates people from contributing any new content, the hyperscalers value will fall. We don't want to be data serfs to the AI overlords.

So AI wasn't an existential threat to humans, but it is an existential threat to knowledge.

Wednesday, April 22, 2026

Computer Science Fiction

 I was randomly thinking about stories that had no actual computing technology dimension, but illustrated a fundamental idea from Computer Science in some accessible way - so here's a few examples to be getting on with...

naming

bootstrapping

indirection

off-by-one-errors 

caching

recursion

encryption


Saturday, March 14, 2026

some of my random acts of digital and literary london history

Bits:-

I was recently trawling through some bits of my past and recall that I was taken to program a DEC PDP-8 (in an outbuilding of the Royal Free hospital) in the 1960s by an enlightened maths teacher - we had to learn machine code (actual machine code, because we had to programme the machine on paper tape and so we learned the binary for the instructions....

A bit later in the 1970s i did some work for my cousin at the London hospital on a database he was building for a surgeon there who did knee&hip replacements and wanted to do some stats on how long each prosthetic type lasted...this was programmed in Algol-68 and involved punch cards and use of a remote ICL 2980 computer down the mile end road at Queen Mary College.

Not long after that, I got a real job at North London Polytechnic (up on Holloway Rd) in the math&CS department, where we ran a DEC-10 and I got to do some Fortran and Cobol (as well as Algol-60) but with the glory of a real glass tty (terminal) and screen editor (SOS)....also used to go to their teacher training outfit on Prince-of-Wales road to setup a modem link for people learning a bit about computers there to use the DEC -10 remotely...

Moving to the "modern" era, in 1981 I was writing C Code on a PDP-11 (using the DED screen/picture editor) and cross loading code to a DEC LSI 11/23...at UCL down on Gower St...

Words:-

Through the same 3 decades, my family (based mainly in Parliament Hill Fields, then later in Camden town) had a sequence of literary/political friends, so as a kid I was playing in Stella Gibbons house (with her grandson), or else with Benji Webb on Highgate hill - at some point we had a holiday with ponies and caravans near their grandmother's (the famous Beatrice Webb of the Fabians etc). Then I recall my father asking me if I knew who Ivor Cutler was, as my dad went drinking with him in local pubs in Mornington Crescent. We also knew Beryl Bainbridge (we were at this point living in Arlington Road and she was round the corner in Albert Street surrounded by eccentrics and a large cloud of cigarette smoke.

Saturday, February 07, 2026

from cypberspace to necropolis

The Internet was dismantled as a space for humanity, first by the loss of community binding through person-to-person trust and group dynamics - 

blame blockchain/cryptocurrencies, but they are just one symptom of the alienation, that built environments like cities, then suburbs, then oneline social media and the web finally perfected.

As with crypto-currencies, so with AI, but this was foreseen by Mbemebe  in the analysis in the great book on Necropolitics

I was the Son of Sam, I contain multitudes, Now  I am become Death, the destroyer of worlds.

As with commons, land was enclosed, by Barons, by Kings, by Nation States - to keep people outbut camps and campfirewalls to keep people and ideas in and so with trade agreements, and the flow of information and knowledge, and so with the Interweb.

The Zombie Apocalypse is here and it is AI Voodoo, just as Neuromancer pressaged. And the zombies are the camp police. And we are interned. stoned. maculate, and soon not even dead.

p.s. a little machine learning is a dangerous thing...


Tuesday, January 27, 2026

fabless industries aren't exactly new

I'm reading Apple in China (see), which is interesting at the detail level (basically how they trained up with the help of Taiwan literally millions of skilled people to produce all their tech - it is now hardly surprising that china doesn't need the US's help any more to make new stuff...)

But the author seems to think that outsourcing tech manufacturing was something terribly new and clever.

Don't get me started on clothing and the east india company and the british empire...

Closer to home for the USA, however, is a much more instructive story - the electric guitar:

(Also lots of other instruments- remember Yamaha flutes were jolly good an way cheaper than Gemeinhardt)

but gibeon and fender both started manufacturing in mexico and japan then later in indonesia - sometimes trying to "brand" things different(ly) so western prejudice about "lower build quality" from those foreigners was offset by calling things Epiphone or Squier (actually its a bit more complex than that, but you get the idea)...

But in reality the outsorced products weren't just a whole lot cheaper. I own a japanese made custom shop fender strat and a 1982 squier tele (possibly/probably made in korea) - both are a very very good - better than instruments i've tried at 5-10 times the price (the tele was off a friend but if you look at the time they were £150. the strat was used so hard to know what new cost was but probably £1500.

And nowadays, i'd still buy a squier or epiphone (i had a beautiful 335 for a while) or just bite the bullet and buy an ibanez (actually, just did - £300 - fantastic quality - though i did recently buy a G&L Tribute (fretless) bass which was made in the USA and was very sensibly priced (unde £500).

Interestingly, just reading innovation in real places too which tells a similar story about bike manufacturing moving from US (schwinn et al) to Taiwan (Giant) for exactly the same set of reasons....

Anyhow, what Apple did (hollowing themselves out, bleeding to death) was absolutely nothing new. It wasn't even rock'n'roll.

Sunday, January 18, 2026

Mind your Ps and Qs, and the K, M, G, and Ts will look after themselves....

  • M I remember back in the early 1980s Sun Microsystems (old unix workstation maker of fine machines) said that the breakthrough was 3Ms - Megabyte of memory, MegaHz processor speed and Megabits of networking (I seem to recall 4Mb, 4MHz and 3Mbps) - prior to this, of course, we'd had 64Kbytes of RAM in PDP11s and 64kbps of network speed and some Khz process clocks).
  • G A decade later, Craig Partridge published a great book on Gigabit networking,  and storage and processor speeds were, indeed, at least heading for Gbytes and GHz.
  • T It took a bit longer, but it is certainly possible to get Tbps or close (800Mpbs) of networking, and Tbytes of storage (RAM is still a bit pricey for that to be common, but in cluster compute its a thing, but on my laptop, Tbyte SSD is totall affordable). While Moore's law ran out of steam fairly recently, so an individual core might still be GHz, I can have a lot of cores, so total processing throughput (including CPU and GPUs) might easily look like THz.
  • P It's interesting to speculate what tech will look like for Petabits a second, but the optical fiber photonics folks definitely can do that and there are other transmission technologies which are fast and affordable. In storage, there seems to be no obvious reason - in fact if either of glass or synthetic DNA storage get on the market (and that is not science fiction) then Petabyte storage (at least write once/appeand) could easily be affordable soon too. So then there's processing speed - i think this will probably not happen in a simple way, partly because it probably isn't necessary in personal devices, but I could be wrong - I don't think it would come from neuromorphic hardware because that doesn't have to be fast, it is just massively more efficient than using tensor processors to do operations on neural network graph data structures. 
  • Q - so what about quantum computing? While they don't really work yet, and also are not exactly going to fit in your new rayban specs or wrist-borne device, they might help speed things up (in some possibly rather limited, cases), however...
Many of those hyperscale companies led by personalities are guilty of massive hubris. Because they are hugely successful in one domain, their leaders assume they can repeat that in arbitrary other areas, and I think this has a massively negative effect on anyone else making progress (or even just getting funding) - 4 examples of over-claiming that have happened multiple times in recent years...
  • Quantum Supremacy - multiple times there have been announcements that embarassingly jumped the gun on people actually making a working quantum computer that actually a) has enough Q bits to be useful and b) has Q bits that don't get overwhealmed by noise or decoherence so fast they are not even a ghost in the machine.
  • Neural Interfaces - there have been beautiful experiments with bio-feedback systems that are now being used to treat various neurological conditions (and problems with phyisical consequences like Parkinsons) - but having an every day mind-computer interface is still science fiction really. Despite various people jumping the gun.
  • Metaverse- we've (I myself included) being doing decades of work in immersive virtual reality. It was a staple of cyberpunk SF books 30+ years back (and more). But it is still a mess outside of games. While augmented reality as a tool (e.g. for repairing things or even surgery) will totally be a thing, I think the metaverse (as per Snow Crash, at least, let alone Neuromancer or Altered Carbon or even just the old Star Trek Holodeck) is not with us yet affordably. I think this is more a failure of use case/compelling application than actual technology, however.
  • AGI - all the money in the world is being spent on better predictive text, but the same people getting the money promise AGI. Hah. Bunch of Marketroids (that's not an insult - they are very good at it given the levels of investment they are getting compared to the total paucity of actual usefulness of their tech beyond prompting governments to worry about their energy grids).


Thursday, December 04, 2025

UK government introducing digital id by stealth - without proper inclusion or safeguards as far as I can see...

 If you are a director of a company in England, you will recently have received a letter requiring you to register for this: https://www.gov.uk/using-your-gov-uk-one-login to be able to continue to do some things (e.g. mandatory for directors of companies). https://www.gov.uk/guidance/verifying-your-identity-for-companies-house

When i did this, the website said that you could do this at a post office 
if you only had paper documentation (e.g. birth cerfiticates, paper passports etc) 
and could use notaries to verify some of the documents.

However, it seems that this alternative may not have ever worked properly,
since now, if you look at the companies house web site guidance,
you have to use the digital service first to input your document details online..

So talking to someone else more recently , they said that this meant they had to step down as a company
director since they did not wish (or perhaps could not) use a digital id upload - 

Is this legal? The government is effectively ruling out people from some things in 
society if they cannot use the digital service -  it appears that companies house forces this, as when you look at the link for verifying your id via the postoffice, 

This seems inequitable at least, and possibly completely a massive security danger,
given the risk of leakage of photo id which many online organisations have so 
publically experiences in recent years....

Also, the behavoural/liveness checks on the photo-id look naive -  when many biometric systems are moving to multi-modal  (e.g. face+movement, + speech / speaker recognition when saying random given phrase or face + movement + fingerprint or iris etc etc)

So is the uk government a) introducing digital id by stealth? and worse b)
introducing a centralised, insecure system and c) ignoring any requirements
for inclusivity? and d) without any public debate?

Blog Archive

About Me

My photo
misery me, there is a floccipaucinihilipilification (*) of chronsynclastic infundibuli in these parts and I must therefore refer you to frank zappa instead, and go home