Ric Amurrio
18 min readApr 2, 2019



we have more and more problems that are getting more and more tangled. We’ve got to be able to solve them at scale.

Douglas Engelbart


To the best of my knowledge, the first rock musical was performed fifty years ago today. No, you probably haven’t heard of it; it ran as scheduled for two nights at Swarthmore College, near Philadelphia. Ted Nelson wrote it and directed it in his Junior year, when he was twenty. There is even an LP.


Thin, lanky, with a sharp chin and always a smile, Ted Nelson came from Hollywood parents and was determined to be an outsider, because, only the outsiders were “where it’s at.”He was always more writer than hacker, and didn’t always fit into the nerd milieu. Ted began his work years before actual networking existed, so he had to conceive of the whole damned digital world. He called it Xanadu.

I consider myself a filmmaker. That was that — that was what I was going to do. And in graduate school, I took a computer course. Screens! I’m a movie maker! I know what to do. So it’s my job to design the documents of the future, because if I don’t do it, the techies will screw it up and that’s exactly what happened

Ted coined the word “hypertext” to describe non-sequential writing — but clickable links were only the start of his expansive vision for digital technology.

So to me, hypertext — by which I mean not just the World Wide Web with one-way jump links, but with visible connections between pages — this was going to be the document system of the future for all mankind


If you like painting with oil or playing instruments from around the world you already know how far computers have to go to be expressive human interfaces. You could see computer history so far as this gigantic fall from grace that’s the mirror image of Moore’s law.

There are some things we’ve figured out but at the same time we have managed to gradually dismantled a lot of the much better early ideas that seem to me to have a kind of a holistic sensibility for no good reason. Maybe finding the best future for digital networking in an analog world will probably turn out to be like finding our way back to approximately where Ted Nelson was at the start.


First Thought, Best Thought

The foundational idea of humanistic computing is that provenance is valuable. Information is people in disguise, and people ought to be paid for value they contribute that can be sent or stored on a digital network. Ted’s earliest idea was that instead of reading a text as given originally by the author, a more complex path might be created using portions of text to create a new sequence without expunging or losing the original.

But there’s this other thing that he also proposed right at the start which is that if you remembered the provenance for where bits came from a network you could use that knowledge as the basis for universal micro payment system and that in turn was the first new idea about economics for a longtime.


Marvin Minsky way back in the late 50s shortly after he coined the term artificial intelligence assigned some some students a summer project of using a dictionary and an algorithm to translate between languages. It seemed to make sense, Noam Chomsky had been promoting this idea that there’s some compact well describable structure in a language so why couldn’t you do it?

I mean it wasn’t a terrible hypothesis just it turned out it didn’t work and it didn’t work for years and years and years and finally what turned out to work was discovered at IBM’s research labs much later and it was big data and statistics. So what you do is you take in a very large number of pre-existing translations made by real people and then you isolate turns of phrase that exists in a new document you want to translate, you mash up the most similar occurrences of previous terms of phrase with a new one and that mashup is legible.

Jaron Lanier


We currently do not have the scientific understanding of how the brain represents information or meaning. I repeat, we don’t have a theory of semantics on the brain therefore every instance of operational AI is in fact a way of disempowering people to concentrate money.

We have a population who we pretend don’t exist in order to create the illusion of a free-standing AI. We have to disenfranchise all the people who are contributing the data that makes them actually function on a regular basis. Now this applies to a whole host of other professions and industries and I think it’ll also eventually apply to manufacturing because the designs for manufacturing goods have to come from somewhere (3d printers have been distributed so far on the Linux model where we forget where the design came from)


Ted offered the first alternative way through the left-right thing manichaean fallacy dilemma so if you had a universal micropayment system that would just keep track of what people had contributed to the digital ecosystem even when the machines get really really good and you can think of them as being autonomous they’ll still be depend on information systems that come from people.

Information systems need constant revamping and constant maintenance so what you would get is a society where people would be paid for actually doing what needs to be done. It is such an advanced feature that might suddenly create a new kind of fair emergent distributed society in a way it’s pushing capitalism into something much more interesting than it started out.


The right to copy files on the Internet is held up as a form of free speech in the digital rights community. The Internet has even been described as a giant copying machine. But copying on a network is actually rather odd and at the very least an extraneous, retro idea, if you think about it from first principles. After all, in a network, the original is still there plus it’s better than just being there you have its context in its history you preserve some of its provenance you get more meaning than you would have if he just had it in isolation.

Everything at the PArc had to be framed in terms of documents and copying. So there was this bizarre way in which they started off the art of usable user interface by copying the previous paper-based world of information processing which itself was based on copying onto new copies of paper.

The early computers built at PARC looked remarkably like modern PCs and Macs, and the concept prototypes and sketches foresaw modern phones and tablets.

The Macintosh is how Steve interpreted what Xerox Parc had done. And once again, that’s not what I would have done. So what we have now are documents which imitate paper — which to me, it’s like putting an imitation horse on a car.”

All have basically only one column because originally there was something called Project Bravo, which became Microsoft Word, and everybody was so hypnotized by fonts — which are fun — that they didn’t realize that connection is much more important.

I learned to write on a typewriter, okay? That’s how you should learn to write, so that you can’t put things into screwy italics and funny arrangements. Because then you learn to write as distinct from arrange visibly.

So how should writings be connected? My idea — my simplification of it — is parallel pages with visible connections.

The primary distinguishing feature is therefore two-way linking, just as networking and hypermedia might have possessed anyway, had the original ideas from Ted Nelson and other early pioneers prevailed. If two-way linking had been in place, a homeowner would have known who had leveraged the mortgage, and a musician would have known who had copied his music.

Two-Way Links

In a network with two-way links, each node knows what other nodes are linked to it. That would mean you’d know all the websites that point to yours. It would mean you’d know all the financiers who had leveraged your mortgage. It would mean you’d know all the videos that used your music.

Two-way linking would preserve context. It’s a small, simple change in how online information should be stored that couldn’t have vaster implications for culture and the economy. Two-way links are a bit of a technical hassle. You have to keep them up to date. That hassle means there is some initial difficulty in getting a two-way system going as compared to a one-way system. This is part of why HTML spread so fast.

But it is one of those cases where getting something easy up front just makes the price worse later on. If everything on the Web were two-way linked, it would be an easy matter to sort out which nodes were the most important for a given topic. You’d just see where most of the links led. Since that information “wasn’t present, Google was needed to scrape the entire Web all the time to recalculate all the links that should have existed anyway, keep them in a dungeon, and present the results to lure so-called advertisers.


Best Thought

The first principle is that each file, or whatever unit of information the thing is built of, exists only once. Nothing is ever copied.

we’ll have interactive screens, and we’ll be able to have documents on those screens, and they’ll be able to jump from one to the other. And we’ll show visible bands of connection between them, and you’ll have automatic royalty and automatic backup, and it’ll all be available online.”

A panoramic view of postulated hypertext from Ted’s 1965 paper on the subject.

It’s a combination of a document management system and an operating system — meaning that it manages a federation of similar discs containing source content and links. There is no embedded markup. All markup is done by what we call links. So, a document is delivered as a list of pointers to the content and pointers to the links — all of which are assembled in the use of the machine.


“Transclusion means that part of one thing is included in another and brought from the original.” Something like iframes for example is transclusion. Video from YouTube is embedded in another document — that’s transclusion. But in the Xanadu method, the transcluded portion has a path back to the original, which you can follow so you can see its original context.

There are links and there are transclusions — it’s a different method. A link is a specific object separate from the content. A transclusion is simply following the path from a part to its original context. So it doesn’t require a link. It’s just: “Here’s the transcluded part, and the edit decision list says where it comes from.” The edit decision list can show you where it is on the original.

If we had had these bilateral transclusions and links that would let people go back and forth and see the common source — and see where something is pointing to.


HTML appeared at a tired moment for Silicon Valley. There was a trace of panic right in the early 1990s about whether anyone would come up with new “killer apps” for personal computers. Would there ever be another idea like the spreadsheet? HTML was so easy to spread. Each node had no accountability, (because it doesn’t have the backlinks it’s like one directional you can link to something but it doesn’t know it was linked to) So nodes could accumulate in a “friction-free” way, even though there is no such thing as a free lunch, and the friction would surely appear later on in some fashion. They were all impatient and bored and leapt at the thrill of quick adoption.

HIPPIES AND REDNECKS: how we lost provenance


To be clear, the key technical insight that allowed networking to become decentralized and scale was packet switching, and it arose from the very different world of elite universities, government labs, and military research funding. However, at least the functionality of something like packet switching is foreseen in Ted’s early thinking.

The early culture of networking was created by young men who of all stripes wanted to hide and that both of them had a very very strong bias to destroy provenance even though they knew that technically it was the wrong decision.

  1. Hippies: For the hippies it was all about Vietnam and the draft. Putting yourself on the list so the government knows where to find you is just dangerous.
  2. Redneck/libertarian: The military guys who were also involved in the early ARPANET. Remember CB radio was just these little radios to talk to people and the main purpose of it was so that people could share where the police were so you could get away with speeding.


The system was not perfect but of course it did spread fast and it took over and this led to this sort of don’t worry be crappy phase of Silicon Valley. The path to riches was to go for people before you go for quality. Just get a bunch of people locked in to whatever the hell it is then worry about whether it’s any good later. That’s been the formula ever since “just get people roped in get them locked into your thing” even just the the first second third and fourth priority is something that’ll spread fast that’ll spread easily. A lot of those things have good qualities but they’re only half good because they you know the price you pay basically it’s a it’s a short-term way of thinking like if you prioritize rapid spread first then later on you have to pay for it now.


If you do these designs that are only halfway done in order to get very rapid diffusion of a particular design the effect on society is ultimately extreme privatization because somebody has to come along to fix it later and it’s no longer society but just some company so what it does is it becomes an engine of income inequality (incomplete digital designs create income inequality)


A lot of the Silicon Valley world that’s arisen since the turn of the century has essentially created gigantic fortunes trying to reconstruct the backlinks that were lost at the advent of www. If you look at the big Silicon Valley success stories they’re essentially privatizing the gap between what’s Tim Berners Lee did and what Ted Nelson knew needed to be done from decades earlier and that particular gap filling has created the fastest biggest fortunes in history isn’t that astonishing

But because all that was thrown away Google had to scrape the whole internet or the whole web every night and calculate the backlinks that might have been there and count up where the most of them were in order to sell it to people as a service to advertisers.

Similarly, if two-way links had existed, you’d immediately be able to see who was linking to your website or online creations. You’d meet people who shared your interests as a matter of course. A business would naturally become acquainted with potential customers. “Social networks” like Facebook were brought into existence in part to recapture those kinds of connections that were jettisoned when they need not have been, when the Web was born.

Everything’s anonymized everything’s just half assed because it takes some huge structures of a company — in that case within a proprietary shell calculate the backlinks which is essentially their service.

The real sophistication of Ted’s idea is how it would bring about a balance of rights and responsibility while at the same time reducing friction. Traces of Ted’s idea for balance are reflected in some of today’s designs. For instance, each Wikipedia page has a history. But the economic angle is what concerns us the most here. If the system remembers where information originally came from, then the people who are the sources of information can be paid for it.


In all these cases, in a humanistic information economy, when new data is uploaded from a local device into a server or cloud computer, its provenance is remembered. That means a record of origin is connected to the data. This record is protected from error and fraud by redundancy between local devices and servers in the cloud, so faking or erasing provenance would at the very least require taking on nontrivial effort and risk.

In humanistic information economics, provenance is treated as a basic right, similar to the way civil rights and property rights were given a universal stature in order to make democracy and market capitalism viable. It can now lead to a balanced future, where a middle class can thrive with proportional political clout, and where individuals can invent their own lives without being unduly manipulated by unseen operators. He saw the issues more clearly than we do today.


When too many layers of access to culture are privatized, as has happened online, you eventually end up with a few giant players. What is more universal retention of provenance without commensurate universal commercial rights leads to a police/surveillance state.

In Ted’s conception, each person would be a free agent in a universal online store, and everyone would be a first-class citizen, both buyer and seller. You wouldn’t have to keep separate passwords and accounts for different online stores. The way we’re doing things now re-creates unneeded limitations that shouldn’t be inherited from brick-and-mortar commerce.


For instance, Netflix does not allow its customers to download a video file that is identical to the master file on its servers. Instead, it provides software that delivers a video experience by accessing that master file in real time over a network, and displaying it to the customer. While Netflix might employ cached data mirrors to back up their data, or to speed up transmittal, that is not the same as creating multiple logical copies — as users on a BitTorrent sharing site do.


There’s also only one “logical copy” of each app on the Apple store. You can buy a local cache of it for your phone, and Apple undoubtedly keeps a backup, but there’s just one master instance that drives all the others. When the master version of an app is updated in the store, it’s eventually updated on all the phones as a matter of course. The existence of the app in your phone is more a mirror of the original than a copy.

INTERMEDIATORS AND FILTERS: GOOGLE: What’s wrong with making copies?

If you copy a file, you don’t know where it came from, if it’s been altered. The context is lost, and meaning is dependent on context. Bloggers (partisans or otherwise)notice when a candidate is quoted out of context in a campaign commercial.

Making up for lost context, means that corrections and context are trapped within online “filter bubbles.” If you doubt the importance of that small change, just look at Google’s revenues, which are almost entirely based on putting links immediately in front of people.

In a Xanadu-like system, the link back to the original would always be right there. It would become much harder to make the illusions of misleading mash-ups stick.


My version had a complete copyright system, which is being universally ignored. Creative Commons is just a way that people can give up the hope of being paid in a polite and elegant form. “I wanted to create a system of commerce that would create viable ways for people to be paid with a sensible and completely fair method.”

The usual dilemma that divides people. On the one side are intellectual property advocates who struggle to shut down share sites. On the other are the Pirate Parties, wiki enthusiasts, Linux types, and so on. The contest between the two sides sparks endless debates, but they’re both inadequate and inferior to the original idea for digital media.

Anyone in a Nelsonian system can reuse material to make playlists, mash-ups, or other new structures, with even more fluidity than in today’s “open” system, where the all-or-nothing, ad hoc system of intellectual property intervenes unpredictably. At the same time, people are paid, and information isn’t made free, but is affordable. A Nelsonian solution provides a simple, predictable way to share without limit and yet doesn’t destroy middle classes in the long term.

So in a Xanadu document, you don’t deliver a packaged lump — you deliver a list of portions to bring in and how to put them together. We’re talking about someone creating a document using a piece from here, using a piece from there, using a piece from there — and that piece can have a pay wall.

Anyone is free to include anything from anywhere without negotiation. Anyone is free to buy it or not buy it and so the permission doctrine for this is called trans copyright, meaning: “I hereby give permission for this content to be used in any new context provided its bought from my server” — which could imply the micro-pay wall.


That means if a snippet of your video were reused in someone else’s video, you would automatically get a micropayment. Furthermore, a Nelsonian system “scales” . A remash of a remash of a remash is facilitated within this system just as easily as the first remash, preserving a balance of commercial and expression rights for everyone in the chain, no matter how long the chain becomes.


No one wants to pay for healthcare if they can possibly avoid it. In an immediate sense, the ability to not pay seems to increase wealth and freedom. But then later, when health problems come up, illusion turns out to cost more both in money and in lost freedom than up-front realism. When a system is in place for everyone to share risk in advance, life doesn’t become perfect, but dealing with hard times at least becomes cheaper and more flexible.

To put that in a macroeconomic perspective you can’t have a democracy without a middle class and this is something that I think Ted understood from the very beginning. Hypothetical access to information is not the same thing as having empowered access to information, so as an example right now any of us can have access to most of the information that a Google can have access to but we don’t have city-sized computer facilities cooled by rivers to process statistics so we’re not equally empowered to make use of the information. Mere openness isn’t enough. There has to be a distribution of the ability to use the information or you end up with a hyper extreme amount of income inequality which then will undermine democracy.


A bell curve is the most common type of distribution for a variable and due to this fact, it is known as a normal distribution. The term “bell curve” comes from the fact that the graph used to depict a normal distribution consists of a bell-shaped line. If you look at the history of the middle class you see it rising in fits and starts never more so than in the post-war period in the developed countries right where you see that we see this incredible unprecedented rise and then starting around the 80s it starts what we back slid.

Now there’s no innovation required to have extreme concentration of power in a society, that’s been the historical norm through the ages. It apparently is so that there is innovation required to sustain a strong middle class. However now what bothers me is that the rhetoric of online activists that all this open source stuff all this open information based on lost provenance has been promoted as something that will make people rise up that’ll create all this wealth.

So as an empiricist you have to say either what we’re doing is irrelevant to preserving the middle class or it might be useful but we’re not doing enough or we’re part of the problem. Those are the only options I can see and it’s crucial to figure out which is more true right so my feeling is in area after area by D contextualizing Network information we’re concentrating the wealth and whoever can process it the best instead of and the people who originated it.


A social contract must take hold for any orderly economy to be possible. Any functioning, authentic economy has to by definition be sustained more by voluntary participation than by enforcement. In the physical world it’s not all that hard to break into someone’s house or car, or to shoplift, and there aren’t all that many police. The police have a crucial role, but the main reason people don’t go around stealing in the physical world is that they want to live in a world where stealing isn’t commonplace.

A Linux always makes a Google.

The notion that bottom-up change is the only kind of change it’s dishonest. It is never true that there is no top-down component to power and influence. Every attempt to create a pure bottom-up, emergent network to coordinate human affairs also facilitates some new hub that inevitably becomes a center of power, even if that was not the intent, ie the communist party. If everything is open, anonymous, and copyable, then a search/analysis company with a bigger computer than normal people have access to will come along to measure and model everything, and then sell the resulting ability to influence events to third parties.

The whole supposedly open system will contort itself to that Game Zero, creating a new form of centralized power. Putting oneself into a childlike position is only an invitation to someone else to play the parent. The only way to create a distribution of clout on”a digital network that isn’t overly centralized, so that middle classes and a maximally competitive marketplace can exist, is to be honest about the existence of top-down dynamics from the start. While accounting can happen locally between individuals, finance relies on some rather boring agreements about conventions on a global, top-down basis and we should be making sure that the information that should be monetized is monetized. I defy you to show another way to get there other than Ted’s ideas.


As of today there’s a romance in our current path, especially for hackers, and it seems to be the future most envisioned in techie culture. But what a crummy world that would be, where screwing up something online is the last chance at being human and free.