Perhaps 14 years ago, late-to-the-game technology pundits became excited about a particular model of growth: user-created content. With today's vast content farms, such as facebook or twitter, everyone is intimately familiar with, and contributes freely to, businesses using this model. It was presented as a counterpoint to the production and consolidation of professional content. Both models are intended to fulfill monopolistic dreams. But earlier-to-the-game technologists pointed out that even the earliest web successes depended upon user-generated content.
All this misses something much more fundamental: the computer industry has always made money from user-generated content. The reality couldn't possibly be different. Before I explain why, I need to provide an example.
You could start anywhere. In the late 1970's, normal people happily produced handwritten and typewritten letters and memos. Suddenly, five years later, they were all using their personal computers to send blurry-inked dot-matrix printouts to each other.
All of this required user-generated content. The benefits of the technology were massively and illogically overstated, with seductive advertising evoking riches and robots.
In a sense, people were being sold to themselves. "Imagine what you could do with this" says the suggestive advertiser. "Yes, I can imagine that!" says the imaginative consumer, who then buys the expensive computer, so they can 'make things'. They also bought expensive professional entertainment products, so they could 'relax'. This is exactly the same situation consumers find themselves in today: applications for generating your content, and others for kicking-back and letting the professionals distract you.
These are just innate qualities of people. We want to watch, and we want to build. People who want power and influence will always take advantage of innate qualities ... in a sense, this actually defines their profession.
Consumer-producers aren't totally unaware of this manipulation. But they are natural optimists. They assume that some potential utility underpins these techno-cultural shifts 'towards a better future'. They assume that 'early-adopters' need to be watched carefully, in the hope of catching a ride to heights of leisure and leverage.
People are also natural builders, which leads us to a deeper reason behind the pervasiveness of user-generated content in computing:
Every tool intends to aid the generation of user content.
But there is a deeper reason still:
Tools are within us.
I mean that quite literally.
Without a human brain to notice that another human brain has produced a tool, and without more human brains writing and speaking about the ideas behind the tool, the chunk of stuff that 'is' the tool, is just a human by-product.
It is a chunk of stuff that human brains might find exciting, stimulating, beautiful, useful, useless, ugly, complex, simple ... but like the chunk of stuff, all those words have meaning only to the human brain (with some understanding that we evolved from animals that may have related ideas, also within their brains, that is, they make sense in their umwelt, or worldview).
So, if a tool is something that only our brain understands, then the very idea of using a tool is 'user-generated' content. Some other human may have said something to you to spark the idea, to help you to construct the idea, but it's your idea, or you wouldn't have it. It is, of course, essentially the same idea, now in more than one brain. The word screwdriver produces the same recognizable structure in the brain no matter which language is spoken, a recent result from studies at Carnegie-Mellon, long-anticipated by some: Aristotle pointed out, in the opening paragraphs of On Interpretation, that this must be true, or we wouldn't be able to translate anything.
So, novel or not, any tool is inside the brain. We reconstructed the idea of the tool. What we do with it next is totally the product of our brains.
The same is true with the idea of computation itself. The computer is a tool that is in our brain. There is no 'physics of computation' ... computation is not some basic 'natural force' ... computation is a human mental construct. And every piece of a computer, every line of code, only has meaning to the human brain. Of course it's doing something in the outside world, and the consequences can be beyond the scope of human understanding. But we cannot begin to understand a computer program, or a computer system, without first taking this stuff we've created and giving it our intellectual effort, an intellectual effort we do not understand, but which we make use of. Our brains are tools, which we use sometimes consciously, but mostly unconsciously, in a foggy groping towards an internally-defined 'understanding' or 'awareness' of what we do, and an even foggier understanding of what is going on outside of our brains.
Our world is user-generated. Not the world, of course. Just our world. The intrinsic one. The one in our heads. The one we experience. Mostly it is the same as other people's, because we are the same species. That's why we can all construct the idea of 'screwdriver' or 'computer' in our brains. We are not conscious of how we do it, in the same way that we are not conscious about our digestive system, or our ability to walk. But the world is still constructed by us, or else how could we experience it the way we do? We know how flies have difficulty seeing some things, things we can see, and that our pets can hear things that we cannot. They are constructing different worlds because they have a different biological endowment.
Much of what the human brain generates, whether 'ideas', which we might 'communicate', or which may 'produce something', is new in some aspects ... but much is the same, in other aspects. The stuff that's the same is similar because, like the fly, we homo sapiens have our limits, our habits, and our strengths. 'History repeats' because we're all human. Bonobo chimps also repeat themselves, with some differences, including some repetitions and differences that we can never know, because we cannot become Bonobos.
That's why so many 'revolutions' in human culture seem similar. The human brain is still the place where culture itself resides, so the differences between the old and the new are going to be, well, less than 'revolutionary'. We can easily exaggerate or cartoonify anything and call it a 'revolution', or 'progress', or a 'paradigm shift', et cetera. That's what people do naturally, and it's a hard habit to break, because we're all chimps and we get very excited when we suddenly see something in our minds that seems new or helpful.
But those things are in our brains. When we generate content inspired by those things in our brains, it's genuinely gratuitous to tell someone "well, we've seen that before". Who cares? Everything is somewhat new and somewhat something that anyone can do.
Time for the best example.
Think of language: very few of my sentences, above, were ever spoken or written before. That's something that all of us do, everyday. Complete novelty. The smaller ideas expressed are a little less new, but still pretty new because they were combined in new ways, using new examples, et cetera. The larger ideas and themes, are frankly far less new ... in fact even I have been harping on about them for decades ... but maybe you've never heard them before!
But, whether you have heard them, or not -- if we agree on some of them, and remember them, we can begin to form a culture, a better future, around these agreements. If these become working assumptions for, say, a new approach to computing, based on natural science and an awareness that 'this is all in our heads', we might actually make some 'progress' towards extracting computing from the pervasive misunderstandings and exaggerations, which distract us from doing good with these tools. Our work will still be user-generated, but we'll be mindful of the brain's place in what we produce, and that should help us to grow a more thoughtful computing community.
Maybe I'm just looking for more quiet, rationality, and feeling, in the very noisy world of computing.
Monday, February 29, 2016
Subscribe to:
Posts (Atom)