A Crisis of Imagination

January 14th, 2003  |  Published in Uncategorized

This might be the most hostile interpretation of the “copyleft world view” I’ve ever read:

“The Commons enthusiasts believe that content publishers earn their profits by using copyright law to steal content from its creators and charge extortionary prices to consumers.”
(Arnold Kling, Content Is Crap, Tech Central Station)

This Commons enthusiast thinks that content publishers can do whatever they want with content they buy from musicians, artists, and writers (me included), but when it comes to stuff I produce for myself, I want to make it unequivocally clear what the conditions are on its use, and I want a way to convey that those conditions include some conditional sharing. Fuzzy-headed gnutella enthusiasts suffering confusion between the ideas of theft and contractual agreements are something Mr. Kling can wrestle with, but in the end it’s just a straw man for what are probably deeper cultural issues he has with why people share and whether it’s good to do so in the first place.

I’m not worried about the big media companies coming along and stealing my Charles Mingus outline, but I’m a little irked at ‘net parasites who cuddle right up to the line of claiming something I spent time on is really their work, their idea, and their inspiration. To the extent any of them have a conscience and will observe things like a random license they encounter on a Web page with stuff they want to take, the CC license I chose makes it clear what their rights are, and encourages them to think about admitting that they borrowed.

Mr. Kling’s essay concludes with this:

“Creative Commons is based on a naive ideology that believes that raw content is gold, which then gets stolen by the evil media companies. In reality, the economics of content are that most of the value-added comes from the filtering process, not the creation process. If you want to overthrow incumbent publishers with Internet-based alternatives, you are better off starting from the assumption that Content is Crap.”

Spoken like a true working columnist.

His ideas about applying bayseian filters to content aren’t so terrible: I’d love to see the sort of deep, data-gobbling mechanisms that could plow through the 70+ Web sites I at least grant a cursory headline skim daily and get me to the ones with something I can use on a particular day. But once I’m there, I’d like to know what I can do with the content I find, not because I hope to produce a paying newspaper or Web site stitched together from other peoples’ work, but because I’d like to know what I can use and what I can’t, what I can pass along to friends and what work the person running the site would prefer to keep for herself. The sort of metadata a site fully decked out in Creative Commons licensing would provide would give me a way to learn that easily, and the simple act of including a license badge would tell me plenty on its own.

If anything, advocating the use of technology to filter content is even more of an argument against big aggregators. If we can accept the idea that machines should be allowed a greater hand in helping us get to the bottom of the information pile that is the Web (I’m not convinced, and have said so a few times), we can certainly accept the idea that human editors and aggregators are in serious trouble when confronted with the twin threat of adequate search engines and competent, predictive filtering. And at that point, license issues become critical: freed of the intermediary editorial aggregator, we have to know what to make of the content we find. We have to know if the report from a strike or a demonstration is ours to use for our own collection of union news, if the really good mp3 is suitable for adding to our streaming radio station, or whether something as simple as a poem is ok to add to our online literary journal.

The more I think about it, the more I’m convinced Mr. Kling doesn’t suffer so much from hostility to sharing as he does a poverty of imagination… he’s able to get to the point where filtering technology makes it easier to drill down to the good stuff, but he runs out of creative energy before taking the next step: using that technology to act as a lever to extend the capacity we each have to eliminate the people who behave as critical filters on our behalf.

Comments are closed.

© Michael Hall, licensed under a Creative Commons Attribution-ShareAlike 3.0 United States license.