Friday, November 24, 2006

Whuffie, Social Networks and Metadata Semantics

I was reading Ambient Findabilty and I saw a reference to a concept called “whuffie”. After checking out the Wikipedia article on whuffie, I discovered that whuffie was reputation-based currency used in Cory Doctorow's sci-fi novel, Down and Out in the Magic Kingdom. This future history book describes a post-scarcity economy where people purchase goods and services based on their reputation, not just cash.

I was interested in this topic because I think that in the future new keywords with precise semantic mappings will be introduced into declarative systems based on social networks. Right now it just takes too long for new tags to appear in an XML standard. Just think of how long it takes for the w3c to create a new XML tag for something like a menu tab! Up to 10 years based on the rate that the XForms standard is being adopted.

Anyway the book was a great mind-opener. One of the first science fictions books that takes nanotechnology and social networking into account. I have always loved cyber-punk: Neil Stephenson’s Diamond Age being one of my favorites. No time-travel or spaceships: just highly imaginative extrapolations of the net and Moore’s Law.

I like the concept of the post-economic society. When all of our economic needs are met what do we strive for? Why don’t they ever complain about their raises in Star Trek? The book also discusses the concept of an adhocracy: groups that quickly band together to solve problems. Doctorow also looks at the concepts of death, backing up your brain to a persistence store (the immortality SAN) and filtering restores to change values.

What does this matter to the metadata architect? It matters because our march away from procedural programming toward declarative systems is totally dependant on the creation of shared semantics. When you need to pick an XML tag how do you pick the tag that will have the highest probability to have semantic precision in the future? Social networks will help us and when we build systems to vote on which tags get approved not all votes should count the same. Votes by people with high-whuffie should get counted a lot more. The concept of one-person, one vote will be modified in the post-economic society.

2 comments:

Unknown said...

Dear Doctor....

Interesting article, but the work that we're currently doing on the HTML Working Group is very different to what you are suggesting; instead of focusing on adding lots of new tags to HTML and XHTML that meet the needs of different domains (a process that is, as you rightly say, painfully slow), we instead devoted our energies to improving the mechanisms by which authors can add metadata to their documents. This has the effect of putting control of the vocabularies back into the hands of the experts, rather than us having to either think up terms that might be useful, or decide which of multiple requests to favour.

The main extension points are RDFa and the new role attribute--both features that we devised for XHTML 2, but both of which are being made available as modules for XHTML 1.

Some of the ideas behind @role are explained in a blog post of mine, Using the role attribute to extend XHTML. A couple of other posts that are related are The XHTML role attribute: small and perfectly formed and XHTML 2 and the Semantic Web: Are we nearly there yet?

Best regards,

Mark

Mark Birbeck
CEO, x-port.net

Dan McCreary said...

Hi Mark,

Thank you for commenting on my blog posting! I hope to continue to write on topics that interest you. I am a BIG fan of XForms see:

http://en.wikibooks.org/wiki/XForms

I am very familiar with RDF but had not yet heard of the role attribute. I will look into that.

I am glad to hear that people at the W3C are working on speeding up the pace of adding new semantically precise tags to XHTML. I am very impressed at how sites like Wikipedia have used the power of social networking to speed the development of authoritative works. They seem to have got a solid "Architecture of Participation" and they continue to improve it.

My fear is that the w3c still does not yet have a state-of-the-art Architecture of Participation and is not taking advantage of modern social networking technology. Even companies like Google are ignoring good standards like XForms see:

http://groups.google.com/group/Google-Web-Toolkit/browse_frm/thread/e22790257f8adc17/1c6df50c213fbcf9?lnk=gst&q=XForms&rnum=1#1c6df50c213fbcf9

The author (from Google) predicts that XForms will not take off for another 10 years, so instead they are re-inventing their own UI abstractions. :-( The good news is that GWT could work with XForms side-by-side in the future.

Here are some questions that I would like to consider:

If we want to extend XForms, where could we go to check to see if someone has already create a similar extension? For example binding images in results to an output element seems to be something that many people need to do.

How do new extensions that are used by many organizations become candidates for the next specification?

How can we statistically analyze how web sites are using JavaScript and how could we use these statistics to propose new XForms elements to minimize the amount of JavaScript that we have to write and debug?