Web 2.0 and user filtered content

By Ross Dawson on September 9, 2006 | Permalink

Tomorrow I’m heading off to the Influence conference run by Phil Sim’s Mediaconnect. The event brings together media and other influencers (I believe I’m labelled a “new media influencer” there) and corporates, discussing current trends in key technology sectors. I’m on the Web 2.0 panel tomorrow, so I thought I’d briefly capture here my introductory comments, on my chosen topic of User Filtered Content.
The user filtering landscape
+ The primary focus recently has been on the explosion of user generated content, with Wikipedia, MySpace, YouTube and many others just the vanguard of an immense wave of content creation, unleashed by accessible tools of production and sharing. We are moving towards a world of infinite content, further unleashed by the vast scope of content remixing and mashups.
+ With massively more content available, we need the means to filter it, to make the gems visible in vastness of the long tail. Fortunately, Web 2.0 is in fact just as much about user filtered content as about user generated content.
+ As far more people participate in the web, as technologies such as blogging, social networking, photo sharing and more become easier to use, the collective ability of the web to filter content is swiftly growing, and will more than keep pace with the growth in content.
User filtering mechanisms
Clicks indicate popularity of specific content within a site (with many caveats).
Links are stronger and more valid votes on the value of content.
Ratings provide explicit opinions on quality.
Tags describe content with words, locations etc.
Web-wide and site-specific filtering
There are two primary ways of implementating user filtering: taking data from across the web, and from within one site.
+ Google’s PageRank is a seminal example of web-wide user filtering, where people’s aggregated linking behaviors enable people to find relevant content. Technorati more explicitly shows how many blogs link to other blogs or blog posts, to indicate their authority. Techmeme draws on the timing and relationship of new links to uncover current conversations.
+Amazon.com’s book recommendations kicked off site-specific user filtering, notably by identifying related titles. Slashdot was for several years the primary site that enabled communities to select stories and rate each others’ commentary.
In two years Digg.com has reached over 1 million daily visitors with its core model of user filtering of content. Copycats or similar sites such as Reddit, Meneame, and Shoutwire have abounded. Finally AOL-owned Netscape launched a Digg copy, providing mainstream media endorsement of the model.
+Content sites such as YouTube, Flickr, MySpace, and Odeo all embed user filtering as core features of their services.
What’s next for user filtering
+ Effective user filtering will have increasing value, and there will be more plays in this space. Network effects will apply strongly to site-specific filtering, however this will not preclude new players with better models gaining traction quickly. The move by Netscape to hire active raters away from Digg is an attempt to accelerate shifts.
+ Social search engines such as Eurekster and Yahoo!’s Search Builder indicate the next level of sophistication of search, enabling filtering aggregation of specific communities rather than the web at large.
+ Tools such as Last.FM and Yahoo!’s Launchcast will, with permission, use extremely detailed personal taste profiles to provide content filtering for individuals.
+ New mechanisms will emerge that draw on people’s web activities, tagging, specific communities, and combine these perspectives in various ways to create more refined user filtering. This filtering will increasingly be designed to be relevant to people with particular interest profiles and individuals.

Bitplex 360