Web 2.0 was first a noun, then a conference. Next it was an adjective. Perhaps soon it will be a verb, adverb, or expletive. Regardless of its grammatical status, it already has an entry in the Devil's Dictionary:
Web 2.0 Proper noun. The name given to the social and technical sophistication and maturity that mark the— Oh, screw it. Money! Money money money! Money! The money’s back! Ha ha! Money!
But its time is already past: Web 3.0 was already being discussed in 2005, and a recent New York Times front-page article (by the sometimes overly breathless but always interesting John Markoff) argued that Web 3.0 will:
provide the foundation for systems that can reason in a human fashion ... In its current state, the Web is often described as being in the Lego phase, with all of its different parts capable of connecting to one another. Those who envision the next phase, Web 3.0, see it as an era when machines will start to do seemingly intelligent things."
Rather like the second coming, Web 3.0 is apparently the Semantic Web, AI, and the singularity. A curious mix of concepts: today's Web does feature systems (like Google) that mine massive amounts of data to yield vaguely "intelligent" behaviors (like useful search results), but this certainly isn't because of any semantic tagging. On the contrary ...
Should we be concerned about this acceleration of generations? Nicholas Carr demures:
the arrival of 3.0 kind of justifies the whole 2.0 ethos. After all, 2.0 was about escaping the old, slow upgrade cycle and moving into an age of quick, seamless rollouts of new feature sets. If we can speed up software generations, why not speed up entire web generations? It doesn't matter if 3.0 is still in beta - that makes it all the better, in fact.
So we are clearly ready for Web 4.0. Perhaps its time is past too. But in case it isn't (and ignoring of course the many previous mentions), I'd like to think of this entry as the place you heard about it first. I guess that means we need a definition. Well, here goes:
Web 4.0: When: "Data is computation, and computation data: that is all ye know on earth, and all ye need to know."
While not a serious definition (and Keats wouldn't approve), the point is that we are still focused on people reading (and writing) data. Yet, despite the ever-growing number of people, such approaches don't scale. We have to get to the point where data is processed primarily by computers. Then, as with Google, we may get some "intelligent" behavior at acceptable costs.