Everyone in the sphere seems to have an opinion on what Web 2.0 means to them, and I will add my own here. Web 2.0 is the new software tradition which we have all but transitioned to. The features of this new tradition that differentiate it from the older tradition are that products are more focused on design rather than on capability/features. Take for example products like Flickr, Firefox, MacOS X, Office 12, and the slew of AJAX service-based microapps. A good example to illustrate this paradigm shift is MS Word. In earlier versions of Word(1-7), each release added more capability to the editor evolving it from something like Notepad to the current Word 2003, while essentially keeping the interface consistent and familiar. However, if you look at Word 12, the first thing you will notice is a new interface focused on making tasks more efficient and discoverable. Even the data is stored in open formats with the goal of making it easy to consume and access by third parties.
Why did this shift occur and does this mean that software companies will eventually evolve into pure design companies? It's conceivable that with technology becoming more and more accessible and the wider availability of powerful tools, the software company of the future may be staffed almost completely with artists, psychologists, anthropologists, and designers, with maybe a few technical school graduates to write the tools.
And also, why is it that features and capabilities are less emphasized now? Aren't those the cornerstone of the computer revolution--being empowered by technology?
The truth is, the software industry is stuck in a rut. You can see it across all specialties--office productivity software, on the web, in gaming--no new features have been added, no new types of websites, no new gameplay, just more efficiency, more polygons, more psuedo-chrome. Since we have no new features to add, we have been keeping busy by making things pretty and usable, to keep us employed. Why this rut? Some might say that it is because the industry has entered a stage of evolution rather than revolution. We've reached a critical mass where now the improvements will be in small increments. I agree and also disagree. I agree that is the state of the industry, but I think the cause for no new features is simply that we have no new technology.
Technology as a whole, even outside of IT, has actually slowed down. Where are the Bell Labs of today? PARC is a shell of its former self. What are the new Information theories and quantum theories, new internets. Technology innovation has flatlined after it was made unecessary after we came out of wartime. The internet itself is a wartime child.
Okay, I've gotten a little too caught up and started rambling, but I think the solution to this technological rut is clear. We need more fundamental research. We've reached the limit on how far we can milk the results of past research. Whether or not there is a wartime neccessity, we need to do this basic research in order to improve the capabilities of our systems, to claim that things are still getting better.
So, what kind of new capabilities should be developed? Computers today are used almost solely to input, output, store, or transmit human data. But, instead of just being repositories and pipes for the data, I believe computers can consume and reason with data, much like a human can. How this can be implemented in our current market, I'll talk about later.
1 comments:
you have come into the dark side, it is a good thing that you realized all this fancy with strong AI having the ability to reason and automate human tasks is nothing but a dream for the next half century. we will all die in an ocean of our own filth thanks to the degradation of our environment
Post a Comment