The Web Doesn’t Suck

The web-world has been awash with the the sounding trumpets of the next round of battles in the long war to web perfection (whatever that may be). With the battle over web standards ending, the dust barely settled, new combatants and their champions are lining up and taking aim. As of late we have had, HTML 5 vs. Flash, CSS3 vs. its naysayers, and h.264 vs. Ogg Theora vs. V8. Of course, these are all the proxy wars being fought so that the real competitors, the browser makers themselves, don’t actually have to quantitatively prove whose browser is better, because honestly, what average user cares about Javascript execution time and page render speed when differences are measured in milliseconds? Barely registered by normal human perception!

Two recent blog posts come to mind as I think about these proxy wars and how they are distracting us from the bigger problems. In the first post (please read), Sachin claims that browsers need to innovate and become more like, “Apps,” (referring to iPhone Apps). His (depely flawed) argument aside, he has one point: sometimes it takes one organization to forsake standards and compatibility in order to drive innovation.

The second post on Ajaxian is simply making an announcement that the Google team behind the O3D web 3D rendering engine is adopting the WebGL standard for displaying graphics instead of their own solution and will instead implement O3D as a Javascript library. A great win for standards, but what struck me was the statement that part of the shift was due to the fact that Javacript is now fast enough, in the team’s opinion, to wrap around the WebGL API. When they started the O3D project this was not the case.

I would like to present an alternative solution to Sachin’s walled-garden proposal where one browser maker begins adding features to standards that other browsers don’t have in order to provide a better user experience, instead offering a less destructive walled-garden. My proposal is that instead of adding new features, simply improve pre-existing elements of the web development stack: Javascript. If Javascript performance is getting fast enough, why not start creating websites that require certain Javascript capabilities. In this way you are not mutating standards or introducing extraneous features to the web, instead you are simply setting a benchmark: “This website requires real man’s Javascript, not your puny Mozilla engine, come back with a real browser.” Google may already have this plan in mind given the recent lightning fast execution times on the latest Chrome nightly builds.

Javascript can easily be quantified and improved upon. Just as modern day computer games and other, “Apps,” require certain performance specs to be met, why not simply apply the same approach to modern day websites? Computer hardware is in excess, but I feel that browser makers have always been a bit conservative in hardware requirements, as the web should be accessible to the masses and their under-performing machines or possibly because web browsing was always seen as a secondary activity and not a primary means of productivity. Now is the time for this to change. Instead of screwing up markup standards for yet another decade, we should simply start applying performance requirements to modern web apps. Improve what we have before we go about adding bloat and confusion.

Powered by ScribeFire.

Comments Off on The Web Doesn’t Suck

Filed under A Category Other Than Uncategorized

Comments are closed.