More Fuel

Malware-installing “ads” appearing on major websites.

Yet more fuel for the case of not only using ad blocking software like Adblock Plus, but also for the case of installing noscript and leaving scripting and embedded objects disabled by default.

At which point my argument about relying on Flash/JavaScript being a bad idea for your website becomes even more important. If the first-time visitor can’t access your website because it hasn’t earned their truest to allow execution of scripts or embedded objects how will it ever earn their trust? Furthermore you lose that critical first impression.

Improving Online Security: A Usability Tool

Many years ago I didn’t really care much about javascript, java applets, flash, etc.. from being run inside my browser. Nor did I care about what was being done with the cookies sent to and returned by my browser.

I’ve changed that.

I’ve been using noscript for some time now. This alone gave me some pretty good insight into how embedded scripts and objects can drive user experience on the web and how it’s quite possible to develop an interactive and interesting experience without any of it. It also puts me through the experience of not being able to access sites that rely heavily on such things. This gave me a much greater appreciation for the need to develop websites with usability and accessibility in mind beyond the simple “use alt tags in your images” bit.

Recently I’ve extended this blockade to cookies. I’ve configured Firefox to not accept any cookies at all.

How quickly things break down.

Almost immediately you (re)discover all the tricks.

Sites that, as soon as you hit them, immediately try to assign a cookie to identify you. If cookies and scripts are disabled this usually gets you a blank page where a script-driven redirect no longer works. Other times the entire page will load only to immediately redirect (meta refresh) to another page that says the site won’t work without cookies (despite it obviously working just fine).

On other sites that rely on interstitials to force users to view advertisements you’re able to bypass the ads completely now that they can’t tell if you’ve seen the ad or not and default to “yes you have”.

Then I went about setting exceptions to disabling cookies. Specific sites only. A whitelist, much like noscript. This way I could do this for an extended time without disrupting my experiences at my usual hangouts.

This led to some very interesting situations with online payment systems. One particular company I do business with redirects you through four different servers with completely different domain names just during the login process, each one requiring their own cookies be set. As you progress through making a payment you hit each server through embedded objects or just directly visiting the site. At times it was quite frustrating trying to identify each server, but it also gave me a lot of insight into how the company conducts their online business and how they’re structured. Information normally transparent to the user who has scripts and cookies globally enabled.

It is, at the very least, an interesting exercise. It is something I think every web developer ought to subject themselves to just to fully appreciate the situation and how they might apply lessons learned in the experience to their own work.

Stay Away From In-Browser Scripting

We could very well have a split user community within a few years. And the basis for all this is in-browser scripting.

I keep saying it and I keep seeing more evidence to support my case.

We, as web developers, need to get away from using JavaScript ActionScript or any other sort of in-browser scripting.

At the very least the core functionality of a website should not rely on it. A user should not be required to support JavaScript to use your website.

Microsoft is starting to make a stink over the future of JavaScript. They are opposed to any major updates to the ECMAScript (the standard name for JavaScript) standard.

Some think this might be about Silverlight and Microsoft’s wish to gain more control over the web.

Stuck in the middle of this battle is any developer that relies on JavaScript for their website. What happens if Microsoft stays with their current version of JavaScript and no longer supports future changes to the standard? Then pages start to break in IE. But other pages start up uing Microsoft’s solution. Their pages will work great in IE and fail in other browsers.

The solution?

Don’t use any in-browser scripting. Then your pages work for everyone. Good old HTML and CSS 2.1. (CSS 3 too, sure, but it’ll take about 3 years after the first working draft is released).

Or

Or you simply decide that you don’t need to support all user bases. That you’re quite accepting of developing a website only usable through IE7. That’s a very real, very valid choice to make. No different than the choices billion dollar companies make when they release a game title exclusive to one gaming platform. And these companies are successful at it too (otherwise they wouldn’t be billion dollar companies to begin with).

I have to accept that alternative.

However my personal feeling is that we should be striving for compatibility and usability. That our job, as web developers, is to make access to information as easy as possible. We should open the doors to information, not close them. Some will claim that a given script or platform is required because there is no alternative to deliver their specific type of information.

In very specific cases this might be true.

But I’d say 90% of the time it’s lazyness or other hidden agenda that drive development in this proprietary direction (like Microsoft trying to force people to use their products).

If you let yourself be overcome then web development will become a confused and sticky place to be in 3-5 years from now.

But if you free yourself of these added burdens (javscript, flash, silverlight, java, ajax, webos, etc.) and stick to what works for everyone (html+css) you’ll be well off.