We’ve had the web for nearly 30 years, it’s still garbage

6 min readJun 3, 2023


Anyone who told you there was a “Web 2.0” was lying. We’re consistently running in version 0.9.6b. Is that b for beta? b for “release before c”? We don’t know. And if that sounds like a joke, that’s the version of OpenSSL most of us used before heartbleed happened. For something like 20 years in an endless beta. And google calling gmail “beta” for something like 10 years didn’t help that cause.

I’ve had a webpage for 20 something years over at https://www.gushi.org. It was constructed in Dreamweaver, in the year 2000, by a friend. The graphics are all .jpegs because “gifs were bad back then.” Why? A hideous software patent. PNG’s and SVGs were not standard either. Maybe it’ll work in this browser but not that one. Who knows?

The structure of the thing was state-of-the-art at the time: some macromedia-standard Javascript to preload, and nice imagey-mouseovers. To make the page look the way it did, you had to bastardize tables, and frames. The <div> tag hadn’t been invented yet in those days.

I’ve done a couple of minor changes to it: and I really do mean minor:

I added SSL when Lets Encrypt became free (because browsers started warning that your static html-and-frame site wasn’t secure!!!).

At one point, I fixed all the HTML so it had an XHTML Strict doctype, and ran nicely through the w3c validator. (Now, ironically, doctypes are deprecated, and you just say “<Doctype HTML>” with the implication being that there’s no real standard to adhere to.)

But because I’m a server guy more than a design guy, I played with something new called the Content Security Policy. I covered this here.

Basically, on big “Web 2.0 websites”, this is a set of headers you can put in either META tags or HTTP headers that specifies what things can be done with your site. What other sites can load my site in a frame? Can elements on my site be allowed to ask for the camera? Where can my site load scripts from. With the idea being, if someone compromises your WordPress install, and you trick it to letting you put a <script> tag in a comment, they can’t remote-load a script from somewhere else and get it to run in your victim’s browser.

(Spoiler: If you want a secure site, don’t run WordPress).

Now, this is a largely static site — there’s not a lot of opportunities for a lot of cross-site-scripting here, but it was still a chance to play with the headers and see how it passes various validators.

Cut to now. I’m playing around with a new, all-static, no-javascript site design, and I wonder how difficult it would be to get the previous version to run with all the knobs tightened and produce no errors, before I load it onto a flaming barge and cast it off to valhalla.

Answer: Impossible. I can either set the “load things unsafe-inline” tag (basically a “rocks ahead, I don’t care about anything loaded as part of a page”) OR I can try applying a hash to any javascript on the page.

Some background: If you don’t want anyone loading any arbitrary script on your page, you can embed a header that says “only load scripts that match this sha256 hash”. Okay, fine, I go into the chrome developer console, where it tells me the hash I need to put, and it stops warning about loading that script.

And I go add the header to my CSP, adding:

script-src 'sha256-Q5CSukvieOuoG2ouforwyknjnSRmQgWqfT1CyCRawBQ='

…to an already long header. It works. The script loads. But as soon as I try my mouseover graphics, my javascript log fills with fail:

And, let me just make the story short: There’s no way around this. Apparently, you can declare a script safe with hashes, but you cannot do this with an event handler like onClick, onMouseover.

According to some docs: here and here, there’s a feature called “unsafe-hashes” where this *should* work, but I just spent an hour copypasting every one of my event handlers on my page in to iterations of:

echo -n "MM_swapImgRestore()" | openssl dgst -sha256 -binary | openssl base64
echo -n "MM_swapImage('Image2','','minitop1mo.jpg',1)"| openssl dgst -sha256 -binary | openssl base64

So I wound up with a hideously-long CSP header with a bunch of hashes. It did nothing in chrome. Nothing in safari. Even though it should have according to These Mozilla Docs.

It’s a lie.

The solution is to set unsafe-inline, or don’t use this design. This page (on content-security-policy.com) claims this is supported in

“The unsafe-hashes directive was added to CSP Level 3. It is currently supported in Chrome 69+ or Chromium Based Edge 79+. Safari 15.4 also supports unsafe-hashes.”

Chrome 69 came out in 2018. Safari 15.4 came out in 2022. This site has a copyright date ending in 2021.

That said, CSP Level 3 is still a working draft as of May 2023 (a month ago, as I write this), and notes specifically that as of that version:

The 'unsafe-hashes' source expression will now allow event handlers, style attributes and javascript:navigation targets to match hashes. Details in § 8.3 Usage of "'unsafe-hashes'".

So…surprise. Everyone says something different, and nobody knows what they’re talking about.

In the end, what we find is what I found out while reworking my site from frames to CSS: there’s this period where “the new thing” comes out, and it’s utter garbage. As was mentioned in this flexbox tutorial

The modern Flexbox standard comes to us from around the year 2012.

I knew the ins and outs of CSS layouts with floats and positioning as support evolved in the 2000–2010 era. We made CSS do things it was NOT designed to do. It was nasty, gritty stuff.

I was so used to the IE6 Web stagnation that I did not bother learning flexbox when it first came out because I couldn’t use it “yet”. Then I didn’t do much web design for about a decade.

When I finally learned flexbox a couple years ago, I couldn’t believe how good it was. This is truly the layout tool we would have killed for during the Table Layout Era and the CSS Float and Position Era after that. Seriously, those were brutal knife-fighting days. To accomplish what you can do now with display: flex; flex-wrap: wrap; back in those days would have taken you 16 hours to create and then some client with an old copy of IE would have come and taken a pint of your blood.

My site? It’s going to be fully static hugo and pure CSS very soon (and yes, I’ll still implement those security policies, without any event handlers at all). Until someone else comes out with a helpful standard that screws everything up.

But if this is what “working” looks like, you’re wrong.

The browser wars are far from over. Things still suck.




Gushi/Dan Mahoney is a sysadmin/network operator in Northern Washington, working for a global non-profit, as well as individually.