Soon, machine code begat assembly language, assembly language begat ALGOL, ALGOL begat CPL, then BCPL, B, C, and finally, C++. Thus equipped, in a single day, the ambitious geek could achieve what would’ve taken years of diddling ones and zeros. Life was good.
Around that time, the Internet made its debut, and then, OMFG…
The World Wide Web! Click a link, and your browser fetched the HTML from some far-flung server and rendered it up all nice and pretty. Like magic! Except that after a page loaded, it didn’t do much, but maybe animate a GIF or blink a tag.
Why so static? Because the dominant C/C++ paradigm excelled at producing giant, star-shaped pegs to the Web’s round, dialup-sized holes. Sensing the need, Sun gave us Java, which promised safe, bite-sized applets that you’d Write Once and Run Anywhere.
In practice, everything didn’t run quite everywhere. However, by 1999, both major Web browsers shipped with a high-quality Java virtual machine which would execute integer code nearly as fast as the equivalent C. By happenstance, I’d just founded an agency with a crackerjack engineering team, and we used Java’s “good parts” to create the first rich media banner ads, video-game style, by smashing bits, precomputing tables, and unrolling loops in ways that would make the Google Doodle blush.
Then, Macromedia’s Flash burst on scene, hypnotized designers with its whizzy tweens, and killed Java on the client. Overnight, the “interactive” Web degenerated into a stew of gratuitous transitions, dirt-slow ActionScript, and strange bugs that somehow survived each new Player release. Times were dark.
And, of course, feel free to show it to your friends, so they can too.
Thanks, and ¡Viva la vida algorítmica!