You don't have to be a villain to say Flash must die

Flash's time has been over for years, but inertia has kept flawed technology alive and exploitable. It's time to kill it off.

Today's Best Tech Deals

Picked by Macworld's Editors

Top Deals On Great Products

Picked by Techconnect's Editors

I won’t pretend to be Steve Jobs—I don’t even own a mock turtleneck—but I have to repeat his words from April 2010: “Flash is no longer necessary to watch video or consume any kind of web content.” Flash is a constantly exploited, superannuated bit of technology that useful in the early days of multimedia in web browsers, and now deserves to die.

When Jobs wrote “Thoughts on Flash” over five years ago, it was in response to the notion that Flash should be available on iOS. At the time, I asked repeatedly for Adobe to stage demonstrations in private using iOS development tools to show Flash running. They never took me up on it, or any other writer that I’m aware of, even though they had the ability. Flash for Android, when it appeared, was terrible. Within two years, it was dead.

Yesterday, Mozilla set Firefox and Google set Chrome to block every version of Flash, allowing only versions released after Monday to run. (Update: Apple joined in the fray on July 15, blocking Flash in all versions of OS X that support its remote web plug-in blocking mechanism.) Adobe patched and released yet another update today that Firefox will accept—provisionally, as it’s possible not all known exploits are fixed. This was Adobe’s third critical release in three weeks. This was due largely to previously unreleased or “zero-day” exploits in Flash discovered in the data breach from security firm Hacking Team. But Adobe routinely releases fixes for both not-yet-known and zero-day flaws.

Facebook’s chief security officer publicly called on Adobe on July 12 to set an end-of-life date for Flash so that browser makers could disable it permanently all at once. Mozilla’s Firefox chief concurred.

What would we lose if we lost Flash? At this point, not much. Jobs was slightly ahead of his time, as he was with all things. But the Internet of 2015 can go Flash-free overnight—in fact, it mostly has already.

A flashback with caffeine jitters

When Flash first appeared as a browser plug-in in 1996 from Macromedia (later acquired by Adobe), multimedia on the web or through native clients was exceedingly primitive. Netscape Navigator and Internet Explorer were our two main choices for browsing. And the blink tag was still cool.

Flash had its purpose, encoding a multimedia presentation, video, or interactive experience once, and letting it play across all browsers and computer platforms that supported the plug-in. Shockwave, and then Adobe, did the heavy lifting of writing the plug-in software and developing the architecture, so Flash designers and programmers could just build on the platform. In that way, it had a lot in common with Java, which was ostensibly a write-once, run-everywhere code-development environment.

And Flash should suffer Java’s fate. While Java was also an integrated part of many browsers, and seemed briefly a critical part of having powerful client-side apps, that’s not how things shook out. It turns out that relying on a single company to develop code interpreters and compilers for a huge array of operating systems is problematic. You’re relying on their competence, speed of development, and concern for security.

The homogeneity of developing in Java and programming or using Flash still relied on a massive amount of heterogeneity: all the different versions of the Java Virtual Machine and the Flash Player required. Since Flash Player and client-side Java were not the core focus of Adobe and Oracle (which purchased Java’s creator and once-mighty workstation company, Sun Microsystems), respectively, there was no sensible way for them to put operating-system scale focus on a business sideline. Flash didn’t generate much revenue for Adobe per se, and browser-based Java had no financial impact, either.

In 2014, Cisco released a report that showed 91 percent of all exploits in 2013 ultimately relied on Java. Apple had already dumped Java as a bundled installation in 2012, and for a bit allowed software to essentially request a Java installation. Now, although Oracle continues to release versions, OS X users need to go to Oracle’s site to download it for installation.

Java found its purpose: a parallel version is the native language for Android apps, and Java remains in heavy use in server-side application programming. On the client side, while software continues to be available, it has to be installed on most platforms, isn’t available for iOS, and is essentially a niche product.

Java for use as an embedded web component died. Why can’t Flash?

Those who forget Flash are condemned to reload it

At first glance, Flash still appears to be in much wider use than Java ever was. If you install a Flash blocker and visit many sites that have video, interactive content, or rich-media advertising, it’ll seem like a good part of the site is broken. But that’s a matter of inertia, not requirement.

Most Flash remains in use as a generic, cross-platform wrapper around video. When Jobs unleashed his Flash thoughts in 2010, it was uncertain what video format would wind up becoming dominant, or if any one standard would. Apple had settled on H.264, as did Microsoft, which is a standard that involves a pool of patents. Google, Opera, and Mozilla (makers of Firefox) pursued ostensibly patent-free standards, including the Google-backed VP8 (better known as WebM for a method of using it with HTML5) and Ogg Theora for video.

As long as every major browser, mobile and desktop, couldn’t play H.264, this left Flash on the field of play: Adobe could encapsulate H.264 within Flash under its own licensing scheme and play back other formats as well. But the competing formats were under clouds: they didn’t have advantages over H.264 except related to patents, and most sites weren’t encoding using them.

In 2013, Cisco removed the last obstacle: it created an open-source H.264 module and would eat any associated licensing costs. That led Firefox to drop its objections, and Google had never fully backed away. Now all major browsers, desktop and mobile, can play the same video.

Apple stopped including Flash with OS X beginning with Macs released in 2010. Google engineered a more secure way to run Flash within Chrome. iOS has never supported Flash, and some Android makers flirted with it only between about 2010 and 2012. It’s had hundreds of vulnerabilities discovered, and an endless supply appear to remain. It’s a vector for criminals and governments to invade our privacy and compromise our stuff.

Firefox, Safari (for OS X), and Chrome disabled older versions of Flash this week uniformly—and remotely, I might add, through their process of checking for vulnerability updates in plug-ins—in newer versions of their browsers. We already live in a nearly post-Flash world, with no mobile users and ever fewer desktop users trusting Flash. There will be some disruption as software makers and website developers who have delayed the inevitable are faced with a reckoning—but they had a glimpse of that already this week.

Adobe has moved away from Flash in recent years, ending a misguided path that didn’t serve its customers or web users well when they couldn’t demonstrate that Flash could work on mobile. The company needs to take the next step and put a time clock on Flash of no more than several months. No one will mourn Flash. It’s already dead.

Note: When you purchase something after clicking links in our articles, we may earn a small commission. Read our affiliate link policy for more details.
Shop Tech Products at Amazon