We live in a technological world where ‘updates’ are basically an expectation of virtually anything connected to the internet, and I’m uncertain that it’s the best thing for a number of reasons.
I was discussing this with a friend of mine last week. We were discussing how previous generations of video games didn’t have an update mechanism. If the game had bugs on the day when the cartridges were programmed or CDs were pressed, players got the glitches, and the game’s reputation would reflect them. This gave incentive for developers (both for games and other forms of software) to do a good amount of quality assurance testing prior to release. The ship-then-patch model has removed lots of QA testing, letting the initial players’ complaints and Youtube uploads from paying customers fill the gap. In other words, it has created an incentive for companies to use paying customers, rather than employees, as QA staff.
Mobile apps have gotten just as bad. With pervasive LTE, apps have less incentive to optimize their code, leading to situations like the Chipotle App requiring nearly 90MB of space. This could be done in maybe 10MB of code (arguably less), but instead there’s a whole lot of unoptimized code which reduces available storage for users, increases the likelihood of bugs, and the interminable cycle of storage management for folks with 16GB of storage – a near-infinite amount just 20 years ago. Moreover, update logs used to describe new features, optimizations, and specific bug fixes. The majority of change logs have devolved into saying little more than “bug fixes and improvements”. Is the bug I’m experiencing on that got fixed? The fact that the Netflix app will no longer run on a rooted phone isn’t something that made it into a change log. Yet, with basically no information, many people allow desktop applications and mobile apps to update themselves with little accountability.
The fact that both Windows and many Linux distributions perform major updates at a semi-annual cadence is itself telling. The PC market has been fragmented for most of its existence. Even after it became the Windows/Mac battle (RIP Commodore, Amiga, and OS/2), there was a span when a person’s computer could be running Windows 98SE, 2000, NT, ME, or XP. Yet somehow, in a world prior to lots of things just being a website, and where users could have dial-up or broadband, and a 233MHz Intel Pentium II (introduced in May 1997) or a 1.2GHz P4 (introduced in January 2002), 64MB of RAM or 320MB of RAM, a hardware GPU or not, 640×480 screen size through 1280×1024. In a far more fragmented computing landscape, it was possible for software developers to exist and make money. There was little outcry from end users expecting “timely updates” from Dell or IBM or Microsoft. The updates that did come out were primarily bug fixes and security patches. There weren’t expectations upon software developers or hardware OEMs to list “timely updates” as a feature they were aiming to achieve.
So, why do I call it the ‘update virus’? Because the major OS vendors (Apple, Google, Microsoft) are all getting to the point where constant updates are an expectation, they’re not just ‘security updates’ but ‘feature upgrades‘. Many end users and fellow technicians I interact with have a condescending mindset towards those who choose otherwise. At first glance, I can’t blame people for being okay with new features for free, but the concern I’ve got is how monolithic it is. It is not possible to get only security updates on any of the three major platforms. UI changes are part and parcel with that. All three operating systems have gotten to the point where there is no “go away” button; Android’s OS release options are “update now” or “remind me in four hours”. Really? Six nags a day? No user choice in the matter? There was a massive outcry when Lollipop came out; the massive UI changes were difficult to deal with for many. There were more than a few reports of measurably decreased battery life. I recently bought a tablet for my mother where updates made it impossible to access the Play Store; my only option was to root the tablet so I could disable the update, because the stock firmware ran fine. Is this truly an improvement?
Now, most people will argue “but security! but malware!”, and to an extent, they are right. Expecting users to manually disable SMBv1 for the sake of stopping the infamous Wannacry ransomware from spreading is certainly the epitome of ‘wishful thinking’. By contrast, I recently had a situation where the laptop I use for controlling my lighting rig when I DJ failed to Windows Update immediately before the event, getting it stuck in an unbootable state and making it impossible to use for its intended purpose. On what basis is that behavior not precisely the sort of thing typically associated the very malware from which Windows Updates are purported to protect?
Ultimately, I like “good updates”. Whether it is because they fix security holes or because they optimize a feature, I am very much in favor of good updates. I do not like “bad updates” – the ones that break an OS installation, or install at the worst possible time, or massively revamp the UI without a “classic mode”, or similarly prevent my devices from performing their intended function. With no way to determine the good ones from the bad, updating has gone from a best practice to a gamble.
And if I wanted to gamble, I’d go to a casino.