Windows Vista is approaching its "end of life" date and support is due to expire on Tuesday 11th April 2017. I think it's fair to say this operating system divides opinion.
I love Windows Vista. I think it's great. But you do need some serious horsepower behind it, and my laptop doesn't have it.
That quote from me in The Tide Turns more or less sums up what the 16 year old Bobby Moss thought of Windows Vista. More surprising perhaps are the first four words, given how much of a panning Vista has had over the years and how often I hold it up as one of Microsoft's greatest mistakes!
But in the interest of fairness, it is worth remembering some of the useful Windows features we still benefit from today that owe their existence to this ill-faited OS:
...and yet Vista fell flat on its face, and most users didn't experience this new functionality until Windows 7.
It wasn't marketed particularly well, because despite all this promise Vista was a massive resource hog compared to Windows XP.
To be classified as Vista Ready a laptop just needed 512MB RAM, 64MB graphics memory and 1GHz clock speed. To be Premium Ready you needed 1GB RAM, 128MB graphics memory and at least 1.5GHz. But, as I put it in Vista: My Thoughts...
I agree, it can run Vista; That much is true. Now try to run Microsoft Office 2007.
Microsoft ended up on the wrong end of a class action lawsuit because these specs were much too optimistic, and their claim Vista could run on "any machine on the market today" was just flat-out wrong.
A big culprit of the performance problems was Superfetch, which was originally designed to surface the things you were most likely to need in memory and cache everything else. Unfortunately, this led to it maxing out the available RAM and slowing the entire system to a crawl every time you moved from one app to another.
Another issue that drew peoples' ire was the UAC (User Account Control) prompts popping up much too often. This could be caused by people running badly-coded XP applications that needed strange levels of access to the system, but the issue was compounded by UAC not being particularly good at remembering what you'd previously authorised and as a result it prompted users every single time admin access was needed. Given this would usually take over the entire screen and crash underpowered systems this led many users to simply disable UAC or just grant access to any program that requested it, effectively rendering the entire system pointless!
The final nail in the coffin was the crazy pricing structure for the operating system. There were 6 variations to choose between with different levels of functionality, and in the UK we were paying roughly double what America were for the same download. And even then, there was no guarantee Vista would be able to run your existing applications and there were nasty rumours that it would probably downgrade the quality of any videos you wanted to watch because your existing hardware was unlikely to support the new heavy-handed DRM and copy protection technologies they were introducing.
However, there is a silver lining. I squarely attribute my interest in Linux to Windows Vista disappointing me on modern (as of 2007) hardware.
I had just surprised my parents by getting 3A*s, 5As and 2Bs in my GCSEs ahead of starting my BTEC National Diploma. As a reward they bought me a Toshiba Satellite L20 laptop. It was supposedly "Vista Ready" and had exactly the minimum required specifications. While I was very grateful, unfortunately it ran like an absolute dog. As I noted in Vista: My Thoughts:
As and when it does load up, you will (if you try anything ambitious like centering text or making text bold – stuff you can do in Wordstar…) hang the app or get a ‘Microsoft Word 2007 is not responding’ error message.
The only way to make the machine usable and preserve my sanity was to downgrade it to Windows XP, because I didn't have the needed funds to upgrade the hardware.
I did eventually upgrade the RAM, but by then the damage was already done. When you're a teenager who's been given a shiny new laptop for the first time, discovering the cool new operating system it comes with has to be uninstalled for it to be useful does colour your opinion of Windows a tad!
For a couple of years before I had been running Linux in Live CD form on my parent's desktop out of curiosity. This mostly involved running Mandriva off a magazine cover disc that had previously been stuck to the front of Personal Computer World.
Ubuntu 7.04 "Feisty Fawn" was the first version I took seriously and dual-booted with XP, and 7.10 "Gutsy Gibbon" (released a year later) was the first version I ever ran properly in solo-boot, as I noticed I was running Ubuntu most of the time anyway and could do Windows-specific college work on their desktop machines.
Ubuntu enabled me to do everything I wanted to on a home machine (I was a console gamer), and instead of having to manually download, install and update a silly number of third party drivers from Toshiba's website I just had to copy and paste some lines into an alsa configuration file each major upgrade for my sound card to work.
So I guess in a way I should be grateful for Vista's unreasonable hardware requirements, because ultimately it's what led to me becoming the programmer I am today and also resulted in the creation of all those articles I wrote for Linux Format magazine. The moral of the story? When the world gives you lemons...
Ironically ten years on, I don't actually run Linux as the primary operating system on any of my main home computers anymore. My laptop (a HP Spectre) and the desktop gaming rig I've built both run Windows 10 in solo-boot with bitlocker-encrypted drives.
This is partly because running Live Linux CDs/installers over USB3 and USB-C seems to be broken, but mostly because I've become more of a PC gamer in recent years. Even though Linux has advanced a lot on this front in recent years thanks to Valve, Windows still gets new shinies first and Linux driver support for my hardware is pretty hit and miss.
In the past I would have gone out of my way to make sure the hardware I was buying was a "known good" machine for running Linux or been motivated to make the time to fix those problems out of sheer Linux enthusiast fanboyism. However, not only do I now have a 9-5 job with a 2 hour commute either side, I spend most of that 9-5 job hunting through forums, wikis and other documentation to figure out workarounds for applications I'm developing for Linux servers - I don't want to do more of that when I get home!
Besides, being able to quickly spin up Vagrant VMs, Docker containers and the Ubuntu subsystem means I get most of the functionality I was after in the first place. I'm sure there are die-hard enthusiasts screaming "Nooooo! Micro$haft have suckered him", but if you honestly still believe "the year of the Linux desktop" is coming at this point, I can't really help you! (There was a brief window of opportunity when Netbooks were a thing, but it's over now. And Windows 10 actually isn't a bad operating system).
If it's any consolation to my enthusiast friends though, I still develop web/server-side Java applications that run on the OS for a living, this site is powered by open source tools and I've lost count of the number of Raspberry Pis I own. I still have two ancient machines knocking around for hobby tinkering projects: an old Asus Eee PC 900 that runs Debian and a Dell Latitude X200 I've just installed Xubuntu on (pictured below).
I also seem to have become primarily an Android user, the Umi Max and Kindle Fire HD being my tools of choice. The only evidence I once owned Apple products at this point is an Apple TV unit I still use to stream boxsets I bought 5 years ago (I missed English-language TV while I was living in Belgium, and they didn't have Netflix yet!).
So, in short: Linux is still an important part of my life. And the cool retro space game I've been developing on the quiet is going to be released for free under open source/creative commons licenses. My conscience is clear and I guess after all this time I'm still benefiting from Microsoft's past mistakes... :)