Yesterday I upgraded the graphics card in my project box with a second-hand graphics card I picked up from Facebook Marketplace. Aside from writing, I'll admit my other great passion is gaming — and my kids love Minecraft. While the consoles are great for most games, I still prefer the PC for real-time strategy and role-playing games — games were the precise control or a keyboard and mouse are superior to the vague approximations of a controller.
Backing up a little, you might be wondering what a self-professed rusted-on Mac user is doing talking about gaming — after all, Apple Macs suck for gaming.
I noted last year, I'd been given a hand-me-down PC by a family member. He was looking to offload the box in favour of adopting the laptop lifestyle complete with a docking station. For me, the computer's purpose is straight forward: a PC on which to play games and run the Adobe Creative Suite.
So far its gaming abilities have been severely curtained by the integrated Intel HD graphics. While it could play most titles pre-dating 2007 it had no hope of playing anything remotely new, including the Witcher series, which I've owned for a while (thanks to gog.com discounts) but never played. In fact, it barely managed Starcraft 2 at anything approaching a decent resolution.
The 'new' graphics card, an AMD Radeon HD 7850 OC, cost me the modest sum of $60 dollars from a guy living less than 10km from my home in Ringwood. It's an older card, but as I've discovered it's still capable of playing many titles at 1080p at frame rates more than acceptable to me. Installation was straight forward (though I did have to re-wire a few sata cables to make room) and downloading and installing the drivers from AMD was easy. Even though the manufacturer (Gigabyte) stopped supporting the card in 2012, AMD (the chipset manufacturer) still makes a driver available for Windows 10.
Under my initial tests, I can now play Starcraft 2 at 1080p on Ultra settings without missing a beat. It also handles the Witcher 2 at 900p on High settings. Under the integrated graphics, the Witcher 2 was rendered like a slideshow - even at 800x600 pixels. A quick browse of YouTube suggests the card will even play the Witcher 3 (though with more modest settings) along with a bunch of titles, I had planned on buying for the PS4.
To say I'm happy with the upgrade is an understatement. It's also led me to reassess my feelings about the PC in general, not to mention the value and benefits that can be derived from giving old hardware a new lease of life and keeping it out of landfill.
For years, I've been a Mac user – a platform that's notoriously difficult to upgrade and maintain. The only Mac with a user-serviceable graphics card is the classic Mac Pro, a computer no longer in production. Modern Mac laptops have no user-serviceable parts at all, not even memory and storage space. When a component on a Mac dies, if you're out of warranty, you're looking at a new computer or paying Apple's obscene repair prices.
I'm not about to jump ship. Touch wood and my 2015 MacBook Air will last many more years – I don't tax it with anything more than writing, coding, browsing the web and occasionally playing movies. These activities I might add, could equally be done on an iPad and there's a case to be made for doing just that. If Apple continue to make laptops with shit keyboards, then I'll certainly look elsewhere once my MacBook Air dies.
For a while, I gave serious consideration to a 5K iMac as our family computer – mostly out of desire for the retina screen, the all-in-one enclosure and the necessities of keeping the family in Apple's ecosystem. The cost though, is not something I'm prepared to swallow, not when there are known issues with the screen burn in and yellowing. Certainly, not when upgrading components is a near impossibility.
The PC is a different animal. A traditional desktop is a robust, general-purpose powerhouse, even one as old as my project/gaming box. I don't care what I throw at it: gaming, video and photo editing, media transcoding. It's astonishing what you can do with a desktop class CPU, hefty GPU, a 600 watt power supply and a bunch of fans to keep everything cool. Having fried my Mac mini's motherboard, I'm extremely leery about Apple's thermal managed in their ridiculously small enclosures.
There's other reasons why I like desktops:
- Tonnes of cheap, high capacity storage: check.
- Optical drives galore: check.
- 10 USB ports: check.
- PCI slots for adding more functionality: check.
- No dongles or expensive docking stations.
- Burn out a component, simply replace it as needed.
- Reach a performance limit, then upgrade the component that's the bottle neck.
For a family computer serving as a workstation and gaming box, these are desirable qualities. It doesn't have to be thin, or quiet it just has to do the job reliably and be cost effective.
I said I wasn't about to jump ship and that's true, and yet that doesn't mean I'm going to stick my head in the sand drinking Cupertino KoolAid. I'm getting used to Windows 10 and I'm liking its new Linux subsystem. Although I prefer macOS, I could get by without it if I had to. The only piece of software I truly feel dependent upon is Scrivener, and that runs on iOS and Windows too – Scrivener 3 for Windows is promising feature parity with its Mac counterpart. I'm jealous of people who can draw and write on their computers with a stylus and find the idea of a device like the Microsoft Surface highly appealing.
In thinking about getting the 5K iMac to share with the family, I could see a world where I ditched the laptop in favour of an iPad pro for the bulk of my personal computing needs. Realistically though, there's no reason why that 5k iMac couldn't be a powerful Windows PC instead.