Wow, it's been years since I've done one of these!
So, as some of you will have read in the last blog post (since deleted) I had a pretty tough couple of weeks recently. I'm glad to say that my world is no longer on fire, and I will be returning to work tomorrow to try and regain some level of normality. Thanks again to everyone who reached out and offered suggestions.
Nope, not version 2! What I said in that section of the other blog post still stands, but thankfully for more positive reasons than "Oh god, I can't do this anymore and I hate everything!"
For most of my 4 years in Software Development (roughly 2 of which was spent primarily as a Java middleware developer) I've been able to say I'm in the uniquely privileged position of doing what I love for a living. However, it is an extremely high pressure job, you have to make hard sacrifices to be successful and the nature of the role is evolving in a way that (to me at least) makes it a lot less fun than it used to be. (See the "Old man yells at cloud" section at the bottom of this post for what I mean by this). I still recommend it as a vocation, but I think it's fair to say it's not for everyone!
It's something I've thought about for some months now, and I don't think there's anything inherently wrong with saying "this career path where I have to live a long way from where I grew up and wrestle with the corporate machine to be successful has an expiration date" if the pros and cons don't line up anymore. Admitting there's an issue is enabling me to focus on forging a better path and planning out how to facilitate that happening.
I'm tempted to get at least another year of full-time software development experience under my belt so there's a nice round "5 years" I can quote on my CV, but there's nothing to say the job won't extend beyond that if I land in an interesting enough project/business and succeed in striking a much better work-life balance.
As I see it my choices in the longer term are to find a new way of being a software developer, switch to the more operational/security side of things or turn my passion for writing into a full time job. It could even be that a combination of some of those is possible!
To facilitate all 3 possible paths I think my professional training next year will focus more on learning about cloud technologies and networking. I'll also be looking to expand my writing side-gig and (if I can find time) coming up with new hobbyist projects at home that excite me.
If you're in a newsagents any time this month check out issue 228 of Linux Format magazine! My first tutorial in over 5 years (on Vagrant no less) has been published and released to the public. You'll see more from me in a couple of months' time in issue 230.
Also towards the end of this month I'll be cycling from Buckingham Palace to Windsor Castle to raise money for The Prince's Trust, an awesome charity that helps young people and those from disadvantaged backgrounds find a positive role in society. If you have a few pennies lying spare here's my sponsorship page.
Update: I'm pleased to say I'll also be attending FOSDEM 2018. I'll be arriving a couple of days before to do the tourist thing, but hopefully it'll also help me attend some of the sessions this time! If you see me wandering around the stalls feel free to give me a prod. :)
I remember when I was a 15 year old kid trying to copy VBA macros from the back of Personal Computer World magazine and struggling to make Game Maker run on horribly under-powered desktop PCs that the technology industry was a very different place. Windows ran on everything, you did your homework with Encarta, flip phones were a fashion statement, everyone had an MP3 player and if you wanted to use the internet you went to school, the library or an internet café.
Since then it's radically changed. We've moved from having one home desktop to one laptop per person. Home broadband is almost ubiquitous and you can connect to WiFi or cellular internet almost anywhere. Smartphones are more powerful than machines that run Windows Vista. Almost any application that runs on the desktop has been duplicated through the web browser, Microsoft is forced to take other platforms seriously and we now have so much data floating around in cyberspace that society is being forced to discuss how to effectively police and make the best use of it. In every way I've just described, society has (in my humble opinion) been enriched by those developments.
But that isn't the only thing that's changed. The very nature of software development has completely changed since I originally decided that this was a career I wanted to pursue. It comes with the territory in our industry, but unfortunately I don't think the latest trends that we've adopted have necessarily been for the better.
Back in 2007 I wrote that I wanted to be a software developer "to model real world problems, create helpful utilities and make games". That later evolved in my mind to "have fun, make cool stuff and break things".
Unfortunately, that's really not where the money is any more, and the balkanisation of each constituent part of the program into discrete disciplines hasn't helped. For example, creating friendly user interfaces for things you develop has become a discipline in itself, with graphic designers creating stunningly attractive interface paradigms I definitely can't compete with as an art muggle. Documentation falls by the wayside or gets rushed at the last minute because we only know a subset of the system and always expect someone else to fill in the blanks. If someone asks for more information on how a system is constructed we normally have to ask a systems designer for the answer because they'll be more aware of the over-arching business reasoning behind it.
More specifically, we no longer really model problems on a single machine with a single program anymore. Instead we appoint "architects" who are pressured from all sides to design highly scalable systems with tiny constituent parts across multiple machines that send messages between each other asynchronously. At any given time, we will only be developing one small cog that will sit behind a load balancer within that larger system. Sure, we realise the benefits of big data and the cloud to deliver infinitely grander capacity for computation. But did we actually solve the problem any better? Could we have over-engineered this solution? We'll never know, because we rarely take the time to compare the efficiency of a system with multiple points of failure we need to keep under constant surveillance to one that is its own single point of failure with a failover mechanism.
Finally, we can no longer create programs as an exploration of new ideas in most enterprise settings. Instead to be "good" programmers who don't use up-front designs we now have to determine all an application's functionality and behaviour up front with a giant test suite and JIRA task list before we can even begin. One solution is to do this piecemeal (say, once per task or sprint) and spend half your life refactoring and redesigning. The other is to reuse frameworks other people have written and clip them together with configuration so we don't have to write so many tests. Both options (or a combination of the two) are about as fun as gnawing your own leg off, but in the grand software factories we've built it is absolutely essential to meet negotiated timelines and quality gates.
It feels in general like professional software developers have stopped trusting each other to be good programmers and keep trying to rescue each other with ever more elaborate methodologies, Jenkins jobs and "look what I made earlier" frameworks and libraries. You're actively discouraged from re-inventing the wheel not because the wheel is particularly efficient or even useful for your particular use-case, but because you're using up valuable time you could be using to tell another technology like Swagger, Cucumber or Apache Camel to do all your coding for you. Everything is stable and quick to release but alas, the soul is empty, the joy is gone and if something breaks your only hope is Stack Overflow.
In summary, the things that excite me about programming feel like they're slipping away from the profession. This saddens me, because hacker culture should be all about having fun, breaking things and discovering innovative ways of improving a system as you fix it. As an industry we've cultivated a collective fear of breaking things outside pre-determined tests and we've even started watching car plants to "increase efficiency" and "automatically generate paperwork" for fear we aren't being taken seriously enough by other parts of the business.
It's great to see hackathons, the free software community and start-ups try to keep the hacker culture flame alive and stop this fear in its tracks, and I applaud my current employer for trying its best to embrace it. To be honest, I don't know what the real solution to fixing this overall trend is, but I think if we keep relentlessly pursuing automation at the expense of fun in fear of a mythical band of "bad programmers" we're going to worsen the burnout culture and fail to address the industry's skills shortage.
Perhaps then that's the first step to rediscovering joy in our work and rekindling the fun that's been lost. All it takes is a little faith that the programmers you'll meet around the world all want to write good code, even if their definition of what "good code" is doesn't match yours.