Tuesday, December 19, 2017

On a Y2k18 Bug - School boy error or sadistic API?

Being and older developer, I remember all the fuss about the Y2K bug.  For the younger readers, shortly before the millennium, there was huge panic that mankind would suffer enormous catastrophe as we moved from the year 1999 to 2000.  The cause of this concern was the short-sighted way some computer programs had been storing dates.  We had been using a couple of numbers to store the year - like 77 for 1977 and 85 for 1985.  Developers of that early software had made the assumption that their programs would not be running by 1999 and therefore thinking about numbers greater than 99 (1999) was not required.

Friday, November 10, 2017

On the extremes of radical new management (part 2)

Over the last 20 years, I've seen a few management masterstrokes, particularly in senior management, and without being privy to the discussions that led to them, I have on occasion tried to rationalise the oddity with possible explanations - here are some of my thoughts.

Being Radical
Often, when a new person is brought into a senior managerial role, the company is looking for change - typically a big change - because they believe that something is fundamentally wrong.  Companies get stuck in a rut when the culture stagnates around existing or historical dogma.  They think they need a big culture shock / kick up the backside to get things back on track, and who better to give this high risk strategy to than....

Tuesday, October 3, 2017

On the extremes of radical new management (part 1)

The early days of my career were spent with a small struggling software company competing in a challenging market mostly dominated by much larger players.  Our company had a different vision, some very talented people and culture of taking raw talent and honing it through collaboration and mentoring.

We had a development lead - let's call him Dave to protect his true identity as everyone knows 3 or 4 Daves.  Dave was an unassuming genius who loved what he did and always had time for others. Within a couple of years of professional development, I had seen Dave perform some extraordinary feats of technical mastery and I aspired to be just like him.  I thought I was getting pretty good at what I did, but Dave was the guy that all the devs looked up to and relied on when things broke in ways that no one else could fix.

Then a strange thing happened...

Monday, July 24, 2017

...and just enough refactoring to not screw it all up

From my previous post, you might have learned that I'm working with a small company that have yet to reach the level of maturity where a more rigorous development process would not be viewed as over-egging. Disturbingly, we already have a legacy codebase since the original concepts had been kicked around for a while in prototype form before morphing into the real application. Regardless of my personal opinions, anytime I want to embark on a radical phase of improvement, I find it useful to ask myself whether I would want that right now if I was paying the bill, or whether there are still bigger fish to fry.

Monday, June 5, 2017

On the art of gradual improvement...

Like most people, I've worked with some pretty ropey codebases over the years - in fact I'm big enough to admit that I've created or contributed to some too. However, I'm no longer phased by it in the way that I might have been 10 years ago. 'Legacy code' is merely a fact of life that as professionals, we have to accept - and dare I say it - embrace.  It's not that I enjoy working with legacy code any more than the next guy, it's just that I have a good appreciation of how 'less than perfect' code is generated and a trusted bag of techniques for making it better.

Monday, May 8, 2017

On lessons in learning from a 6 year old...

I've been developing software for what sometimes feels like a lifetime. I mean, it's almost a quarter of a century, so for some, it is indeed a lifetime. To the hip youngsters who have only ever known the internet world, mobile devices and touch interfaces, I'm a dinosaur. However, as someone once said somewhere: "what other people think about you is none of your business". The thing that has been occupying my thoughts more recently, is what I think about myself.

Tuesday, March 7, 2017

On the Road to Type A Nominalization

I studied computer science at university in the early nineties.  It's a long time ago now, and although it wasn't all mainframes, terminals and punch cards, it was still relatively early in terms of a real understanding of how to build complex software.  There was a lot of talk about object orientation, CASE tools, code generation and expert systems.

Prior to university, I'd done hobby programming in Z80 assembler, GW Basic, Turbo Pascal and 'C'. I couldn't imagine anything other than procedural programming because that's what I'd been doing up to that point, and getting some pretty good results with it too.

Learning how to use some crappy code generation tool that would create 30 almost empty Ada files with a marginal boiler-plate benefit did not set my world on fire.  However, the object orientation stuff was much more appealing.