Posts

Requirements Change is not an excuse

When was the last time you developed a product and knew all, and I mean all, of the requirements, before you started implementation; never? That’s right, requirements change, they always change. If you’re developing a product, for yourself, and you’ve created your own requirements, chances are you’ll change what it’s required to do, whilst you’re actually developing it. This is only natural. As you spend more time on a project, you should gain a better understanding of what you’re trying to achieve and how best to achieve it. This holds true when the product requirements come from somebody else. Developers are often asked to develop a product, based on a set of loosely thought through requirements, only to find that the requirements change, when more thought is given to the idea (often as a result of the developer asking for more information because the original requirements don’t make sense). Because requirements always change, it is not reasonable to use this as an excuse for late

All new developers should read this book

Image
I read this book a number of years ago when I was a junior developer in my first job. At the time I had about one years experience developing web applications (Classic ASP, JavaScript, CSS, and HTML) using knowledge I’d gained at University and anything I could find on the internet. Keen to keep up with the latest technologies our development department decided to use, the then newly released, Microsoft ASP.NET. To help smooth the transition, from Classic ASP to ASP.NET, we were sent on a “suitable” week-long training course. Eager to employ our newly acquired skills we returned from the course and went to work hacking together ASP.NET applications in the same, mostly non object-oriented, fashion. Soon after, a new senior developer joined our team and upon recognising my limited object-oriented approach, and complete lack of design pattern knowledge, suggested I read this book. I highly recommend this book if you have: Little or no object-oriented, or design pattern, knowledge. 

Drag and drop. PLEASE STOP!

Give a man a fish, and he’ll eat for a day, give him a jQuery UI toolkit, and he’ll try to use every bl@*dy effect in every bl@*dy web page. I’ve seen it happen. Don’t get me wrong, I’m a fan of jQuery, and I think some of the jQuery UI widgets, and effects, can really enhance the look and usability of a website. My issue is that whilst its appropriate use can enhance looks and usability, inappropriate and excessive use, can have the exact opposite affect. I should make it clear that I used jQuery merely as an example here and could have used any other toolkit, in any technology, to illustrate my point. A common mistake web developer’s make, once they’ve discovered a new technology, is to attempt to design their user interface around that technology, instead of designing the interface based on the users needs. You will, of course, need to understand what you can achieve with the technologies available to you, but you should strive to design the interface first and then select the bes

Manage your manager

When was the last time you asked your manager for something? They’re not just there to give you orders and make your life a misery, you know? OK, so there are lots of good managers out there. I’ve been lucky enough to work with managers who have, for the most part, been supportive of my personal development needs. I shouldn’t really refer to it as luck, although, of course, there has been an element of luck involved. I prefer to think that I’ve selected my managers, and in a way, I have. I might not have recruited them to their positions but I certainly vetted there suitability during my own interview. I’ve turned down a number of roles because I thought it would be difficult to work with the person interviewing me. But what if your manager wasn’t at your interview? That is when you must take responsibility to, as an ex-colleague of mine used to say, ‘manage your manager’. Managing your manager involves making sure they know what you expect of them, in order to make your job easier and

Make your Prototypes look bad

Prototyping your software is a great way to prove to yourself, and others, that you can get something to work. One of the first goals when developing software is to make it work. And herein lies the problem. As soon as you demonstrate your software to the user, or your manager, or any other interested party, they will assume that it is ready to use. It doesn’t matter to them, that it’s a ‘Prototype’, what they can see is a working product and they will want to use it straight away (providing it does what it was intended to do). What if you deliberately made it look bad? What if you deliberately made it error? They certainly wouldn’t be so keen to roll out the red carpet and put the champagne on ice. Perhaps, introducing faults into your prototype is slightly OTT, but it is important NOT to give the impression that the product is near completion and thus create false delivery expectations that you may be unable to meet.

Not all bugs are equal

The term ‘bug’, is widely used in software development circles, to describe a fault, or defect, in software. Your average dictionary will confirm that the word itself has many meanings, and usages, in the English language. It is because of this wide usage, that I find it strange that the word has been embraced so readily, by software professionals. A developer claiming that their code is “bug free”, gives the wrong impression that the software is perfect, regardless of the many other criteria that need to be meet before ‘perfection’ can be achieved. During testing, Product A was found to have significantly fewer bugs than Product B, yet Product A poses the far higher risk due to the nature of the bugs found. Perhaps we should introduce the concept of ‘nasty bugs’ and ‘cute bugs’? Maybe not, but we should definitely consider using different terminology, to describe software faults, and crucially, understand the impact, that different types of ‘bugs’ will have, on the quality of our so

Is more testing the answer?

It’s not long after you’ve released a new product, or feature, and the call comes in that something is not right. Should have done more testing, yes? Well, maybe. What if you couldn’t do more testing because you just didn’t have the time? In hindsight, should you have factored in more time for testing into the planning phase? In order to answer these questions we need to understand what we are trying to achieve when we test our software. We also need to consider the nature of the projects we are working on, the composition of the development team, and the cost of testing. What are the goals? I opened this post by saying “…the call comes in that something is not right”. I chose these words carefully. By saying that something is not right, what we are actually saying is that the product is not behaving, or does not look, as was intended. It is not great software . When we test our software, we need to go beyond, simply ensuring that it doesn’t throw errors, to making sure that it