21 July 2008

Best Practices?

Joel recently wrote about the use of menus in applications saying that developers shouldn't hide or disable menu items. He writes that we should leave menu items enabled, whether or not they are available.

I have to say, I couldn't disagree more.

Maybe Joel wasn't around when standards for user interface behaviours were being established. I am all for innovation but to change the behaviour of such a basic interface element so late in the game makes no sense without a compelling reason to do so and will only frustrate users.

It reminds me of one project i worked on where the developer had come up with an innovative interface design. He had developed an application so that rather than having a user press and release the left mouse button on a menu item to select it, users would instead left click to pull down the menu and then right click on an item to select it. The idea being that it helped prevent users from selecting the wrong item if they let go of the left mouse button accidentally on the wrong item. True story.

I think, realistically, most experienced developers will disregard this post from Joel and will instantly forgive the author because he makes such a huge contribution in other ways so this is not meant as a personal attack on Joel.

I am writing this post because Joel's article points at something else. There is a tendency within IT organizations to use the word "Standards" or "Best Practices" when they want to enforce conformity and by definition rule out any innovation in the area addressed.

In my business have been very interested in the use of these words because my work is about innovation and the words "Standards" and "Best Practices" are by definition the antithesis of innovation. They basically mean "this is how we do it" and "the matter is closed for discussion".

To be clear, I am a big fan of standards and even some "best practices" but many of the organizations that use these words also say they want to "encourage innovation". If it's true that we want to encourage innovation, then where we set the boundary of what's a standard or not, or what can be called a best practice or not, is critically important if we truly want to encourage innovation.

For example, I am a big fan of having standards around say... the character representations we use. I know that sometimes it's hard to tell a "1" from an "l", and to solve this problem we could come up with an innovative solution. We could choose a different shape for lower case "L" so it doesn't look so much like the number one. Perhaps a small square? That's not being used. So, from now on in all of my programs I could just use a small square instead of a lower case "L". I would explain in the documentation that I have implemented this innovative new feature and before too long my users would get used to it.

Innovation in the area of the characters we use, or the network protocols we use on the internet, or what a dial tone sounds like, or how a basic menu functions, doesn't work. I think the reason it doesn't work is because these are areas where we have solved the problem sufficiently that the cost of innovation is outweighed by the cost of implementation of that innovation. Basically, there is not a high enough benefit in the innovation to recoup the high cost of switching to the innovation.

As an alternative example, some companies have standards around the operating system or application frameworks they use. Switching from say Windows and .NET with an Oracle back end to something like a LAMP back end would be against the "best practices" rules of many IT shops but when you really think about it, the cost of making such a switch can be very small in a world where applications are delivered via the web and where the user is for the most part unaware of the delivery platform.

The benefit of moving to an open source server platform can be phenomenal and the costs are comparatively small for large numbers of users. To rule out an entire genre of technology simply because of a "best practice" makes little sense when essentially, the user can't tell the difference except perhaps increased performance and reliability.

Here's my best practice: Use standards and best practices only where they foster innovation and generate real measurable value.