Code Renaissance is about building great teams and great software. By exploring best practices, team interactions, design, testing and related skills Code Renaissance strives to help you create the team and codebase that you've always wanted.

Audio: Kent Beck on The Future of Developer Testing

(Recommended Audio - MP3 from

This is an excellent pep talk by Kent Beck on Developer Testing(unit testing), how it affects the health of software over time, and how it leads to Healthy Software.

I like the distinction he makes between Quality Software and Healthy Software. Quality Software is software that currently works as designed and lacks bugs. In contrast Healthy Software is software responds well to changes over time. It is possible to have Quality Software without developer testing but healthy software relies on developer testing. I've had to maintain a lot of Unhealthy software that's falling apart at the seams. Some of it was likely considered quality software in the first deployment, but lack of documentation and lack of unit testing doomed it to failure over time.This is an great introduction if you are interested in getting into unit testing or want to get others you work with on board.

Audio: Here comes another bubble

(Recommended Audio - MP3 from

This is a little off topic but I ran across the song "Here comes another bubble" on a TWIT podcast a few days ago and I think it's hilarious. It's an I.T. parody by the Richter Scales of the song "We didn't start the fire". I finally found a copy of the music video on ValleyWag (it apparently disappeared from YouTube on a Take-Down order). The video makes the song even funnier; I can't imagine anyone in I.T. not getting a chuckle out of this. If you haven't see it yet then you should. Enjoy.

Well-Factored Code

What precisely is well-factored code? The phrase grabbed my attention in a recent post by Jeff Atwood. It's a familiar concept; At a gut level this shouts at me: clean code! The self explanatory definition would be: code that has had an acceptable number of refactorings or, perhaps, code that has had all obvious refactorings performed on it. I googled the term to see how it is being used. From this search and, in particular, discussions in one wiki entry I have inferred my own definition.

Well-factored-code: code that is inherently readable, scalable, and maintainable but may be performance insensitive in certain contexts, platforms, and environments.

To optimize the performance of well factored code certain defactorings, such as in-lining small methods, combining classes or rearranging data structures, may be required. In general though, the consensus is that well-factored code is easier is optimize because bottlenecks are easier to identify and the code is easier to change.

Don't think kitchen, think cooking

I was listening to a talk by Robert Kalin from the 2006 IDEA (Information: Design, Experience, Access) conference in Seattle and I came across a saying that comes from architectural design "Don't think kitchen, think cooking". This is the type of thing that I love to latch onto. To restate this: avoid preconceptions by keeping your conceptual domain as large as is possible but no larger. The 'no-larger' part is very important and is driven home rather well by a long winded joke that my boss pointed me to "A toaster is not a breakfast food cooker".

I love to cook though I seldom get the chance. Mainly this is because wife says I make too much of a mess when I cook. I consider this point irrelevant; since I always clean up my 'mess' what difference does it make how big it is? In any case when I think kitchen my mind scans every kitchen I have ever cooked in and takes in the good points and throws out the bad and I have a pretty good idea of what a kitchen should look like. These are preconceptions. When I think cooking my mind jumps into creative analysis mode and I begin to consider the different points of cooking, what is convenient and inconvenient, and what needs to exist to allow me to cook easily.

I once saw a picture in a magazine of an extendable faucet over a stove. This is a great idea. When you needed to add water to a pot you simply extended the faucet rather than carrying the pot to the sink or getting a pitcher to carry water to the pot. This little detail is a major convenience that could only have developed by thinking cooking not kitchen.

There's a shortage of good material on good design but I just found an excellent book on web usability and design called "Don't Make Me Think" by Steve Krug. I also found the transcript of an interview he gave on usability if you'd like to see what he's about.

Another excellent book(not web/programming specific) on usability and general good design practices is "The Design of Everyday Things" by Donald A. Norman. The concepts he discusses carry over to programming.

Audio: Discussion Panel on Domain Specific Languages

(Recommended Audio - MP3 from

This is an Expert Discussion Panel on Domain Specific Languages (DSLs). The concept of DSLs has hung around a long time and some think that it may be close to going mainstream. If you've heard the term and need clarification or just want an overview of a maturing technology you may want to take a listen.

DSLs are not mainstream and I have yet to form a cohesive opinion on them yet, but the concept interests me and I continue to look into it periodically.

If you'd like additional resources here are some written commentaries on the topic:

Cognitive errors in I.T.

Cognitive errors are starting to be seriously evaluated in the medical profession where lives depend on proper analysis and effective diagnostics. I see a great need to apply these lessons in the I.T. arena as well. One common error know as Anchoring occurs when the answer that immediately comes to mind is the one that is focused on (often excluding all other possibilities). This often seems to be tied to availability bias where the likelihood of an answer being correct is judged by how easily it can be brought to mind.

Common causes of Anchoring in I.T. include:

  1. Recent exposure to a solution e.g. "We'll use the umptyfart design pattern. I just read an article that said it is an excellent solution to this problem." Or "I heard Joe was having a similar problem the yesterday and he reformatted his hard drive and reinstalled everything and it worked fine. Let's try that."
  2. An unduly strong reliance on prior history e.g. "We're having a problem with the release of the XYZ application. It must be the config file again. "
  3. Confusing correlation with causation e.g. "We pushed out the ABC web application last night and now the LMNOP service is down; The new release must have broken it." Or "After (some candidate) was elected the economy got (better/worse); Clearly they caused the turn in the economy."

While the above reasoning could increase the likelihood of an answer being correct it does not necessarily prove it to be true. Problems arise when other possibilities are not considered or evidence is not evenly evaluated. As a result of this the correct answer may not be found, critical time may be wasted, or unnecessary/harmful changes may be made. It's good to ask "If it's not this, what else could it be?" and briefly explore each possible cause mentally before deciding on a course of action. Often it can be useful if you can quickly eliminate several less likely but plausible causes before investigating a very time consuming but more likely one. Sometimes you luck out.

Be wary of any possibility that you automatically eliminate as this can be rooted in a denial bias. As an example, I recently I cleaned my wife's car just before we left the house on a date and found shortly after that I could not find my phone. It occurred to me (repeatedly) that my phone could be in the garbage bag, but I ignored this suspicion because I knew I would never throw my phone away. Thirty minutes later after prompting from my wife I was standing outside by the garbage dialing my cell phone with her cell phone; its characteristic chirp was clearly audible. In my defense I suspect that the phone fell from my shirt pocket into the bag when I bent to get garbage from the floor-board, but my wife insists that I actually threw it away.

Two other common cognitive errors that are interrelated are Cherry-picking and Confirmation bias. Cherry-picking is process of choosing only examples or data that support your position while confirmation bias is selective thinking where only items that confirm your conclusion are considered.

An interesting book on cognitive errors in medicine is "How Doctors Think" by Jerome Groopman.