Code Renaissance is about building great teams and great software. By exploring best practices, team interactions, design, testing and related skills Code Renaissance strives to help you create the team and codebase that you've always wanted.

Audio: Scott L. Bain on Emergent Design

(Recommended Audio - MP3 From

In this webinar Scott L. Bain talks about the concept of Emergent Design and about best practices that reduce risk and waste in software development. This is a solid review of software design principles relayed from a new perspective... I highly recommend that you give it a listen.


  • 00:00 Speaker & company plug
  • 02:04 Speaker's background
  • 03:08 Some questions about design
  • 07:20 Overview of topics to be covered
  • 08:28 Natural flow of software development
  • 28:23 The open/close principle and design patterns
  • 43:00 Principles and Practices

Adventures in Ajax

My current project has me doing some hand coded Ajax (by which I mean that I'm not using an Ajax library). There are a couple of interesting findings I'd like to mention to anyone out there who might be doing the same.

As I watched the http requests (courtesy of "Charles") I notice that IE6 allowed only 2 concurrent Ajax requests and that Firefox allowed a maximum is 3. The impact of this is if you have several XMLHttpRequest objects processing requests (each object can only process 1 request at a time) you could have one or more ajax calls in queue. This might be important if you have slow response times or several different pieces of functionality making Ajax calls.

Another thing that I discovered is that while the async parameter for does not have to be specified, it actually defaults differently depending on the browser; in IE6 the default is true, but in Firefox the default is false. Not specifying this parameter will lead to Firefox locking up while waiting on requests.

Finally, make sure that you take slow response times into consideration. In your testing you'll probably have very fast response times, but in production your database server and web server are likely to come under heavy load from time to time. The result of this is that the action of clicking a button, which previously gave instantaneous results via Ajax, might appear to have no affect at all. Make sure there is some sort of UI feed back to the user that your application is processing their action/request. Also slow response times, database time outs and other issues may lead to failed requests. Make sure you handle these or they may lock up your application.

Hope this helps...

The beauty of JSON with Ajax

JSON is a subset of the JavaScript that provides a fairly terse, dense data notation. When passed as data client-side it can be deserialized into a JavaScript Object with one line of code. In contrast XML is bulky and requires a fair bit of code client-side to parse and work with it. I believe that the denser notation and client side ease of use make JSON a ideal data transfer format for Ajax.

Lets take a quick look at some data in both XML and JSON formats...

<Table> <Rows> <Row><Item>1.1</Item><Item>1.2</Item></Row> <Row><Item>2.1</Item><Item>2.2</Item></Row> </Rows> </Table>


The leaner format can cut the size of your data in half and while you do loose a bit of the human readability with JSON, you can always use a tool like Charles to break it out into a tree view when debugging. The real beauty of JSON, however, is the ease of working with it on the client. If you deserialized the data into a variable called DataTable you can read first item in the second row by doing this:

DataTable .Rows[1].Item[0];

The trade off is that you will get cleaner, smaller client side code in exchange for more work getting your data into JSON on the server. Personally I'd rather have the extra code on the server where it can be unit tested and easily debugged than on the client where it can't.

Some might say that if you aren't using XML then you aren't doing Ajax. I disagree. I think that Ajax is actually a misnomer and so is XMLHttpRequest. XMLHttpRequest doesn't require that your data be in XML and will accept any format you choose. Believe it or not even JavaScript isn't even required to do Ajax; I hear VBScript works with XMLHttpRequest just fine.

I think that we've missed the big picture; JavaScript and XML aren't the reason why Ajax is such a phenomenal tool... it's the Asynchronous HTTP requests that it provides that make it so powerful. Also DOM scripting and DHTML are not Ajax, although they are used quite effectively in conjunction with it to provide the rich features that everyone associates with Ajax. The bottom line is that any site that makes Asynchronous HTTP requests is using Ajax and any site that doesn't, isn't.

The irony is that the one technology that is actually required for Ajax, HTTP, isn't even in the name. I think that AJAX would be better called Asynchronous HTTP-Request Scripting. Sadly AHS will never have the marketing appeal of the cool sounding Ajax, so we're stuck. As for the purists out there, I'm sure you'd prefer that I say that I am recommending "Asynchronous HTTP request scripting with JavaScript and JSON" than for me to talk about using JSON with Ajax. Sorry no such luck.

You're gonna love working with "Charles"

Charles isn't a person, it's software and once you get past the goofy name I'm sure you'll like Charles as much as I do.

  • Works on:
    • Windows
    • Mac
    • Linux
  • With:
    • Internet Explorer
    • Firefox
    • Safari

Charles inserts itself between your browser and the internet. It's able to provide:

  • Bandwidth throttling - see how your website will behave at any connection speed slower than your own.
  • HTTP Monitoring -monitor data passing between your browser and the Internet.
  • AJAX debugging - monitor Asynchronous requests and responses in XML, JSON, JSON-RPC, and SOAP formats in a simplified tree view.
  • DNS redirection - redirect calls to a website to point elsewhere.
  • Browser cache disabling.
  • Cookie disabling.

I've been using Charles for a few weeks now and found it very useful. First I throttled down the connection to 56K and noticed that some images which were applied to tabs through CSS would take a moment to render when changing tabs(clicking changed the CSS class). It was clear that the image wasn't being cached. Checking Charles showed that the http header didn't contain an expires date so I set a far future expires date in IIS which caused IE6 to start caching the image.

I monitored the sites Ajax communications with Charles which showed that IE6 was caching my AJAX responses. A quick Google search showed that putting a date-time stamp in a query-string parameter would make the URL unique and keep this from happing. Charles confirmed the fix.

As a test I used the DNS redirection feature to set it up so that when I went to www.[ my website name].com the browser would think it was there but Charles redirected the browser to localhost. This could be useful if you were working on a site that had URLs hard-coded to the Domain or calls to Web-services that you wanted redirected elsewhere.

I also disabled cashing and throttle my bandwidth to view how the site would load on a slow computer for the first time.

I'm sure I've got a lot more to discover about Charles, but so far it is a solid, very useful piece of software. The bad news is that it's not free, but the good news is its affordable 50 dollars for a single user license, and only 400 dollars for a corporate (single site) license. In corporate terms that's a steal. I definitely recommend that you try it free for 30 days; the benefits should make it an easy sell to your boss.