Thursday, April 27, 2006

Test Harness Basics

   A short while back one of my readers asked what a test harness was.  I will answer that in a pair of posts.  This first post will describe the basics of a test harness.  The second post will talk about more advanced features that a test harness might have.

   When you are writing test cases , whether they are unit tests, tests for test-driven development, regression tests, or any other kind, there are some functions you will find yourself doing each time.  These include the basics of an application, a mechanism for launching tests, and a way to report results.  You could write these functions each time you start testing a new feature or you could write them once and leverage them every time.  A test harness, at its most simple, is just that—a system to handle the elements you’ll repeat each time you write a test.  Think of it as scaffolding upon which you will build your tests.

   A basic test harness will contain at least the following elements:  application basics, test launching, and result reporting.  It may also include a graphical user interface, logging, and test case scripting.  It should be noted that harnesses will generally be written for one language or runtime (C/C++, Java, .Net, etc.).  It is hard to write a good harness which will work across runtimes.

   To run a test, the first thing you need is an application.  Each operating system has a different way to write an application.  For example, on windows if you want any GUI, you need things like a message pump and a message loop.  This handles the interaction with the operating system.  The harness will include code to start the program, open any required files, select which cases are run, etc.

   The next thing you need is a way to actually launch the tests.  Most of the time a test case is just a function or method.  This function does all of the actual work of the test case.  It calls the API in question, verifies the results, and informs the framework whether the case passed or failed.  The test harness will provide a standardized way for test cases to advertise themselves to the system and an interface by which they will be called.  In the most simple system, a C/C++ harness might advertise its cases by adding function pointers to a table.  The harness may provide some extra services to the test cases by allowing them to be called in a specified order or to be called on independent threads.

   The third basic pillar of a test harness is a way to inform the user of which cases pass and which fail.  They provide a way for test cases to output messages and to report pass/fail results.  In the most basic harness, this could be just a console window in which test cases will print their own messages.  Better than that is a system which automatically displays each test case name and its results.  Usually there is a summary at the end of how many cases passed or failed.  This could be textual or even a big red or green bar.  More advanced systems have built-in logging systems.  They provide a standardized way for the test cases to output trace messages informing the user of each important call as it is being made.  The harness may simply log to a text file but it may also provide a parsable format like XML or even interact directly with a database for result retention.

   At Microsoft, many groups write their own test harnesses which are specialized to support the unique needs of the organization.  For example, my team uses a harness called Shell98 which has, among other things, support for device enumeration.  Good examples of a freely available test harnesses are the xUnit series of tests like cppUnit, nUnit, and jUnit.  These are designed for unit testing and are not very feature-rich.  I’ve used cppUnit which is very basic and nUnit which is pretty slick.  The xUnit harnesses do not do any logging and you do not have control over the order in which tests are run.  They are intended for a user to run and visually inspect the results.  The harness I use allows for scripting and outputs its results to a database.


(fixed the font issue--silly cut and paste from Word)

Tuesday, April 25, 2006

Monthly Podcast Update

It's nearing the end of the month so I thought I'd update you on what podcasts are filling my Zen Nano these days.  Here are the ones I'm listening to regularly.

Major Nelson - Covers the world of the XBox360.  News, reviews, interviews with insiders.

This Week in Tech - Leo Laporte hosts a roundtable discussion about the technology news of the day.  Guests this month included Robert Scoble.

Security Now - Very informative show about the world of computer security.  Most of this month covered the basics of cryptography.

The Dice Tower - Best of the board gaming podcasts.  Reviews, news, and top ten lists.  Good place to become introduced to new games.  The hosts are good natured and fun to listen to.

FLOSS Weekly - Interviews in the world of Open Source Software.  Only 3 episodes old but all very interesting.  Guests so far include Cmdr Taco of /. and Ben Goodger of Firefox.

Honorable mentions (things I listened to but don't make my rotation yet):

This Week in Media - Roundtable discussion about digital media creation.  I've only listened once so far but found it intrigueing.  I'll be back for more.

HDTV Podcast - Not a lot of depth but good coverage of the news in the world of HDTV.  The whole show is less than 1/2 hour so it's easy to squeeze in.

You'll notice that Engadget fell off of my list.  They were sporadic at the beginning of the month and there's only so much I can listen to about the Motorola Q.

As always, if you have suggestions, send them my way.  I'm looking for good podcasts on the subject of audio or video production and on programming.  I haven't found any really good development blogs.  Perhaps it is just something you can't do in an audible medium.

Thursday, April 20, 2006

Software Engineering - A Valid Aspiration?

I was listening to an interview with Alistair Cockburn tonight on my way home and thought he had some interesting insights into the subject of whether we should aspire to the concept of "Software Engineering."  He had three basic points which I'll relate:

First, engineering as it is conceived today, is a rather new invention.  Before 1950 or so, engineering was less math-intensive and more exploratory.  It was about invention over theory.  As Alistair puts in, after World War 2, engineering got "physics envy" and began to move heavily into theory and math.  This isn't necessarily bad (although there are definite side effects to it), but aspiring to be an engineering trade doesn't necessarily mean the heavy-theory, plan everything in advance model we seem to accept it as meaning.

Second, the term "Software Engineering" doesn't have the origin you probably think it does.  My thought when I hear this is that someone looked at software development and said, "you know, this looks a lot like engineering."  In fact, that's not what happened.  In 1968 there was a NATO conference on software development and, in order to be "deliberately provocative", they decided to use the term "Software Engineering" to describe the industry.  This was done more to force the industry in a particular direction than to reflect where it really was.

His third point is that software development is not really engineering--especially our modern view of that term.  It is rather like a cooperative game.  It is exploratory in nature.  The actions taken are two:  invention and communication.  He expands on this idea in his book, "Agile Software Development."  The advantage of thinking of it as a game is that we can see that there are not right and wrong moves but rather better and poorer moves.  You don't win a game by following a ridgid formula.  Instead, you react to the circumstances.  You need a strategic plan, but you also must remain tactical.  A good place where this becomes useful is in the concept of documentation.  It isn't that documentation is intrinsically good (ISO 9001) or bad (XP), but rather we can decide for each project how much documentation is appropriate.  Weigh the costs and the benefits and decide what to do.

Alistair has some good points here.  Similar to my post about why building software isn't like building bridges, he focuses on the naturally exploratory nature of what we do in software.  It is more about making tradeoffs and reacting to the state of the game than about making a priori decisions which will be applied to the situation in a rigid manner.


Part of this interview is typed up on this blog.  The official transcript seems to be here.


Tuesday, April 18, 2006

HD-DVD Launches

The first of the next-generation formats officially launched today.  As of now, you can walk into a store like Best Buy and pick up an HD-DVD player and some discs.  I know at least one person who already has one.  I'm hoping to see what it looks like shortly.

There are 2 DVD players available, the HD-A1 for about $500 and the HD-XA1 for about $800.  That is pretty cheap for a first generation format.  If I recall correctly, that's about where DVD players came out.  It might even be a bit cheaper.  Early adopters always pay a lot. 

There are a whopping 4 discs available right now:  Serenity, The Last Samurai, Phantom of the Opera, and Million Dollar Baby.  Considering a month ago there was talk of launching the player with no discs available, that's not too bad.  It looks like more discs will trickle out each week for a while.  Apparently the first discs and the first players won't have much of the advanced iHD menuing/interactivity yet.  That may have to wait for the next generation of players.

HD-DVD is shipping and BluRay is delayed.  The Samsung player will come out at the end of June now instead of the end of May.  Sony will apparently be launching movies on May 23, but no players will be available to watch them for a month.

I'm not sure quite what this means.  The videophiles are very excited.  I don't get the feeling that the next-gen-DVD war is slowing down their enthusiasm.  What do you think?  Does being out first help HD-DVD?  Will BluRay's larger cache of movies and studios turn the tide?  Does it matter?

I have a hunch that the winner of the next-gen-DVD war will be downloadable content.  More on that later.


Dangers of Test Automation Revisited

Here is a comment I received via e-mail in reference to my post on the Dangers of Test Automation.  I found it insightful.  The point is the oversights are easy and automation won't catch them.  Despite a few claims that this is just a deficiency in the test plan, assuming that a test plan is perfect is a dangerous assumption.  There are always going to be holes, just like there will always be bugs in the code we write.  Having manual exploration gives us insurance that we will cover these holes.  Once again, if all of your testing is automated, you are done finding bugs after your first run.

Here is the comment:

I was reading your post:


... and I was immediately reminded of a scene from Jurassic Park (the book, not the movie, though that was good too)


The park administrator is showing off the high-tech computer systems.  He says "you can search for dinosaurs... I'll just search for all of them... and the big board will show you their current locations."  Sure enough, there they all are.


Later it is revealed that, contrary to design, the dinosaurs are breeding.  Someone asks why the extra dinosaurs didn't show up on the computer search.


Nedry answers (I'm paraphrasing) "The search allows you to enter an 'expected number' of dinosaurs, which helps it to run faster.  It's a feature."


Premature optimization strikes again...


Sunday, April 16, 2006

HDCP Primer

HDCP stands for High-bandwidth Digital Content Protection.  It is the encryption and handshaking protocol used by DVI and HDMI connections to allow transmission of high definition video content between devices.  In the future, this is the protocol that will be used to connect video sources like HD-DVD and BluRay players and set top boxes to your high definition television.  Ed Felten of Freedom to Tinker gives an excellent overview of the technology.  He describes it in easily-accessible language that doesn't require a Ph.D. in mathematics to understand.

For those who want full details, you can find the full specification and other documents here.

Thursday, April 13, 2006

Why I'm Not Buying an HDTV Yet

   Working with video all the time, I should be an obvious owner of an HDTV set.  Alas, I'm not.  I've many times considered purchasing one but I just haven't been able to bring myself to pull the trigger yet.  Why not?  Let me explain why I'm not jumping in yet.

   First, no technology seems quite ready yet.  Each has a pretty substantial downside.  Second, there are still some changes coming that may effect the utility of what we're buying today.

   Let me run through the technologies quickly and explain why I'm not enamored with each one:

  • Rear projection - Almost all of the rear projection units on the market today require expensive bulbs that are changed out every few years.  I want something that can match my CRT and not require lots of maintenance.
  • DLP [Digital Light Projection]- Most DLPs on the market today are of the single-chip variety and use a spinning color wheel to generate the colors you see on the screen.  The problem with this is that, when things move quickly, you can sometimes spot a rainbow effect on the edges.  Like many video artifacts, once you see this, it's hard to stop noticing it.  DLP also has the bulb issue.
  • LCD projection [Liquid Crystal Display] - There is a fairly pronounced screen door effect unless you are back far enough.  What I mean by this is that you can pick out the individual pixels.  It is like watching TV on the other side of a screen door.  LCD projection has the bulb issue.
  • LCOS (SXRD/DILA) [Liquid Crystal on Silicon] - My favorite of the projection technologies.  It doesn't have any major shortcomings outside of the bulb issue.  It's still pretty pricey.
  • CRT [Cathode Ray Tube] - This is your traditional TV set.  Great technology but way too heavy in bigger screen sizes.
  • LCD - LCDs have very low contrast ratios and thus the dark areas of the screen all tend to blend together.  Trying to watch a night scene can be painful as all of the detail is lost.
  • Plasma - Plasma has one big drawback:  Burn In.  It is, by all reports, not as bad as it once was but it is still an issue.  Perhaps it isn't when watching TV but if you want to connect a computer or a game console, you have to be really careful.

   So nothing quite does what I want yet.  On top of that, there is talk of redesigning the HDMI connector.  HDCP (the encryption protocol for HDMI and DVI) is still unproven in my mind.  Each time I read an HDTV magazine, I hear about some cable box that won't talk to some TV.  Until this is rolled out on a bigger scale, I still worry that the connections will fail to work.  Finally, 1080p is a potentially interesting format.  Some screens (like SXRD) have 1080p native resolutions but they won't accept a 1080p signal yet.  They take a 1080i signal only.  I want to wait for them to start accepting the big signals.

For a whole lot of detail on the topic of HDTV, check out the AV Science Forum.

Thursday, April 6, 2006

Apple Stories

I've long been fascinated with the history of computing.  I've read many books and watched what few movies/tv shows exist on the subject.  It is for this reason that I found this week's "This Week In Tech" so interesting.  In honor of the 30th anniversary of Apple's founding, it is an interview with several of the people who were there when it all began.  Guests include Steve Wozniak, Andy Hertzfeld, Bill Fernandez, Daniel Kottke, and Randy Wigginton.

Along these same lines, there is an interview with Lee Felsenstein who ran the Homebrew Computer Club and creator of the Sol.  I haven't had a chance to listen to this one yet but it's queued up on my Zen Nano right now.

If this subject interests you, see my Amazon List on the subject.  It needs a little updating but includes most of the books I've found interesting.

Tuesday, April 4, 2006

Pranks at Microsoft

Microsoft, like many technology companies, has been the site for many pranks over the years.  In this video, Larry Osterman and Dave Norris relate many of the pranks that have been seen on the Microsoft campus during their tenure.  On his blog, Larry describes their latest prank involving 20,000 bouncy balls.

On a side note, this interview is filmed in the lounge area just down the hall from my office.