Friday, August 31, 2007

iWozn't Impressed

I just finished listening to the unabridged version of iWoz.  It's basically the autobiography of Apple cofounder Steve Wozniak.  I was hoping to get an understanding of the early days of Apple.  I've read several books on the subject but this is directly from someone who was there.  Alas, I was disappointed.  Less than a quarter of the book covers his time at Apple.  This is a book about Steve Wozniak, not necessarily about the time he spent changing the world.  Steve seems more interested in telling us about his pranks and his high school science projects than about the time in his life that made him famous.  Steve goes into excruciating detail about his early days and then blows by his time at Apple in short order.  He writes in a style that comes across as arrogant.  I don't think he really is, but that's the way the book is written.  He thinks very highly of himself.  Unless you really want to learn about Steve Wozniak the man, skip this book.  If you want to learn about Apple, grab Infinite Loop or Insanely Great (more about the Mac than the Apple //).

Wednesday, August 29, 2007

An Explanation for the "Music Slows Down Networking" Issue

Last week there was a small storm on the internet when it was discovered that playing music on Windows Vista caused networking to slow down.  Mind you, the slowdown had little practical effect for most users.  It was most pronounced on gigabit ethernet connections under load.  Still, it was definitely and issue and the blogosphere was talking about it.  Yesterday Larry Osterman of the Windows Sound team and Mark Russinovich formerly of SysInternals and now at Microsoft posted about what happened. 


The basic explanation is this.  Media playback is isochronous.  That means it has to be fed data every few milliseconds or it will glitch.  Anyone playing sound on Windows XP has probably noticed the sound crackling when the system was doing something intensive.  The audio device isn't being fed for a period of time and runs out of audio to play.  Vista doesn't have that problem.  For Vista there is a component which gives priority to the multimedia threads every 10 milliseconds so they can wake up and provide more data to the audio (and video) devices.  Unfortunately, this scheduler is implemented in such a way that drivers can still pre-empt it.  Some networking drivers were doing too much work all at once which caused audio to glitch.  To avert this, the networking system was throttled.  This didn't cause much of an issue for ordinary (100 megabit) networking cards and no issue at all for internet usage (which is usually <= 5 mbps) but did cause a noticeable drop in performance for gigabit networking cards.  Mark says the throttling limited the network card to 15 MB/s whereas a 100 megabit connection can only handle 12 MB/s. 


Go read the posts from Larry and Mark if you want all the gory details.

Monday, August 27, 2007

Penny Arcade Expo

This past weekend Seattle hosted what is reportedly the biggest video games expo in the United States.  That honor used to belong to E3 but they changed their format this year.  Now that title belongs to Penny Arcade Expo (PAX).  Unlike E3, PAX is aimed at consumers, not journalists.  It reportedly had about 30,000 attendees throughout the weekend.  I didn't make it on Friday but Wil Wheaton's keynote was hailed as amazing.  I went on Saturday with some friends.  It was a good time.  We attended a few panels.  One was on the subject of blogs and podcasts and the other was an attempt to make a game in an hour.  The first was moderated by Major Nelson and had representatives from Joystiq, Slashdot, Ubisoft, and a gaming podcast I'm not familiar with.  Here is a photo of the panel.  There was some good discussion about the power of the blogs and their relationship with major media.  The second was someone from Popcap games writing a combat-esque game in about an hour using the popcap games framework.  Trying to write a game in C++ in an hour is no small feat and they fell short.  Still, it was fun to watch them try.  They did come pretty close.

The remainder of the day was spent perusing the expo floor and hanging out in the tabletop games area.  The show floor contained demos of Mass Effect, Rock Band, SingStar, and numerous other video games.  There were also a good number of board game companies present.  Among them was Fantasy Flight.  They were demoing the new StarCraft board game.  Unfortunately, I couldn't get close enough to hear the description.

My favorite T-shirt of the event:  "Real Gamers Shower."

Here are some pictures if you want to see what it was like.

Saturday, August 25, 2007

Test Automation in Crackdown

An interesting report over at Gamasutra.  Jami Johns from Microsoft Game Studio gave a talk at Gamefest 2007 about how they tested Crackdown.  The thing that strikes me most from the article is the increasing need for automated testing and thus test developers who are true programmers.


A few excerpts:


"For Crackdown, the team changed over to a PC application that would rip the data from the game instead of making people report it manually. This generated maps with big red blotches -- making it easy to find the places where performance took a real hit. One block -- where the developers weren't expecting much traffic -- turned out to be a big performance drain. It turned out that the area offered a great vantage from which to look out over the city, so the testers were flocking there."


"The decision was made to create a new tool -- one that could make dealing with the bugs much, much faster. The team came up with a tool called SWARM. This allowed bugs to be tracked easily: each one had a text description and a screenshot, and it could track every bit of relevant data for each bug. Since it was easy to see the bugs, this stopped duplicate bugs and also made them easy to check. Metadata was stored in each bug's jpeg iamge, which meant when that data was dropped into the Crackdown build, they'd teleport to the bug and verify it."


It sounds like they automated not only the bug finding but also the reporting and the reproduction.  Quite a ways from the days of playing a game with a VCR attached and writing your findings on paper.

Friday, August 24, 2007

Why Algorithms Matter

New programmers often don't appreciate the power of algorithms.  They have one criteria for a piece of code:  does it calculate the right answer?  If it does, they go with it.  Every CS program has at least one class on algorithms.  However, if you are a self-taught programmer, you may not have been exposed to the ideas.  This article stems from a conversation with a friend who is trying to learn programming on his own.  This is a simple introduction to the concepts of algorithms and an explanation for why you might care about them.  Most of the time the programs we write deal with small enough data sets and fast enough processors that our choice of algorithms doesn't make a material difference.  However, with a large enough number of items the choice of a poor algorithm can make a tremendous difference.

Here is an example.  When I was first programming, I wrote a Java servlet to access a database of all the DVDs we were testing.  This was very early on and there were only a few hundred DVDs in print.  The servlet had one page which would display each of the DVDs along with some vital information in a table format.  I used simple string functions to build up the HTML describing the table.  With 200 discs, everything worked fine.  We were buying all of the DVDs available at the time and eventually our database grew to encompass over a thousand discs.  Somewhere along the line, I noticed that bringing up the "all discs" page was taking a really long time.  It took something like 3 minutes to display the page.  I did some investigation and figured out the problem.  Each time I concatenated another piece of text or table markup, the entire string was copied to a new location along with the new information.  The same string was being copied tens of thousands of times.  The solution was to move to using a StringBuffer object instead of the built-in strings.  StringBuffer utilizes a better algorithm than does string.  The time to render the page dropped from 3 minutes to 30 seconds.

Before beginning, we should define what we mean by algorithm.  An algorithm is a set of instructions for accomplishing a task.  Everything we write in software is an algorithm of some sort but we usually reserve the term for the parts of our programs that accomplish particularly interesting tasks.  More often than not, the reference is to reusable tasks like searching, sorting, etc.  These algorithms are language and platform agnostic.  That is, the concepts will be equally valid on virtually all programming languages, operating systems, and processors.

If we're going to compare the performance characteristics of different algorithms, we'll need some benchmark to measure them against.  We could just run the algorithms and see how long they take, but this only tells us how they behave on that particular data set.  Instead, we use something referred to a "Big-O" notation* to describe the runtime characteristics on a function.  This describes how the behavior of the function (measured in time in all cases in this article) is affected by the size of the data set.  For each value we add, how does the performance react?  If we know how long it takes to do something to a sample set of size 1, can can then compare the time it takes for bigger sample sets.  The notation is as follows:  O(1), O(n), O(n^2), etc.  O(1) indicates that the time is constant--that is, it is unrelated to the size of the input.  O(n) says the time it takes increases linearly with the size of the input.  O(n^2) indicates that the time grows as a square of the number of items.  An algorithm which is in the class O(n) will run faster than one in the class O(n^2).  For example, if it took 25 instructions to do the work for 1 item, an O(n^2) algorithm would take 250,000 instructions for 100 items.  An O(n) algorithm would take just 2,500.

Let's look at some example algorithms and how they affect running time.

Searching for a Phone Number

Think about looking up someone's phone number in a phone book.  How do you do that?  We could just start on page 1 and look at each entry to see if it is the person we are looking for.  This will take a while.  If we consider just pages, not actual names, we'll have to examine and average of half the pages before we find the number we're looking for.  In the worst case, we'll have to look at each page.  My phone book has 969 pages in the residential listings.  That means it will take us an average of 485 pages to find the name we want.  If we're looking for Fred Zylstra, it's going to take us 969 pages.  This is an O(n) algorithm.  This is how many programmers search for an item in a list but this isn't how anyone actually searches in the phone book.  We open to a page in the middle and if what we're looking for is higher in the alphabet, we turn half way toward the front of the book.  If it is lower, we turn half way toward the end.  We keep repeating this process until we find the name.  This algorithm will find the name we're looking for by examining at most 10 pages.  This is O(log2(n)).  Here is a graph of the two functions:

phonebook

The green line is the linear algorithm.  The red line represents the way most people actually search.

Calculating Fibonacci Numbers

This one is slightly less real world but it demonstrates well the advantage of algorithms.  Fibonacci numbers are a series of numbers where each number is the sum of the two previous numbers.  Formally, F(n) = F(n-1) + F(n-2) where F(0) = 0 and F(1) = 1.  These numbers get large really quickly.  The 10th number is 55.  The 30th number is 832,040.  The 100th is 354,224,848,179,261,915,075.  There are two ways to calculate a Fibonacci number.  This sequence is often used when teaching recursive programming.  The recursive algorithm looks something like this:

UInt64 CalcFibonacciRecursive(UInt64 number)
        {
            if (number == 0) return 0;
            if (number == 1) return 1;
            return CalcFibonacciRecursive(number - 1) + CalcFibonacciRecursive(number - 2);
        }

There is also a non-recursive algorithm.  It looks like this:

UInt64 CalcFibonacciIterative(UInt64 number)
        {
            if (number == 0) return 0;
            UInt64 value = 1;
            UInt64 F2 = 0;
            UInt64 F1 = 1;
            for (ulong i = 1; i < number; i++)
            {
                value = F2 + F1;
                F2 = F1;
                F1 = value;
            }
            return value;
        }

The recursive solution looks a lot more elegant but how does it perform?  let's try calculating F(40).  I tried higher numbers but even F(50) takes minutes to calculate

Algorithm Elapsed Time (ms)
Recursive 5523
Iterative <1

 

The race isn't even close.  Why the big difference?  Let's examine what is going on here.  To calculate the 5th Fibonacci number using the recursive algorithm, we have to calculate F4 and F3.  To calculate F4, we have to calculate F3 and F2.  But wait.  We already calculated F3 directly.  Unfortunately, we end up doing it again.  Here is the complete tree of actions for calculating F5.  Notice all of the redundant work.

fibonaccitree

If you look closely, you'll see that this is basically a binary tree.  Each time we increment the number by 1, we double the amount of the work we have to do.  We have to do 2^n amount of work.  Thus, the recursive algorithm is O(2^n) whereas the iterative algorithm is O(n).  Here is the graph:

fibonacci

Summary

As you can see, the choice of algorithm can have a drastic effect on the running time of your program.  This is why it is wise to have some familiarity with algorithms when you are programming.  Determining the Big-O value for an algorithm can be complex.  Luckily, any algorithm book will tell you the classification of an algorithm so you can look it up instead of having to dust off your math skills.  Spend some time studying an algorithm book or just do some web searches when you go to use one.  Pay attention to these values.  They will be important some day.

I tried to make this easy to understand but if something isn't clear, please ask.  I'll be happy to explain.

 

*Big-O notation is actually a misnomer.  What computer scientists call Big-O notation is more properly called Big-Theta notation.  Big-O describes the upper bound, but that bound doesn't have to be close.  We could just call all of our algorithms O(2^n).  While accurate, this doesn't help us much.  Rather we'd like to know what the lowest Big-O value is which still describes the function.  That is Big-Theta.

Thursday, August 23, 2007

My Team Got Me

It's a tradition at Microsoft that when someone leaves for a while, their office is vandalized in one way or another.  Sometimes it is something big like moving the office into the hall or covering it in post-it notes.  Other times, it's something more subtle.  For me, it was subtle.  I returned from a short trip to California and all seemed well in my office.  Then I heard it.  A short high pitched squeal.  Almost as if one of my monitors was not quite in sync.  Seemingly every 15-20 minutes the same noise or something similar but shorter pierced the room.  I tried to localize the sound but it was difficult because it was so short and came unexpectedly.  After 2 days (much of which I wasn't in my office), I had it narrowed down to one corner but hadn't found it yet.  I'd looked inside the computers, underneath the desk, in the legs of the desk.  Still nothing.  I knew it was something my team had done because of their reactions.  When I was in 1:1s with them and the sound happened, they ignored it.  Still, I hadn't been able to figure out quite what it was.  This evening they revealed it to me.  I think I would have found it within another day or so but it was hidden well.  It turns out they had put a small circuit board called an Annoy-a-tron under the leg of my desk.  It puts out a 2 kHz or 12 kHz sound at random intervals.  Well done guys.  My hat is off to you.

Creating Change

A few articles that came across my browser recently.  Both deal with the idea of changing an organization.  We all get the feeling now and then that our organization is suboptimal in some way or other.  Sometimes it's just a few things.  Other times it might be almost everything.  Either way, there are things where we say "If I were the boss, I'd change X."  Most of us aren't the boss and when we are, we find that there's another boss above us.  So, if you are aren't Bill Gates, how do you change an organization? 


Joel gives us some advice including just doing the right thing yourself, getting involved in the process of making process, and making yourself invaluable (so they'll listen to you).


Max at Codesqueeze has some other ideas.  These include giving others credit, compromise, educating others.


Jeff at Coding Horror talks about leading by example.  He suggests doing the right thing so that others will follow.


Joel and Max both suggest the idea of viral marketing or getting together a support group.  There are probably others in your organization who feel the same way you do.  Find them.  Then start just doing the right thing.  Others will follow.

Tuesday, August 21, 2007

HD-DVD Wins Over Dreamworks and Paramount

Recently people have begun to declare a victor in the next-gen DVD wars.  Because of the PS3, the number of BluRay players on the market are higher than the number of HD-DVD players and thus sales of BluRay discs are higher than HD-DVD players.  From what I understand, the number of HD-DVD standalone players is actually higher than that of BluRay players (i.e. factor our the PS3 from the equation).  Anyway, because of this I'd noticed people starting to declare BluRay the winner.  But now an interesting plot twist:  Paramount and Dreamworks both just declared that they will exclusively support HD-DVD.  The HD-DVD camp appear to be trying to win the market based on cost.  Their cheapest players are $200 cheaper than the cheapest BluRay players.  Perhaps the strategy is bearing fruit.  Neither format has had massive sales numbers yet and the majority of the market is holding back so far.  We're still in the early adopter phase.  It's still anyone's game.


For the record, I still think that downloadable content will probably be the real winner of this generation but we'll see.  The video content delivery system needs to become user friendly for that to happen.

Monday, August 20, 2007

Tacit Approval Often Isn’t

Most of us have found ourselves in situations where we need someone’s approval to get something done, but we can’t seem to get them to respond.  It would be okay if they said no.  It would be better if they said yes.  We just need an answer yet we can’t get one.  One tactic is to just go ahead and do what you wanted.  This has the tendency to come back and bite us if things go wrong though.  A slightly better tactic is to send mail that says something like this:



On the issue at hand, I recommend taking the following actions.  If I don’t hear from you by such and such a date, I’ll move ahead with my recommendations.


This has the benefit of a paper trail. When things go wrongly, you’ll be able to point to this mail and say, “You had a chance to voice your opinion and didn’t.”  I’ll call this strategy getting tacit approval.  The approval is implied.


For some types of decisions, this is enough.  I’ve seen it employed well in situations where one party is being obstructionist via delay or where someone has authority but doesn’t really have a stake in the outcome.  It can work well when trying to get architectural approval for your design.  In the situations where tacit approval works well, you are always the active party and you merely need permission to move forward.


There is another situation where this is often employed and almost always to the tune of failure.  These are situations where you are not the active party.  Instead, you need someone else to do the work.  You’ll define what it is, but you are reliant upon their active participation for things to get done.  A good example would be if you are a release manager and need people to do certain work before the product releases.  You may send out mail explaining what is required and asking for comments.  If, however, you hear nothing back, you didn’t just receive approval.  This is true even if you say “If you disagree, you need to object by this date.” 


The problem is often that people are just too busy.  Too much e-mail is sent.  Too many requirements are put forward by too many disparate groups.  If you don’t hear anything back, it more likely means the message wasn’t received than that it was tacitly approved.  Assuming that silence means approval sets you up for failure.  I’ve seen this happen.  One group I worked with sent out instructions for how to interact with them.  If we didn’t like this, we had to disagree by a certain date.  They did this at a time when everyone was busy doing something else. Then, months later when it came time to finally pay attention to their part of the product, everyone came back with complaints.  They just thought we did.

The solution is to seek expressed approval instead.  Sometimes this can be hard.  The first requests for assent fall on seemingly deaf ears.  If you want to make sure your decision sticks, you need to persist.  If someone has not expressly stated that they agree with your proposal, the chances that they will take actions to being it to fruition are between zero and none.  It is worth the time and effort required to get explicit buy-in when you require the active participation of another party.

Friday, August 17, 2007

XP Machine Can't See Vista Computer - Solution

I'm at my brother-in-law's place and had to troubleshoot a network issue.  I didn't see quite this situation on the web so I'll post it here in case it can help someone else.


The issue was that his laptop (XP) couldn't see his desktop (Vista).  The easy guess was that it was a firewall issue but I looked and the Windows firewall was disabled.  To throw a wrench in the works, my laptop (Vista) could see the desktop.  I surmised then that it couldn't be a firewall issue.  A firewall would block both clients, wouldn't it? 


After some investigation, I noticed something.  When I pinged the desktop from the Vista laptop, the ping address was IPv6.  When I did the same from the XP laptop, it was IPv4.  I tried the IPv4 address on the Vista machine and it now failed to get a response to the ping.  Very strange. 


The web didn't appear to be any help.  I tried several things none of which helped so I won't enumerate them here.  In the end, I found the culprit.  It *was* a firewall issue afterall.  The desktop happened to have McAfee SecurityCenter (v 7.2) installed which had its own firewall.  Disabling that firewall fixed the issue.  The data at hand would seem to indiate that this particular firewall only protects the IPv4 stack, however.  Oops.


 Hopefully this will help at least one of you solve a similar problem.

Monday, August 13, 2007

History of the Amiga

Ars Technica is running a series on the history of the Amiga.  This is the machine I grew up with.  It was way ahead of its time for graphics and sound.  It took many years for the PC (and even Mac) worlds to catch up.  Unfortunately, it was marketed by a less than competent company.  The Amiga eventually died when Commodore went bankrupt but it had a great run.  The article has two parts so far.  I'll try to update this post when more are posted.


Part 1 - Genesis


Part 2 - The birth of the Amiga


Part 3 - The first prototype


Part 4

Scrum Meetings for Test

A year and a half ago I talked about how I was running scrum meetings with my team.  Since then, we've refined the process but have consistently held scrums on a regular basis.  Note that I'm not running a full Scrum system with sprints and product backlogs and such but rather just adopting the scrum meetings from that system.  Currently we have a team of 8 test developers.  We meet twice a week for 1/2 hour.  The format is simple. We go around the room and each person answers three questions:



  1. What did I do since last time?
  2. What will I be doing next?
  3. Is there anything blocking my progress?

Doing this helps me keep the pulse of the team and--more importantly--helps the team keep its own pulse.  It also encourages the team to act as a team.  It is easy in software to put yourself into a silo.  You have a task and you disappear into an office for a few weeks to get it done.  You might talk to your manager about it, but you don't talk to people outside your dependency list.  The disadvantage of this approach is that you don't get help from others.  In a scrum meeting, everyone learns what everyone else is doing.  If someone has experience in something someone else is struggling with, they offer their assistance.  In this way, the team starts supporting itself and the overall output increases.


Along the way, we did things wrong.  We learned from our mistakes.  Here is at least some of that knowledge:



  • Scrum is disruptive.  Programming is a matter of building up a mental map of the problem and then writing down the solution.  Once someone has this map built up they can work efficiently.  Having to change to another function is akin to swapping out the pages of the map.  Trying to start back up again requires paging everything back in which is slow.  Unfortunately, the human backing store isn't always stable and some paged out data gets lost.

  • Don't run scrum too often.  During a time where you are burning down bugs, meeting daily can be useful.  During the rest of the time, meeting daily is too often.  There isn't enough new to report and, worse, it tends to become disruptive.  Perhaps someone who has done daily scrum during the development phase can explain how this is avoided.

  • Scrum can't be seen as judgmental.  I found that without some calibration, team members felt that they were being judged by their progress.  If they didn't have something new to report, they felt it would be held against them.  Because of this, they didn't want to show up.  The solution was making it very clear that scrum meetings were all about status.  Being open is much more important than the level of productivity any individual was able to demonstrate.  The purpose isn't to take notes for the next review.  Being explicit about this helped.

  • Don't get bogged down in details.  The natural tendency of engineers when faced with a problem is to solve it.  This is good, but scrum isn't the place for solving problems.  It is the place for surfacing them.  Solutions should be derived outside the meeting.  Keep the meetings to their scheduled time limits.  Don't allow discussions to get into too many details.  Instead, take a note and have a followup discussion later.

Friday, August 10, 2007

Another DRM Domino Falls

First it was EMI announcing that they would sell non-DRM'd music on iTunes.  Later they announced they would sell through Amazon as well.  Now the next RIAA member appears to be taking the plunge.  Universal is going to be selling most of their catalog in a clear format through outlets like Amazon and WalMart.  Curiously, they won't be doing so via iTunes.  Universal is only doing this temporarily and may revert to their current behavior in January.  Still, it shows that they are not pleased with the sales of digital music and are looking for alternatives.  That's good news for us consumers.

Thursday, August 9, 2007

Summer Vacation Here I Come

I just turned in the project for my summer class.  If you want to take a look, you can find it here.  That means I'm now on summer vacation.  I liked the class but it's nice to have it over with.  Now I can get to some of that reading I didn't have time for.  First up is On the Edge which is the story of Commodore.  I cut my teeth on the C= 128 and grew up with the Amiga.  If it's a good read, I'll let you know.  Of course, my next class starts on August 22nd so it's a pretty short summer.  I better make the most of it.

Windows Live SkyDrive Is In Open Beta

Windows Live SkyDrive is now open to the public in beta form.  From what I can tell, it is basically 500 megs of drive space on the web that you can use to store personal documents or things you want to share.  This isn't the first time something like this has been done.  I can think of older stuff like xdrive, but this is the first time I recall a major player doing it.  It also has very slick integration with the system.  With the installation of one activex control, you can drag and drop things between your computer and the web.  Quite cool.

Wednesday, August 8, 2007

Vista Performance Update

We just released two update packages which improve the performance and compatibility of Vista.  As far as I can tell, they aren't on Windows Update yet so you'll have to go get them manually.  The most significant in my mind is improvement for file copies from the Vista shell.  Anyone who has tried to copy even a small amount of data in Vista knows how painfully slow it can be.  These updates are reported to fix that.  I'm installing them now.  Here is Ars Technica's writeup on them.


The performance update is here.


The compatibility update is here.


 

Thursday, August 2, 2007

New Blade Runner Coming!

25 years after Ridley Scott's masterpiece was first released, another version is nearly upon us.  Blade Runner:  The Final Cut is coming December 18.  What I find most cool about this is not the new cut, but rather the original.  When it was released in 1982, the movie had a voiceover by Harrison Ford.  When It was re-cut in 1992 as the Director's Cut, the voiceover was left out.  Many disagree with me, but I actually liked the voiceovers.  I would prefer that they be there.  Alas, the only version to have made it to DVD has been the director's cut.  Now, along with this new cut, there will be a collector's edition DVD set which includes the original cut as well.  Looks like I'll be buying this movie one more time.  I'm crossing my fingers that perhaps the new cut will  be accompanied by some sort of limited theatrical release.  It's been a long time since I saw Blade Runner on the big screen.  There are a few video clips from the new cut here.

Wednesday, August 1, 2007

Dilbert On Software Testing

Two recent Dilbert comic strips are a picture of how things sometimes work in our business.

In the first, Dilbert complains that funding for test automation has been eliminated.  The PHB responds that Dilbert should write some himself.  Because Dilbert is already paid, the work is "free."

In the second, Dilbert complains that he can't get the work done by the arbitrary deadline.  The PHB tells him to "Try working smarter, not harder, with a sense of urgency, and a bias for action."

The Comfort of Unit Tests

Working on my class project this summer, I decided to bite the bullet and do full unit testing.  I've done unit testing before, but only in small amounts.  In this case, I'm doing something akin to test driven development.  I'm writing the tests very early and for everything.  Not necessarily first, but in very close proximity to when I write the code.  At first it felt painful.  You can't code as fast.  It sucks having to stop and write tests.  Then I started making changes to the design.  This is when it pays to have unit tests.  It was great to make a change, press the "run tests" button, and look for the green bar to appear.  If it didn't, I could easily go fix the issues and continue on.  I didn't have to worry about breaking random things with my changes.  Instead, if the tests passed, I could be confident that everything was still working.  Unit tests are a little like a security blanket.  Just having them makes you feel comfortable.  In the end, I find myself more willing to make changes and to refactor the code.

One of my partners in this project has written some code that I will end up helping out with.  He has no tests for his area.  It's a lot scarier going in there to make changes.  After doing some work, I will have no peace of mind that I haven't subtly changed things.  This also points out another benefit of unit tests.  It makes life a lot easier for the next person.  It's not too hard to keep all of your own code in your head and know what effects changes are likely to have.  It's very difficult to do the same with someone else's code.  I want the ability to make what looks like an obvious change without having to read every piece of someone else's code first.  Creating unit tests makes life a lot easier for whoever has to maintain the code later.

This brings up a use for unit tests that is mentioned in the literature but is not often considered in the real world.  When you are asked to change "scary" code that doesn't have any unit tests, consider spending the first part of your schedule writing those tests.  Not tests for the new code, but tests for the code as it originally exists.  Once you have a comprehensive suite of tests, you can make changes with peace of mind.