Saturday, December 29, 2007

Just a Geek

I just finished reading Wil Wheaton's "Just a Geek."  It recounts his struggles after leaving Star Trek.  Today Wil Wheaton is a prominent Geek.  He has 3 books, a popular blog, and was the keynote speaker at PAX 2007.  However, for the 15 years between leaving Star Trek: The Next Generation and today, he really struggled.  This book is a look into his mind during that tumultuous period.  He was a has-been who couldn't get work as an actor.  He was a husband and a dad and couldn't provide enough money to pay all the bills.  He was struggling with who he was and with his decision to leave Star Trek before it was over.

The book is basically a collection of blog entries from WilWheaton.net and his previous site.  However, unlike some books that are mere collections of blog entries, there is a lot of additional context around the entries that you'll only find in this book. 

Wil holds nothing back in his descriptions of what he was going through.  He had it rough for a while.  His style and openness makes the reader care about him as a person.  This isn't a book to read to get all the dirt on his life.  Rather, it is a book to read to understand Wil Wheaton the man.  It is an inspiring read.  To see him overcome his doubts and fears.  To watch him brought to his knees and admit defeat only to renew himself for victory on a new front.  One cannot help but be inspired by his story.  I find myself looking forward to reading his other material.  I read his new blog, but only irregularly.  I got tired of reading about his poker games.  After reading this book, it will go back on my regularly read list now though.  It looks like poker is taking a lesser role once again.

Has anyone read his newest book, "The Happiest Days of Our Lives" yet?  Is it any good?

Friday, December 28, 2007

On the Edge

I started On the Edge:  The Spectacular Rise and Fall of Commodore this summer but had to put it on hold as I went back to class.  Now that class is done, I have a few weeks to read what I want and finishing this was my first order of business.

On the Edge tells the tale of the rise and fall of one of the great American computer companies:  Commodore.  You've seen me refer many times on this blog to stories about Commodore and the Amiga.  My first computer was a Commodore 128 and I spent most of high school and college on an Amiga.  While the C= 128 was fun, the Amiga was just amazing.  It was so far ahead, it took the PC nearly a decade to catch up.

This book recounts the brilliant engineering and terrible management that characterized Commodore throughout its history.  Apple had Steve Jobs.  Microsoft had Bill Gates.  Commodore never had that great leader.  It had leaders, but none of them were great.  They never understood the market.  It was Jack Tramiel's company during the Commodore 64 days.  He had something with the PET and the VIC-20 and the Commodore 64.  But to him it was never a computer.  It was just a commodity to sell and he failed to understand how to leverage his great hardware into something bigger.  He cut R&D funding.  He pushed for too many compromises on the altar of cost reduction. He fired nearly everyone who did the best work.  Chuck Peddle created the 6502 and the PET.  He was the leader of the engineering group at Commodore and had great vision for computers.  Tramiel saw him as a threat and fired him.

I didn't realize that Commodore had the sales or the opportunities it had.  Despite what we've been taught to believe, the Apple II didn't start off too well.  Commodore and Radio Shack both outsold it in substantial numbers.  The Commodore 64 not only used the 6502 processor, but Commodore created and owned it.  The same 6502 that was at the heard of the Atari and Apple computers and even the NES.  They had their destiny in their own hands.  They could have created a 16-bit version or at least made it faster.  Instead, they fired all the staff responsible for creating it and lost a great opportunity.

Around the time the Amiga was acquired, Tramiel himself was fired by Irving Gould, the financier of the company.  Gould wouldn't keep management in place long enough to let a real strategy be executed. He too felt threatened by those who were his biggest assets.

Brian Bagnall does an excellent job chronicling the years of Commodore.  The book seems to be based largely on the recollections of people like Chuck Peddle and Dave Haynie but includes a myriad of other sources.  The book follows the personalities rather than the events.  In this way, the reader comes to know these men and can feel for them as they are jerked around by management.

As someone who grew up on Commodore's machines and who faithfully read every Dave Haynie post on FidoNet for years, I found it painful to watch the company I knew and loved die.  It was painful when the Amiga died in 1994.  It was painful to relive them reading this book.  I enjoyed it though.  If you are one of those who drank the Amiga Kool-aid during it's decade-long run, grab this book.

The book is also insightful for those of us working in the technology industry today.  Commodore died not because it couldn't create competitive products, it died because it made bad decisions.  Bad non-technical decisions.  The moral of the story:  it doesn't matter how cool the technology or how good the engineers.  If a company has poor management, it will fail in the long run.  Something to consider before your next job interview.

Thursday, December 27, 2007

First EMI, Then Universal, Now Warner...

Apparently Warner Music just announced that they were releasing all of their tracks DRM-free.  That makes 3 of the big four giving the heave-ho to DRM.  Sony is now the lone holdout against the future.  How long until they give in to the inevitable?  Next up, the movie industry?  We can only hope.

Wednesday, December 26, 2007

Encapsulate What Varies

It took a lot longer than I expected but this is first installment of my Design Principles To Live By series:  Encapsulate Variation.  This is a quick tour through the principles behind the design patterns.  Following these allows will allow you to make the "right" choice in most situations.  As with everything, there are exceptions to the rules.  These are not set in stone.  Violating them in okay, but only if you understand why doing so is better than following them.

Encapsulate Variation

A test for good software design is how well it can deal with future change.  As the cliche truthfully claims, the only constant is change.  Inevitably any piece of software the is in use will be asked to change. Business needs will evolve or the problem space will be better understood, etc. Whatever the reason, the software will need to change.  A good design will allow for that change without too much work.  A bad design will be very hard to modify. 

While it is nearly impossible to predict the specifics of future requirements, it is much easier to understand what is likely to change.  When designing software, look for the portions most likely to change and prepare them for future expansion by shielding the rest of the program from that change.  Hide the potential variation behind an interface.  Then, when the implementation changes, software written to the interface doesn't need to change.  This is called encapsulating variation.

Let's look at an example.  Let's say you are writing a paint program.  For your first version, you choose to only handle .bmp files because there is no compression and they are easy to load.  You know that if the program becomes popular, you'll want to load other files like .jpg, .gif, .png, etc.  The naive way to implement the loading of a bmp is to write some functions that do just that.  They load the bitmap file into your internal version of it.  If you are using an api to load them, you might even be tempted to put the correct API calls directly in the Open button handler.  Doing either will make life harder later.  Every place that has to load the files (the button handler, the file recovery routine, the recently-used menu selections, etc.) will have to change when you add support for JPEGs and portable network graphics. 

A better solution would be to create an interface IImageLoader and inherit from it for BMPLoader.  Then all code handling loading files will call methods on IImageLoader and won't care (or know) about the specifics of the type of image being loaded.  Adding JPEGLoader and PNGLoader will require changing much less code.  If done right, changes will be isolated to just one place.

The point of this principle is to look ahead a little.  See what is likely to vary, and plan for it.  Don't plan for it by writing the handlers for JPEG.  Maybe HDPhoto will have taken over the world by then.  Rather, ensure that those things most likely to vary are encapsulated and therefore hidden from the rest of the program.

Tuesday, December 25, 2007

Let It Snow!

Here's my white Christmas:

snow_small

Merry Christmas To All!

Merry Christmas everyone.  I hope you are all able to spend some good time with family and friends.  I'm off to see what Santa brought me.


[2 hours later]  We finished opening presents and it started to snow!  Not a little snow, but a lot of snow.  Large flakes fill the air.  A white Christmas in Washington.  Uncommon but very cool.

Wednesday, December 19, 2007

What Is Test Automation?

I talk about it a lot, but I don't know that I've ever defined it.  A reader recently wrote in and asked what exactly this was.  I suppose that means I should give a better explanation of it.

Long ago in a galaxy far, far away, testers were computer-savvy non-programmers.  Their job was to use the product before customers did.  In doing so, they could find the bugs, report them to the developers, and get them fixed.  This was a happy world but it couldn't last.  Eventually companies started shipping things called SDKs which were Software Development Kits full of programming primitives for use by other programmers.  There were no buttons to click.  No input boxes to type the number 1 and the letter Q into.  How was a non-programmer supposed to test these?  Also, as companies shipped larger and larger products and these products built upon previous products, the number of buttons that needed pushing and input boxes that needed input grew and grew.  Trying to test these was like running on a treadmill turned up to 11.  The cost of testing grew as did the amount of time developers had to wait for the testers to give the green light to ship a product.  Something had to be done.

The solution:  Test Automation.

Test automation is simply an automatic way of doing what testers were doing before.  Test automation is a series of programs which call APIs or push buttons and then programmatically determine whether the right action took place.

In a simple form, test automation is just unit tests.  Call an API, make sure you get the right return result or that no exception is thrown.  However, the real world requires much more complex testing than that.  A return result is insufficient to determine true success.  A function saying it succeeded just means it isn't aware that it failed.  That's a good first step, but it is sort of the check engine light not being lit in the car.  If there is an awful knocking sound coming from under the hood, it still isn't a good idea to drive.  Likewise, it is important to use more than just the return value to verify success.  Most functions have a purpose.  That may be to sort a list, populate a database, or display a picture on the screen.  Good test automation will independently verify that this purpose was fulfilled.

Other advanced forms of test automation include measuring performance, stressing the system by executing functionality under load, and what we call "end to end" testing.  While unit tests and API tests treat methods and modules as discrete pieces and test them in isolation, end to end testing tries to simulate the experience of the user.  This means pressing the buttons in Windows Media Player to cause a DVD to play and then verifying that it is playing.  Sometimes this can be the most challenging part of testing.

Here's an example of something we had to automate.  Think about how you might approach this.  Windows Vista offers per-application volume.  It is possible to turn down the volume on your World of Warcraft game while leaving Windows Media Player playing loud.  To do this, right-click on the speaker icon in the lower-right-hand corner of your screen and select "Open Volume Mixer."  Moving the slider for an application down should cause its volume to attenuate (get quieter).  Testing this manually is easy.  Just play a sound, lower the volume, and listen.  Now try automating it.

Tuesday, December 18, 2007

Welcome Matthew van Eerde to the Blogosphere

One of my team members, Matthew van Eerde, just joined the blog world.  Check out his inaugural post.

Monday, December 17, 2007

Vista SP1 Release Candidate Available to the Public

Vista SP1 RC1 has just been released for public consumption.  If you want to try it out, you can do so here.  I'm running this on most of my machines without incident.  This includes my Media Center at home.  So, from my few data points, it seems quite stable.  Don't expect any major changes in functionality.  This isn't XP Service Pack 2.  It's more like a traditional service pack.  Lots of bug fixes, but it's not being used to roll out big new features.  If you want to read about it, check out Paul Thurrott's review.  Remember, this is the release candidate, not the final build.  If you install this, you'll have to uninstall and upgrade to the real one whenever the final is released.

Thursday, December 6, 2007

Dynamic Range and Color Spaces

Bill Crow, best known for his work on HDPhoto/JPEG-XR, has a great post about dynamic range and color spaces.  If you are into photography or video, understanding this is important.  As we try to aggregate video from more and more sources onto varying display mediums, color science is becoming every more important.  Bill gives a great introduction to the subject.  If you want to know more, there is a great book called Digital Video and HDTV: Algorithms and Interfaces by Charles Poynton that covers this all in great depth.