Wednesday, February 28, 2007

Teaching My Son To Program

A few months back I read the article, "Why Johnny can't code" by David Brin.  He talks about the trials he had teaching his son to BASIC.  Just yesterday, I came across this article by Nat Torkington entitled, "Why Johnny Can't Program."  This one talks about his adventures teaching programming to youngsters.  He didn't use BASIC but rather Logo and Lego Mindstorms.  The comments there are a good source for other suggestions.  Some suggested languages aimed at kids like KPL or Squeak.  Others suggested that Ruby or Python were the way to go.  I think this latter group probably doesn't have young children.

My son is almost seven and insists that he's going to work at Microsoft when grows up.  Makes a father proud.  He enjoys math and for a while I've thought about how to introduce him to the concept of programming.  I thought about Basic, but after reading Nat's article, I thought better of it.  "Hello World" probably doesn't appeal to the average 7 year old.  Something with visuals might be more interesting to start with.  Many of the suggestions were too complex for a first language.

I decided to start small and settled on Logo as my introductory language of choice.  Logo also happens to be the first language I used back in 4th grade.  I recall finding it being taught without much context but I enjoyed it.  I thought I'd give it a shot.  After some looking around, I chose the free MSWLogo.

Tonight I introduced it to him.  He picked it up quickly and really enjoys it so far.  We'll see how long it holds his attention.  I suspect we'll have to move on from Logo before too long.  If things go well, I'll return to this topic in the future.

If you have suggestions on language or techniques that have worked for you, please send me mail or leave a comment.

Tuesday, February 27, 2007

Programming, Bridges, and ... the Halting Problem?

Jt Gleason contends that building software is not like building bridges because of the halting problem.  He describes a situation where you build bridges but random things can go wrong.  The bridge works fine for a VW but not for a Volvo.  Sometimes two cars cross the bridge and end up at different destinations.  Is this really what programming is like?  Sometimes it is.  The advent of memory protection helped a little bit but it is still possible for something like a malformed string copy to cause a program to crash minutes later in totally unrelated code.  From the perspective of a programmer, this can seem just about as mystifying as having a bridge work for one car but not two.  Add multiple threads to the mix and things can get very strange.


Jt seems to blame this seeming randomness and the ensuing difficulty it causes programmers on the halting problem.  The halting problem basically says that we cannot create a program which for all programs can determine if they will end or go into an endless loop.  Jt says that because of this "we can never be certain of any result about any computation."  That's not really true.  The halting problem doesn't say we can never know if any program will end, but rather that we can't know if all programs will end.  In constrained situations, we can most certainly tell deterministically what the behavior will be. 


There is some truth here though.  The more complex the program, the closer it resembles a nondeterministic Turing machine and the harder it is to predict the outcome.  When a program is small, we can keep it in our heads and walk down each path.  As it gets bigger, the number of paths expands exponentially and the length of those paths goes up.  In short, it becomes impossible to fully comprehend. 


This has implications for unit testing.  Unit tests can verify that small pieces of functionality work as intended.  This is like proving the halting of a small program.  However, when we take a bunch of proven smaller pieces and put them together, the end result is not necessarily what we intended.  There can be emergent behaviors we would have thought impossible.  The halting problem begins to rear its ugly head.


This inability to fully comprehend or even really predict all the behaviors of a large program is part of what makes programming so different from bridge building.  I also contend that the fact that computers are unforgiving plays a big part as does the exploratory nature of most software development.  It's more akin to searching for the cure to a disease than to making yet another suspension bridge. 


update:  Here's some discussion of this post over at reddit.

Friday, February 23, 2007

In Defense of Logic Questions

Microsoft has a history of asking logic questions in its interviews.  Because of this, there are many web sites talking about the questions and given answers to them.  There is even an entire book dedicated to the subject called How Would You Move Mount Fuji?  There are those who believe strongly in their usefulness.  There are others who think they have no value.  I fall into the first camp, but only up to a point.  There are two sorts of logic questions.  One is very useful, the other not.  There are logic questions I will call complete.  These are the questions where everything needed to solve them is presented in the problem description.  There are also those I will call incomplete which require knowledge beyond the scope of the problem to solve.

Logic questions are useful when hiring for computer-related jobs because they represent the same sort of thought process required to program or test software.  Computers are pure logic.  They will do exactly what they are told to do and nothing more.  They are unforgiving.  They do not draw inferences.  Close enough doesn't count.  To make them do what you want, you have to be explicit.  To understand why they break, you must comprehend where the logic broke down.  Logic questions tend to exercise the same pathways in the mind.  Being able to figure them out is a good indicator that someone can figure out software.

Logic questions get their poor reputation from the use of incomplete questions.  These questions require you to "think outside the box."  That may be good, but they don't really allow the interviewee to be creative.  Instead, they look for one particular piece of information outside the box.  Solving them is more about getting lucky than it is about applying logic.

Here is a good example of a bad logic question:  Assume there is a room with three light bulbs in it.  Outside the room is a light switch with 3 switches on it labeled A, B, and C.  Once you enter the room, you will no longer be able to access the switches.  How can you, upon entering the room, tell me which of the three switches controls each of the light bulbs?

Think about this question for a moment.  Do you have the answer?  With merely the given information, it is not possible to formulate the answer.  Instead, one must start thinking about characteristics of the light bulb and switches.  Can I take apart the switch?  Can I somehow see into the room before opening the door?  Is there something about the bulbs themselves that I can use to my advantage?

The answer is the following:  Turn on switch A for 5 minutes or so.  Now turn it off and turn on switch B.  Enter the room.  Upon entering, the lit bulb is clearly connected to B.  Feel the other two bulbs.  The one which is warm is attached to A.  By process of elimination, the cool one is connected to C. 

Is it really fair to fail someone in an interview for not thinking about the properties of light bulbs?  Other questions require understanding of math, physics, geometry, etc. that, while the person may once have taken, probably aren't fresh in their mind.  Assume that a candidate gets these questions wrong.  What do you know?  You know that they either are not a logical person or that they don't have a complete understanding of light bulbs.  Unless the knowledge of light bulbs is critical to the job at hand, avoid the question.  It doesn't provide useful information.

There is a better sort of logic question to use.  These I call complete questions.  They contain within them all the information one needs to divine the answer.  The candidate needs only apply the rules of logic to the problem statement and they will succeed.

Here is an example of a simple, yet complete question:  There is a river.  You start on one side with a wolf, a pig, and a carrot.  There is a raft which you can use to cross the river.  Unfortunately, the raft is too small to hold more than 2 things (you being one of them) at a time.  You need to get yourself and all three of the others to the opposite side to continue your journey.  Unfortunately, without you present, the pig will eat the carrot and the wolf will eat the pig.  How do you get all 3 to the other side?

If you are reading this, hopefully the right solution comes to mind fairly quickly.  This question is a bit more simple than I would normally ask but it does demonstrate the point.  The candidate merely needs to find the right combination of things and trips across the river to succeed.  He doesn't need to understand the nature of pigs beyond the fact that they eat carrots.  He doesn't need to recall vector math to calculate the path across the flowing river.

The answer is as follows:  Take the pig across the river and live it there.  Now come back and get the wolf.  Take him across.  On the return trip, bring the pig back to the original side of the river.  This time across take the carrot over and leave it safely with the wolf.  Finally, go back for the pig and bring it across.  All three are now on the other side of the river.  Journey on.

Hopefully you can see and understand the distinction between the light bulb question and the pig question.  Failing to answer the first is ambiguous why the failure happened.  Failure to answer the second can only be a failure to apply the rules of logic correctly.

One point that should be made is that you should never judge someone on the failure to answer a single logic question.  Each question requires particular train of thought and just because someone doesn't reach it doesn't mean that they are illogical.  Sometimes their mind is so busy going down other paths that it never reaches the right one to solve the problem.  In short, it is easy to get hung up on one approach to a problem.  Whenever you ask a question such as those I advocate here, you should always have a second question handy.  Only if someone fails both questions should you pass judgement.  Missing one solution is understandable.  Missing two is a sign that the individual in question may not be able to handle this sort of problem solving.

Much of the recent anti-logic-question backlash has, IMHO, been caused by failing to make the distinctions I make in this post.  Logic question, if used correctly, are an invaluable arrow in the quiver of an interviewer.  Especially when interviewing someone for a non-coding position, the question can provide valuable information that is very difficult to glean otherwise.

Thursday, February 22, 2007

Crossing the Uncanny Valley

The "uncanny valley" is the name for a phenomenon in computer graphics where the closer something looks to reality, the more the mind rejects it as being real.  When you see something like a cartoon--say, Finding Nemo--you don't think about it being real.  You don't notice all the little flaws.  However, when you watch something closer to real--like Final Fantasy:  Spirits Within or Lord of the Rings--your mind starts noticing all the little things wrong and rejects it.  It feels more "unreal" to watch something close to realistic than it does to watch something bearing little resemblance to reality.  The BBC recently had an article talking about this phenomenon and games.  Crossing the uncanny valley is the goal of creating something so lifelike that your mind stops rejecting it as unreal.


In the case of still images, pictures like this new picture of Korean actress Song Hye Kyo, seem to cross it.  Here is an article talking about the creation of the picture in 3D Studio Max.



Creating a realistic picture is only part of the solution though.  Getting them to animate fully realistically is the next challenge.  Still, this picture is pretty amazing.


Hat tip to Kotaku for the story.


Update:  Looks like the site was taken down.  Hopefully it will show up again once it falls off the pages of the big news sites.  In the mean time, the Kotaku article has the picture and the photograph.

Hiring Great Testers - Interviewing

A friend IM'd me yesterday saying he was being considered for a test manager position somewhere.  In that position, he would be responsible for building up his test team.  He solicited advice on what to look for and how best to interview for testing positions.  That's probably as good an incentive as I'll get to write the next piece of my "Hiring Great Testers" series. 

It's "easy" to interview a developer.  The required skillset is obvious and potential questions are posted all over the web.  Sometimes it seems that if you can reverse a string and parse a binary tree, you can be hired as a dev.*  Interviewing testers is a lot harder.  The skills are not as well defined.  Many test teams also hire from a different pool than development teams do.  There are a lot more people hired into testing positions without formal education or experience.  How do we go about sifting the wheat from the chaff in the testing pool?  What follows are some of my observations over nearly a decade of interviewing testers.

When interviewing testers, there are four main areas you want to probe:  motivation, technical skills, problem solving, and testing acumen.  The exact questions you ask in each of these areas will depend on the specific role you are looking to fill.

Motivation

The goal here is to understand why the person wants to work as a tester.  Do they see this as an entry position for a role as a dev later?  Do they enjoy testing?  Do they think it will make them rich?  The best way to get at this is to ask from two angles.  First, just ask straight up why they want this job.  Sometimes I'll use their previous experience to guide the question.  "You have a lot of experience with radio electronics.  Why do you want to work in computers instead of staying in that field?"  Second, you should ask about future plans.  Where does the person want to be in 2-5 years?  This will give you a hint about what they ultimately want to do and why they might want the job.  What you are looking for in both of these is a desire to do (for now), the job you have open.  If this is only a stepping stone, that might be trouble.  Being a stepping stone isn't bad, but it has to be one they will enjoy standing on for a while.  One other thing to look for here is passion for technology.  Do they play with computers on their own time?  If so, how?  Do they have a hobby relating to the problem your software solves?  It is not necessary that they do but the more they care about the area you work in, they better motivated they will be.

Technical Skills

The specifics of what you are looking for will vary depending on the position.  If it is an SDET interview, you'll want to look for programming skills (can they code a linked list?).  If it is more of an STE interview, you'll want to focus on the relevant technologies.  Do they understand computer hardware?  Have they used your application or one like it before?  Can they troubleshoot computer problems?  If they are an experienced tester, you might ask about testing theory (equivalence classes, pairwise testing, model based testing, etc.) or their ability to write test plans.

Problem Solving

It is important for a tester to be able to think like a computer.  They have to be able to recognize not just that things went wrong, but why they went wrong.  It is critical to be able to see two problems and understand that they have the same underlying root cause.  This is not easy to get at directly in an interview but I believe you can approximate it by seeing how well the candidate can work out logical puzzles.  I recommend using logic questions here.  Not all logic questions are good.  Some are more brain teasers than logical puzzles.  By that I mean, they require outside information not given in the framing of the question.  Avoid those.  I'll write another post on this subject soon.  You are not necessarily looking for correct answers here but rather the way the candidate approaches the problem.  Are they capable of breaking it down into its constituent parts?  Do they approach it systematically?  If so, they are probably logical enough to be great testers.  If not, they'll often struggle to understand why the system is breaking.  Not understanding why means they won't know how to identify weak points that might exhibit similar erroneous behavior.

Testing Acumen

How good is the candidate at the act of testing?  If the position requires programming and you have asked them a programming question earlier in the interview, ask them to test their solution.  Pretend it is a black box, what would they do to determine that it behaves correctly?  The best answers here examine not just the errors that could be exhibited in their particular solution but also the errors that other solutions may have.  If you haven't asked a programming question, try asking them to test an object they are familiar with.  Sometimes it works to pick something in the room.  "Test my phone."  When I ask this question, I'm looking to see two things.  First, are they creative?  Do they have a lot of ideas?  Someone who stops after 5 tests doesn't cut it.  Second, are they systematic?  Do they cover all angles?  Are there major areas the person forgets or avoids?  Note that the second is not the same as "Did they organize their tests?"  I've seen many an interviewer say something like, "The candidate had a lot of creative ideas but they didn't group them into areas."  My response is always, "Did you ask the candidate to group them?  Are you convinced they would not have been capable had you asked?"  It's important to judge the candidate on the quality of their answer versus what you asked for and not what you hoped for.  Remember, the candidate cannot read your mind.  An answer that is different is not necessarily wrong.

Those are the major sorts of questions I tend to ask in testing interviews.  If the candidate is well motivated, has the right technical skills, is able to solve difficult problems, and has an ability to come up with good test cases, they'll likely make a good tester.  If they fall short on any of these areas, think twice before hiring them.  It is usually better to forego an acceptable candidate than to hire a poor one.  Hold out for a great candidate.  Settling for a mediocre one is usually not worth the risk.

 

Note that this information should be used as guidance--not prescription.  Take what you want, leave the rest.  There is no one perfect way to interview and no set of questions will always weed out the bad or even allow through all the good.

Have suggestions?  What questions do you find effective?  Share with others in the comments.

 

* I'm kidding here, but these sort of questions do come up a lot.

Wednesday, February 21, 2007

Disneyland After Action Report

Met the mouse.  Lived to tell about it.  Disneyland is a lot of fun with young children.  I'm also a fan of fast roller coasters but not super big fan of the drops involved in most coaster.  Disneyland has a lot of the merely fast coasters.  Space Mountain is the best IMHO.  A few things we learned should you be considering a trip there soon:

  1. Don't go over President's Day weekend if you can help it.  We were hoping for off-season and didn't consider that President's Day was a school holiday.  The park was quite busy.
  2. Fast passes are your friend.  In case you haven't been recently, Disneyland has this system called a "Fast Pass" which lets you basically get a ticket for later in the day.  At the appointed time, you can just walk up and get on the ride.  No need to wait in (most of) the line.  Very cool.  On busy days, they run out fast.  Get them early if you can.  You can always not use them later.
  3. If you have young kids, do lunch in Ariel's Grotto.  The food is good.  The price isn't terribly insane (considering a cheeseburger and a coke will run you nearly $12 in the main park).  You get to meet lots of the characters (currently the princesses) without having to wait in a lot of line.  Note that if things are at all busy, you need to make reservations ahead.  You can apparently call 60 days in advance.
  4. Honda has a really cool robot called ASIMO.  You can see ASIMO in action in the Innoventions exhibit.  While a lot of the show is mocked up to make ASIMO seem to have more AI capabilities than he really has, the ability to walk up/down stairs and kick a ball is very cool.
  5. On weekends, there is a fireworks show at 9:30.  Most people gather in front of the castle/Main Street to see it.  This is where it is best viewed because the music is broadcast there, etc.  The truth is, however, that 80% or more of the fireworks are actually shot from Toon Town.  If you stand right by the entrance to Toon Town, you'll get a better view of the fireworks and have a lot fewer people to fight with for a good seat.  We saw the show from Toon Town and from the end of Main Street.  It's a marginally better show from Main Street but probably not worth fighting the crowds for.  Next time I won't bother. 

I'll add more as I think of them.

Wednesday, February 14, 2007

I'm Going To Disneyland!

We're taking the munchkins and heading to southern Kalifornia.  Don't expect to see any new posts from me for about a week.  I'll be busy riding Star Tours and Pirates of Caribbean.  Well, that and I don't think I'll have a network connection down there.  Silly tourist hotels.


If you're looking for something to read while I'm out, check out the Audio Fool.  He's on my team and knows a lot about audio.  He has a great collection of posts making it understandable for mere mortals.

Copy As Path

Here's a cool little Vista trick I just learned:

  1. Browse to a file. 
  2. Hold down shift and right-click on the file.
  3. Select "Copy as Path"

The path for the file, include the filename, is now in your clipboard.  You can paste it into any app.  This is useful when sending a network file's path in mail or when trying to execute a file with command-line parameters (just paste into the run prompt or a cmd window).

Tuesday, February 13, 2007

Display Adapters Demystified

The world of display adapter types is an alphabet soup of options today.  HDMI, DVI, VGA, UDI, DisplayPort?  Which will become the standard in the next few years?  HDMI looks like a sure bet in the consumer electronics space but it is failing to make inroads into the PC world yet.  DVI is the leader today and UDI and DisplayPort are both making a run for the crown.  ExtremeTech has a good article talking about each of the standards and their likelihood of becoming the overall leader.  Their quick take on the situation:  "With DVI so prevalent and UDI and DisplayPort on the verge of crashing the display party, it's unlikely that HDMI will become more than a footnote in the epic story of PC display technology."

Hiring Great Testers - How Important Is Testing Affinity?

When it comes to the increasingly important role of test developer, hiring managers have a choice to make.  They must decide what is the controlling criteria for hiring.  Which is more important, testing skills or development skills?  Sure, you want both to be strong but often times, you don't get a perfect candidate and are forced to compromise.  If you have to choose one skill as the more important, which should it be?  There is no universal answer.  Each situation will vary.  However, it is my assessment that you are often better off hiring a good developer and teaching testing skills rather than hiring a good tester and trying to teach development skills.  The reason for this is largely this one belief:  it is easier for the average developer to learn to test than for the average tester to learn to develop.


When I say developer, I don't just mean someone who knows the syntax of a programming language but rather someone who really understands programming.  That is, someone with computer science competencies.  This is not necessarily someone who has a CS degree.  The person can be self-taught, but they need to understand the sorts of things computer science students learn.  They need to have familiarity with data structures and algorithms.  They must understand complexity issues.  They should be familiar with operating system concepts including concurrency.  For a modern programmer, experience with object-oriented design principles is also necessary.


Two critical traits all testers must have to pay attention to details and being curious.  A tester needs to be thorough.  They cannot afford to leave any stones unturned in their quest to explore the product.  Good testers also need to be curious.  They need to have an instinct where to look to find the issues.  They have to want to experiment.


Let us think about most developers.  Are they detail-oriented people?  The good ones are.  Code is unforgiving.  If you don't pay attention to details, your program will misbehave or even crash.  Are they curious?  More often than not, curiosity is what brings someone into programming.  Is it the sense of exploration and later mastery that drives most of us forward.  It would seem then that good developers have the basic traits to be good testers.


What about the average tester?  Here I'm not talking test developer but rather someone who is less skilled in programming.  Do they have what it takes to make a good developer?  There is no way of knowing.  While curiosity and thoroughness are traits most developers have, those are not sufficient traits to be a good programmer.  To be the best programmer one must be a good problem solver.  One must also be able to think in very abstract ways.  Most good testers are also good problem solvers.  However, I'm not convinced that there is a correlation between abstract thinking and testing.  Testing can be done in a very concrete manner.  Many who make good testers do not have the ability to become great programmers.  It is often this trait that is missing.


Now let us consider the specific skillsets required for testing and programming.  It is argued that testing is just as difficult as programming.  That the skillset required is equally deep and varied in manner rather than scope.  I reject this notion. 


It's not that I think testing is easy.  It isn't.  I've interviewed many people who don't have what it takes to be a good tester.  They have the wrong mindset.  They aren't curious or thorough enough.  Sometimes they just don't really grok computers.  Often times the higher level skills in testing are enumerated to prove the difficulty.  These include things like understanding equivalency classes or pairwise testing.  The thing is, I've never found these to be too difficult to understand.  Equivalency analysis--despite its fancy name--is something any competent tester should just intuitively understand.  Pairwise testing is more complex but is something any CS student should be able to understand quickly.  There is a lot of art in successful testing too.  This can take a long time to develop well.  The average developer isn't going to be a great tester overnight.


How about programming?  How easy is it for the average tester to pick it up?  As I argued above, good programming is not just knowing the syntax.  I like to use the concept of writing to prove my point.  Knowing the syntax of a programming language is like knowing grammar.  You can correctly form sentences and can most likely convey your point to the audience.  However, knowing grammar does not make you a good writer.  What sets Mark Twain apart from your typical math student is not just a superior knowledge of grammar and vocabulary.  It is understanding the larger process at work.  Likewise, knowing the syntax of C++ or Java doesn't make for a good programmer.  One merely needs to watch The Daily WTF for a short while to understand why such thinking can cause enormous trouble over time.  It takes a long time and a lot of hard work to go from being someone who understands syntax to someone who can truly weave code.


Thus I think the gap between a tester and a developer is harder to cross from the tester side than from the developer side.  A developer can become an adequate tester well before a tester can become an adequate developer.  Given the choice between someone who is a good developer and someone who is a good tester and each lacking much skill in the other discipline, I'll take the developer nearly every time.


As always, your mileage may vary.  There are some testers who go on to make amazing developers.  There are developers who cannot grok testing.  There are some jobs where you don't really need good development skills.  Sometimes a person who understand syntax is enough.


That's my take anyway.  Release the arrows.


 


A few related articles are Chris McMahon's discussion of developer-testers vs tester-developers.  Also Elisabeth Hendrickson has a great post about the convergence of testing and development.


 


Series Index

Saturday, February 10, 2007

Programmer Humor

A friend turned me onto this online web comic called XKCD.  Some parts are really funny.  Others, quite lame.  Anyway, here are some of the funnier ones from a programmer's viewpoint:



http://www.xkcd.com/c138.html


http://www.xkcd.com/c221.html


http://www.xkcd.com/c208.html


http://www.xkcd.com/c163.html


http://www.xkcd.com/c26.html


http://www.xkcd.com/c205.html


Thursday, February 8, 2007

February Netcast Update

It's been a while since I posted my list of netcasts.  I've had less time of late to listen to netcasts so my regularly scheduled netcasts have dwindled.  Here is a list of those I listen to on a regular basis:

This Week in Tech - Leo Laporte hosts a roundtable discussion of the news of the week.  Great one-stop-shopping for all noteworthy tech/geek happenings.

This Week in Media - Content creation and politics.  Everything from discussions of DRM to the latest news about the Red camera.

Windows Weekly - Really more like Microsoft Weekly.  Despite working here, there's a lot I don't know about.  It's also fun to hear someone else's perspective on what we do.

The HDTV Podcast - The latest in HDTV news and equipment reviews.  Only 1/2 hour. 

A few notable mentions that I listen to when I have extra time:

The Dice Tower - Best board gaming netcast I've found.

Security Now - The world of computer security in layman's terms.

What Is Your Greatest Weakness?

If you've interviewed much, you've probably been asked this question.  Art Vandalay examines some of the potential answers and their outcome on his blog.  There's also an interesting conversation in the comments over there.  The consensus of those being interviewed seems to be that it is a no-win question.  The managers seem to find it useful.  Which is it? 

I've spoken to people who use this question and asked them what they get out of it.  Their answer is usually "You'd be amazed what some people will say."  True, but if they're really that badly behaved and that intelligent, couldn't you find out some other way?  If you ask a question only to weed out the most incompetent, perhaps there are better uses for that time in the interview.

I see little value in this question.  As Art points out, there are four possible answers:

  1. Lie/Evade - Claim you have no weaknesses or attempt to change the subject.
  2. Answer with a weakness that is really a strength - Say something like "I am a perfectionist"
  3. Honesty - Answer with your truly biggest weakness.
  4. Wiggle out - Give a true weakness but only a very mild one.

Given these 4 choices, the best possible answer is #4.  The first will often upset an interviewer looking for "honesty".  Unfortunately, this question is designed to get you to be dishonest so the interviewer should avoid the question.  The second is lame and probably runs into the same problems as #1.  The third will probably end the interview.  Everyone has weaknesses and without context, your biggest weakness will often dominate the mind of the interviewer.  Game over.  The fourth has some honesty to it but isn't enough to get you thrown out of the interview room. 

As an interviewer, I don't want to put my future employees in a position where they are struggling to find the answer I want to hear rather than the truth.  Why should I put them in a position where lying by avoidance is the best answer?

What are your thoughts?  Is there value in this question? 

Sunday, February 4, 2007

The Sorry State of HD Television

I attended a Superbowl viewing party at a friend's house today.  No, the screen wasn't larger than 55" so the NFL doesn't need to worry.  I don't happen to have an HDTV at home yet so this is one of the few times I've seen the programming.  What I saw doesn't make me want to run out and buy a set and upgrade to HD cable anytime soon.  Sure, HDTV can be beautiful.  We've all seen the 1080i demos running at the stores.  It's gorgeous.  The XBox 360 looks very beautiful on an HD set also.  HD-DVD (and to a lesser extent, BluRay) also look beautiful.  Serenity if HD is a wonder to behold.  However, television--at least via Comcast cable--still isn't there.  The content is often over-compressed and there seems to be no real desire to get it right.


I saw two football games in HD this year.  The first was the NBC broadcast of the Seattle-Dallas playoff game.  The quality of the video was terrible.  When the picture was still, the quality was quite good.  However, as soon as it started moving a little (which it tends to in football), the quality fell through the floor.  There were all kinds of compression artifacts to be seen.  The picture looked very muddy.  I can't quite tell what the problem was but I suspect that someone was compressing the signal too far save bandwidth.  The graphical cuts were, however, quite gorgeous.  Still, this was definitely not something I'd go out of my way to experience. 


The second game was today's Superbowl.  This one they seemed to have good footage of the game.  The pictures of the field were full of detail.  No muddying here.  However, whenever they overlayed graphics, they were upscaled SD pictures.  Whether the picture was a cut scene, an overlay with a picture of a player, or even the score, the contrast in quality jumped right off the screen.  The edges were blurry and there were obvious scaling artifacts.  How anyone could let this happen in the Superbowl is beyond me.  How hard can it be to author the graphics in HD and downscale them for SD?  Several of the commercials too were upscaled 480i video and it showed.  Why you would spend $2.6 million for a Superbowl commercial and then make the video look bad for your most affluent audience?


If they can't get the NFL playoffs and especially the Superbowl right, what does that mean?  It means that it isn't worth their time.  If it were, they wouldn't let this kind of garbage show on the air.  HDTV is still very much in its infancy still.  Too bad.  I was hoping it would be better by now.

Saturday, February 3, 2007

Why Writing Software Is Hard

Scott Rosenberg just published a new book called Dreaming in Code about a project to create a new personal information manager called Chandler.  As many software projects, this one is late.  Rosenberg was recently interviewed about the subject of software projects falling behind. He has some interesting thoughts on the issue. 


Are projects like Vista just poorly run projects?  Were they late because they were undisciplined?  Or was there something else at work?  Rosenberg doesn't give a clear answer but he does seem to indicate that it's impossible to estimate software times because it has never been done before.  He says,


"[A]s long as you are not trying to do something new -- if you are doing something that has already been done -- then you have a frame of reference to estimate how long it is going to take, and to guess how many people are going to be required, and so on. And of course in other fields of engineering where we do the same thing over and over, the same is true. If we are going to build a house, there are always imponderables, and every site is different, but you have a rough idea: two-story house, and such and such a set of features, it is going to take a certain amount of time."


This is true.  If you've done it before, you can probably estimate it.  However, if you haven't done it before--and with software you probably wouldn't be writing it if you had--there is really no way to accurately know how long it will take.  I think this is largely true, as I have argued elsewhere.  Software is unforgiving (the reason for most security holes).  Writing a program is more like research than simple engineering.  No one blames Monsanto for being late developing a cure for cancer (that I know of anyway) but Microsoft gets killed for being late with Vista.  Part of that we, as programmers, bring on ourselves.  To be a good programmer means you are smart.  Mere mortals often have a difficult time understanding how to write even simple programs.  Because of that, we often times overestimate our abilities and underestimate the size of the problem.  We don't have the information to make an accurate calculation but we refuse to admit that. Instead, we insist that everyone else trying to write software was just undisciplined and that we can do it better.  Unfortunately, we can't.


My takeaway from all this:  don't try to estimate the whole project up front.  Stay agile.  Come up with a list of work and churn through it at whatever rate is possible.  Have lots of checkpoints and don't be afraid to change your plans.