The hardest thing is to go to sleep at night, when there are so many urgent things needing to be done. A huge gap exists between what we know is possible with today's machines and what we have so far been able to finish.It has been the case for quite some time that there is a huge disparity between the quality of computer hardware and software. As a software guy myself, this is frankly a little embarrassing. Every time I see a computer freeze up, I cringe a little inside, knowing that somebody somewhere, maybe somebody not so different from me, has screwed up. Somehow, though, you hardly ever see that with hardware.
Here's an example. Way back in the early days of Pentiums, Intel released a chip which had a minor bug in floating-point division; for some incredibly rare inputs, you would get results that were off by some fraction of a percent. This happened 14 years ago, and the Intel FDIV bug is still a running joke. Were this a software bug, it would have been forgotten within weeks, if not days. Bugs of that magnitude are just too common in software to care.
Why the disparity? I can think of a few reasons. The most obvious reason, of course, is that hardware is usually much, much harder to replace than software. To replace software, you generally just have to install a patch (and then reboot if your operating system is dumb); meanwhile, replacing a CPU is a pain in the ass. This forms part of a feedback cycle. Generally speaking, hardware undergoes much more strenuous testing than even high-quality software - they simply can't afford the mistakes. (Hell, it's still rare to see programmers that use unit tests on a regular basis.)
Another reason is the relative skill levels. People working on hardware design are usually pretty sharp out of necessity, while people who write software are often, not to put too fine a point on it, idiots. This isn't always the fault of the programmers; the barrier to entry for software design is simply much lower than it could possibly be for hardware design. Even without that, it's difficult to become highly skilled in any one subfield of software design. The popular languages change rapidly, and behind them the underlying paradigms change too.
The hell of it, though, is that perfect software really is possible. Knuth's quote holds more than ever today - so much more is possible than what we've been able to accomplish so far. There is no fundamental reason that software should suck, other than human fallibility, and really, that's an unsatisfactory reason. At a minimum - an absolute minimum - programs should never, ever crash. Is that really so much to ask for?
The answer, from the last few decades, has been a resounding: yes.
Software sucks. :(