Pages Menu
TwitterRssFacebook
Categories Menu

Brute Force vs. Efficiency

Brute Force vs. Efficiency
As I was leaving my local Walmart today, I made my way through the parking lot in my usual escape route to the exit furthest from the main intersection the store sits on. On my way out, I noticed a few cars blaring through the lot towards the exit closest to the freeway, which is also the most congested exit. I putted along casually onto the road and managed to beat most of them to the main intersection. One of the cars barreled past me and cut in front of me before the traffic light. “Idiot,” I thought.

Ohio is a great state for road rage. It was once surprising to me that more people didn’t shoot each other on the roads here. Later, though, I found out that Ohio is also a great state for posers, people who affect a bad attitude and aggressive habits, but are not truly prepared for a real confrontation. The fact that these people live consequence-free in their habits (driving as well as social habits in general) annoys me. It’s easy to get mad at someone who cuts you off, and it’s also easy to be depressed about the state of the human race that people don’t think a little more before they do ignorant things. If people took some time to think about effective routes around congestion and also realized that the difference between 75 mph and 65 mph over a 10 mile trip amounts to a savings of a meager 1 minute and 14 seconds, maybe they’d use their heads more and their gas pedals less, create less congestion and reduce travel times for everyone.

I had a discussion with a woman once about car racing. The opinion I was trying to express was that it’s too safe to drive dangerously now. Air bags, antilock breaks, variable ratio steering, automatic transmissions, and front-wheel drive, combined with the bigger engines now found in family cars all make it too easy for the local hausfrau to pretend she’s Speed Racer. If you want to be a maniac behind the wheel (and I do), you should have to fight for it. That way you take it more seriously and don’t just casually drive 90 on the way home from work whenever you can. It should be you against the car. The car should fight you for control. A real man’s sportscar is on old leaded fuel guzzling, 8 cylinder, 5 speed with rear wheel drive with huge, bald back tires that skid all over the place. Not only is it more exhilarating to drive dangerously with that type of car, but it’s something that you don’t do without some consideration of the consequences.

Now, though, brute force comes in safer packages, encouraging people to be less efficient since they can make up for it by quicker acceleration and higher top speeds.

There is a similar analogy to the car design example in computers. Why was there ever a concern about Y2K problems? Because computers were once short on memory and storage space, and programmers were in the habit of writing the tightest code possible so that it used fewer resources, including saving a few bytes on date fields. [This is actually only one side of the coin. The other side is human nature: We tend to group years in decades -- the 60s, the 70s, etc. It's natural to code with that principle in mind and neither be concerned with nor even contemplate what will happen several years hence with your code.]

In the day, programmers would spend a lot of their time tweaking their code to make it faster, since the machines the code would run on were very slow. The more efficient the algorithms in use were, the faster the machine would reach its goal. [Anecdote - Steve Jobs is famous in the Apple community for a speech he gave the designers of the original Macintosh. The boot time was much too long. He told them how many units they planned to sell and how many times each unit would be booted. Combining those, if you saved just a few seconds on boot time then the combined time saved would be many human lifetimes. The speech was a little tongue-in-cheek, I'm sure, but the result was a significant savings in boot time -- at least for the original Mac.]

In today’s world, processor speed is beginning the transition from Megahertz to Gigahertz. I once downloaded an Apple II emulator from asimov.net and ran some games I used to play in the 80s on it. I turned off the speed governor and watched in amusement as the games moved too quickly to keep up with, similar to hitting fast forward on a VCR, but an order of magnitude faster. And still, it takes a long time to boot a 1 GHz computer running today’s modern operating systems. Why? Because with greater storage space, memory, and processor speed, it is less important to code efficiently. The art of tweaking your code to make it fit the machine it is running on has been lost both as an unnecessary process and due to the modern business world trying desperately to burn out its programmers before they turn 30.

Windows, still the most common OS family, could run many times faster than it does. It would take a complete rebuild from the ground up. Third party device drivers could be rewritten to be many times faster. The GUI shell could trim many pointless bells and whistles that users never asked for (oooh! “show as web page” — how useful!). Core DLL files could be made to interact more effectively with the OS. Application programmers could spend less time giving status reports to their bosses and having projects canceled midway through, and more time tweaking their good products to make them great.

Computers have never needed a minute and a half to boot up; the consumers have simply put up with it out of ignorance and fear. If the average consumer were more shrewd, the fastest and most efficient code would become the new standard each generation, and monopolies would be much harder to come by.

Efficiency, though, has a dark side: perfectionism. A friend of mine came up with a great truism once, namely “Perfectionism is inefficient.” There is a point where more efficiency is possible in any given system, but the effort to reach it is greater than the return. This “bar of perfectionism” is where effort to improve a system will logically stop.

Back in the 80s, the market for computer programs was small, so a lot of effort had to go to making products as bug-free, quick, and easy to use as possible. In addition, the machines in use were underpowered. This naturally resulted in tight code and many extra hours in the lab trying to find more corners to cut. The bar of perfectionism was very high. In the late 90s, computer power started to skyrocket, and consumers got in the habit of accepting crappy programs as just part of the game. Because of that, the perfectionism bar has been substantially lowered. Now extra hours getting rid of bugs or making programs faster would be inefficient since it would not dramatically alter the money made on the product. The end result of the current computer market is insanely powerful machines with insanely bad programs. It’s insane.

This must be what high school teachers feel when they deal with brilliant kids who are apathetic to learning. Sure, kid, you’re getting by, but you have the ability to be truly great.

To make a change in the way the computer market is now will require a lot of dissatisfaction from users. Personally, I’m dissatisfied, and I have been for a long time. I don’t upgrade, and I rarely purchase software. I’m online almost every day, but I pretty much abstain from participating in the market. I cast my vote with my wallet, and lately my vote has been a simple “no.”

If Chrysler were run the same way many large computer corporations are today, not even President Carter’s federal loan to them in the 70s would have kept them from going under. One day average consumers will realize this, and then maybe things will improve.

Post a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>