Why were early computer companies so poorly run?
Regular commenter Clemens asks the question that titles this post. There is only one honest answer to this question: I don't know. I know squat about running a business.
The wonderful thing about a weblog, of course, is that you can make up all kinds of nonsense, pass it off as the truth, and people respect you for it, whereas telling the unvarnished truth dispassionately is considered at best ho-hum, but more often dishonest. Given that disclaimer, I will endeavor to make up an answer, based only on enough facts so as to ease what little conscience I have left after 36 years on this earth.
I would identify several problems.
(1) Guys who developed computers weren't all that interested in running a business. Many of the Apple developers are good examples of that: all of them got rich, but only Steve Jobs still runs the company, and if I recall correctly he isn't all that technically savvy in fact. He isn't even a particularly good manager, from what I hear & read. What he does very well is pick excellent directions and lead in those directions.
A lot of early computer companies were basically started by engineers at a time when there was an opportunity to exploit a new technology and obtain huge benefits: namely, the transistor. They were successful at making a splash because their innovations were just that good, but once they were in they didn't have enough business sense to keep things going. They didn't know how to market their products; they were successful at selling to fellow engineers and hobbyists, but beyond that they were helpless. The ones who succeeded were the ones who managed to turn themselves over to competent managers, or get bought out by large tech companies (as Amiga did).
A combination of engineering talent and business sense is not so common, but not that rare either. The guys who founded Microsoft are an example, but there were other successful companies too. They usually migrated towards the embedded world where they made a lot of money.
(2) The reverse problem also existed: if a company managed to survive the typical engineers' mistakes, often by hiring competent managers or attaching themselves to a larger company, they subsequently ended up in the hands of what my father calls "financiers": people who don't give a rat's @$$ about the technology, only about whatever it takes to make money. Managers of this sort starve research and development and avoid risk like the plague, looking for "safe bets" and how they can break into established markets. Commodore Business Machines is an example of this: they starved the Amiga division of R&D and marketing funds, lavishing money instead on building a PC-compatible business to break into that supposedly lucrative market—the one where IBM, Compaq, Hewlett-Packard, Dell, and many, many other companies were already making huge amounts of money. Rather than build the Amiga market, CBM decided to try to out-muscle the giants. The consequences were foreseeable to anyone who didn't work in Commodore management: the PC division dragged the company down, taking the Amiga division with it.
Again, one of the worst aspects of Commodore's bankruptcy is that while they had moved their headquarters to the Bahamas in order to avoid taxes—fantastic, from a financial point of view—the Bahamas apparently have some of the harshest bankruptcy laws in the world. That's what I read at the time in an Amiga magazine, anyway, so it must be true, right? Once Commodore started to collapse, there was no hope of recovery; the entire thing was auctioned off and sold for a song.
I would suggest that a more recent example of this is Palm. Ten years ago, there was only one handheld computer worth mentioning, the Palm handheld computers. The company took this bottom-line approach so aggressively, and engaged so little research and development, that they were several times beaten badly in new technologies. First Handspring came onto the scene, developed by former Palm engineers. Handspring developed the Treo while Palm sat around trying to increase its stock value. Eventually Palm bought Handspring, but again did nothing. Microsoft inevitably moved aggressively into the market with the grotesquely inferior Windows CE operating system. (AKA "WinCE" for how it makes you feel. :-)) Today almost all handhelds are based on WinCE, not on PalmOS, because Palm did zero serious software engineering over ten years, instead playing a financial game where they shed the software engineering side and eventually killed it by ceasing to buy its next-generation operating system. Even on the hardware end Palm allowed itself to be one-upped, by companies such as Nokia with its remarkable internet tablet. (Again, if only I had money…) Palm's most recent attempt at something new was the ill-conceived, ill-fated Foleo. It was several years too late, or several months too early, depending on whether you think (I don't) that it was a kind of "netbook" computer like the ASUS EeePC.
It should be noted that Microsoft is one of the few tech giants to retain a research & development division, although given the apparent disaster that is Microsoft Vista, it isn't clear what practical use they've been. I know that they have some really bright guys working over there though. Perhaps it's like AT&T and Bell Labs: I was told by someone whose husband once worked for Bell Labs that they developed all sorts of wonderful devices during the 60s and 70s, but AT&T management refused to market them.
(3) It took some computer companies a while to figure out that a successful business is built on providing the customer with what he wants, and not what the engineers want. To be fair, designing a computer is very, very complicated work, hard enough without idiot-proofing them. (Murphy's laws are very popular with computer engineers, especially Design a system so simple that even an idiot can use it, and only an idiot will want to use it.)
However, the big companies making home computers never really understood what their product was about. Was it a game machine? a hobby machine? a business machine? an education machine? The exciting thing about home computing in the late 70s and early 80s was that home computers could do all of these things, but the user had to figure out how to do most of them by himself, unless he could find a User Group and/or a magazine dedicated to the computer.
Companies like Texas Instruments, Radio Shack, and Commodore already had other successful markets. With the IBM PC entering homes, it became harder to make enough profit to justify the high costs of developing home computers; these are publicly traded companies, after all! Eventually, the home computing divisions started to make less money, and a decision had to be made: continue to commit resources to a market that will be hard to maintain, or return those resources to our main focus?
Texas Instruments and Radio Shack obviously decided to focus on their traditional markets, although Radio Shack dabbled in the IBM PC market for a while. TI later made a splash with graphing calculators, changing the way algebra, or at least calculus, is taught in the high schools. Radio Shack is still a convenient electronic parts store.
Apple had nothing to change to; they were wholly a home computer company. I think that had a lot to do with their survival, at least for a while.
Unlike TI and Radio Shack, Commodore focused on home computing, so much so that by the mid-80s I doubt anyone remembered that it had once manufactured typewriters and calculators. Their Commodore 64 was the most popular computer in the world for a while.
(4) A lot of companies made bad decisions, thinking they were smart decisions. IBM designed the PC to be little better than a paperweight, and while no one knows why, most of the speculation I've read is that they hoped people who were seriously interested in computing would abandon this toy for their "serious" computers. Wikipedia hints at it on this webpage with the comment,IBM's tradition of "rationalizing" their product lines, deliberately restricting the performance of lower-priced models in order to prevent them from "cannibalizing" profits from higher-priced models, worked against them.
Tandy/Radio Shack did the same with the Color Computer; they didn't want it to outshine their serious, more expensive, and more profitable Model I, II, III, etc. series. My understanding is that a lot of home computers were deliberately mangled that way, with unpredictable results. In the case of the PC, software developers still managed to make it do things that the small business owner could use. Since it was an "IBM" machine and IBM was the company in business computing, it was game set and match from then on. In the case of the Color Computer, the lack of serious graphics hardware (in its highest graphics mode, it could show two colors: black and white, an ironic contradiction of its name) meant that the superior processor and interfaces were hobbled in ways that Commodore's and Atari's machines were not.
Another example is codified in a book called The Mythical Man-Month. I forget the author, but it describes a project at IBM to develop a new operating system for its big computers back in the 1960s. When the project fell behind, they figured they could simply assign more engineers to it, but that had an effect opposite from the one intended. The new engineers needed to be trained on what needed to be done. This took time and money. In addition, new lines of communication needed to be open, allowing for more mistakes, and requiring more money. It was something of a debacle, but succeeded anyway.
(5) A lot of what looks like good or bad management is actually luck. In the computer age, hardware becomes obsolete very quickly, and it takes effort to make old software work on new hardware. In the late 70s and 80s, programming was done in assembly language or machine code, which is quite hard, and worse, is specific to each hardware platform. Even people who could program in a language that could re-compile on other platforms, such as C or Pascal, had to jimmy a few things in order to make it work, because there were no such things as APIs, cross-platform libraries, etc.
Early home computers were 8-bit machines, based on the Z80, the 6502, or the 6809. None of these processors was compatible with the other. To make things worse, most 16-bit processors were incompatible with their 8-bit forebears. Motorola developed both the 6809 and the 68000: the one powered the Color Computer, the other powered the Amiga, but software built for one did not work on the other.
Intel, by contrast, was clever enough to ensure that some of its newer processors worked with software for older processors. Software written for the 8-bit 8080 was easily moved to the 16-bit 8088. There was a lot of 8080 software running on an operating system called CP/M, which was very popular at the time. IBM adopted Intel's 8088 for the PC, and Microsoft's MS-DOS was essentially a dumbed-down CP/M. (MS-DOS was the predecessor to Windows; you probably remember it but our students have never heard of it. Feel old yet? :-)) So, a lot of CP/M software was easily modified to work with the IBM PC, making it easy to use a PC in the small to medium business market where minicomputers were unthinkably expensive. Intel continued this process with the 80286, the 80386, the 80486, etc. I'm pretty sure that even today's processors from Intel (Xeon, Core 2, etc.) and AMD (Opteron, etc.) still understand the old 8080 machine code, but don't quote me.
In order to do this, Intel had to maintain backward compatibility with several very, very bad design decisions on the 8080, and other bad decisions on the 8088, etc. However, it meant that IBM and other companies could easily exploit the technologies that came along later, such as 16-bit computing and then 32-bit computing, despite those processors' defects. Genuinely advanced processors, such as the 68000 and later the PowerPC processors used by Apple for a long while, were far better and didn't suck as much electric power as the Intel machines. (To make up for technological inadequacies, Intel for a looooong time merely increased what is called clock speed, which makes the computer seem faster even when it's not. This requires much more electricity, and consequently much more heat is generated.) Upgrading was less of an investment: when Apple moved from the Motorola 680x0 series to the PowerPC, they had to develop a very complex program to allow older programs to work on the original iBook and iMac. Microsoft never had to do that.
I hope this jumbled assembly of thoughts and memories helps. What shocks me most is that a lot of details that are listed I typed from memory, and subsequently looked up some of them online. All the ones I looked up corresponded to my memory. Why can't I remember everything this way?
1 comment:
Brings back memories of working for Control Data, once known as Engineering Research Associates. One of their original engineers was Seymour Cray. He was a weird genius dedicated to building the biggest and fastest computers anywhere. And he did it for years, always managing to save the company with a new "Super Computer" every few years. Once the company got to a certain size, the "managers" who took over running the company couldn't quite figure out what was so special about Seymour. So they began to mess with him, and to cut his budget. After all, where would an eccentric genius who was loyal to the company go?
Seymour just packed up and left. When one executive pleaded with him to stay he said he was leaving anyway. The guy wailed "But what are you going to do, Seymour?"
"Oh, I don't know. Wonder in the wilderness. Start a little company maybe."
In the fullness of time the little company became Cray Computers and Control Data never again dominated the market of super computers.
That, and the story of how CDC beat IBM where some of the more entertaining things I learned about the history of computing.
You did a pretty good job from memory, btw. Started getting me interested in computers again.
Post a Comment