From Pet to Nexus: 31 years in the personal computer market

Jeremy Reimer of Ars Tech published an article From Altair to iPad: 35 years of personal computer market share. It is a great read both for those who have been involved in the industry for many years, or new people who may not know any of the history.

The interesting thing about history is that different people who were there experienced it differently. The interesting thing about statistics is that the details of what the methodology is counting is often more important than the numbers that come out. Much of the Ars article this month and from 2005 were based on IDC numbers, which I have been critical of for years for being wildly inaccurate in reporting alternatives to Microsoft Windows running on Intel/AMD machines.

My first real experience with a computer was in grade 8 in 1981, when a mobile classroom of Commodore Pet computers was at my school for a few months. The next summer I put all my money together, borrowed some more from my sister, and bought a Commodore Vic-20.

Since that time I have owned many more Vic-20's, multiple Commodore 64's, a KIM-1, an Apple II clone, multiple Amiga computers of a variety of models (Including CDTV), an XT computer computer running DR-DOS (Fidonet), multiple Intel/AMD machines running GNU/Linux, and now various ARM/Snapdragon mobile devices running Android/Linux. I've used many other computers over the years, with this only being the list of what I owned.

My experience of the 1980's was different than what was articulated in the Ars Tech article. While the early 1980's home computer market was filled with 8-bit computers like the Vic 20's and Vic 30's (Err.. Commodore 64) that I owned, the second half was dominated by 16-bit machines like the Apple Mac (1984), Commodore Amiga (1985) and Atari ST (1985). All not coincidentally running Motorola 16K family processors.

While it is true that Intel computers running DOS, and later Microsoft Windows existed, it wasn't until the release of Windows 3.0 in 1990 that it was a relevant contender. DOS was big in the business computing market, but it confuses the statistics to include DOS computers in the home computer market, but to exclude the large UNIX workstations and the mainframe market which were also largely outside the home computer market.

In the late 1980s' and early 1990's I worked as a hardware technician at a Commodore authorized dealer, focused on repair of the Commodore branded computers. The stiff competition for the home market was between Apple, Commodore and Atari. These were all closed proprietary platforms where the operating system and hardware came from the same vendor, there was no software compatibility between platforms (other than via various emulators), and peripherals were more often than not also very platform specific.

As much as I loved my Amiga, it became clear by 1990 that Apple was going to dominate that marketplace. I liked the Atari ST as a second choice to my Amiga, but I never liked the Mac (still don't) so this development saddened me.

Then something I couldn't have predicted happened. Microsoft offered a comparatively open platform to the computing before them. They were only trying to control the one piece, the operating system, and worked hard with third party hardware and application developers to create a marketplace around their operating system. Like the 3 most important factors for buying a home is "location, location, location", the 3 most important factors for Microsoft building their operating system business was "developers, developers, developers".

While Apple didn't try to control third-party developers back in the Apple II days, they were very controlling with the Macintosh. If Apple didn't like the aesthetic of your hardware peripheral or your software, Apple used any control it had over the platform to disallow that developers product to interface with the Mac.

Like how Thomas Edison's attempt to leverage motion picture patents to control the motion picture industry (down to what movies could be made) drove all the innovation westward toward Hollywood, Apples excessive control over developers drove home computing innovation to Microsoft. Even though Microsoft made a few mis-steps (Bill Gates thought the Internet was a fad, and had to scramble to get the Internet adequately integrated into Windows 95), by the mid 1990's the migration was over and Apple quickly went from dominating the newly growing home computer market to being a small player.

I believe documenting what happened in the past is critical if you are trying to understand the present or predict the future. My predictions for the future will be based on my experience of the past.

Microsoft made an even bigger mistake with the transition to mobile than they did with the Internet, and I have a hard time believing they will recover from this one. If I were to predict their future it would be for them to do as IBM did and transition into being a services company from a product focused one. They have some talented people, and having them work on cloud and related services would be far more lucrative than trying to become relevant in the mobile space. Their days as being relevant in the operating system marketplace is nearly over, as is the era of desktop applications like Microsoft Office. They will still exist in the future, just as mainframe computers exist today and will in the future, but will be largely invisible to the general public. I also think their current plan to rent rather than sell desktop-era software will not amount to anything in the longer term. What I see at Outlook.com has promise if they make it fully neutral to the computing platform people use to access it.

Apple is still Apple. While yet again they dominate a very young marketplace, they are trying to exert a level of control which developers are only going to stomach as long as there isn't an obvious alternative. Apple's massive lawsuits against nearly every mobile hardware developer will eventually be understood as a similar threat to all software developers as well, and the mass exodus of all types of developers will be swift.

Google is providing that more open platform alternative which a majority of mobile hardware manufacturers adopted right away, and software developers are gradually jumping on-board. The growth of this comparatively open platform will for Android be similar to what happened with Microsoft Windows.

Android is essentially Linux with a different interface on top than what you see on desktop distributions. Like other Linux distributions it enables developers to create their own products by not being tied to any specific vendor. This makes Android/Linux quite different than Microsoft. While Microsoft has licensing agreements they leverage to close down competitors as Microsoft moves into new markets, Google couldn't do the same thing with Android. We already see with companies like Amazon and the Android based Kindle the ability for a company to go into direct "competition" with Google using Android.

While Microsoft made money charging royalties for Microsoft Windows and Office, Google is mostly an advertising company and doesn't charge hardware or software developers any royalties to build on top of Android.

While I can easily predict Apple returning to its previous distant-second position, I don't think we can simply map what happened with Microsoft onto Google, Android or Linux. It is also possible that Apple will remain around for a third computing wave that they will quickly give up as they try to control developers. It is also likely that RIM/Blackberry and Nokia/Microsoft on mobile will be remembered 20 years from now as much as the Atari ST or Amiga is today.

I wanted everyone to switch to the Amiga in the late 1980's, and currently wish everyone would switch to desktop Linux. While my reasons were technical in the 1980's, and political today, I didn't get my wish then and won't get my wish now. Android-based mobile devices won't allow their owners as much control as a completely Free/libre Software GNU/Linux computer would, but it is orders of magnitude better for developers and computer owners than what Apple has twice tried to offer.

In a mature market it will inevitably be a practical yet open system enabling a competitive free market between developers that will dominate over closed and/or impractical alternatives.

Well, almost inevitable -- don't get me started on the market distortions from harmful government granted monopolies in the form of bogus patents and "technological measures" abused in copyright. Bad government policy is the largest wildcard in predicting the future of computing and innovation in this space.

Location, Location, Location, Developers, Developers, Developers, Politics, Politics, Politics....

(Update: A few people have suggested I should have used the term "Home computer" market rather than "personal computer", given the latter wasn't used at the time and is vague. I'm essentially talking about computing used by individual citizens, rather than by businesses which was/is a different market).

[comment on Google Plus]

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

Microsoft’s Lost Decade

For an article focused on Microsoft's decline, check out How Microsoft Lost Its Mojo: Steve Ballmer and Corporate America’s Most Spectacular Decline | Business | Vanity Fair.

As a different data point, take a look at the government market. Governments, especially in North America, tend to be about a decade behind what is happening in the computing marketplace. With the Shared Services initiative in the federal government we are seeing the locking in (via contracts, APIs, etc) of legacy Microsoft technologies such as Windows, Office, Exchange and Sharepoint. This will lock the federal government into 1990's and early 2000's technologies for quite some time, essentially forcing Canadian taxpayers to keep Microsoft afloat while the computing market otherwise moves elsewhere.