It may be a difficult thing to imagine for some, but there are people alive today in America for whom the very concept of a “computer” seemed like pure science fiction for decades. Everyone knows – or knew – at least one. A grandparent, parent, or aunt that never had any use for a computer in the home. Many of these people saw computing evolve from 1950’s mainframes staffed by crisply “uniformed” IBM technicians in shirts and ties, to the Commodore 64 and Apple II which combined to kickstart the digital golden age of computing in the 1980’s. When my parents learned of my interest in computers at a young age, it was fairly easily shrugged aside, for a time.
Eventually I wound up with my first computer, a Texas Instruments 99/4a that never really struck my fancy as a thirteen year old in the throes of an ADHD diagnosis process. And then onward and upward, to the far more capable and expandable Commodore 64. Now this. This was a computer I could DO things with. I could play games, I could type in programs from the pages of Mad magazine which would display the face of Alfred E. Neumann on my TV set. I could learn to program, write my own software, become the next Bill Gates before anyone outside of the computer or financial industries ever knew what he’d mean to the world at large!
Of course, I never did.
Instead I spent years riding the wave of technological advance, moving from computer to computer, platform to platform on the thinnest of budgets. The Commodore 64 begat a Commodore 128D, which begat a ‘286, which begat a ‘486, and so on. I had become an enthusiast. The last time I would ever spend without a computer was a period of time in foster care. Once that ended, and I had my own apartment and computer returned to me – I would be hooked.
Forward to today.
I am sitting at a thoroughly middle of the road machine, for someone with an interest in video gaming. A six-core computer with the processing power of tens of thousands of those Commodore 64’s, or Apple II’s. Twelve gigabytes of RAM, instead of the paltry-but-then-amazing 64 kilobytes available in the aptly named Commodore 64. My video card alone runs at a speed once considered so amazing that entire buildings were designed just to contain the hardware it would take to process the same amount of data. Yet, I don’t see the computer on my desk now as any kind of accomplishment for humankind. I don’t see it as something amazing that needs to be preserved. I see it as, fundamentally, a tool to be replaced when it no longer serves its purpose.
I don’t say the same about the Commodore 64.
And increasingly, contrary to what many people would call rational thought, I’m not the only one. The number of “retrocomputing” hobbyists and collectors has always been unknown, but today it seems that the concepts that drew so many of us into the age of computing during that “golden age” of the 1980’s are coming to the forefront with a new resurgence of interest.
But it isn’t the Commodore 64, the Apple II or even the BBC Micro that leads the way.
It’s a new generation of 80’s-era systems, built virtually within the confines of video games. Primarily, the popular title “Minecraft”, initially created by Markus “Notch” Persson from his home in Sweden. Available for download on the PC, Mac OS X, Linux, and Android platforms, Minecraft is an open-world sandbox game where players are able to gather resources and construct some truly amazing objects in the virtual space. Initially a largely single-player title programmed by Notch himself, it eventually became a sales juggernaut leaving him in the enviable position of forming his own game company with a multimillion dollar bank account and a staff of talented, devoted employees. And the devotion doesn’t just go one way. His dividend on estimated hundred-million dollar company holdings for 2011 was a total of three million, all of which was divided among the employees of his company, Mojang.
With Minecraft, Notch unlocked the creativity in thousands of devoted players. There are numerous music videos, short films, and songs devoted to the game or that use the game as a framing device to tell a story. Yet, the most interesting use of Minecraft to date has been the addition of player-created modifications. Despite Minecraft lacking a proper development API for third-party tools, its nature as a Java program has made reverse-engineered hacks and modifications the true driving force behind the game’s adoption. One of these hacks is the aptly named “ComputerCraft”, which allows a player to build functional computers, disk drives, and “Turtles” which act much like the Logo accessory of old. These computers are designed to be used with the Lua scripting language, with players able to use a machine in-game and author programs which can trigger actions inside the game world or even network between systems and serve as GPS-like navigation systems.
Other pioneering projects that have been accomplished in Minecraft, are devices such as a full 16 bit arithmetic logic unit which was built entirely using standard components that Notch included with the game. Another project exists with the goal of adding a full 6502 capable computer system into the game. These are amazing learning tools, able to excite the people who play Minecraft in ways that were never even considered remotely manageable only a year previously. Joel Levin, a teacher at a private school in New York City has earned the moniker “The Minecraft Teacher” for his innovative use of Minecraft as a teaching tool. He even reached an agreement with Notch’s company Mojang, to found MinecraftEdu, a company devoted to selling an educationally modded version of Minecraft specially tailored for classroom use.
Notch himself may never have envisioned this role, but it’s clear from his recent announcement of the next project he plans to personally helm at his company, that he’s embracing it with gusto.
On March 28th, readers of Notch’s largely disused Tumblr site were treated with the following post which began with this tantalizing tidbit:
TITLE: dcpu specs, classified, not final TO: redacted DATE: 20120328 VERSION: 4 16 bit architecture 0x10000 words of ram (128kb) 8 registers (A, B, C, X, Y, Z, I, J) program counter (PC) stack pointer (SP) overflow (O)
On March 29th, Notch posted the image below to his Twitter feed.
On April 1st, he officially announced the game “Mars Effect”, a take on the popular – and recently controversial – Mass Effect series, in a doubly pointed joke referencing a recent trademark suit his company had endured over the title of another game they currently have in development, and a friendly jab at the controversial ending to the Mass Effect trilogy which had been in the news.
In his description for “Mars Effect”, Notch described a free roaming sci-fi space combat, trading and mining game.
“The computer in the game is a fully functioning emulated 16 bit CPU that can be used to control your entire ship, or just to play games on while waiting for a large mining operation to finish.”
Being announced on April Fool’s Day, many readers considered it a joke and wished that such a game truly were in development. We didn’t have to wait long. April 2nd, Notch completed the legal filings necessary and acquired the domain name 0x10c.com, which became the new – yet awkward – title for the very game originally announced as “Mars Effect”.
I firmly believe this is going to be a truly interesting time for people who love vintage computing. For the first time in decades, a whole new generation of technically minded people are going to be exposed to an environment where working within the constraints of a limited system with logic so low-level that it can actually be understood by a human brain without requiring superhuman acuity. Gamers who play with these systems in their virtual spaceships will perhaps wonder, maybe for the first time, whether or not there are any vintage computers out there for them to experience. Alternatively, this may provide some of the industry’s older generation, who still understand the concepts long left for dead in mainstream commercial development with a new outlet with which to work. The possibilities are mind boggling.
Many questions remain, however. Will Notch’s “DCPU-16” include a ROM toolbox? How deep will it go? Will it have a robust series of buses and interfaces that players and programmers will be able to tap into and possibly even exploit in ways similarly to how early 8 bit systems were designed? What kind of graphics capabilities will the chip include?
Will players be able to develop ROM chips themselves? Will entire operating systems be possible? Will RAM be expanded, and more importantly, should it?
Notch’s design of DCPU-16 is an amazing start to what could eventually become an entire ecosystem of development potential. Who knows where he’ll stop, and where his work may enable other programmers in the future to go?
To fans of retrocomputing, I say: “Embrace this.”
This is quite possibly the beginning of the next wave. The next generation of interested hackers, tinkerers and users of vintage computing techniques and sensibilities. I predict that within a year of 0x10c’s release, the option of buying a physical DCPU-16 will exist. We will see the inversion of today’s 8 and 16 bit status quo – virtually emulated hardware coming to life in physical form.
Embrace it. This is the future, and it’s going to be just as much fun as the past.
Update: Shortly after the initial posting of this article, Notch released the first bit of “complete” documentation for the DCPU-16 design which will be used for 0x10c. The specification for programming in assembly on this virtual processor can be found here.
One morning, during the summer of a year I can’t remember, I recall my father walking into the house with a cardboard box full of wires and plastic. It’s a vague memory, and surprisingly so considering the amount of change it brought to my life in both the short and long terms. My mother might argue the point, but I feel it’s safe to say it was probably the biggest influence my father ever had on my future. Contained in that box was a used, bare-bones Commodore 64 computer that he’d found at a local yard sale while searching for valuable antiques to be resold at the family’s flea market booth.
The Commodore 64 rapidly became a defining element in my everyday life. Beginning with a simple bare-bones computer connected to a black and white television my father let me keep in my room, I gradually expanded the capability of it with a cassette tape unit for loading software. When tape fell out of favor, I managed to get a floppy drive of my very own. The eight-pound wonder of the Commodore 1541 felt like magic at the time, despite its lack of capacity or speed. But to me, it was the most high-tech toy I could have wanted… until I got my very first modem.
Designed in 1981 as a successor to the Commodore VIC-20 computer, the Commodore 64 (Colloquially the C64) was released to the public in early 1982 for the then shockingly low retail price of $595. Thanks to Commodore’s ownership of MOS Technologies, manufacturer of the CPU chip in the C64, production costs were incredibly low compared to other computers at the time. Selling a remarkable and record-setting 17 million units over its lifetime from 1982 until 1994, the system was a powerful direct competitor to the Apple II, and thanks to the estimated $135 cost of production it was able to drive many competing companies from the home computer market.
Technically, the Commodore 64 was highly advanced in comparison to many early computers of the day. Its 1Mhz processor, while slower than that in other computers such as the TI-99/4a, was still on par with the more common (and more expensive) Apple II series. Additionally, the C64 included advanced audio and graphics capabilities that could be displayed on a common black and white or color TV as opposed to the specialized displays most commonly seen in use with Apple computers.
My Commodore 64 really came into its own with the purchase of my first modem, the aptly and accurately named “64modem”, which was a 300 baud device that utilized the household phone line whenever I was able to wrangle up permission to use it. With my modem, I was able to connect to local Bulletin Board System services and engage in messaging and gaming with other computer hobbyists in the community. Eventually, my 300 baud modem gave way to a more modern 1200 baud modem, that promised (and delivered!) far faster access to the BBS systems I connected to for software and messaging.
I’d like to take a moment to compare rough estimates of the difference in the speeds which I’ve been describing thus far, in particular the speeds related to communications VIA a modem. A single “baud” may eventually have been capable of transmitting multiple bits of data, but in the early days it was a direct 1:1 correlation and this is how most consumer systems in the 300/1200/2400 baud era were rated.
Using this method, my current internet connection is 838,860 times faster than that original 300 baud link between my computer and a single, dedicated machine sitting across town. This is just one aspect of the progress that has been made in my 36 years.
I feel old.
This isn’t a new thing. I’ve felt old for a while now, and I’m sure anyone who reads this blog (Dave, and my mom most likely) has a better than even chance of reading that and thinking “You’re just a kid!” but it still feels like a fairly powerful truth to me.
Most of the reason for this is the growth and expansion in the power of technology in just the years since I’ve been old enough to notice and pay attention. For fun, I’m going to post a few quick memories about what degree of change I’ve actually had beneath my fingertips. Samples from multiple technological generations, comparing and contrasting in rough terms, the way this power has grown.
The first “computer” I ever owned, as a Texas Instruments machine called the TI 99/4a.
… I really never counted the TI-99 as a “real computer”, even as a kid. It didn’t feel right for some reason. It might have been roughly the same color that the Apples I played with in school at the time were, but other than that the similarities were nonexistent. Without a disk drive or any cartridge games beyond “Chisholm Trail”, a cattle-herding game, and with only a tiny amount of memory available to be used for playing about in BASIC, it was pretty much useless to me.
Unknown to me at the time – and indeed right up until I began researching my facts for this post – the 99 was actually the first commercially produced 16 bit computer. This meant, that in comparison to other computers of the time like the Commodore 64 and Apple II series systems, it had a greater level of complexity and potential power that could be put to work each time the system began a cycle of computing. In addition, it ran at a blisteringly fast speed compared to the other personal computer options of its day from 1981 to 1983. It’s processor, the TI TMS9900, ran at a full 3Mhz meaning that for every second that passed, it was capable of performing something in the range of 3 million simple mathematical calculations or actions. This might sound like a lot, and indeed, in 1981 it was a fairly incredible amount of power to work with. In fact, if one was to compare the 16 bit, 3Mhz processor in the TI 99/4a against its contemporaries, the TI would win every time.
So why does everyone know the Apple II or Commodore 64, but not the powerful, capable TI-99/4a?
In short, Texas Instruments got caught in the middle of a vicious battle between Apple and Commodore. Apple, at the higher end of the price options at the time, had a growing lock on the educational computer market. Commodore on the other hand, engaged Texas Instruments and other computer manufacturers in the United States in a vicious and bloody price war. Even pricing the 99 on a level comparable to the then-inexpensive Commodore VIC-20 failed to stem the tide of red ink. When sales continued to remain low, the price dropped from $299 to a below-cost $150, and eventually to $99 with the release of a revised, cheaper to manufacture version. (seen above) In addition, Commodore sold products with extensive documentation allowing any VIC owner to develop his own hardware or software for the machine with full access to the schematics and other necessary references. Texas Instruments, attempting to keep a firm lock on the full market for TI 99/4a compatible equipment and software, revealed very little information at the time of the system’s launch, and schematics were never released until after the cancellation of the product line in 1983.
By the time I had my TI 99/4a, the system was long “dead”, and nothing new was being produced. After a few weeks of trying to find fun things to do with my new first computer, I stopped playing with it. And eventually, came to what will make up Part 2.
I’ve given this much thought, as much as I can without being intimately aware of the financial politics between Diamond Distributors, publishers, and local comic shops. And I think I have a possible solution.
The complexities of the comic industry largely involve five groups of “People Who Must Be Happy” with any approaches to digital comics.
So far, publishers have been very cautious in approaching the concept of digital comic distribution. DC Comics for example only offers a few issues of “current” material, typically items that will sell well at retail thus run the lowest risk of cannibalizing sales for retailers. The vast majority of items offered on DC’s iPad based distribution platform, are books several months old and unlikely to be in popular trade paperback form. In addition, most of the offerings available are only $1.00 less expensive than actual printed material.
When you’re dealing with a six month old back issue of something most shops have already tossed into the bargain boxes, that $1.99 price might even be more expensive than a “dead tree” product – if you can get one
To keep publishers happy, you need to decrease the volume of piracy while increasing the availability of product.
To keep creators happy, you need to increase the volume of product sales while increasing the amount of royalties.
To keep distributors happy, you need to keep the supply chain intact and expanding.
To keep retailers happy, you need to protect the supply of product that stocks shelves every Wednesday.
To keep customers happy, you need to offer them increased value for the money, and in ways they want it.
When we boil all of the above information down, we come to a single overriding need to keep all five of the groups I listed previously happy. And I believe that a solution exists, if publishers are bold enough to take it on together.
So this is my proposed solution:
I suggest a joint venture between publishers and Diamond Distribution (though Diamond could be bypassed with a bit more effort on the side of the publishers) to create a dual distribution model for physical comics with added digital downloads.
It’s a simple thing to say “Buy dead tree, get digital free”, but a very difficult thing to contemplate when you’re trying to balance the needs of all five aforementioned groups. Yet, that’s exactly what I’m going to attempt here today.
There are two pillars supporting the approach I suggest tonight, first and foremost is the retail market.
When a person purchases a comic at retail, the clerk should be able to scan each individual item purchased into a computer connected to a database holding records of an account previously created for that customer. The scan would be a simple one – denoting the issue of the comic being purchased – and this would be transmitted along with the date of sale to a centrally managed server. This server could be managed by a joint venture formed by publishers, or even by Diamond itself.
This server could then use the central records to verify that the retailer is only scanning in copies of issues legitimately sold. If the number is exceeded by the retailer… well, the customer trying to get his digital copy of a book no longer granted to the retailer as a valid item is going to be a pretty unhappy customer.
The upside to this first scenario is that the customer gets his comics on day of release, and knows he will have a digital copy for later reading. This would appeal to both collectors who might never crack the spine on a weekly issue, and to the avid readers who don’t want to sort through ten longboxes to re-read a story that suddenly has greater relevance or interest. The publisher continues to make money on sales, creators make money on royalties, distributors make money distributing, and retailers make money with happier customers. The only downside is that this leaves no room for digital-only sales, which is where our second scenario comes in.
For our second scenario, we look at the idea of digital sales. Transactions where a comic is primarily distributed online in digital form. This is where the real risk-taking comes in.
My suggestion here is that digital comics be made available one or two weeks after arriving at retail, as well as those issues included with a purchase. If a comic is purchased at retail on the first of the month, it would only be unlocked and available digitally on the fourteenth. It would be $.50 cheaper, and available on the fourteenth for those who might be unable or unwilling to take advantage of a retail opportunity.
In the case of the $2.49 digital comic, I would suggest using the “premium” over today’s $1.99 price to subsidize the expenses incurred by retailers and the distributor in setting up the hybrid physical/digital infrastructure. Apply the money to the operation of the authentication and distribution servers, and to reduce or even eliminate the cost for affiliated retailers to sign up users and authenticate purchased comics. This would also enable the publishers and creators to turn a profit in a manner largely compatible with that of physical issue levels – if not greater.
I believe that this kind of approach, or one slightly modified, would best serve the needs of all those involved in the creation and sale of comics.
One thing (among many) I have not covered, is the format these digital comics should take. As a consumer, and someone well acquainted with the technology of DRM and format obsolescence (ask me about the Commodore 1541!) I would obviously prefer unprotected “.cbr” style archives. Were I a publisher, I’m fairly certain my thoughts would tend to run more toward ideas of “lockboxes” and “embedded keys”. I can only suggest that anyone considering implementing this kind of scheme look to the music market, and see where growth and innovation has led to the rise of the unrestricted MP3.
My thoughts on this idea are many, and they range from the simple “at the counter” implementation right on up to graphic novel pricing and authentication server structuring. However, I’m going to end this little rant for the moment. If anyone happens to read this and has further ideas or concerns, please let me know. I’m always up for a good discussion.
As those who know me are fully aware, I spend far too much money on comic books. Not enough to put anyone’s kids through college, but definitely enough to make a small dent in my local shop’s rent every month. Usually, my monthly expenses at the shop – DK’s Sierra Mountain Comics in Carson City – run between $125-175 over the course of a full month’s pull list. I don’t smoke, don’t drink, and don’t gamble, so I figure I’m allowed one good vice.
My personal comic poison of choice is the DC Universe, which in the eyes of the vast majority of people is the “one with Batman and Superman in it”. Eighty or ninety percent of what I spend money on every month goes to keeping up with the storylines, continuity, and characters I enjoy reading about in DC’s output every week. Writers like Gail Simone, Greg Rucka, Eric Trautmann, Garth Ennis, Grant Morrison, and the unstoppable Geoff Johns to name only a few, have made comics an art form that this 36 year old “pathetically aspiring” writer can really appreciate.
(And Dave, that comment above was sarcasm. Don’t chew me out for low self esteem about my writing again!)
Because of those writers, and the characters and stories they get to work with, I have no problem shelling out a couple video games worth of money every month in order to support creators I enjoy. However, there is a dark side to comics that’s raised a lot of discussion and debate over the past few years. Something that upsets creators, scares publishers, and very likely terrifies small comic book shop owners:
Illegally downloaded comics.
I’m going to go right out on a limb here and admit that yes – I have downloaded comics from the Internet. In fact, downloading copies of Dan Jolley’s (And later, Stuart Moore’s) run on the rebooted “Firestorm” comic from a few years back is what got me buying any comics at all. The incredibly frustrating combination of impatience, lack of money, and distance (100 miles) from the nearest comic shop lead me to download what I couldn’t buy – which at the time was quite a lot.
There are a number of factors that I believe need to be understood before tackling the issue of downloaded comics, and it is my hope that I understand enough of them to put a solid proposal of a suggestion out here for someone in the industry to consider.
Before I put my suggestion on the table, let’s take a brief moment to try and summarize what happened to the industry as a whole to get where it is today.
In the 1980′s, the comic reader started to get older. I’m not qualified to say exactly how, but it was likely a combination of decreased retail availability and broader entertainment options for the “traditional” comic book buyer. Additionally, comics began to openly feature more mature themes than had been previously done. The “Comics Code Authority” stamp given to most mainstream weekly comics became less of a requirement for publication, and more of a dividing line between the mature and the youth-oriented.
In the 1990′s, following this market contraction, the speculators arrived.
Suddenly, comic books were collectibles – not things to be purchased, read and tossed in the back seat of Grandma’s car on a road trip to Spokane. For some publishers, quality became less of a concern than novelty or shock value. Foil covers, alternate printings, special sleeves, variants galore. The bigger companies, Marvel and DC got in on the act with a vengeance. Smaller ones struggled to get anything they could onto shelves, whether or not it was actually good. Some small publishers, like Image, made bigger waves than even Marvel or DC at the time. But when the speculators finally came to their senses, the market collapsed, right along with dozens of publishers and distributors.
“Moichandising, where the real money from the movie is made.”
- Yogurt (Mel Brooks), Spaceballs
In the wake of this crash, the industry was left in a very difficult spot. Sales of new comics no longer held a high place on the balance sheets, if they ever truly did. The real money came from merchandising the “big” characters such as Superman, Wolverine, Batman, Wonder Woman. Accountants for publishers increasingly saw the weekly comic as a fairly inexpensive way to generate buzz around movies and merchandise.
But the worry about digital comics and illegal downloads is real.
Some publishers have income from merchandising, but most creators don’t. The artist on last week’s issue of Batman doesn’t see a dime from The Dark Knight. The writer who shepherded Superman through the mid 80′s sees nothing from sales of the original Christopher Reeve films. And your local comic book shop owner sees absolutely zero from the sale of Superman T-shirts at Wal-Mart.
So how does the industry respond to the threat of pirated material?
Please continue on to Part 2 in order to read my suggestion.
Today, an editorial I read on the technology news site Electronista (Formerly MacNN) inspired me to begin thinking about what the future of tablet computing may hold. The editorial above posits that Google may be its own biggest enemy in competing with Apple for tablet dominance, through having two operating systems in the fight.
First, and foremost in the minds of consumers and technology pundits, is the Android operating system used by many popular cell phones and so far unpopular tablet devices.
Second, Google is on the verge of releasing Chrome OS which I will describe in further detail later on in this article.
The primary thrust of the Electronista article is that Google, by having two incompatible operating systems in what is likely to be seen as a singular “tablet space” on the market, is more likely to be competing with itself than it is with Apple. In order to properly examine the claims made by Don Reisinger in his article, it is important to evaluate just where Android stands today and where Chrome OS is likely to stand in the forseeable future.
Android OS is by far the simplest of Google’s options to compare with the existing feature set offered by Apple’s iOS based devices. As the vast majority of Android devices are cellular phones, the cleanest comparison is between phones such as the Motorola Droid and the iPhone 4. Each phone has its strengths and weaknesses, some of which are based on hardware, but most of which stem from differences in the operating system and developer support. The intent of this article is not to compare the systems feature-for-feature, but to simply state that Android and iOS belong in the same class of operating system – they are by and large fully featured, user-alterable systems that run standalone applications and interface with other devices.
A far less clear comparison is the one that arises by including Chrome OS in the mix. Chrome is not a known quantity by many, though information about the system has been circulating among the tech-savvy since at least 2008 when its existence was largely a rumor. Officially only begun in 2009, development on Chrome took a far different focus than the work Google had already been doing on the Android system. In the eyes of many pundits and watchers including myself, Chrome was going to be a direct competitor to the Microsoft dominance of the desktop.
We couldn’t have been more wrong.
The structure of Chrome has been explained as a web-based system, leveraging a network connection and applications designed largely to be based on a remote server. With an interface designed for simplicity, and an underlying system with a heavier focus on security than user modification, Chrome is expected to be used on devices with steady network connections and a limited requirement for users to create content or applications. Additionally, official Chrome OS devices will only be sold by companies approved by Google, implementing features in ways that Google requires. Users will not be able to download Chrome OS and install it on their own hardware, a tactic that carries strong echoes of the Apple, Microsoft or RIM approaches to the tablet and phone OS markets.
Reisinger’s article posits that Google will, by offering both Chrome OS and Android to consumers, splinter the marketplace and cause serious problems for itself. He suggests that Google will struggle with growing both the Android and Chrome market shares, with consumers confused about which operating system will be the most appropriate. I see a problem as well, but I cannot simply agree with Reisinger’s reasoning or conclusion.
Android and Chrome OS may both be offered in tablet forms, but the core functionality will be quite different, and the first indicator pundits should have of this difference is in the name of Chrome OS itself. Chrome is the name of Google’s web browser, now on the market and competing with Internet Explorer, Firefox, and Safari. By choosing to call this new system Chrome OS, Google is being quite clear in telling potential buyers that this OS is intended to leverage the power of the web. I believe that this message will eventually be conveyed, and that Chrome OS devices will make quite a distinctive impact on the market in 2011 and beyond.
The future of Android is far less clear. Currently the darling of the “not an iPhone” market, the Android operating system has made huge gains. In the first half of 2010, Android devices actually outsold iOS based models, largely on the strength of Verizon’s introduction of the Motorola Droid in late 2009. The Droid, marketed as “doing what iDon’t” by Verizon, struck a nerve with the Apple faithful while finally offering clear temptation to many users who either could not, or would not choose to do business with AT&T or Apple. Droid, released with the then-new Android 2.0 operating system, finally felt like a true market-ready product unlike the somewhat limited devices sold previously with the far more limited 1.x versions of Android. The 2.0 system was reasonably fast, reasonably stable, and reasonably good looking. When updated to 2.1 only three months later, prospects for the system looked even stronger.
Unfortunately, Android’s ugly past continued to get in the way.
Phone after phone continued to be released with older versions of Android. Mobile carriers started seeing older versions of Android as a viable option for cheaper, less costly handsets. Device manufacturers decided to encourage this mentality, by offering cut-spec devices at cheaper prices. Through the use of older Android versions such as 1.5 and 1.6, these companies released a steady stream of underpowered and disappointing hardware. And due to Google allowing virtually any manufacturer to use Android in one form or another, the practice flourished.
The worst and most recent offender in this trend is Dell Computers. Having toyed with phone development in the past, Dell finally decided to release an Android based handset only three days ago on August 24th, 2010. Just shy of 10 months after the release of Android 2.0, the Dell Aero shipped to consumers with a base operating system of 1.6.
In addition to providing devices with outdated versions of Android, manufacturers like Dell, HTC and Motorola also tend to offer devices with customized user interfaces. With this, one Android device may look almost entirely different from another, and both of those may appear almost completely distinct from a Google “base” system.
Most of this article has focused on the cellular phone market, and at the current time, that’s where the real fight is happening. The iPad occupies a space nearly entirely to itself, rapidly chewing away at the netbook market. Where many pundits decried the iPad as a threat to the eBook reader market, market projections for the dedicated devices seem to be on the move for heavy growth based on low pricing that Apple will most likely never attempt to match.
Android on the other hand, has been primarily marketed as intended for mobile phones. One of Google’s few restrictions on Android use, has been to limit the installation of the Android Market application store to mobile phones only. This is most likely due to piracy controls and unique device identification ability, which would not typically be included in a tablet device such as the iPad when manufactured by companies less concerned with such matters than Apple. As such, Android has yet to appear in a heavily adopted device in tablet form. Many inexpensive options exist, largely as Chinese iPad knockoffs, but these devices typically sport outdated versions of Android or manage to be unavailable for actual purchase.
Chrome OS appears to be ready to avoid those pitfalls. Though Google has released the source code as required by the GNU Public License used by many of its Linux-based underpinnings, the resulting software known as Chromium OS is unlikely to see widespread adoption due to an expected lack of manufacturer support. Companies using Chrome OS will be required to get Google’s sign-off on new hardware releases, giving Google a much tighter grip on the quality and presentation of the Chrome OS brand.
Ironically, Chrome OS might also serve as a savior of sorts for Android. As Google develops services and features for Chrome OS, it is almost a certainty that those features will be rolled into future releases of Android. As Android matures and Chrome OS rises, it is likely that fragmentation in the Android space will decrease while the low-tier manufacturers focus on low-cost Google-blessed Chrome OS devices as opposed to unsatisfying Android offerings. Running Chrome OS will take much of the support hassle away from device manufacturers, and place the largest burden of design and development on Google’s shoulders.
Taking all of the above into account, it is difficult for me to make a clear prediction as to who will “win” the tablet wars to come. Apple will certainly sell millions of devices, and Google’s twin systems will find themselves in other millions. Android and Chrome OS may, combined, even grow to dwarf the installed user base of iOS systems over a multi-year period; with the trend in sales of Android devices, this is a far more likely event than many would have suspected only a year ago. I do believe it is fair to say that the rise of Chrome OS – provided that hardware is significantly less expensive than Apple’s iPad offering – and a truly connected way of delivering information, bode well for Google’s fortunes.
Disclaimer, I own both an Apple iPad and Motorola Droid. My previous cellular phone was an iPhone 3G.
I spent several days trying to think of the best way to write up the events and experiences of my final days in Korea, and unfortunately never found an appropriate way to do so. There was an evening on the town with friends and a gracious host, a bar called “Cocks” which had nothing to do with genitalia or poultry, but plenty to do with darts, and an excellent little bar on the second floor in one of Seoul’s downtown districts.
I was uncomfortable.
I slept badly.
I got sick.
I would go back in a heartbeat.
The memories will last forever, and even though it felt so good standing on the tarmac at the Reno airport, knowing that I was going to be able to read everything again – I do miss the people, culture and sights. There were a lot of really great people that James and I met in Korea, and I hope to see them again someday.