Last week, Apple announced their latest creation: the iPad. For those who have been living under a rock, it is a handheld computer that looks like something straight out of Star Trek. The iPad is a 9.7″ multi-touch display backed by Apple’s custom silicon, the A4 processor. Depending on the exact model purchased, it comes with a 16GB, 32GB, or 64GB flash drive and 802.11n wi-fi with 3G support. There are six models with prices ranging from $499 to $829.

There are two big questions to ask of this device. First, what does the iPad mean for the average consumer? Second, how does the iPad change the way we interact with computers? I’m more interested in the second question than the first, but since most people are more concerned with the first question, I’ll start there.

The first rule of buying Apple products for the average consumer is: wait for the second version, and that is my only piece of advice for the average consumer regarding the iPad.

Apple has a history of disappointing early adopters, and there’s no reason to think that won’t be the case with the iPad. Despite Apple’s incredible design team, there are inevitably missing pieces that are corrected or resolved in the second versions of their products. The iPhone 3G was much better than the first version. The MacBook Air prices dropped dramatically in subsequent versions of the product. Even the iPod, which was an almost instantaneous success, improved greatly in subsequent versions. I can’t say specifically what is missing on the iPad. I haven’t even used one, but it’s such a different piece of technology that there are certainly things that aren’t quite right. Here are a few potential examples:

  1. No web cam.
  2. Not enough storage.
  3. No USB ports.
  4. No HDMI ports.
  5. No multitasking apps.

Now, some of these “obvious” omissions may eventually prove to be brilliant design decisions. Remember: good design is more about leaving things out than cluttering your product with too many features. The problem for early adopters is that we don’t know yet. If you’re an average consumer, then I would hold off.

Of course, if you’re not an average consumer and you have a few hundred bucks to burn on something that might revolutionize the way you live, then suddenly the second question becomes important: how does the iPad change the way we interact with computers? This is an extremely difficult question to answer with any certainty, but we do know that answers to this question boil down into two broad categories: (1) the iPad improves computing and (2) the iPad is a setback for computing.

There are a few important ways that the iPad improves computing. First, as a recovering hardware geek, I was most excited to see Apple use their own silicon on the iPad. It’s nothing all that revolutionary in terms of hardware, but it’s definitely not x86. I generally disfavor computing monocultures. It’s not always that simple (PDF), but the general principle holds up pretty well because nothing innovative happens when everyone makes the same assumptions.

Second, I believe Apple is right there’s a need for some kind of computing device between a smartphone and a laptop. I don’t think Netbooks are the answer simply because they are too similar to laptops. This similarity narrows the utility of a Netbook. I can’t imagine myself reading an electronic book with a Netbook, but I could imagine it working well with the iPad. I’m not sure if the iPad is the answer, but the fact that it is distinctly not a Netbook and distinctly not a smartphone are proof that it’s headed in the right direction.

Third, Apple didn’t include Flash on the iPad. This is the most important argument in favor of the iPad improving computing. It is a sign that openness is winning the web. Adobe Flash is a proprietary, closed-source product that requires a browser plugin to run. Unlike most of the web, you cannot see the source used to render the page you’re viewing if you’re on Flash-based website. View-Source is a good thing. Closed-source development for non-differentiating infrastructure is a bad thing.

I’m not saying that no one should ever produce closed source content because it is somehow inherently evil, but I am saying that it’s just not a good idea unless you know that whatever you’re spending money on will actually increase your net revenue compared to your competitors. It makes sense to ensure that whatever you’re spending resources to produce is actually a business differentiator. For web-based technologies, Flash is no longer a business differentiator. As Gruber says here:

Used to be you could argue that Flash, whatever its merits, delivered content to the entire audience you cared about. That’s no longer true, and Adobe’s Flash penetration is shrinking with each iPhone OS device Apple sells. [...] Developers go where the users are.

—John Gruber

Flash used to enhance the web experience by creating interfaces that weren’t otherwise possible, but now open technologies have basically caught up. YouTube and Vimeo were previously the quintessential Flash-based websites, but they are both already offering HTML5-based video. Heck, there’s even an open-source Flash runtime written in javascript. (It’s called Gordon, as in Flash Gordon…)

Most of the geek backlash against the iPad focuses on things that are missing, like Flash, but one of the key arguments geeks have made against the iPad is that it has a closed-app ecosystem. This is the key, critical way that the iPad is a setback for computing. As Tim Lee points out, closed-app ecosystems are top-down approaches that go against powerful economic forces, which favor an open development environment.

It’s interesting that Apple got open development 100% right with their more traditional computers. While Microsoft forced developers to pay for Visual Studio, Apple ensured that every developer who wanted the best available Mac development tools could do so by registering on their website and downloading the tools for free. This is still true for the iPad. The SDK is available now. The problem is in the installation process. Apple’s more traditional computers can purchase and install applications from anywhere on the Internet. (Also, the actual installation process is much easier than Windows. There’s no registry, and applications are almost always completely self-contained and installable by dragging and dropping them somewhere on your file system.) However, this freedom isn’t available for the iPhone and the iPad, which are stuck with the bottleneck of a top-down app store.

When I talk to non-techies about things like the iPad’s closed-app setup, their response is usually something along the lines of this: “But I’m never going to build my own applications, so why do I care?” This is a fair question to which there are a couple of important responses. First, users still care about the applications they use, but the closed app store model puts the actual decision regarding what’s available in the hands of Apple rather than the users. Even if you never actually build an application that you would use regularly, you still want the pool of developers who might to be as large as possible because someone else might.

Second, the closed app model doesn’t just restrict applications; it also restricts data use. The iPad uses digital rights management (DRM) to ensure that the books, movies, and other content users enjoy has been legitimately purchased. In an ideal world, this would be a good thing. No one wants thieves to prosper (except perhaps the thieves…), but the problem is that DRM doesn’t actually do this in the real world. In fact, DRM breaks more than it fixes because it restricts the rights of legitimate users. It enables censorship, limits free-market competition, and even allows Apple to delete content off your device without notice. Don’t think something like this could happen? Think again. For these reasons, Defective by Design has a petition against the DRM restrictions on the iPad. If you’re interested in more information on the perils of DRM, then I would recommend reading some of the more than 200 excellent posts on the topic by the folks at Freedom to Tinker.

So what’s the overall verdict? One thing we know for sure is that the iPad is distinctly different. Consider this quote:

“The reasonable man adapts himself to the world; the unreasonable one persists in trying to adapt the world to himself. Therefore all progress depends on the unreasonable man.”

—George Bernard Shaw

Steve Jobs is a famously unreasonable man, and as a result, Apple has had some mega-hits, like the iPod, and some mega-flops, like the Lisa. The iPad is destined to be one or the other, and that’s far better than just another boring computer gadget.

We also know that the iPad influences two separate platforms: an open web platform and a closed hardware platform. If you think the open web aspects of the iPad is more of a benefit than the closed hardware, then this is a great development for open technologies. However, if you think reverse that opinion, then the iPad is definitely a bad development for open technologies. Currently, I’m leaning towards the latter, but that’s more of a prediction than an actual opinion. We may have to wait and see what changes Apple makes in the second version of the iPad before we really know how this device will affect computing.