UPDATE 2/20/2015: This build failed after about 15 months, due to extensive drive failure. By extensive, I mean there were a total of 9 drive replacements, before three drives gave out over a weekend. This correlates closely to data recently published by Backblaze, which suggested 3 TB Seagate drives are exceptionally prone to failure. I've replaced these with 6 HGST Deskstar NAS 4TB drives, which were rated highly, and are better suited for NAS environments.

For many years, I've had a lot of hard drives being used for data storage. Movies, TV shows, music, apps, games, backups, documents, and other data have been moved between hard drives and stored in inconsistent places. This has always been the cheap and easy approach, but it has never been really satisfying. And with little to no redundancy, I've suffered a non-trivial amount of data loss as drives die and files get lost. Now, I'm not alone to have this problem, and others have figured out ways of solving it. One of the most interesting has been in the form of a computer dedicated to one thing: storing data, and lots of it. These computers are called network-attached storage, or NAS, computers. A NAS is a specialized computer that has lots of hard drives, a fast connection to the local network, and...that's about it. It doesn't need a high-end graphics card, or a 20-inch monitor, or other things we typically associate with computers. It just sits on the network and quietly serves and stores files. There are off-the-shelf boxes you can buy to do this, such as machines made by Synology or Drobo, and you can assemble one yourself for the job.

I've been considering making a NAS for myself for over a year, but kept putting it off due to expense and difficulty. But a short time ago, I finally pulled the trigger on a custom assembled machine for storing data. Lots of it; almost 11 terabytes of storage, in fact. This machine is made up of 6 hard drives, and is capable of withstanding a failure on two of them without losing a single file. If any drives do fail, I can replace them and keep on working. And these 11 terabytes act as one giant hard drive, not as 6 independent ones that have to be organized separately. It's an investment in my storage needs that should grow as I need it to, and last several years.

Building a NAS took a lot of research, and other people have been equally interested in building their own NAS storage system, so I have condensed what I learned and built into this post. Doing this yourself is not for the faint of heart; it took at least 12 hours of work to assemble and setup the NAS to my needs, and required knowledge of how UNIX worked in order to make what I wanted. This post walks through a lot of that, but still requires skill in system administration (and no, I probably won't be able to help you figure out why your system is not working). If you've never run your own server before, you may find this to be too overwhelming, and would be better suited with an off-the-shelf NAS solution. However, building the machine yourself is far more flexible and powerful, and offers some really useful automation and service-level tools that turn it from a dumb hard drive to an integral part of your data and media workflows.

Before we begin, I'd like to talk about the concepts and terminology to be discussed as part of the assembly. Feel free to skip this section if you already understand RAID, ZFS, and computer assembly.

Data Storage for Newbies

At its core, a NAS is just a computer with a number of hard drives in it. Its only purpose is to store and load data, and make all that stuff available over the network. Since all it's ever doing is holding on to lots of data, you typically don't need a lot of the things that you'd put into a normal computer; stuff like a graphics card, keyboard, mouse, and monitor aren't needed very much. You instead buy parts that focus on a few key areas: number of hard drives you can connect, and how fast you can get data in and out. In this case, you need these parts:

  • a motherboard
  • a CPU
  • some RAM
  • a bunch of hard drives
  • a power supply
  • a case to put everything inside of

Your laptop has a hard drive in it. If you've ever plugged in an external drive or a Flash drive, you'd see that they're two separate places for you to store stuff. If one of them fails, you lose all of the data on it, but it doesn't affect the data on your other drives. And you have to organize everything yourself. Trying to scale up to 4 or 6 or 10 drives sounds like a disaster. What we really would like is to make all of those drives pretend like they're one giant hard drive. And we'd like to be resilient to a hard drive dying without losing data.

There's a tool for this, and it's called RAID, or "redundant array of independent disks". RAID is a set of technologies that takes multiple hard drives, called an array, and combines them under the hood to make them look and act like one giant hard drive. The way this works is complicated, but the basic idea is that RAID takes a file, chops it up into little pieces, and spreads them out across all your hard drives. Then, when you want the file, RAID will grab all those pieces from each hard drive and combine them back into the original file. (Please note: this is an overly simplified discussion of the technology, and is not technically accurate, but is adequate for our purposes of conceptualizing.) There are different strategies called "RAID levels" you can use that will change the specific behavior; some are more focused on redundancy, some are focused on speed.

The benefits you get with most RAID levels are: a bunch of hard drives that look like one storage place, improved speed when reading/writing data, the ability to survive a drive failing, and the ability to replace a dead drive with a new one. However, the downside is potentially a big one. Because the files are never stored as a whole on one drive, if you lose enough drives at once and don't replace them in time, you lose all the data, even on drives that haven't failed. Depending on your RAID level, you can survive zero, one, two, three, or more drives failing. But the more dead drives you want to be able to withstand, the more storage of those drives gets used for redundant data. So it's a balance of how much storage you want vs. how much protection you want from dying drives. You can calculate how much storage you'll have based on how many drives you buy using a RAID calculator. A healthy minimum is that for every 3 drives you buy, you want to be able to withstand one failing. So 2 or 3 drives should withstand 1 drive failing, 4-6 drives should withstand 2 failing, 7-9 should withstand 3, etc.

For this build, I set up my array as a form of RAID called RAID-Z2. RAID-Z and RAID-Z2 are based on a technology called ZFS, which is a modern file system that supports "storage pools". This gives us the "make a bunch of hard drives act like one giant hard drive" behavior, which RAID-Z builds on to give us the "survive a hard drive failure" behavior we want. RAID-Z lets you survive one drive failure, RAID-Z2 lets you survive two, RAID-Z3 lets you survive 3. The major downside to RAID-Z is that it requires all data to be processed by the CPU, so you'll want something reasonably fast to process your data. The more drives you add, the bigger the CPU will need to be.

Building the Computer

The part that was the most daunting for me to overcome was actually purchasing the pieces necessary to build the computer. I'm a software guy who's owned Macs all my life, so I've never actually assembled a computer before (I will take this opportunity to let all the nerds out there get a good laugh in before we move on). If the idea of building your own computer is scary, you may want to just go buy an off-the-shelf NAS, such as the Synology DS413j and stop reading. Keep in mind, though, that a preassembled NAS will be more expensive and far less flexible than building one yourself.

After waffling on this for months, I finally decided to go the custom build approach. I figured I could make it cheaper, quieter, and run whatever services I wanted directly on the machine by building it myself. After putting some pieces together, here's the parts I went with. Prices are what they cost as of September 30, 2012. All links to Amazon include affiliate links, so I get a tiny kickback. Feel free to search for the part names if you wish. You may be able to find these parts cheaper elsewhere on the Internet.

A few notes about this hardware configuration:

  • The case has 6 hard drive slots, so you can put up to 6 drives in it. You can, of course, put fewer in it.
  • The motherboard has 6 SATA ports, but only two are 6 Gbps, while the others are 3 Gbps.
  • The power supply has 5 SATA connections, so if you want to run 6 drives, you'll need a Molex to SATA power adapter.
  • Besides the Molex adapter, the parts mentioned all the cables necessary for internal setup. But you will need your own power cable.
  • The motherboard includes some onboard graphics, and you'll want to have a DVI monitor available for making sure the machine is booting correctly. You won't need to keep it plugged in beyond setup, however.
  • RAM is cheap, and if you're accessing the same files over and over, they can remain in RAM and be even faster than loading from disk. It's better not to skimp on this. Just make sure your CPU is 64-bit.
  • There's no Wi-Fi here, so you'll either need to get a wireless card or (ideally) plug an Ethernet cable into it connected to your network.

Installing the OS

For the operating system, I decided to use FreeNAS 8.2, a distro of FreeBSD that is designed to run ZFS-based RAID systems. It includes a web-based administration tool that lets you set up the array, monitor it, set up tests to detect failing drives, run services and plugins, and lots of other stuff. To run this, I copied it to a USB key (at least 2 GB necessary, you probably want 4 G😎 and just leave that plugged in to the back of the machine all the time. Once you copy the image onto the key, you set the default boot drive to the USB key, and it will boot to it each time. You will also need a keyboard (and note, Apple's keyboards will not work with this setup, so have a USB or even a PS/2 Windows keyboard) to get into the BIOS settings. After you have the BIOS auto-boot set up, when you turn the computer on, it'll take a minute or two to set everything up, and then the web admin will be available on your local network. If you have a router that can tell you what's connected, you can get the IP there; otherwise, plug a monitor into the motherboard and it'll tell you the IP. If your router supports it, you should grab the MAC address and assign it to a static IP on your network so that your NAS is always available on the same IP address. Once this is all running automatically, you can disconnect the monitor and keyboard and just run the machine headless.

The web admin is divided into a few sections. Along the top are the sections/actions that are the most commonly used; System, Network, Storage, Sharing, Services, Account, Help, Alert Status, and Log Out. The absolute first thing you should do is click the Account button and change the username and password for the admin account (which you got logged into automatically). Once this is set, nobody will be able to log in to the web admin without these credentials, or without physical access to the machine (as you can disable the login from the console if you have a monitor/keyboard attached). You'll also want to click the Users tab in that section and create a user for yourself for connecting to the array. Make sure it's in the group "wheel", at the very least.

Once you have that out of the way, you can set up your storage array and actually get those hard drives to do something. Click Storage at the top to view the Active Volumes, which is empty, as we haven't set any up yet. Set one up by clicking the Volume Manager button; give the volume a name (I just called mine "Main"), select all the disks from your list, choose ZFS, then choose your RAID-Z level. Click Add, and after some processing, you'll have a giant hard drive. The amount of storage will be considerably less than the sum capacity of the hard drives you put in, as it is reporting the capacity after taking out the backup data it will eventually be storing. In my case, the 6x3TB drives have about 16.3 TB of raw capacity, but after the backup data in RAID-Z2 is accounted for, only 10.7 TB is available. Note: If you added 6 drives to the array, you should see 6 drives in the list when creating the volume; if you don't, you probably didn't connect something correctly inside the machine. Make sure you set the permissions on this new volume so your user can access it, and do this recursively.

ZFS has a cool feature called "datasets". A dataset is just a folder with special types of rules around how big those folders can be. You can set a quota, which is the maximum size a folder can grow to, and a reserved space amount, which (as the name implies) reserves a certain amount of space for use in that folder. You can customize permissions on these separately from the whole array. You can set certain compression levels based on if you're more concerned with speed vs space. All of these values can be changed later. You can also ignore all of this, and just use datasets for organization. So, for example, I have two primary datasets:

  • Media, which has no quota or reserved space, permissions set so that anyone can read but only I can write, and no compression so it can stream fast, and
  • Backups, for Time Machine, which has the maximum level of compression (as read/write speed doesn't matter), no access to anyone except my user, and a quota of 500 GB

Actually Getting Data In/Out

So now I have a ZFS volume running RAID-Z2, /mnt/Main, which has two datasets, /mnt/Main/Media and /mnt/Main/Backups. Now we need to actually make them available for use by other computers. To do this, we set up Shares. FreeNAS has three different types of shares - AFP (for Macs), CIFS (for Windows, also known as SMB or Samba), and NFS (for Unix/Linux/FreeBSD). For our purposes, I will be setting up two AFP shares, one for each of the two datasets.

Shares are a type of Service, which is a program that FreeNAS will run automatically for you. Besides Shares, FreeNAS has services for things like FTP, LDAP, Rsync, SSH, UPS integration, and plugins. At the top of the admin UI, click Services, and click the On/Off switch next to the AFP service to start it up. Feel free to turn on whatever else you like (except Plugins, which will not quite work out of the box, but I'll discuss Plugins at greater length below). You may be prompted for settings before a given service will start.

Now you can create your Shares. Click the Sharing tab at the top, and make sure "Apple (AFP)" is selected. Click the "Add Apple (AFP) Share" button, and you'll be prompted with a daunting form. You can leave some of the more confusing fields as their default. The fields you really need to worry about are:

  • Name, the displayed name of the share
  • Path, where you want the share to point
  • Share password, if you want to set a password
  • Allow/Deny list and Read-Only/Read-Write Access, to control who can do what on the share
  • Disk Discovery, which will allow the share to be seen if you just ask the server for a list of shares
  • Disk Discovery Mode, which will let you toggle between a normal Finder share and a Time Machine backup share
  • Permissions, which let you control who can read, write, and run programs on the share

Once you have this in place, click OK, and you'll have created the Share. If you enabled Disk Discovery mode, your NAS should appear in the Finder's sidebar. If you did not, you can connect to it by selecting "Connect To Server" from the Go menu in the Finder (⌘K), and typing afp://NAS_IP/SHARE_NAME and filling in the NAS_IP and SHARE_NAME as appropriate. Authenticate if you set it up, and you should be connected. Then you can drag stuff from your hard drive into the share and it will copy over. You can also use cp from the Terminal to copy data.

When I tried setting this up originally, I got permissions errors while doing this. My rules for setting the permissions up are:

  • Make sure the user you want to have read/write access is in both the allow list and the read-write access list
  • If you want read-only access available to everyone, add @nobody to the allow list and the read-only list
  • Set all file/directory permissions to on, with the exception of "other/write".
  • Set the owner of the ZFS dataset to your user, and set all the permissions there to on, with the exception of "other/write".

To test the permissions on the ZFS dataset, the easiest thing to do is enable the SSH service, SSH into the machine with your user account, cd into the dataset, and try to touch a file. If it fails, you can't write. If it does work, cat the file; if it fails, you can't read. If that succeeds, but trying to connect via AFP doesn't let you read/write files, the error is on the AFP share permissions.

Keeping Your NAS Healthy

If you have a system dedicated to making sure your data is reliably accessible, you want to know sooner rather than later if you're going to have hard drive problems. FreeNAS includes a drive testing system called S.M.A.R.T. which is a tool for testing your drives to determine if they are behaving abnormally (higher temperature, higher error rates when reading data, lower throughput, etc.). These can then be emailed to you on a schedule you decide for your analysis. These tests are not run on the array as a whole, but rather on individual disks within the array. These tests can be created and found on the sidebar, under System > S.M.A.R.T. Tests.

I rely primarily on the "short" S.M.A.R.T. test which runs once a day, and occasionally a "long" test which runs manually when I won't need the array for awhile. The short test scans electrical circuits and selected parts of the disk for errors, and these tests take only a couple of minutes. The long test scans every bit on the drive for failures; this takes a very long time, especially on high capacity disks, so it should be run infrequently. There's also a "conveyance" test, which is useful to run before/after moving the drives, to determine if they were damaged during transport. Set these up at your preference.

The easiest way to see this data is to have it emailed to you. Test reports are sent to the email address associated with the root user. To change this, select Account > Users > View Users from the sidebar. In the list that appears, the root user will be at the top of the second list. The last button lets you change the email address, so set this to your email address. You then have tell FreeNAS how to connect to an SMTP server with an account. You can use Gmail or iCloud for this. On the sidebar, select System > Settings and choose the Email tab. Fill out the fields as appropriate for your mail server. Once this is in place, you can send a test email. If you get it, you're all set up, and your S.M.A.R.T. tests will send their results to you when they run.

Extending with Plugins

Note: This is a more advanced topic, and to make this work you'll need an understanding of how SSH and shell access works, which is beyond the scope of this post.

FreeNAS 8.2 introduced a plugin system based on FreeBSD jails, which are sandboxed environments for programs to run in. Plugins are like other services that run automatically in the background, but instead of being services for managing the array themselves, they are apps that you might want to run directly on your storage array. As they are sandboxed, they will only be able to write to specific folders in your array. A number of services have been ported to the FreeNAS plugin format, and you can use these to extend your array's functionality and provide even more utility. I'll demonstrate how to set up Transmission, the BitTorrent client, to run natively on your NAS. You can find other plugins on the FreeNAS Forums, or even make them yourself if the app has been ported to FreeBSD.

To begin, we need a place on the array to store plugins, and to store the Jail. Create two ZFS datasets for this (I call them "Jail" and "Plugins"). You'll rarely need to go in here manually, but the plugin system needs a place for this stuff to live. All FreeNAS plugins are .pbi files, and in fact the service that runs the plugins is itself a pbi file, which is not installed by default. Once you have your datasets set up, go to the Services tab, and click the settings icon next to the Plugins service. There are three steps to the installation. First, it needs a temporary place to store the plugin while it installs (this will be the root of your ZFS volume). Next, it needs to know the path to your dataset for your jail and plugins folder, as well as the IP address you're going to use as the jail's IP (make this something unique, out of your DHCP range). Finally, it needs the plugin service PBI that is appropriate for the version of FreeNAS you're using and the architecture of your CPU.

If it installed successfully, you can then install plugins. Near the top is a tab called "Plugins". Here you can upload the pbi for whatever plugin you like. On the page where you downloaded the plugin service PBI, you can also download the pbi for Transmission. Download it from the site and upload it to your NAS. You'll have to set up the parameters before you can turn it on. Make note of the Download directory you specify, as we'll need it later (but you can leave it as the default). Then, you can turn it on and access it by going to http://JAIL_IP:9091/ in your browser.

Now, before we go on a download spree, we need to understand where those files will end up. They go into the Download directory specified in the settings, which for me was /usr/pbi/transmission-amd64/etc/transmission/home/Downloads. But there's a catch: since this is in a FreeBSD jail, that path is relative to the jail root, which is itself part of your array. Now, you can access that folder, but you probably will want to set up a nicer path for it, that doesn't go through your jail.

That's where Mount Points come in. A Mount Point is a way of making a folder available from the outside of your jail to inside of it. So you can set up a Downloads dataset at /mnt/Main/Downloads, and establish a Mount Point from that to the Transmission download folder, and suddenly everything Transmission downloads will appear in /mnt/Main/Downloads, even though Transmission itself is jailed. In the Plugins tab of Services, there is a "View Mount Points" button. If you add a mount point, it asks you what the route is you want to set up. So for the case above, we need a mount point that looks like this:

  • Source: /mnt/Main/Downloads
  • Destination: /mnt/Main/Plugins/Jail/usr/pbi/transmission-amd64/etc/transmission/home/Downloads

Once this is set up, turn it on, and it will just start writing data from the Transmission downloads folder into your Downloads dataset. You may have to fiddle with permissions; I found I had to make the folder within the jail writable by the user that was running the Transmission process. To enter a jail, SSH in to the NAS box as a user in the wheel group, su root, and run jexec 1 csh. To exit, just exit.

Result

The machine, named [Holocron](http://starwars.wikia.com/wiki/Holocron), sitting in its new home next to my media center. Given the level of nerdy this project entails, Twilight Sparkle is appropriate here. Forgive the wiring and other clutter; that's one of my next projects.

The case was larger than I expected, but not too large. It's about as tall and deep as my media center, so it sits nicely next to it (which is handy as that's where my Internet switch is). The case looks great, with off-black on all sides and no obnoxious branding on the front, and has some convenient USB ports on top. The only problem with the front is that the case has a power button on the top with a REALLY BRIGHT BLUE LED noting that the machine is on; I would love to figure out a way to turn that off (or at least knock the brightness down). But the real win here is that the case is very quiet. It has noise insulating material on the walls, which knock down the sound, and the hard drive trays have rubber grommets on the screw holes, which helps quiet the spinning of the hard drives. The case emits so little sound that, even with 6 hard drives and fans, the entire thing is less noisy than a single Western Digital MyBook (and I had 5 of those to replace). It blew away my expectations of noise.

The machine is quite fast. It handles reading and writing stuff like a champ, downloading and streaming at the same time with no problems. It's been running for weeks at a time with no uptime issues. Even with 7 plugin services running, it has all run very, very smoothly. I've run into one or two bugs in the FreeNAS web admin UI, mostly happening when you try to save an options form that includes a permissions field (when you aren't actually changing permissions). When this happens, a manual reboot of the machine fixes the problems, and since it's manual you can take down connections as you need to. But you really shouldn't have to change them once they're set up, so this is a problem of setup more than anything.

The permissions on the system remain the biggest single headache. I've definitely spent most of my time struggling to make sense of the permission model, which gets more complicated and difficult to track down when you introduce Shares and Mount Points into the mix. But once you have it figured out, you can build in the permissions you want to offer and it will stick. You can also SSH in to the system to see the permissions at the UNIX level, which is helpful if you're familiar with the shell.

The second biggest headache has been learning FreeBSD, which is starkly different from Linux or Mac OS X. There have been several times where I'll do some muscle-memory shell command, like sudo su transmission, and it will fail because FreeBSD does things a little differently (in this case, I've been doing su root followed by su transmission). These are probably just differently configured and there's ways to get it to do what I want, but it's not a big deal.

However, nits aside, once this system is running, it's providing a ton of value. As someone who has always cobbled together storage based on what I had and what was the easiest to get setup, this definitely took more discipline to configure and get working properly, but the value is paying off huge. Since everything is pooled together, I have more incentive to keep it organized and optimized for how I want to use it. The assumptions I set up for myself and through the plugins mean everything works as I want and everything ends up where I need it to be. The extra effort makes it a more useful system.

Building a NAS is not for the cheap or faint of heart. It requires money, time, and effort to build into a great storage system. It is also not a panacea of storage; you still want to back up critical stuff onto a different drive, ideally offsite or in the cloud, and you still need to worry about drives failing. But if you put that energy in, you'll end up with an indispensable tool that will be more reliable and more powerful than a glued-together system of disparate components and wonky services. It's an investment that I'm hoping will pay off for a number of years.

 

Smartphones have replaced lots of types of small devices. iOS and Android have made it easy to build apps that perform all kinds of functions, replacing other standalone devices like media players and GPS. It's been wondered if they would replace handheld gaming devices, and for many people they have. For awhile, I thought they had, at least for my needs. But after trying to play games on touchscreen-only devices for years, I've largely felt unenthused about the deeper and more engaging games that would come from big studios. These games required a higher level of precision control that touchscreens just couldn't deliver.

The PS Vita caught my attention about a month before its launch in the US. It combines a lot of the best features of smartphones with the controls of console games. It has a gorgeous, large, high-resolution touchscreen (and a back panel that is touch-sensitive), as well as a tilt sensor and cameras for augmented reality games. But it also has almost all of the buttons of a typical PS3 controller, including two analog sticks. Sony managed to cram all of this functionality into a device that, while large, is not too big to fit into my pocket, and with long enough battery life for a busy day interspersed with some gaming. The combination of apps and games (which I will describe as just "apps" for the sake of this review) is powerful, and the hardware power and display size make it a compelling device.

Hardware

Put simply, the Vita is a delight to look at. Its black and silver case is easy on the eyes, and falls away while playing games. The device itself is almost entirely plastic, which does make it feel a little bit cheaper than the iPhone, but it's still quite comfortable to hold, if you don't have to use the back touch sensor (more on this below). The physical controls are small, but placed well; I have no difficulty moving my hands between the buttons and the analog sticks for the kind of twitch gaming that hardware buttons excel at.

The display is stunning to look at; at 220 DPI, pixels are almost never noticeable, and the color depth and contrast provide some incredible graphics. The pixel density is not as tight as the newest iPhones or Android phones, but it's just not an issue. The screen itself has a multitouchscreen with amazingly low latency; it feels ever-so-slightly faster to swipe something than the iPhone does (which may be real or not, but it's at least as good). There were some minor issues with the display. It seems prone to banding in a few cases (an issue where a smooth transition between two colors appears as stripes, or bands, on the display). And the graphics, while high-resolution, occasionally showed some slight jagged edges, especially in the OS UI. These issues are tiny, though, and aren't hugely apparent in gameplay.

Spanning the back of the device is a touch sensor which can be used for controlling games, which is as responsive as the front, but is almost too large. There are grips for your fingers, but these grips are too small for my hands. If you need to use them for a game which relies on the back touch sensor, I have to grip the device somewhat awkwardly. It's not uncomfortable, but it does make me worry a bit that I will drop the device due to loose grip (a problem that has never actually happened in use).

There are a number of input ports on the device. Along the top are two trays, one containing an accessory port, and one for inserting the tiny game cartridges. These trays have a plastic cap that I found incredibly difficult and frustrating to open with just my hands, which will probably limit how many physical games I end up buying versus downloading through the store. Along the bottom is a proprietary "multi-use" port similar to Apple's dock connector, a headphone jack, and a memory card slot. The memory card is the only covered port, which is thankfully far easier to open than the ports on top. I have to wonder if this was a conscious decision by Sony to encourage purchasing games over the Internet; make the old-style games hard to replace, but make the memory card (which you can store downloaded games on) easy to replace, and people will tend to buy more online. There's also front and rear cameras; these take terrible photos/videos, and are basically useless for anything other than augmented reality games, which are actually really interesting (more on that in the Games section below). But it's not like you're buying this to replace a camera anyway.

The usual wireless technologies are here. Wi-Fi worked pretty well and generally connected automatically to 802.11b/g/n networks. I paired my Sennheiser MM 100 Bluetooth headphones to the Vita and they sounded great. You can also get a version of the Vita with 3G data. I ran into several issues with these in common use. While I did not get the 3G model, it's limited to 20 MB downloads (so basically no games), and multiplayer games cannot be played over 3G. It's basically useful for messaging and browsing, and that's about it. If you have a smartphone with tethering, it's probably best to just stick with that. And while the Vita had no issue auto-connecting to the Wi-Fi at my home and my office, it didn't seem to want to connect to my iPhone's tethering Wi-Fi until I went into the settings app and turned it on. Similarly, the Vita had no end of trouble automatically connecting, to my Bluetooth headphones, leading to a similar jump through the settings app. Hopefully these are 1.0 issues resolved with software updates, but it limits their use when you only have 10 minutes to play a quick game.

OS

The system OS is pretty well thought out in terms of interaction, though it has some rough edges. It's completely controlled by tapping and gestures on the touchscreen; none of the buttons do anything. You can either use your index finger, or the combination of both thumbs, to access every pixel on screen, and all the gestures are usable by just a single thumb. This might be an issue if you have smaller hands, but I have no issues with it. Navigation around the home screen is more fluid than any touch device I've ever used, and animates at very high framerate (probably 60 FPS).

Your apps are listed on the leftmost screen, stored on multiple pages you can access by swiping vertically. You can organize them as you would on a smartphone, and assign different wallpapers to each page. Tapping on any icon opens its LiveArea, which you can use to then launch the game. This is one thing I rather dislike about the Vita OS, as it requires two taps to open anything.

To the right of the app pages, you can find the list of recently running apps. Each shows what Sony calls a "LiveArea", a nearly full-screen page showing information about the game, some meta controls, and recent activity about the game (when you last played, recent Trophies you've gotten, etc.). App developers can place stuff on the LiveArea, such as announcements and links to downloadable content. The system also shows some common controls for apps, like an update button, a button to do a web search for the game name, and on-device instruction manuals for the games. You can close any of the LiveAreas just by swiping from the upper right to the bottom left, with a nice paper effect of throwing the page away.

The graphical style of the OS is not great, but it's livable. App icons are glossy bubbles on the home screen, which looks kind of cheesy. As far as I can tell, the Vita doesn't use anti-aliasing (at least on the home screen), which causes the round bubbles to appear extremely jaggy. If the display were higher resolution, this might work, but it just isn't quite high enough to warrant eliminating anti-aliasing. The LiveAreas look nice, but some of the stock apps use this to excess with bright, conflicting colors that just look under-designed. But it works, and it's intuitive.

The interaction between software and the OS is generally pretty great. When inside an app, the OS disappears except for a few interactions (loading/saving data, for example). Some popups will appear occasionally, such as when you unlock a Trophy or a friend comes online, in the upper right corner. At any time you can press the PlayStation button to suspend the app and return to its LiveArea; you can then switch to a few of the other apps, such as Settings or the Twitter app, do something, and return to the app in the exact same state. Unfortunately you can only have one app open at a time, which can be annoying (specifically for the Browser app). But this doesn't get in the way all too often.

Apps

The Vita comes with a handful of stock apps, none of which are particularly great, but they get the job done. I haven't gotten to play in-depth with all of them, primarily as this is a gaming device first. The Friends app lists your PSN friends and who is online, but has a lot of whitespace, leaving you to see only 6 people onscreen at a time. The Messaging app is handy for chatting with your friends, and has no setup other than your PSN account, which is handy. Maps is pretty capable, using your geolocation to show you places and driving/walking directions, and storing favorites (but has no public transit, which is a dealbreaker for me personally). The Browser is okay, and can view basic pages, but anything taking advantage of newer HTML5 features will probably not render well. Hopefully these are 1.0 issues that will be improved with system updates

As of this writing there are four apps you can download from the PS Store - LiveTweet, Facebook, Flickr, and Netflix. I could not get the Facebook app to work, which just showed a "connecting to Facebook services failed" dialog and a cryptic error code. The Netflix app was slow and not particularly aesthetically pleasing, but it worked and you can stream video on it, pretty fast (and video plays very well). The best app is definitely the Twitter app, LiveTweet, which is a surprisingly full-featured Twitter client, supporting reading your timelines, pull-to-refresh, uploading images to the Twitter image sharing service, and lots of other little nuances in Twitter. It's a pretty great app, though it has some polish issues that will surely be resolved in updates.

The PS Store is the app you use to buy stuff and download free apps. It features one of the best LiveAreas in the system, showing popular content that you can find within the store. The Store itself works pretty well, albeit slowly and with some organization problems. You can see featured apps, new releases, and the most popular downloads. There are also a number of categories, such as Vita-specific games, PSP games, games that run on either the Vita or the PS3, and smaller games called "minis". These categories are generally grouped by their title, which is weird for me, as I prefer exploring all the games, not just the ones whose first letter is betweeen E and H. You can also sort games by genre, which is probably my favorite view (but is inexplicably buried at the end of the list). There are a number of genres (including both the "Shooters" genre and the "Shooting" genre) to explore games. Sadly the game pages themselves don't show screenshots, previews, or customer reviews; just an aggregate rating, a description, and the ESRB rating. This should really be fleshed out to show more detail, similar to the App Store or the Android Market.

UI-wise, there are some nice affordances. If you reach the end of a scrollable area, the content will either stretch (in the case of a single piece of content like a web page), or the items in the list will space themselves out (in the case of the Twitter or Messaging apps). Apps can fire off notifications, which appear in any app as a bubble and are collected in a notification space on the home screen, accessible by tapping the bubble in the upper right corner. Text input is generally easy, although there is no selection/cut/copy/paste (though you can tap-hold anywhere to zoom into the text to place the cursor where you like). The keyboard is pretty good, with a fairly intuitive layout and some OK autocorrect features which work similarly to Android's suggestion tray above the keyboard.

Games

There are over a dozen full Vita games available at launch, as well as a huge online catalog of PSP ports and mini games you can download through the PS Store. Of the games available at launch, I've played:

  • FIFA '12 (Vita)
  • Uncharted: Golden Abyss (Vita)
  • Unit 13 (Vita, demo)
  • Fireworks (Vita, a tech demo)
  • Final Fantasy IV (PSP)

So far, my favorites have been FIFA '12 and Final Fantasy IV. FIFA is EA's well-known soccer game, and the extent of the game is pretty huge for a portable device. It's so complete, it feels like it belongs on my TV. Tons of gameplay modes, a huge array of national teams, an extensive Career Mode, and tons of character customization. The touchscreen controls are OK, but can be kind of gimmicky and in practice are only occasionally useful. It often takes too long to move your hand from the buttons to the touchscreen and back to use in action-packed gaming. It's more useful for throw-ins and other less intense moments. The rear touch surface lets you shoot the ball on goal extremely accurately, and this is where the touch controls truly shine in FIFA. And it just looks amazing.

Final Fantasy IV is a remake of the SNES original, one of the greatest Japanese RPGs ever made. Square Enix completely remade the graphics and made a great version for the PSP, and it looks great on the Vita's screen. PSP games have some additional features on the Vita, accessed by tap-holding on the touchscreen, such as changing how the image is upscaled and colored, determining which camera to use, and picking what to control with the right analog stick. If the $29 price tag is off-putting for a 30 year old game, at least you're getting a polished remake with high-quality pixel graphics with cutscenes and video, and a bunch of supplemental material.

Uncharted was a game I was looking forward to, but it has mostly been disappointing. The jagged edge effect is more noticeable here than in any of the other games, simply because there's a lot going on onscreen. The game has so far tended to hold your hand throughout the entire process; walk for 50 feet, then a cutscene telling you exactly where to look and what to do. The combat controls are fairly good, but with one huge exception. If an enemy gets too close to you, it enters "melee mode", which wants you to use the touchscreen to draw gestures on how to attack your opponent. As with FIFA, the switch from physical buttons to touchscreen is not fast, and the whole thing is somewhat jarring. The game uses the touchscreen for some "puzzles", which are so far pretty boring and repetitive tasks like "wipe this thing off" and "spin this object around to look at it". The one good use of the touchscreen is climbing. You can draw a gesture along rock walls to signal to the character where to climb, which is handy and doesn't seem to come during fights.

Unit 13 is a tactical shooter game by Sony, which makes good use of the physical controls of the device. The graphics look pretty good, but not stellar, mostly like a PS2 game. The controls work very well, and I had no problems with moving around or hitting my target. And the game doesn't coddle you - it doesn't point out where enemies are, and it will happily let you die mid-mission if you take a few shots from the enemy. It makes light use of the touchscreen for controls, but it does so when you're supposed to have cleared the room of targets, so you're encouraged to avoid using it while in twitch mode, and to use it when things calm down. That's smart use of the touchscreen, and I hope more game developers will do that. I only have the demo, but will probably pick up the full game soon.

Fireworks is a free tech demo published by Sony that uses the augmented reality feature of the Vita very nicely. The system comes with six AR "cards", which are about as big as the Vita, with QR-like shapes printed on them. The idea is that you set one of these down on a table, point the Vita at it, and the camera will recognize the card and project graphics on top of it. In the case of Fireworks, it showed a small house which was shooting off fireworks, and you tap the round to make it explode. I've played AR games and apps on iPhone, and found it to be lacking; if you moved the device, it was too slow to respond, leading to a disconnect between the real world and the augmented world. Not so on the Vita. The camera, display, and accelerometer work extremely well together, and it really feels like you're projecting onto the real world. If you move the device, the lag it takes to see the game update is nearly imperceptible. I can't really explain this one, you just have to see it in action. I hope to see more games (and apps) take advantage of this.

In general, games and demos look great, and the physical controls are quite snappy. This truly was built to be a gaming system first, and it shows. Touchscreen input is OK for instances where you don't have to make a lightning fast reaction, or don't need accuracy beyond a tap or a swipe, but you're not going to want to do it often. It sucks for all the reasons intense games suck on touchscreen-only devices. I'd love to see more use of the rear touch surface, though, which is handy because your hands are already there. And the augmented reality stuff could open up some really awesome possibilities, if everyone manages to keep from losing their AR cards.

Future

Sony is propositioning this as another long-life console, like they are with the PS3. It very well could be; it certainly has the raw horsepower, a great (if maybe too large for the average person) form factor, and a wonderful blend of console mainstays and fresher smartphone ideas. It's pretty clear that the smartphone manufacturers aren't terribly interested in making the input side of gaming much better. And Nintendo's 3DS is gaining some traction, but is certainly not as big a success as they'd hoped. The big question remains, can a gaming device remain a standalone product and gain enough traction to warrant being a separate device?

I, for one, hope so. The Vita is extremely capable and, with some updates to the stock OS/apps and some additional software, could be (and this is probably a stretch) a competitor to the iPod touch. It makes sense to me that, as people want to use technology in ever-more-mobile spaces, they'd want to bring powerful games along with them, and smartphones just can't provide that beyond flinging birds and other simple games. The Vita shines because of its ability to provide the immersive experience, and it does that very well. I can't remember ever seeing three hours disappear playing an iPhone game; I did that this weekend on the Vita.

One way they can definitely attract consumers is by expanding the available apps to include all kinds of content, as well as indie games. The mobile software industry is exploding right now on all platforms that offer everyone the ability to tinker, from massive companies to hobby hackers to teenagers. A Vita that ignores that opportunity is leaving money on the table, both from lost sales of software to unpurchased devices. Sony has announced that they're bringing an SDK for developers, called the PlayStation Suite, but it's unclear if this will be a less restricted approach like on iOS and Android, or a locked down and tightly controlled approval process that has been the status quo in the gaming industry since its inception.

If Sony can keep game makers interested in bringing massive new titles to the Vita, we are probably looking at some of the best days for mobile gaming ahead. Hopefully customers will notice, and be willing to fork over the premium for a better gaming experience. But it will be a tougher sell in a world of mobile computers crammed into smartphones.

Conclusion

Five days in, I love my Vita. I've been spending at least an hour or two on it every day, and that's only been going up. The games are pretty great for first-gen titles. This truly feels like a console experience merged with the best ideas the smartphone world has been building for years. There are some 1.0 issues, and some hardware quirks that are metaphorical rough edges, but the overall experience is solid and thought out. If you are a fan of gaming, or are disappointed in the state of gaming on cell phones, a Vita will be a great asset to you. Hopefully Sony can sell a bunch of these things and keep game developers interested over the long run. And hopefully they open it up to indies and app developers to add that much more value as a great Internet communication device.

Edit 2/27/2012: As Kevin Ballard pointed out, I incorrectly called the LiveArea feature "LiveTile", which is actually a feature of Windows Phone 7. The Vita's feature is called LiveArea.

 

Adobe is finally putting an end to Flash Player. They've announced they're stopping development of the mobile Flash Player, which is where the future of tech innovation is heading, and the writing is on the wall for desktop Flash Player as well. This is a good thing for a myriad of reasons, both technical and political.

However, it is important to remember that Flash drove much of the innovation on the web as we know it today. When Flash was conceived over a decade ago, the web was a glimmer of what it is today. Creating something visually impressive and interactive was almost impossible. Flash brought the ability to do animation, sound, video, 3D graphics, and local storage in the browser when nothing else could.

Without Flash, MapQuest would not have been able to provide maps for years before Google did in JavaScript. The juggernaut YouTube would not have been possible until at least 2009, four years after its actual launch. Gaming on the web, which has been around as long as Flash, would only now be possible a decade later. Flash enabled developers to create rich user experiences in a market dominated by slow moving browser developers. Even in 2011 Flash exists to provide those more powerful apps to less tech-savvy people who still use old versions of Internet Explorer.

Flash Player itself seemed like a means to an end. Macromedia, and then Adobe who acquired them, sells the tool that you use to build Flash content. Thus, Adobe's incentive was not to build a great Flash Player, but a pervasive one that would sell its tools. Its technical stagnation provided a market opportunity for browser developers to fill in the gaps that Flash provided. As a result it has a huge market dominance in tools for building rich apps for the web, tools HTML5 lacks.

This puts Adobe in a unique position. As HTML5 continues to negate the need for Flash Player, Adobe has the tools for implementing Flash within HTML5, and the market eager for those tools. Hopefully this move signals that Adobe will be moving in this direction. Because the web DOES need great HTML5 tools for people who aren't savvy in JavaScript, especially for the people who used Flash to do it previously.

HTML5 offers developers the ability to build high-performance, low-power apps and experiences. Browser innovation has never been faster; Apple, Google, Microsoft, and Mozilla are all competing to bring the best new features to their browsers in compatible ways. But they're just now filling in many features Flash Player has had for years. Adobe can harness this to help build a better web, and few others can. Hopefully they seize this moment.

 

I wrote a guest post for MacStories, covering the history of patent law surrounding patent trolls. While recent lawsuits from Lodsys and Kootol are causing panic and alarm from indie developers, it's not like this threat is suddenly new. Patent lawsuits have always been on the table, but they were ignored by the majority of small companies. Now it's clear that patent holders will pursue people who violate their patent. Whether ethical or not, they are legally required to defend their patents, and that means we will see more patent lawsuits pursued by trolls. Meanwhile, none of these small developers can afford to fight, so they settle, perpetuating the cycle.

 

It seems like every other week I'm reading a blog post from a person who went to a tech conference, or a meetup, or heard a talk, and was rightly offended that someone made a tactless joke about women, either about women in general or about specific women. It is disheartening to hear that any group would be made to feel less worthy of respect in our circle, especially at a time when our industry is undergoing one of the most massive and impactful revolutions in decades, and at a time when we need new blood the most. Every person in our industry should be fighting for inclusivity and should welcome new members with open arms and helpful tutorials. Why there aren't more people pushing for this, I don't know.

It is equally saddening to hear so many respectable people jump to conclusions about what your actual motives might be when trying to have an adult discussion about this sensitive subject. I'll make no qualms about it; I'm a born-middle-class white guy, so right off the top, there will be people who will read this under the pretext that I'm either a misogynist or that I'm some kind of "Internet white knight". As a middle-class white guy, my exposure to injustice and inequality has been limited. I cannot possibly know how it feels to hear words thrown around that minimize the role of women in tech. But I also have rarely been a presenter; who am I to say that I know what Noah Kagan's motives were when he put "faceless bitch" on a slide at a recent conference? He could've been trying to lighten the mood, he could have a vendetta against women in tech. I don't know.

I'm inclined to believe that incidents like this, such as where women are mocked by a presenter, are isolated events perpetuated by a non-representative group of a few people. When I go to conferences, I keep an active ear open for slurs against women, and have yet to hear any. But what's fascinating to me is how women are the group continually called out. A demographic survey created by A List Apart shows that women made up 17.8% of respondents; the same study also showed that Asians, blacks, and Hispanics each represented no more than 6% of the group (which is itself a completely separate topic of inequality that seems to be forgotten in these discussions). Yet women are the demographic so frequently mocked and shamed. It probably boils down to sex and the fact that the people that connect to this industry tend to be more introverted, but I don't know.

The only two things that unite everyone in this industry are that 1) we are all fascinated with high technology, and that 2) we are all humans. As humans we have cognitive biases which prejudice us towards recognizing things the way we'd like them to be. So when we hear that, over the course of several conferences, jokes were made that denigrate women, we're biased to believe that these events are misogynistic in nature, and that repeated incidents show a trend of sexist men trying to keep out women. It's possible that's what's happening; I think the truth is that these people generally are poor communicators and entertainers put into a role of communicating and entertaining, and failing. But I don't know.

I don't know the solutions to the problems we face, but I do know a few things that we all can do better, no matter what subset of demographics you belong to.

  • Actively call out unacceptable remarks when they're perpetuated at the expense of any group within our community. Whether that's at the expense of women, men, Android fans, Windows fans, Apple fans, anyone. There is no logical reason for our fledgling industry to show animosity towards any group.
  • Fight the groupthink mentality to label anyone who screws up . Nobody is perfect. Everyone makes mistakes. Few people are truly evil and rotten. I'm reminded of this YouTube video on race; watch, but replace "racist" with "sexist". Address what they did, not who they are. Let's address individual problems without calling into question someone's motives, unless someone makes the same mistake over and over without remorse.
  • Consider not just on how your message is delivered, but also on how it will be perceived. Your audience will contain not only women, but members of every race, gender, religion, and sexual orientation. Joking at the expense of other groups is juvenile and unbecoming, and will reflect negatively not only on the speaker but on the tech community as a whole.
  • Mentor young people who are interested in high technology, and help them learn how to become successful and open-minded. This is something that we should be doing a better job of as an industry as a whole. A teenager who wants to become a software engineer will learn acceptance if they are accepted into a group dominated by grown-ups.

More non-middle-class-white-guy people in our industry will only benefit everyone, from developers to designers to companies to customers. We must be vigilant to keep prejudice out and embrace every single person who wants to contribute to this revolution. But we must be similarly careful not to vilify people for mistakes; hindsight is, after all, 20/20. Of course, maybe I'm wrong. I just don't know.

Be excellent to each other.

Thanks to Faruk AteĊŸ, who has spoken at length on this issue, for his feedback on this post.

 

JailbreakMe.com is a website that offers visitors the ability to jailbreak their iPhone without a computer-based tether. It does this by exploiting the system-wide ability for applications to read PDF files, where an incorrectly-formatted PDF file can lead a hacker to do anything they want to your system. While this bug CAN be used maliciously to steal all the personal data from your phone, the developers in this instance used it to enable jailbreaking.

Others will tell you why you should or should not jailbreak your iPhone. Others will decry the developers for bringing to light a serious vulnerability in the iPhone OS. In this blog post, I won't do any of that, but will instead point out some things you should and shouldn't do if you decide to jailbreak.

Backup first, and backup the backup

It should go without saying that, before you start mucking around with the internals of the software on your phone, you should back everything up with iTunes. Sync down all the data into iTunes, and explicitly backup by right-clicking the iPhone in the sidebar and choosing "Back Up". Once that is done, you should backup the actual backup files to somewhere safe. This way, if you ever want to go back to a vanilla iPhone, it's fairly straightforward. The files are located in ~/Library/Application Support/MobileSync/Backup.

Understand what you're doing

Jailbreaking lets you run apps on your iPhone that, for a variety of political and technical reasons, you could not run otherwise. Apple has gone to great lengths to prevent you from running unauthorized apps on your iPhone, and for several reasons; the most important is for security. Since jailbreaking is designed to let you run those apps, that means that in order for the jailbreak to work, several of those security measures are simply shut off and disabled. This does not mean that you'll automatically get viruses and have your data stolen, but it does open up more avenues for hackers to gain access to your data. You simply must be more vigilant and attentive about security when your phone is jailbroken.

Only add sources that you trust completely

When you jailbreak, you will notice a new app on your home screen, called "Cydia". You can think of this as the jailbroken App Store for your iPhone. You will be able to use this to install lots of apps; you can also install mods that change app icons and fonts, mods that change how apps behave, and mods that add new features system-wide. One way this differs from Apple's built-in App Store is that third parties can publish their own list of apps and mods at their own whim, and users can add those lists to Cydia. You can find lists of third-party sources available by doing some creative Googling.

Now, since you can add any third-party list you want, and those lists can contain mods which can access all of the data on your iPhone, you need to be extremely mindful of which sources you add. Seemingly innocuous apps, such as simple wallpaper lists, can contain code which subtly and sneakily siphons away your contacts, or worse. Since you don't have Apple vetting apps before they hit your phone, you won't be able to trust that an app isn't malicious if it's from an unknown source.

Only install what you need

Many of the apps and mods you can download through Cydia will not be things that you can technically do on the iPhone using Apple's published APIs. An example of this is the project which allows you to install a Growl-like UI for push notifications; it simply is not possible to do through the App Store. This means that you will have mods injecting code into the memory of other apps (sometimes into EVERY app). The more mods like this you have, the more they will start to clash with each other. This can lead to crashes, drained batteries, hangs, and system slowdown. You should consciously try to minimize the number of mods that you install, to preserve the experience of your iPhone.

Be mindful of OpenSSH

Packages in Cydia often times will require use of other libraries to achieve their goals. These needs are called dependencies in Cydia, and they will be listed when you try and install packages. There are packages which will blindly install a package called OpenSSH, which installs a server on your iPhone that allows you to log in via a Terminal. Now, this package uses a file on the iPhone to determine what the default password is, which happens to be 'alpine'. As you can imagine, many people don't change that password by default, and instead just let the default stick and never change it; this led to disaster last year when someone used the default password to extort lazy iPhone jailbreakers.

If you install this package, the absolute first thing you should do is change the root password.

Be wary of iOS software updates

In all likelihood, your iOS software updates will be far more involved than non-jailbreaking. The hacks used to enable jailbreaking are usually patched in the next update of the OS. This means that, if you want to keep your jailbreak mods, you will need to wait for the iPhone dev community to release an updated jailbreak procedure. Sometimes this takes hours, sometimes this takes weeks. Once the jailbreak is released, updating generally consists of backing up everything, restoring your iPhone to the new OS, re-jailbreaking, and reinstalling all of your jailbreak software. It is a far more involved process, on top of the already involved update process of the iOS. You will likely update the OS far less than you would if you were non-jailbroken.

 

iPhone had the first two store UIs; the iTunes Store for content like music and movies, and the App Store for software. The iPad will add a third, the iBookstore, for buying eBooks. These stores all provide content for users to extend the utility of their device. But each has a pretty different user interaction model for accessing, purchasing, and consuming that content.

  • The iTunes Store is a separate app that is completely distinct from the iPod app. When you find something to buy, prompting you for your iTunes account password. It then adds the purchase to the app's Downloads tab. Once you have purchased the content, you must then switch back to the iPod app to listen to or watch it.
  • The App Store is a separate app. When you purchase something, it prompts you for your iTunes password, and then exits to the home screen, switching to the screen where the app will live. The state of the download is reflected in the app icon. When the download is complete, you tap the icon on the home screen to use it.
  • The iBookstore (the one word is the official name as used by Apple) is not a separate app, but lives within the iBooks app on the iPad. Purchasing content prompts for the iTunes password and downloads in-app, which can be directly accessed after it has finished downloading.

Each type of content follows a different workflow when going from access to purchase to use. If a goal of the iPad's low price is to drive content sales through the three stores, as some speculate is the case, then the purchase model should be as streamlined for the different types of content. Forcing different workflows will only confuse users who can't remember which type of content comes from where.

 

We've all got our thoughts on what the Jesus Tablet will be, so here are my guesses. I fully expect to be completely wrong on all of this, as many of these answers are completely blind shots and that Apple will blow my expectations out of the water.

Hardware

  • 8"-10" touch screen, running at 1280x720
  • Very thin; less than 1/2" thick (the iPhone 3GS is 0.48" thick)
  • About 1lb heavy, light enough to hold in one hand
  • 8 hours of battery life
  • 32 or 64 GB SSD
  • WiFi
  • 3G over GSM, and Apple's US 3G partner will continue to be AT&T
  • There will be some way to pair your Tablet cell connection with your iPhone's cell connection; either with an official announcement of AT&T tethering, or by adding your Tablet to the 3G account
  • Front-mounted camera
  • Some kind of collapsible stand in the frame, so the device can sit on a table

Input/Output

  • Multi-touch on the display, exactly like the iPhone
  • Multi-touch on the back of the device, similar to the surface of the Magic Mouse
  • Photos and video via front-mounted camera
  • Audio via front-mounted microphone and speakers, wired headphones, or Bluetooth
  • Dock connector
  • Expanded voice recognition
  • Software keyboard, no Bluetooth keyboards available

Software

  • It will run the iPhone OS 4.0; or rather, the iPhone OS will become a "Mobile OS X", consisting of the heavyweight Tablet and the smaller iPhone.
  • It will allow multiple apps to run at the same time, with some UI for viewing multiple apps alongside each other. This may not be possible on the iPhone.
  • It meant to replace a full PC for most common day-to-day needs
  • iPhone applications will not run "automatically", but will need to be resubmitted through the App Store approval process. Most applications will run without much modifications. Icons will need to be higher resolution.
  • A system-wide Dock for documents, applications, and small widgets will be onscreen at all times
  • The home screen will be significantly revamped, and renamed to the Dashboard. App icons, web clippings, and widgets will be freely arrangeable.
  • Handwriting recognition will be available for text input, with an optional stylus, or with a gesture such as two closed fingers drawing as if you had a pen.
  • Some gestures will be used on the back of the device, such as scrolling and zooming.

Apps

  • Standard kind of iPod and Internet communications apps the iPhone OS comes with. iTunes video, iTunes LP content, Maps, and Safari web content will look phenomenal.
  • Sketchbook, an unlimited workspace to sketch and write notes, with collaboration features.
  • iWork, a full port of the iWork application suite, tied to the Internet (and expansion of the iWork.com web application), with collaboration features.
  • iChat, a port of the Mac app, with a heavy emphasis on video conferencing

SDK

  • The SDK will be available immediately, with a simulator.
  • There will be an emphasis on application interoperability.
  • Applications will be able to register plugins with view controllers and UTIs. When an application wants to expose an object (say, an image) to other apps, it will look for app plugins which respond to the "public.image" UTI, load one which matches the UTI, and present the view without leaving the application.
  • Applications will be able to expose services, similar to how they work on Mac OS X. Services will be integrated into the voice control system.

Product

  • 32 GB model will be available for $899
  • 64 GB model will be available for $999
  • Available in US in March, major countries by summer
  • There will not be a WiFi-only model at launch.

Other Predictions

  • Updated MacBook Pros and MacBook Airs, with the mobile Core i5 "Arrandale" processors from Intel.
  • There will be no mention of Verizon
  • There will be no updates to the iPod or the Apple TV
  • There will be no announcements of the iPhone 4G
 

"Luckily I speak l33t." This is painfully bad.

, ,    Tags
 

Beyond impressive. This is more than some governments have contributed so far.

, , ,    Tags
Page 1 of 3