AMD’s Threadripper CPU platform snuck up on everyone last year and revealed itself to be an incredible platform for high-end computing with chips going from 8 cores to 16 cores. Just one year later, they’re taking that platform all the way up to a mind-bending 32 CPU cores. It fits nicely between their mainstream Ryzen CPUs and their Epyc CPUs to take a prominent place for workstations used by professionals.

The new chips come in 12, 16, 24, and 32 core varieties, each with AMD’s take on hyperthreading that effectively doubles the thread count. And these chips are priced at $649, $899, $1299, and $1799 respectively. This puts each chip at roughly $54-$57 per core. Intel, by comparison, can’t come close to hitting those prices. Intel’s workstation CPU flagship, the 18-core i9-7980XE, costs $1879 while having 14 fewer cores. At every rung of the ladder, Skylake X costs significantly more per core. On the server side, it’s even worse. Xeons often cost several thousand dollars.

AMD is singlehandedly responsible for revitalizing the desktop CPU market, leaving Intel scrambling, and I’m really excited about the future. I’m strongly considering stepping up to the 2920X and its 12 cores and 64 PCIe lanes, and finally building Hackintosh support into my machine.

My Giant Hard Drive: Building a Storage Box with FreeNAS


UPDATE 2/20/2015: This build failed after about 15 months, due to extensive drive failure. By extensive, I mean there were a total of 9 drive replacements, before three drives gave out over a weekend. This correlates closely to data recently published by Backblaze, which suggested 3 TB Seagate drives are exceptionally prone to failure. I’ve replaced these with 6 HGST Deskstar NAS 4TB drives, which were rated highly, and are better suited for NAS environments.

For many years, I’ve had a lot of hard drives being used for data storage. Movies, TV shows, music, apps, games, backups, documents, and other data have been moved between hard drives and stored in inconsistent places. This has always been the cheap and easy approach, but it has never been really satisfying. And with little to no redundancy, I’ve suffered a non-trivial amount of data loss as drives die and files get lost. Now, I’m not alone to have this problem, and others have figured out ways of solving it. One of the most interesting has been in the form of a computer dedicated to one thing: storing data, and lots of it. These computers are called network-attached storage, or NAS, computers. A NAS is a specialized computer that has lots of hard drives, a fast connection to the local network, and…that’s about it. It doesn’t need a high-end graphics card, or a 20-inch monitor, or other things we typically associate with computers. It just sits on the network and quietly serves and stores files. There are off-the-shelf boxes you can buy to do this, such as machines made by Synology or Drobo, and you can assemble one yourself for the job.

I’ve been considering making a NAS for myself for over a year, but kept putting it off due to expense and difficulty. But a short time ago, I finally pulled the trigger on a custom assembled machine for storing data. Lots of it; almost 11 terabytes of storage, in fact. This machine is made up of 6 hard drives, and is capable of withstanding a failure on two of them without losing a single file. If any drives do fail, I can replace them and keep on working. And these 11 terabytes act as one giant hard drive, not as 6 independent ones that have to be organized separately. It’s an investment in my storage needs that should grow as I need it to, and last several years.

Building a NAS took a lot of research, and other people have been equally interested in building their own NAS storage system, so I have condensed what I learned and built into this post. Doing this yourself is not for the faint of heart; it took at least 12 hours of work to assemble and setup the NAS to my needs, and required knowledge of how UNIX worked in order to make what I wanted. This post walks through a lot of that, but still requires skill in system administration (and no, I probably won’t be able to help you figure out why your system is not working). If you’ve never run your own server before, you may find this to be too overwhelming, and would be better suited with an off-the-shelf NAS solution. However, building the machine yourself is far more flexible and powerful, and offers some really useful automation and service-level tools that turn it from a dumb hard drive to an integral part of your data and media workflows.

Read More

Smartphones have replaced lots of types of small devices. iOS and Android have made it easy to build apps that perform all kinds of functions, replacing other standalone devices like media players and GPS. It’s been wondered if they would replace handheld gaming devices, and for many people they have. For awhile, I thought they had, at least for my needs. But after trying to play games on touchscreen-only devices for years, I’ve largely felt unenthused about the deeper and more engaging games that would come from big studios. These games required a higher level of precision control that touchscreens just couldn’t deliver.

The PS Vita caught my attention about a month before its launch in the US. It combines a lot of the best features of smartphones with the controls of console games. It has a gorgeous, large, high-resolution touchscreen (and a back panel that is touch-sensitive), as well as a tilt sensor and cameras for augmented reality games. But it also has almost all of the buttons of a typical PS3 controller, including two analog sticks. Sony managed to cram all of this functionality into a device that, while large, is not too big to fit into my pocket, and with long enough battery life for a busy day interspersed with some gaming. The combination of apps and games (which I will describe as just “apps” for the sake of this review) is powerful, and the hardware power and display size make it a compelling device.

Read More