One of my earliest memories of awe when learning how to program computers as a kid was wondering how the computer takes bits in the CPU and turns them into an image on a screen. When you don’t know how most of the system works, it’s easy to conclude that it is basically magic, and that the knowledge was arcane to the point of being inaccessible without tons of background in engineering and science.

As I grew up, this question never really left my mind, and really grew into the more abstract question of how a computer works at every level. Once you get into the high level of software and languages like C and Swift, you can abstract away a lot of the questions into programming language theory and ignore things like transistors and voltage. But peeling back the curtain to reveal what is going on in the circuitry lies a world that is both simple in its principles and ever complex in the myriad of tricks used to keep this wrangled chaos ticking in utter precision.

I started watching Ben Eater’s incredible YouTube series on building a computer on breadboards a couple years ago. I’ve watched the whole thing end-to-end multiple times, and have built (or tried) a couple of the modules, including the clock and the ALU. I went from knowing very little about electronics to having a fairly broad understanding of how circuits work, at least in theory.

On the other side, I’ve tried writing video game console emulators, notably a Game Boy emulator in Swift which works well enough to understand how a higher level system works with all its pieces put together. The emulator is largely unusable (hence why it doesn’t exist in an open source form) but it can run the start of a couple games, generate some graphics and sound, and process user input. Over the years I’d heard about things like Verilog and VHDL being used to define hardware in code. And in the last year or two I’ve gained an understanding of what FPGAs are (especially thanks to the MiSTer project and its focus on cycle-accurate emulation).

So in my ever expanding pursuit to learn how each piece of the computer works, I’ve picked up an FPGA development board, the DE10-Nano, and have begun using the Intel Quartus software to learn how to program it using Verilog. So far, I’ve gotten a couple small and ultra-basic projects working, namely getting an LED to blink based on an internal 50MHz clock and a clock divider, and a basic unclocked two-bit adder. As I continue going down this route, I have a number of project ideas I want to try, including:

  • emulating Ben Eater’s breadboard computer
  • a video generator that outputs something to HDMI
  • an analog to HDMI video converter and scaler, akin to the OSSC
  • an HDMI to analog video converter, so I can play PC games like Undertale on my old CRT TV
  • a Dreamcast VMU clone
  • a basic graphics card
  • a RISC-V CPU
  • some kind of module for my modular synthesizer

Will I be successful at actually building any of these? Who knows! But in the end, I’ll get a few steps closer to understanding how computers truly work at the silicon and chip level. And the journey is definitely more valuable than the destination.

I’ve never been able to get into a flight simulator before, but I really want to check this one out. This is a fantastic in-depth look at it. Make sure to crank the resolution all the way up.

The party set off from Neverwinter with the supplies en route to Phandalin. They came across two dead horses in a choke point on the road. Hagnar went to briefly investigate, then returned to the cart, only to be ambushed by two goblins with one more sniping with arrows from the trees. Tarot charmed one, while Thunder cleaved the arm off another, leaving the third to run away as Hagnar sniped it in the back.

Thunder discovered that the horses belonged to Gundren and Sildar, and convinced the goblin Grok to bring them to where they were taken. While distractedly talking to Tarot, Grok failed to point out one of the traps, nearly causing Hagnar to fall into a pit. They arrived at the mouth of a cave, where Grok turned tail and fled.

Peering into the cave, the party noticed it was dark, so Azunyan lit Thunder’s axe bright pink with sparkles. This caught the attention of two goblin guards, who shot arrows at the party from a bunch of nearby thickets, hitting Hagnar. The party took shots at the two goblins, with Tarot slicing one’s throat and the other getting hit with a firebolt from Azunyan and a cleave from Thunder.

Heading into the cave, the group sees three wolves, and tries to tame them. Azunyan puts one briefly to sleep, and goes to calm the others, but one is only made angrier, which leads Thunder to quickly kill it. This drives the other into a rage, and the party decides this is more trouble than it’s worth, and heads deeper into the cave.

After a little while, they see a goblin keeping watch who notices the party and begins running off to alert the rest of the goblins. Thunder whips an axe at him and kills him instantly, knocking him into the river below. They decide to try to climb up a rocky embankment but it collapses, knocking Thunder unconscious as he falls into the river. Tarot tries to pull him out but gets dragged in as well. Hagnar and Azunyan pull him out with rope and they spend a couple hours helping him recover.

They head back into the cave, seeing the wolves, and the now conscious Thunder antagonizes the wolves with a face, which stirs the wolves into a frenzy, causing one to yank loose its chain. The party coerces it out of sight and kills it with a fire bolt from Azunyan and a scimitar shot from Hagnar. Azunyan then calms the other wolf, now all alone, and forms a solid bond with it.

I was a guest on The Icon Garden podcast today, talking about the newly announced SwiftUI, how it co-exists with other tools like React Native, and what it means for the future of native apps and cross-platform hybrid apps.

So Zoom runs a web server on your Mac (even after you uninstall the app), and that web server can launch Zoom calls via URLs, and those Zoom calls can default to having your camera open. Which apparently makes it very easy to embed something into a web page (or an ad) in an attempt to trick people into unwittingly opening a video chat.

Remote video exploits are one of the worst case scenarios of security vulnerability, and this is it. It looks like Zoom took over two months to start responding to it from the timeline, and if that’s true, it’s irresponsible security practice.

If you have Zoom installed on your Mac, check the “Patch Yourself” section of the article to block the functionality that allows this.

As of now my personal website should support WebSub on all pages, posts, and the RSS feed (basically everything linked in the sitemap). If you have the capability to subscribe to pages via WebSub, you should be able to point it at any page on this site and get notified when that page updates. That’s maybe not the most useful on individual pages, but it is useful for the home page, or if you want to be able to subscribe to individual tags or projects.

This should work well with social readers that let you subscribe to websites, at least in concept. To go with that, I’ve updated the microformats across much of the site to make them accessible to those readers. Things appear alright in web-based validators but I’ve been having trouble getting those changes to appear in the readers themselves. Hopefully that will fix itself as caches expire. But it’s one step further down the road of being a good IndieWeb citizen.

Behind the scenes, WebSub support is handled by Webmaster, my custom-built server that integrates with my Gatsby build system. When GitLab finishes building, it sends a webhook to Webmaster that signals that the site has changed, causing Webmaster to fetch the sitemap and scan the ETag of each file in it for changes. If any of those ETags are different from what’s already known, then that page has changed. The WebSub subscribers for that URL are fetched and notified with the changes. No WebSub hub necessary, but down the road I could switch to one pretty easily.

Checking out the IndieWeb Summit remotely and working on my personal website today.

With the impending release of the third expansion Shadowbringers, I’ve been rewatching this amazing documentary on Final Fantasy XIV, its very poor launch, and its reconstruction. It’s a fascinating tale, one that has probably never been attempted (and likely never will again).

In September 2018, the Verge posted a video that was designed to show people how to build a PC, which was full of errors and mistakes. Some were inconsequential or considered bad practice, like having bad cable management which might impede airflow but wouldn’t necessarily impact performance. Some would cause performance problems but not damage, like putting the GPU in the wrong PCI-e slot. And some issues could cause irreversible damage, like using the wrong screws on the radiator, which could potentially penetrate the radiator tubing and cause coolant leaks. The internet quickly began criticizing this video for its flaws, making parodies and reaction videos, and the Verge disabled the comments on the video before ultimately taking it down, amending the accompanying article noting that the video wasn’t up to their standards. Paul’s Hardware did a very good summary of the video and the reaction to it. The internet made fun of it for awhile, and everyone largely moved on. Until this week.

On Tuesday evening, Kyle from the YouTube channel Bitwit tweeted that the Verge had used YouTube’s copyright strike system to take down his reaction video. The Verge did not issue a statement or public comment to this, but about a day later, the claim was reversed after being disputed. According to Bitwit, YouTube disputed that the video fell under fair use for transformative purposes (which will go on to be disputed by the Verge later). They also took down a video from channel ReviewTechUSA which broke the original video down and added a lot of commentary to it. Before the videos were reversed, several large tech YouTube channels posted videos about the Verge’s actions, which appeared to outsiders like the Verge was trying to censor criticism, as the videos were both transformative, critical, and highly viewed.

This morning, editor-in-chief Nilay Patel finally issued a statement on behalf of the Verge. In it he says that the legal team at Vox Media (the parent company of the Verge) found these videos and decided that they were not fair use, and issued copyright strikes to YouTube under their own purview. Later, when he was notified of these strikes, he had them rescinded despite believing that the legal team was correct in thinking that they did not fall under fair use. He then spent the morning responding to tweets about the issue, including my own, which were almost entirely negative.

Now, I’ve generally liked the Verge and Nilay Patel’s work, and have defended him and his position strongly when I agree with him. And after thinking about it, in some ways I can understand where they’re coming from. If we assume they’re being truthful in their public statement, they saw some videos, they felt they were not fair use, they tried to take them down. But their process failed in a few fundamental ways.

Read More

The New York Times has written a great dive into mobile apps that harvest data off your device, such as location data. Many of these companies feel entitled to harvest and store your data for things like location when you give consent for location access, and are in the business of selling that data to advertisers.

The book ‘1984,’ we’re kind of living it in a lot of ways.

Bill Kakis, a managing partner at Tell All

I’ve been removing a lot of the native apps I’ve relied on recently in favor of mobile web apps. I won’t let Facebook run code natively on any device I own, precisely because I know they go out of their way to capture every scrap of data they can. Running Instagram in a mobile web browser provides a much stronger sandbox, limiting the amount of data they can steal dramatically.

Apple and Google have largely destroyed any real marketplace for paid apps that don’t need to rely on selling data, and app review mechanisms have been unwilling or unable to protect customers from it. They deserve a huge share of blame for the status quo being what it is.