I’ve been experimenting lately with IPFS, the InterPlanetary File System, and learning more about distributed information systems like it. I think I mentioned this kind of thing in passing in a podcast a year or so ago, so I thought I’d do more of an explanation of it. First I demonstrate the client-server model which most Internet applications use, and why it’s increasingly fragile now that a handful of corporations control so much of our access to and ability to share information.
I have a $5/month virtual server at Digital Ocean, which I use for some light work and for an extra location outside my usual networks from which to test connectivity. I noticed recently that they’d increased the RAM and disk space included for that price. It turns out I could have just clicked a button to expand it, but I decided to make a new droplet and move everything to it, since that’s really how you’re supposed to handle the cloud – lean toward spinning up new systems rather than getting attached to the ones you have.
I recently upgraded Emacs and BBDB, and it stopped working to auto-complete addresses in Gnus. The error turned out to be that it was trying to run bbdb-migrate to update the database, and I wasn’t loading that. So I just needed to add this to my .emacs: (require 'bbdb-migrate) And do a C-x C-e at the end of that line to execute it. Then the next time I tried to use BBDB by auto-completing an address, it took a few moments to migrate the database, then worked fine.
I run FreeBSD on a Dell Latitude D520 laptop. One issue in installing it is that the wireless doesn’t work out of the box, so you have to install firmware for it. In this machine’s case, the needed firmware is in the net/bwn-firmware-kmod port. So you have to connect with the Ethernet port long enough to get that installed, or pull it in some other way, like a flash drive.
My current workstation has 8 CPU cores (each core can handle a stream of instructions independently, so it’s more-or-less like having 8 CPUs – 8 different “brains” that can each be running its own thing at the same time). My last computer had 2, so I’m guessing my next one will have 32. They seem to be hitting a wall on how fast a single CPU can be, so the next best thing is to stack more and more of them together.
For FreeBSD administrators, ZFS and jails combine to make virtualization easy, fast, and secure. A FreeBSD jail is a virtual machine which can only access the resources assigned to it when it was created, so its processes have no access to the rest of the machine. ZFS is an advanced filesystem that makes it very easy to create and destroy filesystems whenever they are needed. Together, they make it a matter of moments to create a new virtual system for testing, walling off network services, or other projects.
For those who know what it is, here’s my public key. I’m going to start signing my email with it, so you can use it to verify me, and feel free to encrypt email to me with it. Contact me via any other channel you like to get my fingerprint to verify that it matches this, to make sure someone hasn’t compromised my web site and changed it. It’s a bit longer than usual because it has a JPEG of my smilin’ mug encrypted in it, for another possible way to verify it.
The first time I used the Unix shell, I was hooked. The idea of having all these little programs, each of which did one thing, and being able to chain them together to do more complicated things, made perfect sense. Coming from an 8-bit background, where you were always up against the limits of the machine and waiting for programs to load, keeping everything small and focused was great. I still reach for the toolchain on my own systems on a daily basis.
I’ve been doing FreeBSD sysadmin work and using it on my own systems since about 1998. I like its no-nonsene, professional attitude and the simplicity and openness of its licensing. I can build the kernel and OS from source (though that’s not necessary as often as it used to be). Other skills: System and security updates Security auditing Installing and configuring ports Networks (including wireless) and firewalls Installing and administering services (web, email, etc.
I occasionally use the wget utility with the -m option to download a mirror of an entire website. This is very handy, but wget respects the robots.txt file, so it won’t mirror a site if robots.txt disallows it. Obviously, you should respect the downloading restrictions of other sites, but there are times when you have a valid reason to ignore them (when it’s your site, for instance, but you don’t want to change robots.