Last Friday I finished my summer internship at GrammaTech. A few days before that (I forget when exactly) the discussion on our IRC channel turned to cybernetic implants. We’re a company full of pretty hardcore software types, what do you expect? Though to be honest, I was the chief instigator. Anyways, the conversation quickly moved to the question of securing such implants. The questions raised are summarized by one coworker’s comment: “Which software vendor do you trust to write the operating system for your brain?” Given that regular implant technology probably isn’t too far in the future, the question is a valid one. For now my answer is: no one.
Let’s be honest: most of our computer systems are hopelessly insecure. And making them insecure isn’t as simple as installing antivirus software from a big vendor. Depending on just how secure you need or want to be, you potentially have to go very, very deep. In a lot of cases the trouble is not worth it. Want to take down my VPS running my personal website and storing my Git repos? Go ahead, it’ll take me all of five minutes to shut it down and spin it back up, maybe half an hour to restore everything. That’s far easier to do than statically analyzing every line of the Linux kernel, the GNU utilities and the web stack for vulnerabilities (and then fixing them without introducing new ones or breaking things). This is not to say that these aren’t worthwhile, important activities, they’re just not top priority for most users.
However, it’s another matter entirely when the systems are mission critical — banks, defense, the Internet backbone – or they’re running inside our body. Coming back to the original problem, medical technology is quickly progressing to the point of us having fully functional implants replacing faulty organs. Insulin pumps are just the start. Cochlear implants and artificial limbs have been around for a while. Bionic eyes are slowing pushing forward and real cyborgs exist. We’re not going to see full cyberbrains just yet and we’re definitely not throwing out the wetware for full synthetic bodies. But as the number of computers inside our bodies gradually increases it’s never too early to start thinking about how we’re going to keep them safe, especially if we want them connected to the Internet (and we will).
Having our implants connected to the Net is a matter of convenience as well as health and safety. Real-time monitoring, remote diagnostics and over-the-air software updates would greatly cut down on the amount of time you spend in your doctors’ waiting room. However, if you want your arm or eyes hooked up to the Internet you definitely want to be careful about who can connect to them. Asymmetric encryption and signing for all communications (especially updates) would be necessary, just for starters. I can see some kind of code-signing for the software itself being beneficial. But it raises of the question of whether the user can/should be able to hack their own organs. I really don’t want to jailbreak a critical organ if there is a possibility of bricking it. But at the same time I do have a right to my own bodyparts, biological or synthetic.
Aside: I wonder why cars don’t come with 3G connections for remote software upgrades. If the Kindle can do it, it can’t be that hard. Then again car manufacturers haven’t exactly been the most innovative and forward thinking in recent years. Maybe I should be talking to Elon Musk.
Even if the proper technical measures are in place, there is still the question of just who do we trust to provide and potentially control our body parts. I don’t mind Apple storing my music and Amazon can store and sync my books. I do mind them locking me in, which is why I’m still hesitant to go completely digital. But do I trust either of them (or any for-profit corporate entity) with my vital organs, or even non-vital ones? Furthermore do they get keys to shut down “malfunctioning” organs, for some definition of “malfunctioning”? What safeguards are in place to prevent them for misusing these keys? Given the life-threatening nature that such shutdowns might have, requiring a complex legal procedure to overturn shutdowns is dangerous and ethically negligent.
When implants start becoming mainstream and popular we’re going to start seeing issues and problems similar to the ones with computer systems. There are always going to be people who want differing degrees of control over their technology, whether that technology be cars, computers or prosthetics. It would be interesting to see something like a “homebrew” implant scene come up, though I doubt it would rival the popularity of the homebrew computer scene. Like many important problems the questions are both technical and social in nature. So, who do you trust to write the operating system for your brain?