What if the Singularity already happened?

I’ve been re-reading one of my favorite science fiction books, Accelerando by British author Charlie Stross. In one of my favorite passages, some of the characters are sitting around talking about their belief in the Singularity. One of the characters makes the following claim (about when the Singularity happened):

“Au contraire. It happened on June 6, 1969, at 1100 hours, Eastern Seaboard Time,” Pierre counters. “That was when the first network control protocol packets were sent from the data port of one IMP to another — the first ever Internet connection. That’s the Singularity. Since then we’ve all been living in a universe that was impossible to predict from events prior to that time.”

While it’s typical to equate the Singularity with the future advent of superhuman artificial intelligences, I think this definition makes a lot of more sense. The Internet has had more impact on our world in the recent past than any other technology (especially after the advent to mobile pocket-sized connected computing devices), and furthermore, it came almost completely out of left field. Few of the “classic” science fiction stories I remember reading (particularly by Isaac Asimov) prominently feature networked computers, even though they have faster-than-light spaceflight, aliens, robots and the like. Perhaps we should take that as a warning: the most disruptive technologies are the ones we’re least cognizant of, until the disruption is well under way.

Advertisements

The Spirit of Jane Austen

After reading one too many posts about how to (and why we should) read more, last night I sat down to read an article on The Atlantic about Jane Austen. Though I remember reading Pride and Prejudice once upon a time, and am generally aware of her status as a cultural icon, I can’t say I know very much about Jane Austen. This piece was interesting as an insight into her cultural impact and changing interpretation over time. However, what stood out to me was the author’s interpretation of Austen and her characters as agents of the humanist revolution sweeping Europe and the West in the late eighteenth and nineteenth centuries. In particular, I was struck by this excerpt:

Spiritedness is a way of understanding oneself as having rights. It experiences those rights as a joy, as a sense of blossoming, of freedom; but also as something often in need of quickly roused defense. It is the style of the revolutions—American, French—encroaching on Austen’s Britain, put in the mouths of intelligent young women who know their own worth.

Elizabeth’s is a declaration of rights; she demands the pursuit of happiness.

Since we seem once more to be living in times where personal liberties and rights are being questioned, and to some extent redefined, perhaps it’s time to pick up some Austen.

Yesterday I rewrote about half (the entire front-end) of a project that took me and two other collaborators several months to complete a few years ago. At the end of a solid work afternoon, I had things compiling and running, but in a very buggy state. Unexpectedly, the associated feelings turned out to be somewhat bittersweet and conflicted. I’m happy and proud of how much I’ve improved in the years since I first worked on this project, but I’m also sad thinking of how much faster and better things might have gone originally if I had known back then all the things that I know now.

Later in the evening, I learned something new (GADTs and how to use them in OCaml), which makes me hope that in a few years I’ll be even more capable than I am now. At the same time, I’m also wary of falling into the trap of rehashing old projects and ideas with new tools and techniques, rather than moving forward into new areas.

A part of me really wants things to be “just right”, not just when it comes to work, but in the rest of my life as well. It’s almost a visceral when I have to deal with things that don’t seem (close to) perfect for long periods of time. But at the same time, keeping a steady pace of progress in research and engineering requires knowing when things are “right enough” and moving on to the next thing.

Navigating that boundary isn’t something that I have a lot of experience, but I’m hoping that just like my programming skills, it’s going to be something I get better at in the coming years.

Problem: I am a human being

A relevant excerpt:

“If you are a person who has come of age during the time of ubiquitous internet access and github, you cannot know what having access to the source code of an entire operating system meant in the mid 90s. When I saw the impact this access had on my own life in the coming years, I began to view closed source software as unethical. In a society that was increasingly mediated by software, restricting access to learning about software works is in my opinion a means of control and subjugation.”

For me, growing up in India, that time was the early 2000s. My first Linux distro was Ubuntu, thanks largely to Canonical shipping out Ubuntu CDs to people around the world, something that seemed like a ludicrous idea at the time.

I wouldn’t be where I am without free software (both free as in beer, and free as in freedom). Also, Star Trek.

Sunday Selection 2017-06-25

Around the Web

The Largest Git Repo on the Planet

I’m always a fan of case studies describing real world software engineering, especially when it comes to deploying engineering tools, and contains charts and data. This article describes Microsoft’s efforts to deploy the Git version control system at a scale large enough to support all of Windows development.

Why our attention spans are shot

While it’s no secret that the rise of pocket-sized computers and ubiquitous Internet connections have precipitated a corresponding decrease in attention span, this is one of the most in-depth and researched articles I’ve seen on the issue. It references and summarizes a wide range of distraction-related issues and points to the relevant research if you’re interested in digging deeper.

Aside: Nautilus has been doing a great job publishing interesting, deeply researched, and well-written longform articles, and they’re currently having a summer sale. The prices are very reasonable, and a subscription would be a great way to support good fact-based journalism in the current era of fake news.

How Anker is beating Apple and Samsung at their own accessory game

I own a number of Anker devices — a battery pack, a multi-port USB charger, a smaller travel charger. The best thing I can say about them is that by and large, I don’t notice them. They’re clean, do their job and get out of my way, just as they should. It’s good to see more companies enter the realm of affordable, well-designed products.

From the Bookshelf

Man’s Search for Meaning

I read this book on a cross-country flight to California a couple months ago, at a time when I was busy, disorganized, stressed and feeling like I was barely holding on. This book is based on the author’s experience in Nazi concentration camps during World War II. The book focuses on how the average person survives and reacts to life in the brutality and extreme cruelty of a concentration camp. The second part of the book introduces Frankl’s theories of meaning as expressed in his approach to psychology: logotherapy. In essence, the meaning of life is found in every moment of living, even in the midst of suffering and death.

Video

Black Panther Trailer

I’m a big fan of Ta-Nehisi Coates’ run of Black Panther and really enjoyed the Black Panther’s brief appearance in Captain America: Civil War. This trailer makes me really excited to see the movie when it comes out, and hopeful that it will be done well. If you’re new to the world of Wakanda in which Black Panther will be set, Rolling Stone has a good primer.

When visiting Boston a couple weeks, an old friend of mine remarked that he thought that the humanities should be “the pinnacle of human achievement”. I wasn’t sure what he meant at the time, but the phrase has been rattling around in my head. In between working on a conference presentation, attending a law conference and reading the Meditations of Marcus Aurelius, I think I’ve come up with an explanation that makes sense (at least for me):

Science and technology, for all their towering achievements, are concerned with improving and extending human existence. By contrast, the humanities, (and I think also the arts) should inform and improve how humans live.

Moshe Vardi on Humans, Machines and Work

Yesterday evening I had the pleasure of listening to Professor Moshe Vardi talk about the effects of automation on the workforce and the economy, and how the continued development of AI and machine learning-based technologies might further tip that balance. This post is based on the notes I took during that talk. Please read it as more of a high-level summary, rather than a transcript. The talk itself contained more links and references that I haven’t had the time to chase down, so any inaccuracies and misquotations are probably my own fault.

Professor Vardi is a professor at Rice University, and currently Editor-in-Chief of the Communications of the ACM, and the winner of numerous awards including the ACM Gödel Prize, and the EATCS Distinguished Achievements Award. He is also an excellent speaker and it was wonderful to see a coherent narrative formed out of many disparate threads.

The talk started with a brief mention of Turing’s thesis, which can be read as a compelling philosophical argument for thinking machines, and the related intellectual questions. The early history of artificial machines was characterized by unbridled optimism (expectations that general purpose AI would arrive within a generation), punctuated by several AI winters (1974-80 and 1987-93) where funding and support for AI research dried up. However, 1997 started a new era in AI research when a chess playing computer, IBM’s Deep Blue, defeated Garry Kasparov. More recently, DeepMind’s AlphaGo defeated European Go champion Fan Hui. Crucially, AlphaGo combines machine learning techniques (including deep learning) with search space reduction, resulting in an approach that could be termed “intuition”.

With the resurgence of AI research, automated driving has been the holy grail for about a decade. Cars were one of the most important developments of the 20th century. The automobile shaped geography and changed history, and led to lots of infrastructure development. By some estimates, there are over 4 million truck drivers in the US, and 15 million jobs involving operating a vehicle. Today there are about 30 companies, working on self-driving vehicles, attacking an estimated market of $2 to $5 trillion a year. Experts predict that the main technical issues will be resolved in 5-15 years. While this will be a great technological achievement, it will produce profound business disruption. For starters, there is likely to be major industrial contraction (cares are idle 90% of the time), and a major loss of business for insurance, legal, medical fields, as automobile accidents are drastically reduced.

Unfortunately, this industrial disruption follows a trend that has already been in progress for a while now. The last 40 years has resulted in a harsh negative impact on middle & working class. For much of the 20th century there was a “Great Coupling”  between productivity, private employment, median income and GDP growth: they all followed a linked upward trend. However, since the 70s, this trend has “decoupled”, a fact observable from many dataset. In particular, there has been increasing inequality: a massive decline in the bottom 50% of earners, and a massive increase in the top 1% of earners. There is a declining chance that a person in their early 30s is going to be better off than their parents.

This in turned has resulted in an “Age of Precariousness”: half of Americans would have trouble affording $400 for an emergency, and two-thirds would have trouble dealing with a $1000 emergency. Labor force participation for men 25-54 has dropped from 97% to 88% and those with high school degrees or less were the hardest hit — almost 20% are not working.
Technology is eating jobs from the “inside out”. High-paying and low-paying jobs are both growing, but middle class jobs are declining. According to a Bloomberg 2016 report: as we move towards more automation, we need fewer people in manufacturing and more people go into the service sector, historically a low-wage sector.

All this paints a pretty bleak future, and from Prof. Vardi’s talk it’s unclear what the way forward is. Universal Basic Income seems like one idea to help offset this dangerous trend, but UBI is still a hotly contested topic. The following discussion raised some interesting questions, including asking what the role of “work” and employment is in a mostly-automated society, and questioning the role and responsibility of educational institutes in the near future.

Personally, I feel lucky to be in a field where jobs are currently booming. Most of my work is creative and non-routine, and thus not amenable to automation yet. At the same time, I am very concerned about a future where the majority of people hold poorly paid service sector jobs where they can barely eke out a living. I am also afraid that jobs that seem more secure today (administrators, doctors, lawyers, app developers) will also be gradually pushed into obsolescence as our machine learning techniques improve. Again, no good solution, but lots to think about, and hopefully work on in the near future. As the Chinese proverb goes, we live in interesting times.