Moshe Vardi on Humans, Machines and Work

Yesterday evening I had the pleasure of listening to Professor Moshe Vardi talk about the effects of automation on the workforce and the economy, and how the continued development of AI and machine learning-based technologies might further tip that balance. This post is based on the notes I took during that talk. Please read it as more of a high-level summary, rather than a transcript. The talk itself contained more links and references that I haven’t had the time to chase down, so any inaccuracies and misquotations are probably my own fault.

Professor Vardi is a professor at Rice University, and currently Editor-in-Chief of the Communications of the ACM, and the winner of numerous awards including the ACM Gödel Prize, and the EATCS Distinguished Achievements Award. He is also an excellent speaker and it was wonderful to see a coherent narrative formed out of many disparate threads.

The talk started with a brief mention of Turing’s thesis, which can be read as a compelling philosophical argument for thinking machines, and the related intellectual questions. The early history of artificial machines was characterized by unbridled optimism (expectations that general purpose AI would arrive within a generation), punctuated by several AI winters (1974-80 and 1987-93) where funding and support for AI research dried up. However, 1997 started a new era in AI research when a chess playing computer, IBM’s Deep Blue, defeated Garry Kasparov. More recently, DeepMind’s AlphaGo defeated European Go champion Fan Hui. Crucially, AlphaGo combines machine learning techniques (including deep learning) with search space reduction, resulting in an approach that could be termed “intuition”.

With the resurgence of AI research, automated driving has been the holy grail for about a decade. Cars were one of the most important developments of the 20th century. The automobile shaped geography and changed history, and led to lots of infrastructure development. By some estimates, there are over 4 million truck drivers in the US, and 15 million jobs involving operating a vehicle. Today there are about 30 companies, working on self-driving vehicles, attacking an estimated market of $2 to $5 trillion a year. Experts predict that the main technical issues will be resolved in 5-15 years. While this will be a great technological achievement, it will produce profound business disruption. For starters, there is likely to be major industrial contraction (cares are idle 90% of the time), and a major loss of business for insurance, legal, medical fields, as automobile accidents are drastically reduced.

Unfortunately, this industrial disruption follows a trend that has already been in progress for a while now. The last 40 years has resulted in a harsh negative impact on middle & working class. For much of the 20th century there was a “Great Coupling”  between productivity, private employment, median income and GDP growth: they all followed a linked upward trend. However, since the 70s, this trend has “decoupled”, a fact observable from many dataset. In particular, there has been increasing inequality: a massive decline in the bottom 50% of earners, and a massive increase in the top 1% of earners. There is a declining chance that a person in their early 30s is going to be better off than their parents.

This in turned has resulted in an “Age of Precariousness”: half of Americans would have trouble affording $400 for an emergency, and two-thirds would have trouble dealing with a $1000 emergency. Labor force participation for men 25-54 has dropped from 97% to 88% and those with high school degrees or less were the hardest hit — almost 20% are not working.
Technology is eating jobs from the “inside out”. High-paying and low-paying jobs are both growing, but middle class jobs are declining. According to a Bloomberg 2016 report: as we move towards more automation, we need fewer people in manufacturing and more people go into the service sector, historically a low-wage sector.

All this paints a pretty bleak future, and from Prof. Vardi’s talk it’s unclear what the way forward is. Universal Basic Income seems like one idea to help offset this dangerous trend, but UBI is still a hotly contested topic. The following discussion raised some interesting questions, including asking what the role of “work” and employment is in a mostly-automated society, and questioning the role and responsibility of educational institutes in the near future.

Personally, I feel lucky to be in a field where jobs are currently booming. Most of my work is creative and non-routine, and thus not amenable to automation yet. At the same time, I am very concerned about a future where the majority of people hold poorly paid service sector jobs where they can barely eke out a living. I am also afraid that jobs that seem more secure today (administrators, doctors, lawyers, app developers) will also be gradually pushed into obsolescence as our machine learning techniques improve. Again, no good solution, but lots to think about, and hopefully work on in the near future. As the Chinese proverb goes, we live in interesting times.

Computer Science as a Creative Endeavor

Yesterday, Professor Eugene Wallingford posted about how computer science is not that different from a lot of other professions. Some parts of it are interesting and exciting, but a decent chunk is tedious, frustrating and boring as well. In his words:

Let’s be honest with ourselves and our students that getting good at anything takes a lot of hard work and, once you master something, you’ll occasionally face some tedium in the trenches. Science, and computer science in particular, are not that much different from anything else.

While I agree with the general message, in my experience, computer science is also a wonderfully creative and exciting endeavor (even if we often struggle to portray it as such).

Throughout college I had a variety of on-campus jobs. I was mostly trying to make some spending money, and pay for the occasional trip with friends. Through that time, I remember the jobs that were most fulfilling and interesting involved something that I would broadly call “computer science”. Some of it was doing honest-to-goodness computer science research (which later helped me get into graduate school), some of it was data crunching, and some of it was simply straightforward scripting and system administration, with a dash of web design. In fact, my first job involved calling up alumni asking for donations, and I promptly quit it after a week once I got a job doing some data processing for a professor.

Of course, I had found computers interesting for several years prior to my first college job, so I certainly wasn’t unbiased when I started comparing jobs. I can also imagine that a lot of people would consider calling up alumni a far more interesting job than munging CSV files with Python scripts. And there are certainly parts of even technically demanding programming tasks I find tiresome and would happily avoid (pretty much anything to do with CSS).

All that being said, (and with several years hindsight and interaction with professionals in other fields) I would place computer science on the same level of creativity as screenwriting and practicing law. In all these cases, there are certain structures and rules you have to follow, some more flexible than others. Some parts of tools and materials you have to work with are beautiful and elegant, others are messy and ugly but can be avoided in part, and some are just necessary evils. But within those rules and using those tools, you have the chance for exercising creativity and ingenuity, and maybe even some true beauty (if you’re lucky and capable enough). In fact, having taken a bunch of law classes, I would say that the practice of computer science has a lot in common with the practice of law (though that’s a matter for another post).

Perhaps the view of computer science as tedious is an indictment of our teaching methods, our tools, or the creeping incidental complexity that easily infests software projects of any size. But if projects like Pyret, Scratch and Jupyter notebooks are any indication, there seems to be concerted effort to change that. I’m not a fan of the mindset that says that HTML/CSS/JS must be taught to everyone, and it would be disingenuous to say that computer science is simple or easy. But as both academics and practitioners, I do hope that we can be honest about the effort and occasional drudgery involved, while helping understand and appreciate the joy of programming and the thrill of computation.

Picking Your Battles

Over the weekend I took a few hours off from a paper push to watch a couple episodes of the second season of Gotham. In one of them, Alfred Pennyworth has some words of wisdom for the young Bruce Wayne (who has just signed him up for a fist fight):

Pick your battles, don’t let your battles pick you.

Just prior to watching this, I was busy planning a trip I could have planned weeks ago. That probably explains why I thought about Alfred’s line in the context of planning and organization. I often put things off until the last moment, and then get them done in the nick of time, feeling like my back’s against the wall. As a result, I’m often just doing the bare minimum, and repeatedly losing the opportunity to do a better job, or as happened with planning this trip, get more out of an experience.

Being early and prepared lets you pick your battles. Procrastination and disorganization lets your battles pick you, and often lets them kick your ass.

A Kickstarted Reissue of Principia Mathematica

A small Spanish publisher, Kroeneck Wallis, has a Kickstarter for a new version of Isaac Newton’s Principia Maethematica. As you can see from their Instagram account, the finished product is going to be beautiful. The publishers are making some interesting design choices, including producing a separate book for each of three chapters of the original, using a visible binding that leaves the spine bare, the use of just two colors (petrol blue and coral orange) and a low contrast serif font.

principia

As of this writing, the Kickstarter is already a third complete, with over three weeks left. There are a number of support options, starting with a single copy at just €45.

(Via Jason Kottke)

Star Trek Beyond

Was very enjoyable. Spoilers follow.

The movie was a lot of fun, and managed to hit a good mix of serious and light-hearted. I liked it much more than I did Into Darkness, and it might just be my favorite of the the Abrams Star Trek movies.

As my favorite Star Trek blog calls it: it was a romp. It was a lot of fun and struck most of the themes that make Star Trek what it is—interesting characters, healthy optimism, underlying themes of unity, courage and friendship, and struggles both personal and epic. Take out the destruction of the Enterprise and squeeze it down to under an hour and the movie would have made a great TOS episode.

The visuals are of course simply beautiful (something true of the Abrams movies in general). The outfits, locales and effects in general are well done. The sequences of scenes showing life aboard the Enterprise and Starbase Yorktown are smooth, informative and impressive without being overwhelming. In fact, I would say that the scenes aboard Starbase Yorktown does one of the best jobs of showing off life in the Federation in any iteration of Star Trek.

Finally, the movie also does a good job of addressing Nimoy’s death (and the loss of the one of the main characters of both this, and previous iterations of the franchise). It’s not overly dramatic, but it is respectful, elegant and helps drive the rest of the story forward. And I absolutely love that one of the final shots of the movie is this photo of the original cast:

Star_Trek_V_The_Final_Frontier_Crew

The movie wasn’t perfect: the action seemed choppy, some of the humor was unnecessarily forced, and some of the science was suspect. But it was a damn good Star Trek movie and a good movie in general. Would watch again.

Work, Life, Balance, Choose Two

Yesterday a friend of mine asked me how I manage to get everything done, in the context of being a grad student. Truth be told, I don’t always manage to. I often get things done either just in time, or just after time, and some things are routinely put on hold (cooking, vacuuming) so that other things can get done on time (papers, code). The so-called “work-life balance” can be an often elusive goal for graduate students, and I suppose for academics in general. While some academics I know are better at it than others, I doubt there are few, if any, who have nailed it down.

With that in mind, yesterday I also stumbled across a poem (and recording) by the late Kenneth Koch that seems relevant. Entitled “You want a social life, with friends”, it is of course about the difficulty (and the compromises involved) in making its title a reality. I won’t support or deny its claims, but at least at first reading, there does seem to be an undertone of truth to it. Without further ado:

You want a social life, with friends,
A passionate love life and as well
To work hard every day. What’s true
Is of these three you may have two
And two can pay you dividends
But never may have three.

There isn’t time enough, my friends—
Though dawn begins, yet midnight ends—
To find the time to have love, work, and friends.
Michelangelo had feeling
For Vittoria and the Ceiling
But did he go to parties at day’s end?

Homer nightly went to banquets
Wrote all day but had no lockets
Bright with pictures of his Girl.
I know one who loves and parties
And has done so since his thirties
But writes hardly anything at all.

Investing in the Open Web

It seems like every few days there’s a new post lamenting the death of the Open Web, and the corresponding rise in ad-driven social media machines and clickbait. Recent examples include this lament on the Cult of the Attention Web (prompted by Instagram moving to an algorithm presentation, away from a chronological timeline), and Brendan Eich’s response to online news publishers strongly objecting to the ad-blocking browser, Brave.

At the risk of beating a dead horse, we seem to have collectively struck a number of Faustian bargains: free services in exchange for our personal information; free articles, audio and video in exchange for advertising, more personal information and ugly, slow sites; walled gardens, in whose operation we have little say, in exchange for ease-of-use. And while I would love to pit advertisers and social media giants against brave independent bloggers and developers in a black-and-white contest, the reality is never quite so simple.

If we really want an vibrant, independent, open web, we need to invest in it with our time, money, effort and technical know-how. But I don’t know if that investment exists, or if the people complaining about the state of the open web are ready to make it. Examples abound: the above piece about Instagram is posted on Medium, which might join said Cult of the Attention Web any day. WordPress, which powers a significant fraction of the open web (and on which this site is built), would rather pretend that it’s a feed-reader and encourage me to “follow” other blogs, than make it simple and quick to write or edit posts (it takes me four clicks from the WordPress.com page to start editing a draft). And I myself would rather rant about investing in the open web than build a CMS that I actually want to, and enjoy using.

If we seriously care about preserving an open web outside of walled gardens and free of ugly, privacy-destroying advertising, we need to be an active part of it. We need to publish to our own domains, backed by services that won’t turn into advertising machines tomorrow, maybe even pay for hosting. We need to vote with our wallets and actually subscribe to publications we want to read and support. We need to write code and build publication platforms that embody our ideals and values, and make it easier for others to do the same.

I do two of those three, though not as often as I would like to. I don’t exaggerate when I say I wouldn’t be where I am in my life without the open web. I would like to invest in it so that others can say the same in the future.