Books I read in 2017

A quick post about books I’ve read in 2017 to ring in the New Year. There weren’t many (something to change in 2018), but I’m very happy with the ones I did read.

Why Not Me? by Mindy Kaling

This was the first book I read in 2017. I don’t remember much about it, other than being impressed by how hard working Mindy Kaling was. Goodreads tells me I gave it 5 stars, so I must have really liked it at the time.

Man’s Search for Meaning by Victor Frankl

I read this in one sitting during a cross-country flight. Reading about what victims of the Holocaust endured in Nazi concentration camps has a way of putting your life’s problems in perspective. The first part of the book focuses on how the average person survives and reacts to life in the brutality and extreme cruelty of a concentration camp. The second part of the book introduces Frankl’s theories of meaning as expressed in his approach to psychology: logotherapy. In essence, the meaning of life is found in every moment of living, even in the midst of suffering and death. 5 stars and highly recommended for everyone.

American Gods by Neil Gaiman

Very enjoyable and signature Neil Gaiman. I loved the themes, the concepts, the writing style and especially how Gaiman weaves together so many different characters and ideas into a single coherent narrative. I was a little disappointed by what felt like a anti-climactic resolution, but the rest of the book is good enough to warrant 4 stars.

Frankenstein by Mary Shelley

I was really looking forward to reading this and comparing it to all the modern portrayals of Frankenstein (both the doctor and his monster). But full confession: Dr. Frankenstein comes off as a complete jerk and I got tired of his whining about two thirds through the book and couldn’t finish. The monster’s parts, by comparison, were captivating and very enjoyable. Maybe I’ll manage to get through it this year.

Deep Work: Rules for Focused Success in a Distracted World by Cal Newport

The only self-help book I read this year and it was definitely a good choice. It’s a dense book and I haven’t implemented all the suggestions, but it has certainly helped me think more about how I go about my work and made me reconsider and pay more attention to things like how many distractions I tolerate. Consider this required reading for anyone working in an intellectual or creative field. My only complaint is that some of the chapters are really long with lots of information, some restructuring into smaller segments would have helped.

Binti and Binti:Home by Nnedi Okorafor

Great example of Afrofuturism and modern science fiction. I wouldn’t call it “hard” science fiction, but they are chock full of interesting concepts and ideas, and the characters and their perspectives are refreshingly different from standard science fiction tropes. I’m looking forward to the final book in the series that’s due out soon.


We Need Hyperlink Literacy

A couple weeks ago, I was in a student discussion with James Grimmelmann, law professor at Cornell Tech who studies how laws regarding technology affect freedom, wealth and power. A large part of our discussion centered around search engines and media platforms, specifically how personalization and algorithmic filters may affect what users see, even when they don’t understand or know that they’re getting filtered content. One way to tackle this issue (in addition to regulation, or some form of opt-out) would be media literacy: teach people that what they see is not some kind of perfect, impartial truth, but might be tailored to their preference and past histories (and biased in other ways).

Fostering that kind of media literacy among the populace at large is at once sorely needed and immensely difficult. Given how much our society depends on the Internet, the World Wide Web, search engines, social media platforms and the (often inscrutable) algorithms behind them, it is crucial that people understand how they get their information, and what are the biases, agendas, and influences shaping what they see (and don’t see). This is clearly a massive challenge, and likely one that we don’t yet know how to overcome. Personally, I would settle for solving one much smaller piece of the puzzle first: a more general understanding and respect for hyperlinks.

The humble hyperlink is the foundation of the web as we know it. It takes the web from being just a digital version of a library or filing cabinet to something truly new: a system where direct access to a piece of information is as easy as naming it. Unfortunately, along with the rise of digital walled gardens such as Facebook (and to a lesser degree, Twitter) the hyperlink seems to be losing prominence. That’s ironic given that the hyperlink is a sharing mechanism, and Facebook would like to bill itself as a platform for sharing and connecting. On a normal web page, one can use snippets of text as anchors for a hyperlink, instead of using the raw link itself. Facebook doesn’t let you turn pieces of text in a status update into links. Furthermore, pasting more than one link at a time breaks the user interface. I suppose Facebook wants to give the link more prominence than what you have to say about it. People like Dave Winer and John Gruber have commented at length on how Facebook breaks the web. Poignantly, that last sentence (with two hyperlinks) would be impossible to write properly in Facebook.

And it’s not just Facebook. Twitter is approximately the same as Facebook. Slack supports links poorly in the same way: there’s no obvious way to use pieces of text as anchors for links. Adding more than one link is slightly better: giving previews for both links (though they are bigger and more prominent than the message containing the links). These systems are silos: they prefer you share and interact with posts within their own boundaries, rather than with content on the web as a whole.

By reducing the prominence of hyperlinks and truncating their utility, we create online cultures focused on sharing information, rather than ones that encourage combining and synthesizing multiple sources and viewpoints into coherent narratives. I would argue that in doing so we are giving up a large part of the power of the Web, to our detriment, and for no clear benefit.

So how do we fix this? Certainly, there is an argument to be made for reducing our dependence on platforms where we cannot sufficiently control our own writing. But beyond that, I would like to see hyperlinks become a more ingrained part of writing on a computer. I would love to see a world where whenever you write some text on a computer that references external sources, you link copiously to them, rather than just inserting references that readers have to look up manually. School and college writing classes would be the prime places to teach this. In the same way that we teach students to include citations to external sources, I would like to see students treat hyperlinks with the same importance and fluency.

In a deeply connected technological society such as ours, using the core technologies of the web should be a central part of any kind of digital or media literacy.

Steve Jobs, the Xerox Alto, and computer typography

A few nights ago I met a wonderful woman at a pipe organ concert who worked for several decades at Xerox, programmed in the Mesa and Cedar languages, as well as Smalltalk on the original Altos.

She told me that she eventually left programming because she felt like modern computing and programming had become bureaucratic and process-oriented, more like engineering and less creative. These days she was more interested in statistics and data science.

Personally I’m glad to see computing and programming mature into a engineering discipline, but I would also very much like to see programming embraced as a creative endeavor. I hope that it’s possible to do both: embrace modern type systems, property-based testing and metaprogramming to build reliable systems, while interacting with clean and beautiful tools and interfaces instead of the modern mess of HTML/CSS/JavaScript (or similar messes in other technology stacks).

What if the Singularity already happened?

I’ve been re-reading one of my favorite science fiction books, Accelerando by British author Charlie Stross. In one of my favorite passages, some of the characters are sitting around talking about their belief in the Singularity. One of the characters makes the following claim (about when the Singularity happened):

“Au contraire. It happened on June 6, 1969, at 1100 hours, Eastern Seaboard Time,” Pierre counters. “That was when the first network control protocol packets were sent from the data port of one IMP to another — the first ever Internet connection. That’s the Singularity. Since then we’ve all been living in a universe that was impossible to predict from events prior to that time.”

While it’s typical to equate the Singularity with the future advent of superhuman artificial intelligences, I think this definition makes a lot of more sense. The Internet has had more impact on our world in the recent past than any other technology (especially after the advent to mobile pocket-sized connected computing devices), and furthermore, it came almost completely out of left field. Few of the “classic” science fiction stories I remember reading (particularly by Isaac Asimov) prominently feature networked computers, even though they have faster-than-light spaceflight, aliens, robots and the like. Perhaps we should take that as a warning: the most disruptive technologies are the ones we’re least cognizant of, until the disruption is well under way.

The Spirit of Jane Austen

After reading one too many posts about how to (and why we should) read more, last night I sat down to read an article on The Atlantic about Jane Austen. Though I remember reading Pride and Prejudice once upon a time, and am generally aware of her status as a cultural icon, I can’t say I know very much about Jane Austen. This piece was interesting as an insight into her cultural impact and changing interpretation over time. However, what stood out to me was the author’s interpretation of Austen and her characters as agents of the humanist revolution sweeping Europe and the West in the late eighteenth and nineteenth centuries. In particular, I was struck by this excerpt:

Spiritedness is a way of understanding oneself as having rights. It experiences those rights as a joy, as a sense of blossoming, of freedom; but also as something often in need of quickly roused defense. It is the style of the revolutions—American, French—encroaching on Austen’s Britain, put in the mouths of intelligent young women who know their own worth.

Elizabeth’s is a declaration of rights; she demands the pursuit of happiness.

Since we seem once more to be living in times where personal liberties and rights are being questioned, and to some extent redefined, perhaps it’s time to pick up some Austen.

Yesterday I rewrote about half (the entire front-end) of a project that took me and two other collaborators several months to complete a few years ago. At the end of a solid work afternoon, I had things compiling and running, but in a very buggy state. Unexpectedly, the associated feelings turned out to be somewhat bittersweet and conflicted. I’m happy and proud of how much I’ve improved in the years since I first worked on this project, but I’m also sad thinking of how much faster and better things might have gone originally if I had known back then all the things that I know now.

Later in the evening, I learned something new (GADTs and how to use them in OCaml), which makes me hope that in a few years I’ll be even more capable than I am now. At the same time, I’m also wary of falling into the trap of rehashing old projects and ideas with new tools and techniques, rather than moving forward into new areas.

A part of me really wants things to be “just right”, not just when it comes to work, but in the rest of my life as well. It’s almost a visceral when I have to deal with things that don’t seem (close to) perfect for long periods of time. But at the same time, keeping a steady pace of progress in research and engineering requires knowing when things are “right enough” and moving on to the next thing.

Navigating that boundary isn’t something that I have a lot of experience, but I’m hoping that just like my programming skills, it’s going to be something I get better at in the coming years.

Problem: I am a human being

A relevant excerpt:

“If you are a person who has come of age during the time of ubiquitous internet access and github, you cannot know what having access to the source code of an entire operating system meant in the mid 90s. When I saw the impact this access had on my own life in the coming years, I began to view closed source software as unethical. In a society that was increasingly mediated by software, restricting access to learning about software works is in my opinion a means of control and subjugation.”

For me, growing up in India, that time was the early 2000s. My first Linux distro was Ubuntu, thanks largely to Canonical shipping out Ubuntu CDs to people around the world, something that seemed like a ludicrous idea at the time.

I wouldn’t be where I am without free software (both free as in beer, and free as in freedom). Also, Star Trek.