We Need Hyperlink Literacy

A couple weeks ago, I was in a student discussion with James Grimmelmann, law professor at Cornell Tech who studies how laws regarding technology affect freedom, wealth and power. A large part of our discussion centered around search engines and media platforms, specifically how personalization and algorithmic filters may affect what users see, even when they don’t understand or know that they’re getting filtered content. One way to tackle this issue (in addition to regulation, or some form of opt-out) would be media literacy: teach people that what they see is not some kind of perfect, impartial truth, but might be tailored to their preference and past histories (and biased in other ways).

Fostering that kind of media literacy among the populace at large is at once sorely needed and immensely difficult. Given how much our society depends on the Internet, the World Wide Web, search engines, social media platforms and the (often inscrutable) algorithms behind them, it is crucial that people understand how they get their information, and what are the biases, agendas, and influences shaping what they see (and don’t see). This is clearly a massive challenge, and likely one that we don’t yet know how to overcome. Personally, I would settle for solving one much smaller piece of the puzzle first: a more general understanding and respect for hyperlinks.

The humble hyperlink is the foundation of the web as we know it. It takes the web from being just a digital version of a library or filing cabinet to something truly new: a system where direct access to a piece of information is as easy as naming it. Unfortunately, along with the rise of digital walled gardens such as Facebook (and to a lesser degree, Twitter) the hyperlink seems to be losing prominence. That’s ironic given that the hyperlink is a sharing mechanism, and Facebook would like to bill itself as a platform for sharing and connecting. On a normal web page, one can use snippets of text as anchors for a hyperlink, instead of using the raw link itself. Facebook doesn’t let you turn pieces of text in a status update into links. Furthermore, pasting more than one link at a time breaks the user interface. I suppose Facebook wants to give the link more prominence than what you have to say about it. People like Dave Winer and John Gruber have commented at length on how Facebook breaks the web. Poignantly, that last sentence (with two hyperlinks) would be impossible to write properly in Facebook.

And it’s not just Facebook. Twitter is approximately the same as Facebook. Slack supports links poorly in the same way: there’s no obvious way to use pieces of text as anchors for links. Adding more than one link is slightly better: giving previews for both links (though they are bigger and more prominent than the message containing the links). These systems are silos: they prefer you share and interact with posts within their own boundaries, rather than with content on the web as a whole.

By reducing the prominence of hyperlinks and truncating their utility, we create online cultures focused on sharing information, rather than ones that encourage combining and synthesizing multiple sources and viewpoints into coherent narratives. I would argue that in doing so we are giving up a large part of the power of the Web, to our detriment, and for no clear benefit.

So how do we fix this? Certainly, there is an argument to be made for reducing our dependence on platforms where we cannot sufficiently control our own writing. But beyond that, I would like to see hyperlinks become a more ingrained part of writing on a computer. I would love to see a world where whenever you write some text on a computer that references external sources, you link copiously to them, rather than just inserting references that readers have to look up manually. School and college writing classes would be the prime places to teach this. In the same way that we teach students to include citations to external sources, I would like to see students treat hyperlinks with the same importance and fluency.

In a deeply connected technological society such as ours, using the core technologies of the web should be a central part of any kind of digital or media literacy.

Advertisements

Steve Jobs, the Xerox Alto, and computer typography

A few nights ago I met a wonderful woman at a pipe organ concert who worked for several decades at Xerox, programmed in the Mesa and Cedar languages, as well as Smalltalk on the original Altos.

She told me that she eventually left programming because she felt like modern computing and programming had become bureaucratic and process-oriented, more like engineering and less creative. These days she was more interested in statistics and data science.

Personally I’m glad to see computing and programming mature into a engineering discipline, but I would also very much like to see programming embraced as a creative endeavor. I hope that it’s possible to do both: embrace modern type systems, property-based testing and metaprogramming to build reliable systems, while interacting with clean and beautiful tools and interfaces instead of the modern mess of HTML/CSS/JavaScript (or similar messes in other technology stacks).

What if the Singularity already happened?

I’ve been re-reading one of my favorite science fiction books, Accelerando by British author Charlie Stross. In one of my favorite passages, some of the characters are sitting around talking about their belief in the Singularity. One of the characters makes the following claim (about when the Singularity happened):

“Au contraire. It happened on June 6, 1969, at 1100 hours, Eastern Seaboard Time,” Pierre counters. “That was when the first network control protocol packets were sent from the data port of one IMP to another — the first ever Internet connection. That’s the Singularity. Since then we’ve all been living in a universe that was impossible to predict from events prior to that time.”

While it’s typical to equate the Singularity with the future advent of superhuman artificial intelligences, I think this definition makes a lot of more sense. The Internet has had more impact on our world in the recent past than any other technology (especially after the advent to mobile pocket-sized connected computing devices), and furthermore, it came almost completely out of left field. Few of the “classic” science fiction stories I remember reading (particularly by Isaac Asimov) prominently feature networked computers, even though they have faster-than-light spaceflight, aliens, robots and the like. Perhaps we should take that as a warning: the most disruptive technologies are the ones we’re least cognizant of, until the disruption is well under way.

The Spirit of Jane Austen

After reading one too many posts about how to (and why we should) read more, last night I sat down to read an article on The Atlantic about Jane Austen. Though I remember reading Pride and Prejudice once upon a time, and am generally aware of her status as a cultural icon, I can’t say I know very much about Jane Austen. This piece was interesting as an insight into her cultural impact and changing interpretation over time. However, what stood out to me was the author’s interpretation of Austen and her characters as agents of the humanist revolution sweeping Europe and the West in the late eighteenth and nineteenth centuries. In particular, I was struck by this excerpt:

Spiritedness is a way of understanding oneself as having rights. It experiences those rights as a joy, as a sense of blossoming, of freedom; but also as something often in need of quickly roused defense. It is the style of the revolutions—American, French—encroaching on Austen’s Britain, put in the mouths of intelligent young women who know their own worth.

Elizabeth’s is a declaration of rights; she demands the pursuit of happiness.

Since we seem once more to be living in times where personal liberties and rights are being questioned, and to some extent redefined, perhaps it’s time to pick up some Austen.

Yesterday I rewrote about half (the entire front-end) of a project that took me and two other collaborators several months to complete a few years ago. At the end of a solid work afternoon, I had things compiling and running, but in a very buggy state. Unexpectedly, the associated feelings turned out to be somewhat bittersweet and conflicted. I’m happy and proud of how much I’ve improved in the years since I first worked on this project, but I’m also sad thinking of how much faster and better things might have gone originally if I had known back then all the things that I know now.

Later in the evening, I learned something new (GADTs and how to use them in OCaml), which makes me hope that in a few years I’ll be even more capable than I am now. At the same time, I’m also wary of falling into the trap of rehashing old projects and ideas with new tools and techniques, rather than moving forward into new areas.

A part of me really wants things to be “just right”, not just when it comes to work, but in the rest of my life as well. It’s almost a visceral when I have to deal with things that don’t seem (close to) perfect for long periods of time. But at the same time, keeping a steady pace of progress in research and engineering requires knowing when things are “right enough” and moving on to the next thing.

Navigating that boundary isn’t something that I have a lot of experience, but I’m hoping that just like my programming skills, it’s going to be something I get better at in the coming years.

Problem: I am a human being

A relevant excerpt:

“If you are a person who has come of age during the time of ubiquitous internet access and github, you cannot know what having access to the source code of an entire operating system meant in the mid 90s. When I saw the impact this access had on my own life in the coming years, I began to view closed source software as unethical. In a society that was increasingly mediated by software, restricting access to learning about software works is in my opinion a means of control and subjugation.”

For me, growing up in India, that time was the early 2000s. My first Linux distro was Ubuntu, thanks largely to Canonical shipping out Ubuntu CDs to people around the world, something that seemed like a ludicrous idea at the time.

I wouldn’t be where I am without free software (both free as in beer, and free as in freedom). Also, Star Trek.

Sunday Selection 2017-06-25

Around the Web

The Largest Git Repo on the Planet

I’m always a fan of case studies describing real world software engineering, especially when it comes to deploying engineering tools, and contains charts and data. This article describes Microsoft’s efforts to deploy the Git version control system at a scale large enough to support all of Windows development.

Why our attention spans are shot

While it’s no secret that the rise of pocket-sized computers and ubiquitous Internet connections have precipitated a corresponding decrease in attention span, this is one of the most in-depth and researched articles I’ve seen on the issue. It references and summarizes a wide range of distraction-related issues and points to the relevant research if you’re interested in digging deeper.

Aside: Nautilus has been doing a great job publishing interesting, deeply researched, and well-written longform articles, and they’re currently having a summer sale. The prices are very reasonable, and a subscription would be a great way to support good fact-based journalism in the current era of fake news.

How Anker is beating Apple and Samsung at their own accessory game

I own a number of Anker devices — a battery pack, a multi-port USB charger, a smaller travel charger. The best thing I can say about them is that by and large, I don’t notice them. They’re clean, do their job and get out of my way, just as they should. It’s good to see more companies enter the realm of affordable, well-designed products.

From the Bookshelf

Man’s Search for Meaning

I read this book on a cross-country flight to California a couple months ago, at a time when I was busy, disorganized, stressed and feeling like I was barely holding on. This book is based on the author’s experience in Nazi concentration camps during World War II. The book focuses on how the average person survives and reacts to life in the brutality and extreme cruelty of a concentration camp. The second part of the book introduces Frankl’s theories of meaning as expressed in his approach to psychology: logotherapy. In essence, the meaning of life is found in every moment of living, even in the midst of suffering and death.

Video

Black Panther Trailer

I’m a big fan of Ta-Nehisi Coates’ run of Black Panther and really enjoyed the Black Panther’s brief appearance in Captain America: Civil War. This trailer makes me really excited to see the movie when it comes out, and hopeful that it will be done well. If you’re new to the world of Wakanda in which Black Panther will be set, Rolling Stone has a good primer.