We Need Hyperlink Literacy

A couple weeks ago, I was in a student discussion with James Grimmelmann, law professor at Cornell Tech who studies how laws regarding technology affect freedom, wealth and power. A large part of our discussion centered around search engines and media platforms, specifically how personalization and algorithmic filters may affect what users see, even when they don’t understand or know that they’re getting filtered content. One way to tackle this issue (in addition to regulation, or some form of opt-out) would be media literacy: teach people that what they see is not some kind of perfect, impartial truth, but might be tailored to their preference and past histories (and biased in other ways).

Fostering that kind of media literacy among the populace at large is at once sorely needed and immensely difficult. Given how much our society depends on the Internet, the World Wide Web, search engines, social media platforms and the (often inscrutable) algorithms behind them, it is crucial that people understand how they get their information, and what are the biases, agendas, and influences shaping what they see (and don’t see). This is clearly a massive challenge, and likely one that we don’t yet know how to overcome. Personally, I would settle for solving one much smaller piece of the puzzle first: a more general understanding and respect for hyperlinks.

The humble hyperlink is the foundation of the web as we know it. It takes the web from being just a digital version of a library or filing cabinet to something truly new: a system where direct access to a piece of information is as easy as naming it. Unfortunately, along with the rise of digital walled gardens such as Facebook (and to a lesser degree, Twitter) the hyperlink seems to be losing prominence. That’s ironic given that the hyperlink is a sharing mechanism, and Facebook would like to bill itself as a platform for sharing and connecting. On a normal web page, one can use snippets of text as anchors for a hyperlink, instead of using the raw link itself. Facebook doesn’t let you turn pieces of text in a status update into links. Furthermore, pasting more than one link at a time breaks the user interface. I suppose Facebook wants to give the link more prominence than what you have to say about it. People like Dave Winer and John Gruber have commented at length on how Facebook breaks the web. Poignantly, that last sentence (with two hyperlinks) would be impossible to write properly in Facebook.

And it’s not just Facebook. Twitter is approximately the same as Facebook. Slack supports links poorly in the same way: there’s no obvious way to use pieces of text as anchors for links. Adding more than one link is slightly better: giving previews for both links (though they are bigger and more prominent than the message containing the links). These systems are silos: they prefer you share and interact with posts within their own boundaries, rather than with content on the web as a whole.

By reducing the prominence of hyperlinks and truncating their utility, we create online cultures focused on sharing information, rather than ones that encourage combining and synthesizing multiple sources and viewpoints into coherent narratives. I would argue that in doing so we are giving up a large part of the power of the Web, to our detriment, and for no clear benefit.

So how do we fix this? Certainly, there is an argument to be made for reducing our dependence on platforms where we cannot sufficiently control our own writing. But beyond that, I would like to see hyperlinks become a more ingrained part of writing on a computer. I would love to see a world where whenever you write some text on a computer that references external sources, you link copiously to them, rather than just inserting references that readers have to look up manually. School and college writing classes would be the prime places to teach this. In the same way that we teach students to include citations to external sources, I would like to see students treat hyperlinks with the same importance and fluency.

In a deeply connected technological society such as ours, using the core technologies of the web should be a central part of any kind of digital or media literacy.

Advertisements

Steve Jobs, the Xerox Alto, and computer typography

A few nights ago I met a wonderful woman at a pipe organ concert who worked for several decades at Xerox, programmed in the Mesa and Cedar languages, as well as Smalltalk on the original Altos.

She told me that she eventually left programming because she felt like modern computing and programming had become bureaucratic and process-oriented, more like engineering and less creative. These days she was more interested in statistics and data science.

Personally I’m glad to see computing and programming mature into a engineering discipline, but I would also very much like to see programming embraced as a creative endeavor. I hope that it’s possible to do both: embrace modern type systems, property-based testing and metaprogramming to build reliable systems, while interacting with clean and beautiful tools and interfaces instead of the modern mess of HTML/CSS/JavaScript (or similar messes in other technology stacks).

Yesterday I rewrote about half (the entire front-end) of a project that took me and two other collaborators several months to complete a few years ago. At the end of a solid work afternoon, I had things compiling and running, but in a very buggy state. Unexpectedly, the associated feelings turned out to be somewhat bittersweet and conflicted. I’m happy and proud of how much I’ve improved in the years since I first worked on this project, but I’m also sad thinking of how much faster and better things might have gone originally if I had known back then all the things that I know now.

Later in the evening, I learned something new (GADTs and how to use them in OCaml), which makes me hope that in a few years I’ll be even more capable than I am now. At the same time, I’m also wary of falling into the trap of rehashing old projects and ideas with new tools and techniques, rather than moving forward into new areas.

A part of me really wants things to be “just right”, not just when it comes to work, but in the rest of my life as well. It’s almost a visceral when I have to deal with things that don’t seem (close to) perfect for long periods of time. But at the same time, keeping a steady pace of progress in research and engineering requires knowing when things are “right enough” and moving on to the next thing.

Navigating that boundary isn’t something that I have a lot of experience, but I’m hoping that just like my programming skills, it’s going to be something I get better at in the coming years.

Problem: I am a human being

A relevant excerpt:

“If you are a person who has come of age during the time of ubiquitous internet access and github, you cannot know what having access to the source code of an entire operating system meant in the mid 90s. When I saw the impact this access had on my own life in the coming years, I began to view closed source software as unethical. In a society that was increasingly mediated by software, restricting access to learning about software works is in my opinion a means of control and subjugation.”

For me, growing up in India, that time was the early 2000s. My first Linux distro was Ubuntu, thanks largely to Canonical shipping out Ubuntu CDs to people around the world, something that seemed like a ludicrous idea at the time.

I wouldn’t be where I am without free software (both free as in beer, and free as in freedom). Also, Star Trek.

Sunday Selection 2017-06-25

Around the Web

The Largest Git Repo on the Planet

I’m always a fan of case studies describing real world software engineering, especially when it comes to deploying engineering tools, and contains charts and data. This article describes Microsoft’s efforts to deploy the Git version control system at a scale large enough to support all of Windows development.

Why our attention spans are shot

While it’s no secret that the rise of pocket-sized computers and ubiquitous Internet connections have precipitated a corresponding decrease in attention span, this is one of the most in-depth and researched articles I’ve seen on the issue. It references and summarizes a wide range of distraction-related issues and points to the relevant research if you’re interested in digging deeper.

Aside: Nautilus has been doing a great job publishing interesting, deeply researched, and well-written longform articles, and they’re currently having a summer sale. The prices are very reasonable, and a subscription would be a great way to support good fact-based journalism in the current era of fake news.

How Anker is beating Apple and Samsung at their own accessory game

I own a number of Anker devices — a battery pack, a multi-port USB charger, a smaller travel charger. The best thing I can say about them is that by and large, I don’t notice them. They’re clean, do their job and get out of my way, just as they should. It’s good to see more companies enter the realm of affordable, well-designed products.

From the Bookshelf

Man’s Search for Meaning

I read this book on a cross-country flight to California a couple months ago, at a time when I was busy, disorganized, stressed and feeling like I was barely holding on. This book is based on the author’s experience in Nazi concentration camps during World War II. The book focuses on how the average person survives and reacts to life in the brutality and extreme cruelty of a concentration camp. The second part of the book introduces Frankl’s theories of meaning as expressed in his approach to psychology: logotherapy. In essence, the meaning of life is found in every moment of living, even in the midst of suffering and death.

Video

Black Panther Trailer

I’m a big fan of Ta-Nehisi Coates’ run of Black Panther and really enjoyed the Black Panther’s brief appearance in Captain America: Civil War. This trailer makes me really excited to see the movie when it comes out, and hopeful that it will be done well. If you’re new to the world of Wakanda in which Black Panther will be set, Rolling Stone has a good primer.

When visiting Boston a couple weeks, an old friend of mine remarked that he thought that the humanities should be “the pinnacle of human achievement”. I wasn’t sure what he meant at the time, but the phrase has been rattling around in my head. In between working on a conference presentation, attending a law conference and reading the Meditations of Marcus Aurelius, I think I’ve come up with an explanation that makes sense (at least for me):

Science and technology, for all their towering achievements, are concerned with improving and extending human existence. By contrast, the humanities, (and I think also the arts) should inform and improve how humans live.

Computer Science as a Creative Endeavor

Yesterday, Professor Eugene Wallingford posted about how computer science is not that different from a lot of other professions. Some parts of it are interesting and exciting, but a decent chunk is tedious, frustrating and boring as well. In his words:

Let’s be honest with ourselves and our students that getting good at anything takes a lot of hard work and, once you master something, you’ll occasionally face some tedium in the trenches. Science, and computer science in particular, are not that much different from anything else.

While I agree with the general message, in my experience, computer science is also a wonderfully creative and exciting endeavor (even if we often struggle to portray it as such).

Throughout college I had a variety of on-campus jobs. I was mostly trying to make some spending money, and pay for the occasional trip with friends. Through that time, I remember the jobs that were most fulfilling and interesting involved something that I would broadly call “computer science”. Some of it was doing honest-to-goodness computer science research (which later helped me get into graduate school), some of it was data crunching, and some of it was simply straightforward scripting and system administration, with a dash of web design. In fact, my first job involved calling up alumni asking for donations, and I promptly quit it after a week once I got a job doing some data processing for a professor.

Of course, I had found computers interesting for several years prior to my first college job, so I certainly wasn’t unbiased when I started comparing jobs. I can also imagine that a lot of people would consider calling up alumni a far more interesting job than munging CSV files with Python scripts. And there are certainly parts of even technically demanding programming tasks I find tiresome and would happily avoid (pretty much anything to do with CSS).

All that being said, (and with several years hindsight and interaction with professionals in other fields) I would place computer science on the same level of creativity as screenwriting and practicing law. In all these cases, there are certain structures and rules you have to follow, some more flexible than others. Some parts of tools and materials you have to work with are beautiful and elegant, others are messy and ugly but can be avoided in part, and some are just necessary evils. But within those rules and using those tools, you have the chance for exercising creativity and ingenuity, and maybe even some true beauty (if you’re lucky and capable enough). In fact, having taken a bunch of law classes, I would say that the practice of computer science has a lot in common with the practice of law (though that’s a matter for another post).

Perhaps the view of computer science as tedious is an indictment of our teaching methods, our tools, or the creeping incidental complexity that easily infests software projects of any size. But if projects like Pyret, Scratch and Jupyter notebooks are any indication, there seems to be concerted effort to change that. I’m not a fan of the mindset that says that HTML/CSS/JS must be taught to everyone, and it would be disingenuous to say that computer science is simple or easy. But as both academics and practitioners, I do hope that we can be honest about the effort and occasional drudgery involved, while helping understand and appreciate the joy of programming and the thrill of computation.