Search This Blog

Are Computers Still a Bicycle for the Mind?

Steve Jobs had an enormous appreciation for the computer, believing it was the greatest human invention, and he commonly likened it to a bicycle for our minds. Here he is in one such explanation of this analogy:

He refined his delivery over the years, but the underlying analogy was always the same. The bicycle dramatically increases the efficiency of human locomotion, and likewise the computer dramatically increases the efficiency of human thought. While that is still the case when computers, the Internet, and increasingly Artificial Intelligence and Machine Learning are used as tools to leverage our innate abilities to solve huge, complex problems, they can also become other things for the mind that are not so useful. We are seeing it happen more and more that as computers proliferate, shrink in size, and become more convenient and ubiquitous, they stop being treated as a tool and start being treated as a toy or simply as a distraction. Maybe computers are becoming less like a bicycle for the mind and more like something else.

For decades the computer has been an essential tool for transforming and advancing our civilization. We use computers for nearly everything we build and do now: manufacturing, design, construction, finance, banking, communication, supply chain management, retail, medicine, research, exploration, and on and on. We use computers to design and develop newer and faster computers—a profound use for a tool—and we've been using them in this way since they were first capable of doing the job. It's the ultimate tool that can be used to make anything more advanced, even itself. The computer is the driving force behind its own exponential improvement. That characteristic makes it unique among all of the technological tools we've invented.

As computers have gotten more powerful, we have also designed them to be smaller and more capable to the point where now we can carry around a supercomputer in our pocket that is always connected to even more powerful supercomputers in the cloud. This supercomputer can run for hours on a small battery and connect us to nearly the entirety of human knowledge with a touch of the screen. Having such an incredibly powerful tool at our fingertips should enable us to do amazingly advanced things, and in most cases it does.

However, the smartphone in particular was not designed to create amazing new things, at least in the physical world. Sure, it can definitely be used to create in the virtual world: YouTube videos, Instagram pictures, or stories that can go viral and spread through the Internet in a millisecond, but it isn't normally used to design and build real new things like the next interplanetary rocket engine or distributed renewable energy system. These tasks are left to its more powerful and capable predecessor—the computer workstation. The smartphone was simply not designed for complex design tasks because it lacks the complex inputs that such tasks require. It was designed for something else.

The smartphone was designed to hold our attention. Maybe not at first. The first smartphones needed to serve a useful purpose to justify their existence and sale to a willing consumer. Being able to take a quick picture, check your email, or look up the nearest restaurant while away from your desktop computer was a reasonable value proposition for a lot of people (or holding your entire music collection, that was a major selling point for the original iPhone). Those first entries in useful, need-fulfilling apps quickly expanded into a sea of attention grabbing apps and features that are constantly vying for our attention with notifications, badge app icons, and never-ending feeds.

Holding as many people's attention for as long and as often as possible is definitely the name of the game for smartphones now. The companies that have figured out how to do this best have grown the largest with Apple, Google, and Facebook at the top of the list. Amazon is right up there, too, but they deal as much in selling physical goods and providing the computing infrastructure that most other companies use to build their businesses as they do in providing entertainment. Twitter would be up there as well, if they could only figure out how to monetize their users' attention as effectively as the other companies do.

Where does all of this attention seeking get us? We're not using this feat of human ingenuity to create even more useful things. We're using it to burn our time playing simple repetitive games with in-app purchases, trolling other people on Twitter, and reading all about the latest scandals and horrible fear-inducing news on our Facebook news feeds. We've even reached the point where journalists routinely write entire news articles about what people said and how other people responded on Twitter. Who had the best burn? Who made the best meme? How is this news?! This civilization advancing technology, this bicycle for the mind is no longer being used to take us to new places. We're now using it to go in circles, just wasting time, without making any progress. We've created a technology carousel.

We haven't just created and unleashed this time-wasting carousel, we can't get off of it. We keep going round and round, checking notifications, scrolling feeds, and searching and searching and searching. We're searching for approval if friends and acquaintances liked our post. We're searching for responses to DMs. We're searching for that next inspiring post or cute pet pic. Studies are showing that the average person now spends two hours a day on social media (careful, it seems like a spammy website, and of course I couldn't find the actual source). A report from Morgan Stanley in The Atlantic says that "depending on which study you choose to look at, adults on average check their smartphones as many as 80 or even 200 times each day." According to a report on a study in The Washington Post, teens are on different forms of media for nearly nine hours a day, including online music and videos.

The number of hours taken from our lives and handed over to social media companies for advertising dollars is simply astonishing. We trade our precious time for the next dopamine hit again and again because we're addicted to the artificial sense of connection we feel when we send and receive these virtual messages out over the ether. We're not making real connections with other human beings, though. We're making stronger connections with our phones, which is exactly what was intended to happen by design. We may try to convince ourselves that all of this checking and searching is accomplishing something important or filling a need, but most of the time what we're accomplishing for ourselves is further isolation and distraction.

We are succeeding in reducing our attention span and destroying our ability to focus. If we're obsessively needing to check our phone every few minutes, that doesn't leave much mental capacity for deep, creative thought. As Nicholas Carr, author of The Shallows: What the Internet is Doing to Our Brains describes, our mental functions end up fragmented and shallow, and we're unable to follow complex lines of thought to more innovative solutions to our problems. When we can no longer function without our phones, they may not be just a technology carousel, but a technology crutch that we depend on to hold us up.

One would think that with all of the talk about how quickly technology is advancing, and how we're all walking around with supercomputers in our pockets, we would be having a cultural renaissance. One would expect that with that kind of technical leverage, productivity would be accelerating through the roof, but that's not what we're seeing. The following plot shows total factor productivity in the US up to 2014.

Plot of Total Factor Productivity 1950-2014

Productivity clearly stalled out in the 70s and picked up again in the 80s and 90s, but it is definitely decelerating again in the 2000s. Unfortunately, we can't see what has happened in the last four years in this graph, but what we do see is still disconcerting. The 2000s and beyond should have seen accelerating productivity, according to the technology evangelists, because that's when computers really came into their own and smartphones and apps started penetrating the market and eating the world.

I'm not claiming that smartphones and social media are the cause of this productivity slowdown. I'm not an economist, and even the experts don't seem to have a great handle on why this is happening. But these newest technology entrants certainly aren't helping productivity, and there are a number of reasons why that may be, beyond distracting us with the technology carousel.

First, digital recording technologies—embodied in many ways by a smartphone—seem to be replacing our memories. In another report in The Atlantic going by the provocative title Is Technology Making Us Dumb?, studies show that if you record something, whether by audio, images, or video, you remember it less. Beyond recording our memories instead of remembering them, we have become better at finding information, but worse at retaining it because Google is just a click away. We don't have to remember how to do things anymore. We can just look it up whenever we need to, so as soon as we've accomplished whatever goal we had, the information is jettisoned and we get back on the carousel.

That behavior of always looking things up also has another side effect. We practice skills less. Without sufficient practice, we can't develop fundamental skills into more complex problem-solving abilities. So many useful skills are of the use-it-or-lose-it variety, and always looking things up instead of practicing has the nasty side effect of losing it. Google Maps is a great example. Before Google Maps we needed to do route-planning and navigation with an actual map. Even folding up the map after using it was its own problem solving exercise. Now we just follow the directions fed to us from the Google Assistant in the phone stuck to the dashboard. That Atlantic article had some great words on this subject specifically:
Using your GPS for all navigation is also damaging to your brain. A series of studies showed that people who rely on GPS to get around have less activity in the hippocampus, an area of the brain involved in both memory and navigation, than those who use maps and learn to navigate based on landscape indicators. Your spatial memory develops far less when you are on GPS autopilot than when you need to observe what is around you to determine where you are going and how to get back.
GPS is certainly easy and convenient, but it's not doing us any favors in the mental health department. After using navigation for a few months, most people will be lost without it. It quickly becomes a technological crutch.

Google Assistant is gaining new abilities every year, too. Now it can be your personal secretary, making appointments and ordering food for you, as seen at Google I/O:

This AI performance is incredibly cool, no denying it, but what is it taking away from us? Will we start losing the skill of human interaction as these AIs do more and more of these types of things for us? People are already horrible to each other on Twitter and Facebook because they've stopped appreciating that there's another human being on the other end of the conversation instead of boxes of black and white text. How is Google Assistant going to change our culture as we interact with less and less different people on a daily basis?

Driving is another skill that we probably won't be commonly doing in the near future. Of course, I, as much as anyone, look forward to the day my car can drive itself because it's so boring and tedious and humans have consistently shown how bad they are at it, but we should still acknowledge what we're giving up. Even though traffic deaths would be significantly reduced, driving is a reasonably complex task, and if we're not doing that, what are we replacing it with? Another spin on the carousel?

The problem here is that a small fraction of people are permanently solving the problems that we used to routinely solve for ourselves everyday. Then they package it up in an AI and distribute it to everyone, so we no longer have to solve that problem on a daily basis. Each individual problem may seem trivial, but they all add up. As Google Assistant, Siri, and Alexa gain more skills, we stop using those same skills, and we lose them. We become dependent on the AI crutch.

We're making such progress on solving all of these tedious day-to-day problems that people have started making solutions for problems that don't even exist. Case in point: Bitcoin. Digital currencies and the blockchain are solutions looking for a problem and coming up short. They don't seem to solve any pressing problem that isn't already solved with the existing monetary system, unless you're a drug dealer or financial speculator. They do, however, consume an inordinate amount of electricity.

Instead of putting more energy and effort into solved problems or finding more ways to distract and entertain everyone, we should look at some of the most pressing problems and exploratory frontiers staring us in the face. We could be developing renewable energy systems much more rapidly than we are. We could be pouring resources into educating the world's population and elevating it out of poverty. We could be building advanced space probes, spaceships, and telescopes to explore the universe and learn more about how it all works. There are incredibly challenging and fascinating problems all around us. Just take a look at for a list of problems that need solving to combat the existential threat of global warming.

New, advanced technologies can play an essential role in solving all of these problems and more. Using the vast computer power at our disposal to make progress towards solutions in these spaces would truly be using the computer as a bicycle. With all of the time we're freeing up by offloading those tedious tasks to our digital assistants, we should have plenty of time and creative energy to tackle the real, complex problems ahead of us. If we could only redirect all of the energy and attention we're spending on using computers as a carousel or a crutch, we could get back on that bike and go somewhere.