The zombie apocalypse started on June 29, 2007. If you don’t believe me, here’s the video from the ABC News archives:


The video is coverage of the release of the first Apple iPhone. By 2012, smart phones, both the Apple variety and Android, were nearly ubiquitous. Today, seeing someone with a flip phone is akin to seeing an old Volkswagen beetle on the road. It’s quaint.  

Also in 2007, Facebook launched its mobile platform and LinkedIn was mobile by 2008. Both Facebook and LinkedIn had over one million users prior to launching their mobile apps. Twitter was launched in 2006, but did not take off until 2007. Instagram launched as a mobile app for the iPhone in October 2010 and reached a million users before the end of the year. Pinterest and Snapchat both launched mobile versions in 2011.  

Ten years later, several of the creators of these phones and apps have had a crisis of conscience over the business models behind these products. Each of them banks on user information by vying for user attention. The longer you are on your phone and interacting with apps, the more information you provide to companies which can be sold to advertisers.

To keep users engaged for as long as possible, the smart phone operating system and the apps use psychological hacks that the gambling industry uses to get people hooked. Everything from color choice, to when alerts occur and how they sound, to whether you swipe up to refresh or left to ignore are all carefully crafted features to ensure that you, the user, are spending as much as your life as possible on the phone.

Time well spent and the Center for Humane Technology

Now these designers, engineers, and strategists have realized that they have created a monster. Several former employees of Google and Facebook, as well as other tech companies, have started a non-profit called the Center for Humane Technology. According to the Center’s website the problem is that “[o]ur society is being hijacked by technology. What began as a race to monetize our attention is now eroding the pillars of our society: mental health, democracy, social relationships, and our children.”

One of the co-founders of the Center is Tristan Harris, former design ethicist for Google. I wrote about him here in 2016 when his non-profit was called Time Well-Spent, a phrase Mark Zuckerberg recently used to describe how Facebook will refocus the content of its news feeds.

Harris has been an advocate for humane technology after learning how social media apps and smart phone designers use psychological tools to hack people’s brains.

For example, the fact that Facebook’s notifications are red is no accident. Red tends to prompt people to respond. You know how notifications can ping your phone at any moment? That isn’t an accident either. It is based on the idea of intermitted variable rewards, which gives the user a little dopamine rush whenever you happen to get a notification. That dopamine rush makes you respond to the app in an almost Pavlovian way. Slot machines work in a similar way.

It would be more user-friendly, in the sense of regarding the user’s well-being, for all notifications to download at one time, not sporadically. But these little hacks are important for prompting people to frequently open the app. Then tricks, like a bottomless newsfeed, help keep people there longer. The longer the app holds your attention, the more advertisements you see. The more you interact with the app, the more information their algorithms glean from your actions. This is why Facebook is “free and always will be.” Social media apps make a substantial amount of revenue off of selling user information.

Keeping your kids from getting high

Justin Rosenstein, one of the Center’s advisors and co-inventor of the “like” button on Facebook said in an interview with The Guardian that Facebook likes are “bright dings of pseudo-pleasure.” After his crisis of conscience, he removed all apps from his phone and had his assistant set up parental controls to prevent him from downloading any apps. He is one of several former Silicon Valley engineers who do not use the apps that he helped design and sends his children to a prestigious low-tech private school that emphasizes hands-on learning.

Most Silicon Valley elites protect their children from too much screen time to a degree that few of us do.

One article described Steve Jobs and Bill Gates’s level of concern over their children’s screen time as borderline-paranoid. Steve Jobs touted the iPad as so easy to use that a toddler could use it, but he did not allow his own children to have an iPad. Bill and Melinda Gates did not allow their children to have a cell phone before they were fourteen years old, and they had a standing rule of no technology at the dinner. The Gates’s sent their children to school that emphasizes hands-on learning, using paper and pencil, and very little technology. Both Jobs and Gates would not allow their children to use any screens prior to their bed time.

James Williams is a former Google strategist. He will not give his children an iPad or any other digital reading device but promises his kids as many physical books as they like. In a Guardian article, he recounts his discomfort over seeing a dashboard in Google’s headquarters that showed the amount of attention time Google had “commandeered for advertisers.” Williams said, “I realised: this is literally a million people that we’ve sort of nudged or persuaded to do this thing that they weren’t going to otherwise do.”

Take back your technology and your time

The Center for Humane Technology is not an anti-technology group. It is more interested in technology that helps human flourishing rather than robs people of time and attention in the name of profits. According to their website their mission reads: Reversing the digital attention crisis and realigning technology with humanity’s best interests.

They accomplish their mission by both informing people of the ways these technologies were designed to be addictive and by providing practical ways to make your technology more user-friendly.

I’ve tried a few of these suggestions and found them very helpful. For example, Rescue Time helped me see that I spent more time than I thought reading non-research related articles and news on the internet. It also gave me a realistic picture of how many hours I really spent with work-related programs, such as Word or Excel, as my active window.

I have been more diligent about not looking at any screens, including television, for at least 30 minutes before going to sleep, which has been helpful, but all of us have had times when we have to burn the midnight oil. Two of the apps, “Turn on NightShift” and “Flux” removes the blue light from your computer and phone screen after a certain time in the evening. This is a simple way to protect your sleep even when you have to stay up late working.

Tech tools and digital treats

Ultimately, though, we need to completely break our unhealthy attachment to our technology. Technology is a tool that can help us in many ways, but there is a difference between using technology as a tool and passively consuming it. I’m writing this article on a word processor, which helps me communicate clearly.  When I worked in a chemistry lab, Excel and ChemDraw were my favorite programs because they allowed me to easily organize and analyze my data in ways that were more accurate than hand-written data.

But using my computer to create this article or to analyze data is different from using it to scroll through a bottomless social media feed or checking Facebook for the third time to see if anyone liked my post. In the first sense, I’m using my computer to create what I first conceptualized in my mind. In the second sense, I’m looking for something that makes me feel good. It’s the difference between using a tool and looking for a treat.

While the means may change with every cultural fashion, human nature stays the same. We are quite adept at turning something good into something that can be used to exploit others. In the 2007 ABC video the interviewer asks author Steve Levy if Apple is a religion. Levy replies, “To some people, yes. That’s why some people call it the Jesus phone.”

But Apple, or Steve Jobs, is not the only one responsible for the zombie apocalypse. There are also the social media companies whose apps exploit the user’s vulnerabilities in order to nudge their behavior. And there’s us.

Levy pointed out that “the biggest risk is inflated expectations. In a sense, it’s got to disappoint a lot of people because it’s not going to cure your pimples.”

While tongue-in-cheek, the idea is that we expect technology to solve more than just our technological problems. Rather than a religion, perhaps the tech industry is more like a cult. It uses manipulation tactics to keep you there and convince you to think certain ways about specific things. And the cult members joined because they were searching for something.

Heather Zeiger is a freelance science writer with advanced degrees in chemistry and bioethics. She writes on the intersection of science, culture, and technology.