Recently, we’ve looked at a number of ways the Internet can help or harm our lives. The major harm is that in a virtual world, fakery succeeds much better than in an actual world. Our lives could be guided by artificial news, and dominated by fake friends. We could be bullied, harmed, or shamed by myriad people to whom we have no connection at all. People who are not good for us who need never have mattered to us.
While teens are often more savvy than older folk about the technology itself, they may not have the life experience needed to interpret the online frenzy. For one thing, natural cues do not always work because we are not getting natural signals. One outcome for teens and young adults can be reduced social and academic skills.
That said, I am not a fan of lecturing teens about moral issues as such. They usually have a hard time understanding how a single wrong choice can entail big, maybe long term consequences. So they are turned off by warning about things they cannot imagine.
And, lets face it, many of us seniors learned the hard way, not the best way.
However, I am hoping readers will offer input on one approach: Help young people see how much of the online world is believable flimflam. It’s not just click farms in poor countries manufacturing cool fake friends or sore losers posting malicious edits to one’s Google information.
It could also be scamsters posting fake bad online reviews to ratings sites. Communications prof Joseph Reagle explains,
In early 2015, the Belfast Telegraph sent reporter Kim Kelly undercover to visit Northern Ireland’s “worst” hotel — according to its on online reputation. Kelly reported that although some TripAdvisor reviews had called it a “hell hole” and “dustbin,” she was pleasantly surprised with the “clean and compact” rooms.
Because of the overwhelming popularity of reviews sites, he says,
Beyond consumers, merchants, and review platforms, there’s another actor keen on benefiting from online reviews: illicit manipulators. From overseas “sweatshops” (that earn pennies per post) to the “boutique” reputation services for the rich, there is a vast market for online deceit. By finding patterns in posts (such as the ratio of positive to negative words) and activity (such as a negative review quickly followed by a positive one), studies estimate that 10%–30% of reviews are fake. More.
Do we need these misleading and potentially dangerous hjinxs if we are backpacking through Europe? We could end up in the actual worst hotel, get sick, and have to cut our hike of a lifetime short.
And then there’s reputation management. Recently, we looked briefly at Jon Ronson, that British writer who discovered that some academics had created a distasteful digital clone of him (an “infomorph”). He wrote a book about his and others’ experiences, So You’ve Been Publicly Shamed. There he details the case of Lindsey Stone, a Cape Cod woman who worked at a non-profit that helped learning disabled adults with independent living.
Now, Stone and a friend had a running gag where they posted to their Facebook pages pictures of themselves disobeying signs. Smoking in front of a No Smoking sign was their usual style.
A Facebook group called “Fire Lindsay Stone” had 10,000 likes in just a couple days. Her address and phone number ended up online. Her phone rang off the hook. News vans showed up at her house. Within days of the photo’s arrival online, she was fired from her job. Lindsey stopped leaving her house, but she was unable to stop checking up on the people who hated her.
Ronson, who needed her story for his book, offered the now-reclusive Stone a deal she could hardly refuse: Big budget reputation management.
How does that work? Ronson had found out how quite by accident when a contact’s former classmate was arrested for tax evasion:
A while later, Graeme started getting new, bizarre Google Alerts about Phineas. An article about Phineas being appointed the “Head Finance Curator of Venture Cap Monthly,” and being voted philanthropist of the month by “Charity News Forum.” But when he went to these sites, he found that they barely looked legitimate. Phineas Upham had engaged a reputation manager.
The interlinked pages, whose hits ranked highest at Google, show Upham doing reputable things, and they push the other stuff downward to page 2. Says Digg,
According to a study by the online ad network Chitika, 33% of Googlers only click on the first link. And it just gets worse from there. Almost no one continues on to the second page.
And now, thanks to a big online reputation management firm,
The “Fire Lindsey Stone” Facebook page is now nowhere to be found, but another Facebook page, “Hire Lindsey Stone” is the second result for a search on Lindsey Stone. And then, of course, there are the pages that Reputation.com made. Pages like lindseymstone.com which is a seemingly endless, almost pictureless website dedicated to “promot[ing] awareness of autism spectrum disorders and support the research and outreach efforts of domestic and international organizations.” More.
Few of us would wish to be vindictive toward the newly sobered Lindsey, who got another job helping autistic children. But there is this: Less scrupulous firms than Reputation.com (the one Ronson contacted) scrub the reputations of far less savoury characters, using these and similar methods. So we really don’t know who we are dealing with online.
Now, my question for readers: Do teens learn this stuff in school? Should they? Would it make a difference if they knew?
Here’s a Molly Maid for routine online reputation issues:
Denyse O’Leary is a Canadian journalist, author, and blogger.