"Something is seriously out of kilter in our communications environment
when safe food products and proven technologies can be torpedoed by sensationalist, misleading, yet entertaining social media campaigns. We should all take several steps back and remember the critical thinking skills we were taught in school."
Defenders of unregulated social media, and there are plenty, counter: We were also taught democracy in schools. If not for throngs of Facebook friends and everyday tweeters, U.S. Rep. Chellie Pingree, a Maine Democrat, may never have introduced a labeling bill to at least let consumers know when they're buying pink slime.
Choice.org -- the petition site that gave rise to the pink-slime crusade (and also sharpened national attention on the killing of Florida teen Trayvon Martin) -- removes discriminatory causes and postings that call for violence. Website spokeswoman Megan Lubin said those cases were rare: "Most everyone is responsible when using" the open platform.
"It was the first time in history that more than 1 million comments were generated on a food petition at the FDA," said Sue McGovern, spokeswoman for the Just Label It Campaign. "The exact number was 1,149,967 ... It's those mammoth, historical numbers that Washington, D.C., is taking a look at" in the viral age.
Some contend the best way to thwart the dangers of social media is to fight fire with fire -- better technology.
The U.S. government is pushing to detect online persuasion campaigns and to develop "counter-messaging" software against "adversaries (who) may exploit social media and related technologies for disinformation," according to a Pentagon statement to The Wall Street Journal.
"It's an arms race," said disinformation sleuth Menczer of Indiana's Center for Complex Networks and Systems Research, recipient of a $2 million Defense Department grant. "We may develop better detection tools only to see political and commercial interests invest in beating these tools."
The center he directs has a website, Truthy.indiana.edu, that monitors the Twittersphere to detect how political groups take advantage of it.
The Truthy project spotted suspicious patterns in the 2010 elections. Several Twitter accounts created simultaneously -- along with Web links launched the same day -- gave the illusion of real people having conversations. In fact, they were dummy accounts automatically tweeting and re-tweeting each other.
Followers of those accounts would get the sham tweets and be directed to Web sites resembling news organizations, Menczer said. Some of the reports would accuse a campaign's opponent of backing legislation such as health reform and cap-and-trade proposals for personal gain.
Once the strategy goes viral and a topic, or "meme," is followed -- with Menczer's computers tracing common hash tags, URLs and repeated phrases -- digital images of the activity do resemble a biological virus.
But tracking this tangle of tweets, links and retweets back to the original source can be difficult, giving political campaigns deniability if confronted about the schemes.
In the 2008 Massachusetts race for U.S. Senate, Wellesley College scientists P. Takis Metaxas and Eni Mustafaraj detected a pattern of "Twitter-bombs" against Martha Coakley, the Democratic candidate.
During the week leading up to the vote, the researchers noted a spike in Web searches that directed users to a disproportionate flurry of tweets smearing Coakley. The social-media traffic built enough for Google to tag the race a "trending topic," and Republican Scott Brown scored a surprise victory.
The race in Massachusetts "was the first election in which social media absolutely changed the conversation," said Mustafaraj, who noted the anti-Coakley tweets carried morsels of truth.
"In order for these things to spread, it can't be a complete falsehood," Mustafaraj said. "You hope that other media will pick up on the story."
In time, other research shows, a social-media falsehood finds ways to die. Tracking the tweets from the zone of an earthquake that devastated Chile in 2010, computer analyst Barbara Poblete discovered that accurate reports from victims traveled faster and farther than did the false rumors.
Melin, of Overland Park's Spiral16, notices the same: Bad information has ways of correcting itself, a phenomenon that social media defenders attribute to the collective wisdom of crowds.
"It does seem to actually work in the end," he said. "Believe it or not."
Most Popular Stories
- American Airlines, US Airways Complete Merger
- ACA Delay Stresses Small Businesses
- Questions Remain in Jenni Rivera's Death
- Harley Issues Motorcycle Recall
- Unemployed Wait as Lawmakers Debate
- General Dynamics Plans 200 New Jobs in N.M.
- Auto Dealer Builds Big Solar Project
- Entrepreneurs' Next Creation May Be New Laws
- Saab Gets Back into the Game; U.S. Auto Sales Soar
- Dell Offers Undisclosed Number of Employee Buyouts