When Facebook and Google are ‘weaponized,’ the victim is relative, reality

When Facebook and Google are ‘weaponized,’ the victim is reality.... in a society that's fast becoming fake this is dangerous, for the truth will go unheard, for a truth will be bothersome, a cause of ire in such a society, the more phony it becomes. It's a truth that the faker a society becomes, the more anger there is for the uttered truth.


Well-intentioned warnings to journalists were circulating early the week as the news of the Las Vegas massacre broke.

The Society of Professional Journalists urged accuracy, fairness and respect.

Poynter Institute cautioned reporters to avoid speculating on mental illness and told editors not to use images that glorify the shooter.

But the purveyors of viral lies weren’t listening to this good advice, and never will.

Facebook and Google served up disinformation on their all-powerful platforms. They promoted rumors that not only named the wrong gunman but blamed his supposedly liberal politics.

“Social media has become totally weaponized,” Kara Swisher, co-founder of the technology news site Recode, said at a conference Tuesday. The former Wall Street Journal and Washington Post reporter couldn’t be more right.

And the managing editor of the fact-checking website Snopes.com warned that “it’s getting more polarized.”

“There’s this mad scramble to paint the guy as a Democrat or a Republican, so they can cheer,” Brooke Binkowski told the Guardian. “A lot of this is pushed by trolls deliberately to muddy the conversation.”

The worst of it this week probably was the falsehood spread on the anonymous message site 4chan that the shooter was someone named Geary Danley, using his Facebook “likes” to show that he was a Trump-hating liberal. It quickly made its way into prominence through the big social media sites. The Las Vegas tragedy is only the latest example. We’re only now beginning to get a grip on social media’s role during last year’s presidential election.

CNN reported Wednesday that Russian-backed advertising on Facebook — damaging to Hillary Clinton — was targeted at her campaign’s weak spots in Michigan and Wisconsin. She lost both states by narrow margins, and they were important to Donald Trump’s electoral college triumph.

“The ads employed a series of divisive messages aimed at breaking through the clutter of campaign ads online, including promoting anti-Muslim messages,” according to the CNN story.

These platforms couldn’t be more powerful or influential. (Facebook now has 2 billion monthly users. Google’s parent company is worth more than $600 billion.) Millions get their news — or what looks like news — from these behemoths.

At a time when the truth is under assault, including from the Oval Office, it’s clear that the weaponizing trend couldn’t be more damaging.

What’s far less clear is what should be done.

This week, the platforms made their usual weak-tea excuses, accompanied by vague pledges of reform — delivered in the equally familiar robotic prose:

“Unfortunately, early this morning we were briefly surfacing an inaccurate 4chan website in our Search results for a small number of queries. Within hours, the 4chan story was algorithmically replaced by relevant results. This should not have appeared for any queries, and we’ll continue to make algorithmic improvements to prevent this from happening in the future.”

Facebook’s response had a similar mind-numbing tone.

To be fair, they’ve made a start. Under fire, Facebook has announced that it will hire 1,000 people to review ads. (This after dismissing a host of editorial employees not too long ago.) In a full-page print ad in Wednesday’s Washington Post, Facebook said it would fight election interference. And both companies have supported news literacy efforts and journalistic initiatives.

Granted, an obvious solution is elusive.

“No amount of ‘fixing’ Facebook or Google will address the underlying factors shaping the culture and information wars in which America is currently enmeshed,” wrote Danah Boyd in Wired.

Undeniably true, as is her point that it’s hard to define “fake news” in a terribly divided society. (The president’s definition, for example, is anything — true or false — that paints him in a negative light.)

Still, there is such a thing as verifiable reality. There is such a thing as valid, fact-based journalism. And while it may be hard to pin down, there is such a thing as truth.

And there’s the opposite of these things: hoaxes, conspiracy theories, flat-out lies in the form of news stories, advertising from a foreign adversary meant to sway a presidential election.

Given the platforms’ outsize role in these problems — and their immense wealth — they need to step up with some serious solutions and stop blaming their own technology.


Fast FWD to today...

'Russia weaponizing Facebook' is a tipping point for how much we rely on tech, says author

* The ease in which Russia was able to buy political ads on Facebook shows how much influence tech companies have amassed in America, says Scott Galloway.
* The notion that someone with a credit card "can pay in rubles to start advertising and sewing chaos here is probably the tipping point," he says.
* Galloway is author of the new book, "The Four: The Hidden DNA of Amazon, Apple, Facebook, and Google."

Tech's four horsemen more like Darth Vader or Ayn Rand: NYU's Scott Galloway from CNBC.

The troubling ease in which Russia was able to buy thousands of political ads on Facebook in an attempt to influence the 2016 presidential election shows just how much influence major technology companies have amassed in modern-day America, NYU business school marketing professor and author Scott Galloway told CNBC on Wednesday.

The notion that someone with a credit card "can pay in rubles to start advertising and sowing chaos here is probably the tipping point," said Galloway, author of "The Four: The Hidden DNA of Amazon, Apple, Facebook, and Google." The new book explores the sway big tech holds over consumer attention, loyalty and personal information.

Facebook on Monday said roughly 10 million people saw advertisements bought by Russian groups trying to influence the November election. The social network released those details as it turned over about 3,000 ads to House and Senate lawmakers.

Special counsel Robert Mueller and congressional committees are investigating possible links between President Donald Trump's campaign and Russia, which both sides deny.

"The most innovative use of technology in 2017 was Russia weaponizing Facebook," Galloway said in a "Squawk Box" interview while discussing his book. "The thing that's made it worse is the underreaction or half measures by Facebook, refusing to acknowledge, in my view, the important role that the fourth estate plays in our society."

Facebook has been asked to appear at public hearings by three different committees in the coming weeks to give details on the Russian ad effort. The company said there are limits to its ability to stop people from using its site to undermine democracy.

Facebook did not immediately respond to a request for comment.


What Happens When the Government Uses Facebook as a Weapon?
It’s social media in the age of “patriotic trolling” in the Philippines, where the government is waging a campaign to destroy a critic—with a little help from Facebook itself.

Rodrigo Duterte walked down the aisle of a packed auditorium at De La Salle University in downtown Manila, shaking hands and waving to nearly 2,000 college students snapping photos of the rising political star. At the front of the hall, waiting for him in a sharp red jacket, was Maria Ressa, co-founder of the Philippines’ largest online news site, Rappler.

Ressa, something of a journalistic legend in her country, had invited five candidates for the 2016 Philippine presidential election to a Rappler forum called #TheLeaderIWant. Only Duterte showed on this January afternoon. So, after the crowd stood for the national anthem, Ressa introduced the lone candidate and his running mate. “The stage is yours,” she said to applause.

For the next two hours, Duterte, under bright lights, sat in a white leather chair as Ressa lobbed questions that had been crowdsourced on Facebook, the co-sponsor of the forum. This was a peak moment for both interviewer and subject. While the event elevated Ressa and her four-year-old company, it also gave the then-mayor of Davao City, known as “the Punisher” for his brutal response to crime in the southern Philippine city, an exceptional opportunity to showcase his views. It was broadcast on 200 television and radio stations, and viewing parties on more than 40 college campuses across the Philippines tuned in as the event was livestreamed.

The Philippines is prime Facebook country—smartphones outnumber people, and 97 percent of Filipinos who are online have Facebook accounts. Ressa’s forum introduced Duterte to Filipino millennials on the platform where they live. Duterte, a quick social media study despite being 71 at the time of the election, took it from there. He hired strategists who helped him transform his modest online presence, creating an army of Facebook personalities and bloggers worldwide. His large base of followers—enthusiastic and often vicious—was sometimes called the Duterte Die-Hard Supporters, or simply DDS. No one missed the reference to another DDS: Duterte’s infamous Davao Death Squad, widely thought to have killed hundreds of people.

“At the beginning I actually loved it because I felt like this was untapped potential,” Ressa says. “Duterte’s campaign on social media was groundbreaking.”

Until it became crushing. Since being elected in May 2016, Duterte has turned Facebook into a weapon. The same Facebook personalities who fought dirty to see Duterte win were brought inside the Malacañang Palace. From there they are methodically taking down opponents, including a prominent senator and human-rights activist who became the target of vicious online attacks and was ultimately jailed on a drug charge.

And then, as Ressa began probing the government’s use of social media and writing stories critical of the new president, the force of Facebook was turned against her.


To get to the offices of Rappler—the word is a portmanteau of “rap” and “ripple”—you wind through the hilly streets of Manila, swept along by a tide of motorcycles, small cars, and vibrantly decorated jeepneys. Perched at the top of a hill is an up-and-coming neighborhood just north of the Pasig River, where Rappler sits on the third floor of a nondescript tower nestled between the Coffee Bean & Tea Leaf and a Tim Hortons. The elevator doors open to a large concrete hall, with orange (Rappler’s signature color) balloons floating on the ground in front of a glass-enclosed office. Inside, about 100 editors, reporters, videographers, and other staff churn out various types of content—breaking news, lifestyle stories, edgy video features in the style of Vice News. On the day I visited in October, a video team was editing a virtual-reality documentary about the city of Marawi, which for nearly six months had been embroiled in a war between the Philippine government and Islamic militants.

Rappler’s varied content is a reflection of Ressa, a woman who has so many ideas that she often shifts topics midsentence and will occasionally run from desk to desk for meetings. She spent almost two decades on-air with CNN, then led the news division of the largest broadcaster in the Philippines, ABS-CBN Corp. Born in Manila and raised in New Jersey, she broke major stories after the Sept. 11 terrorist attacks, connecting the masterminds of the plot to terror cells in the Philippines. She wrote two books on Southeast Asian jihadi networks, and in 2008 personally negotiated the release of three members of her news staff who’d been kidnapped by Abu Sayyaf, an al-Qaeda affiliate in the southern Philippines.

So it may be surprising to learn that the interview that launched Rappler six years ago was a Facebook video filmed in Ressa’s apartment with Alodia Gosiengfiao, a young cosplay (as in costume play) model. She was best known for dressing up as pigtailed anime characters and buxom video game heroines. With nearly 1 million Facebook followers, she helped Rappler position itself outside the old strictures of traditional news.

Rappler demonstrated its seriousness, however, by dominating the 2012 coverage of the impeachment trial of the chief justice of the supreme court. The next year the company put together a public debate forum for Senate candidates that was livestreamed on Facebook. As each candidate answered questions, audience members clicked on what Rappler called a mood meter, and a line gauging their reactions popped up on a screen next to the candidate. It was a breakout moment for Rappler, even if the candidates vowed never to participate in that setting again—they described the experience as nerve-wracking. (Ressa says that reaction partly explains why Duterte was the only candidate to accept her invitation for her presidential forum.)

Rappler was given another boost in March 2015 when it entered into a partnership with Internet.org, a free service established by Facebook Inc. aimed at giving the world’s then nearly 5 billion unconnected people access to the internet—and, of course, to Facebook. The program was meant to highlight the company’s expansive vision of itself. Facebook wasn’t just about connecting friends anymore. It was becoming a basic necessity, a powerful tool for poor and sometimes isolated people in Colombia, India, Ghana, Kenya, Tanzania, Zambia—and now the Philippines.

To advertise the global rollout of Internet.org, Facebook Chief Executive Officer Mark Zuckerberg posted a picture on his page of a young Filipino looking at a phone while sitting in the cab of a colorful motorized tricycle of the sort that is ubiquitous in the Philippines. “Here’s a photo of Jaime, a driver in Manila who uses Facebook and the internet to stay in touch,” Zuckerberg wrote. “We’re one step closer to connecting the world. ... Now everyone in the country can have free access to internet services.” Rappler would be one of the free network’s featured sites.

As the campaign for the 2016 Philippine presidential election got under way, Facebook began receiving inquiries from candidates on how they could best use the platform. In January the company flew in three employees who spent a week holding training sessions with candidates. When it was Duterte’s turn, the Facebook team gathered with the campaign inside the Peninsula Manila Hotel. The campaign staff was trained in everything from the basics of setting up a campaign page and getting it authenticated with the trademark blue check mark to how to use content to attract followers. As an example of the use of unscripted video, the Duterte campaign was shown a live Facebook video of Barack Obama preparing for his State of the Union speech in 2016. The clip garnered more views than a video of the actual address to Congress.

Armed with new knowledge, Duterte’s people constructed a social media apparatus unlike that of any other candidate in the race. The strategy relied on hundreds of volunteers organized into four groups—three in the Philippines, based on geography, and one comprising overseas Filipino workers, a crucial constituency—to distribute messages created by the campaign. Every day the campaign would tee up the messages for the following day, and the volunteers would distribute them across networks that included real and fake Facebook accounts, some with hundreds of thousands of followers.

Facebook initially started receiving complaints about inauthentic pages. It seemed harmless enough—they supported a range of candidates, and most of them appeared to originate from zealous fans. Soon, however, there were complaints about Duterte’s Facebook army circulating aggressive messages, insults, and threats of violence. Then the campaign itself began circulating false information. In March one of the campaign’s Facebook pages posted a fake endorsement by Pope Francis, with the words “Even the Pope Admires Duterte” beneath the pope’s image. The Catholic Bishops’ Conference of the Philippines posted a statement saying, “May we inform the public that this statement from the Pope IS NOT TRUE. ... We beg everyone to please stop spreading this.”

Duterte ended up dominating the political conversation so thoroughly that in April, a month before the vote, a Facebook report called him the “undisputed king of Facebook conversations.” He was the subject of 64 percent of all Philippine election-related conversations on the site.

After Duterte won, Facebook did what it does for governments all over the world—it began deepening its partnership with the new administration, offering white-glove services to help it maximize the platform’s potential and use best practices. Even as Duterte banned the independent press from covering his inauguration live from inside Rizal Ceremonial Hall, the new administration arranged for the event to be streamed on Facebook, giving Filipinos around the world insider access to pre- and post-ceremonial events as they met their new strongman.

Internet.org was just one part of a decade-long campaign of global expansion for Facebook. In countries such as the Philippines, the efforts have been so successful that the company is able to tout to its advertisers that its network is, for many people, the only version of the internet they know. Repressive governments originally treated Facebook, and all social media, with suspicion—they saw how it could serve as a locus for dissidents, as it had in the Arab Spring in 2011. But authoritarian regimes are now embracing social media, shaping the platforms into a tool to wage war against a wide range of opponents—opposition parties, human-rights activists, minority populations, journalists.

The phenomenon, sometimes referred to as “patriotic trolling,” involves the use of targeted harassment and propaganda meant to go viral and to give the impression that there is a groundswell of organic support for the government. Much of the trolling is carried out by true believers, but there is evidence that some governments, including Duterte’s, pay people to execute attacks against opponents. Trolls use all the social media platforms—including Twitter, Instagram, and YouTube, in addition to the comments sections of news sites. But in the Philippines, Facebook is dominant.

Ressa exposed herself to this in September 2016, a little more than three months after the election. On a Friday night, a bomb ripped through a night market in Davao City, Duterte’s hometown, killing 14 and injuring dozens more. Within hours, Duterte implemented a nationwide state of emergency. That weekend, the most-read story on Rappler was an archived item about the arrest of a man caught planting an improvised explosive device, also in Davao City. The article had been written six months earlier, and the incident had no connection to the night market bombing—but it was circulating on the same Facebook pages that promoted Duterte’s presidency, and people were commenting on it as if to justify the state of emergency.

This, and another earlier incident, became the basis of the article that altered Ressa’s relationship with her government. She titled it “Propaganda War: Weaponizing the Internet.” Within hours of publication, she and Rappler were being attacked through Facebook. She began receiving rapid-fire hate messages. “Leave our country!!!! WHORE!!!!!!” read one. The messages became increasingly violent: “I want Maria Ressa to be raped repeatedly to death.” When she later reported that she was getting as many as 90 such messages per hour, including rape threats, the tidal wave began again. The onslaught became so disturbing that Ressa sent her social media team to counseling. She installed an armed guard in front of her office. By November an #UnfollowRappler campaign led to Rappler losing 52,000 of its Facebook followers, or about 1 percent.

Manila was changing. The economy had boomed under the previous administration, but much of the wealth gains went to the top, and some Filipinos had taken to calling the capital “Imperial Manila.” Duterte, who was born in one of the nation’s poorest regions, positioned himself as a champion for regular people. He told Filipinos the nation was being ruined by drug abuse and related crime, and promised to bring to the capital the merciless strategy he had employed in Davao. Soon, Duterte’s death squads prowled the streets at night in search of drug dealers and other criminals. Images of blood-smeared bodies slumped over on sidewalks, women cradling dead husbands, and corpses in satin-lined caskets went viral. As the bodies piled up—more than 7,000 people have been killed as part of Duterte’s war on drugs—the social media war escalated.

Ressa had already watched Duterte’s supporters undo his opponents. Senator Leila de Lima, who had led an investigation into Duterte’s extrajudicial killings in Davao City, was targeted by viral Facebook articles with headlines like “Leila de Lima is an idiot” and “Leila de Lima is the patron saint of drug lords.” An #ArrestLeilaDeLima campaign began—the origins are unclear—and in February she was arrested, on drug charges that she disputes. (De Lima is listed by Amnesty International as one of the world’s “Human Rights Defenders Under Threat.”) Duterte also targeted the Philippine Daily Inquirer, one of the nation’s most prominent newspapers, in part because it maintained what it called a kill list—a record of drug war victims. In public remarks, Duterte called the owners of the newspaper “sons of bitches” who “went too far” in their “nonsense” and warned that “someday, karma will come.” In July, the family that owned the paper announced it was selling it to a wealthy businessman who is a close friend of Duterte’s.

Ressa grew more alarmed after the powerful campaign bloggers were brought even closer—in one case, into the administration itself. Mocha Uson, an actress and DDS blogger with more than 5 million followers, was named assistant communications secretary. R.J. Nieto, who runs the influential pro-Duterte site Thinking Pinoy, which has frequently taken aim at Ressa, was hired as a consultant to the Department of Foreign Affairs. (“Pinoy” is slang for Filipino.)

The Rappler data team had spent months keeping track of the Facebook accounts that were going after critics of Duterte. Now Ressa found herself following the trail of her own critics as well. She identified 26 accounts that were particularly virulent. They were all fake (one account used a photo of a young woman who was actually a Korean pop star) and all followed one another. The 26 accounts were posting nearly the exact same content, which was also appearing on faux-news sites such as Global Friends of Rody Duterte and Pinoy Viral News.

The messages being posted consistently linked back to pro-Duterte pages. Ressa and her team put all these accounts into a database, which grew rapidly as they began automating the collection of information, scraping Facebook pages and other public sites. They took to calling their database the Shark Tank. Today it contains more than 12 million accounts that have created or distributed pro-Duterte messages or fake news. Ressa isn’t sure how many of these accounts are fake.

“Either they’re negligent or they’re complicit in state-sponsored hate”

Even in the U.S., where Facebook has been hauled before Congress to explain its role in a Russian disinformation campaign designed to influence the U.S. presidential election, the company doesn’t have a clear answer for how it will stem abuse. It says it will add 10,000 workers worldwide to handle security issues, increase its use of third-party fact-checkers to identify fake news, and coordinate more closely with governments to find sources of misinformation and abuse. But the most challenging questions—such as what happens when the government itself is a bad actor and where to draw the line between free speech and a credible threat of violence—are beyond the scope of these fixes. What stays and what goes from the site is still decided subjectively, often by third-party contractors—many of them stationed, as it happens, in the Philippines, a long-standing outsourcing hub.

Facebook is inherently conflicted. It promises advertisers it will deliver interested and engaged users—and often what is interesting and engaging is salacious, aggressive, or simply false. “I don’t think you can underestimate how much of a role they play in societal discourse,” says Carly Nyst, a London-based consultant on technology and human rights who has studied patriotic trolling around the world. “This is a real moment that they have to take some responsibility. These tools they’ve promised as tools of communication and connection are being abused.”

Facebook’s executives say the company isn’t interested in being an arbiter of truth, in part because it doesn’t want to assume the role of censor or be seen as having an editorial opinion that may alienate users. Nonetheless, it’s been under increasing pressure to act. In the Philippines, it began conducting safety workshops in 2016 to educate journalists and nongovernmental organization workers. These cover the basics: an overview of the company’s community standards policies, how to block a harasser, how to report abusive content, how to spot fake accounts and other sources of misinformation. The company has increased the number of Tagalog speakers on its global Community Operations team in an effort to better root out local slurs and other abusive language.

Still, Facebook maintains that an aspect of the problem in the Philippines is simply that the country has come online fast and hasn’t yet learned the emergent rules of the internet. In October the company offered a “Think Before You Share” workshop for Filipino students, which focused on teaching them “digital literacy” skills, including critical thinking, empowerment, kindness, and empathy.

Nyst says this amounts to “suggesting that digital literacy should also encapsulate the ability to distinguish between state-sponsored harassment and fake news and genuine content.” The company, she says, “is taking the position that it is individuals who are at fault for being manipulated by the content that appears on Facebook’s platform.”

In Europe, that isn’t good enough: The U.K., Germany, and France have threatened fines and increased regulation if the company doesn’t take more steps to prevent fake news and extremist propaganda. Ten days before the French elections in April, Facebook announced it would suspend 30,000 fake accounts. Ressa wondered why the company was willing to act in France but in the Philippines said people needed to bone up on online etiquette. “We are going through much worse than any of the Western nations, and our institutions are far weaker,” she says. “It made me really realize that I needed to speak up.”

In April, Ressa met with Zuckerberg at the F8 conference in San Jose, an annual event for Facebook developers. After a keynote by Zuckerberg, Ressa joined a group of other entrepreneurs for a meeting with the Facebook founder. When it was her turn to talk, she described how critical Facebook was to Filipinos, that it was essentially the country’s most important public space. Politely, she also expressed dismay at how it had become a tool to spread what she called government propaganda. She then invited Zuckerberg to come to the Philippines.

A few days later she sent an email to a New York-based Facebook manager in charge of journalism projects saying that the issues she’d raised in earlier emails to the company’s Asia-Pacific division had not been addressed. She attached some of the underlying data from the Shark Tank and outlined the scope of the harassment she was enduring. In May she wrote again, this time to two additional U.S.-based Facebook managers. “Please take a closer look at the Philippines,” she wrote. “While you’ve taken action in Europe, the danger is far worse for us, and Facebook is the platform they use to intimidate, harass, and attack. It is dangerous. I fear where this may lead. Best, Maria.” In yet another email, she suggested the company consider changing its algorithm to take into account the difference between credible news, harassment, and government propaganda.

In a response to questions from Bloomberg Businessweek, Mia Garlick, Facebook’s director of Asia-Pacific safety programs, said, “We are committed to helping ensure that journalists around the world feel safe on Facebook as they connect their audiences with meaningful stories. We permit open and critical discussion of people who are featured in the news or have a large public audience based on their profession or chosen activities, but will remove any threats or hate speech directed at journalists, even those who are public figures, when reported to us.”

The pressure on Ressa increased in May after Rappler published a transcript of a call between Duterte and U.S. President Donald Trump, in which Duterte called the leader of North Korea a “madman.” Nieto, the government consultant, posted a video on Facebook calling Ressa a “traitor” who had made the Philippines a target of North Korea. The video got 83,000 views and drew comments like “Declare Rappler & Maria Ressa as enemies of the Filipinos” and “#ArrestMariaRessa.” In July, in his annual state of the nation address, Duterte stood at a podium before the Philippine Congress and for nearly two hours hammered against illegal drugs, corruption, and pollution. Then he began a tirade against news organizations, saying that by law they’re supposed to be entirely owned by Filipinos. That’s when he singled out Ressa’s company: “Rappler, try to pierce the identity and you will end up with American ownership,” he said.

In August the Philippine Securities and Exchange Commission established a special panel to investigate Rappler. In the complaint, a copy of which was obtained by Bloomberg Businessweek, the panel ordered Rappler to produce evidence that the company wasn’t in violation of a constitutional provision limiting ownership of media companies to Philippine citizens. Rappler has overseas investors, including North Base Media Ltd., a Cayman Islands-based venture capital firm with investors from around the world, and Omidyar Network, a venture capital firm started by EBay Inc. founder Pierre Omidyar. Rappler’s response to the special panel was that the investments were made legally through a common financial instrument called a Philippine Depositary Receipt, which, unlike traditional shares, does not confer ownership or control. It isn’t clear whether this explanation will suffice. The SEC has broad powers to refer cases to the Philippines’ Department of Justice for criminal charges.

Rappler was born on Facebook and lives there still—it’s the predominant source of Rappler’s traffic. So Ressa finds herself in an awkward spot. She has avoided rocking the boat, because she worries that one of the most powerful companies in the world could essentially crush her. What if Facebook tweaked the algorithm for the Rappler page, causing traffic to plummet? What if it selectively removed monetization features critical to the site’s success? “There’s absolutely no way we can tell what they’re doing, and they certainly do not like being criticized,” she says. But after more than a year of polite dialogue with Facebook, she grew impatient and frustrated.

In a trip to Washington in early November, she met with several lawmakers, telling them that she believes Facebook is being used by autocrats and repressive regimes to manipulate public opinion and that the platform has become a tool for online hooliganism. She did the same in a speech at a dinner hosted by the National Democratic Institute, where Rappler was presented with an award for “being on the front lines of fighting the global challenge of disinformation and false news.”

As she accepted her award, Ressa recalled that she started as a journalist in the Philippines in 1986, the year of the People Power Revolution, an uprising that ultimately led to the departure of Ferdinand Marcos and the move from authoritarian rule to democracy. Now she’s worried that the pendulum is swinging back and that Facebook is hastening the trend. “They haven’t done anything to deal with the fundamental problem, which is they’re allowing lies to be treated the same way as truth and spreading it,” she says. “Either they’re negligent or they’re complicit in state-sponsored hate.”

In November, Facebook announced a new partnership with the Duterte government. As part of its efforts to lay undersea cables around the world, Facebook agreed to team up with the government to work on completing a stretch bypassing the notoriously challenging Luzon Strait, where submarine cables in the past have been damaged by typhoons and earthquakes. Facebook will fund the underwater links to the Philippines and provide a set amount of bandwidth to the government. The government will build cable landing stations and other necessary infrastructure.

That’s the sort of big project Facebook embraces. It’s also testing a solar-powered drone that will beam the internet to sub-Saharan Africa and has a team of engineers working on a brain implant to allow users to type with their minds. To Ressa, Facebook looks like a company that will take on anything, except protecting people like her.

I continue to believe that elevating the judgment of experienced, intelligent news experts — editors, by any other name — must be a central part of the answer.

The tech giants need to fully grasp what’s happening here, and devote their attention and plentiful resources to addressing it. And soon — before their weaponized platforms kill what’s left of our democratic society.

An open letter to a tech giant...

We work for Google. Our employer shouldn't be in the business of war


In this open letter to Google’s CEO, over 3,000 employees urged the company not to work on a Pentagon ‘AI surveillance engine’ used for drone warfare


Open letter signed by Google employees

Amid growing fears of biased and weaponized AI, Google is already struggling to keep the public’s trust.’


Dear Sundar,


We believe that Google should not be in the business of war. Therefore we ask that Project Maven be cancelled and that Google draft, publicize and enforce a clear policy stating that neither Google nor its contractors will ever build warfare technology.


Google is implementing Project Maven, a customized AI surveillance engine that uses “wide area motion imagery” data captured by US government drones to detect vehicles and other objects, track their motions and provide results to the Department of Defense.


Recently, Googlers voiced concerns about Maven internally. Diane Greene responded, assuring them that the technology will not “operate or fly drones” and “will not be used to launch weapons”. While this eliminates a narrow set of direct applications, the technology is being built for the military, and once it’s delivered it could easily be used to assist in these tasks. This plan will irreparably damage Google’s brand and its ability to compete for talent.


    We request that you cancel this project immediately 


Amid growing fears of biased and weaponized AI, Google is already struggling to keep the public’s trust. By entering into this contract, Google will join the ranks of companies like Palantir, Raytheon and General Dynamics. The argument that other firms, like Microsoft and Amazon, are also participating doesn’t make this any less risky for Google. Google’s unique history, its motto “don’t be evil”, and its direct reach into the lives of billions of users set it apart.


We cannot outsource the moral responsibility of our technologies to third parties. Google’s stated values make this clear: every one of our users is trusting us. Never jeopardize that. Ever. This contract puts Google’s reputation at risk and stands in direct opposition to our core values. Building this technology to assist the US government in military surveillance – and potentially lethal outcomes – is not acceptable.


Recognizing Google’s moral and ethical responsibility, and the threat to Google’s reputation, we request that you: 

1. Cancel this project immediately. 

2. Draft, publicize and enforce a clear policy stating that neither Google nor its contractors will ever build warfare technology.




Since you’re here …


… we have a small favour to ask. More people are reading my blog than ever but advertising revenues across the media are falling fast. And unlike many news organisations, we haven’t put up a paywall – we want to keep our journalism as open as we can. So you can see why we need to ask for your help. My blog’s independent, investigative journalism takes a lot of time, money and hard work to produce. But we do it because we believe our perspective matters – because it might well be your perspective, too.


    I appreciate there not being a paywall: it is more democratic for the media to be available for all and not a commodity to be purchased by a few. I’m happy to make a contribution so others with less means still have access to information.

    Thomasine, Sweden


If everyone who reads our reporting, who likes it, helps fund it, our future would be much more secure. For as little as $1, you can support my blog – and it only takes a minute. Thank you.

Comments

Popular posts from this blog

737 MAX 8: Airplane A 50 Year Old Design Flaw Maybe The Culprit And What's Haunting Boeing

Suicide Forest

Set Expectations:Seven (7) Things You Should Stop Expecting from Others...