LAPD partnered with tech firm that enables secretive online spying

The Los Angeles police department pursued a contract with a controversial technology company that could enable police to use fake social media accounts to surveil civilians and claimed its algorithms can identify people who may commit crimes in the future.

A cache of internal LAPD documents obtained through public records requests by the Brennan Center for Justice, a non-profit organization, and shared with the Guardian, reveal that LAPD in 2019 trialed social media surveillance software from the analytics company Voyager Labs.

Like many companies in this industry, Voyager Labs’ software allows law enforcement to collect and analyze large troves of social media data to investigate crimes or monitor potential threats.

But documents reveal the company takes this surveillance a step further. In its sales pitch to LAPD about a potential long-term contract, Voyager said its software could collect data on a suspect’s online network and surveil the accounts of thousands of the suspect’s “friends”. It said its artificial intelligence could discern people’s motives and

— source theguardian.com | Sam Levin, Johana Bhuiyan | 17 Nov 2021

Nullius in verba


How Facebook Programmed Our Relatives

Three years ago, on his birthday, a law professor watched his e-mail inbox fill with Facebook notifications indicating that friends had posted messages on his wall. The messages made him sad. The clogged inbox was annoying, but what really upset him was having disclosed his birth date to Facebook in the first place. It’s not necessary for social networking or to comply with privacy laws, as some people mistakenly believe. He hadn’t paid much attention when he signed up—as with most electronic contracts, there was no room for negotiation or deliberation about terms. He complied with Facebook’s instructions, entered the data and clicked a button.

A few days later, the law professor decided to change the birth date on his Facebook profile to avoid the same situation next year. But when the fake date rolled around, his inbox again flooded with Facebook notifications. Two of the messages were from close relatives, one of whom he had spoken with on the phone on his actual birthday!

How could she not realize that the date was fake?

Our hypothesis: she’d been programmed!

That law professor was one of us (Brett Frischmann), and it confirmed his suspicions that most people respond automatically to Facebook’s prompts to provide information or contact a friend without really thinking much about it. That’s because digital networked technologies are engineering humans to behave like simple stimulus-response machines.

— source blogs.scientificamerican.com | Brett Frischmann | Jun 21, 2018

Social media makes it difficult to identify real news

The study found that people viewing a blend of news and entertainment on a social media site tended to pay less attention to the source of content they consumed — meaning they could easily mistake satire or fiction for real news. People who viewed content that was clearly separated into categories — such as current affairs and entertainment — didn’t have the same issues evaluating the source and credibility of content they read.

The findings show the dangers of people getting their news from social media sites like Facebook or Twitter. We are drawn to these social media sites because they are one-stop shops for media content, updates from friends and family, and memes or cat pictures. But that jumbling of content makes everything seem the same to us. It makes it harder for us to distinguish what we need to take seriously from that which is only entertainment.

The study appears online in the journal New Media & Society.

The results showed that when the content was not grouped by distinct topics — in other words, news posts appeared on the same page with entertainment posts — participants

— source Ohio State University | Mar 30, 2020

Nullius in verba


How Facebook let fake engagement distort global politics

Shortly before Sophie Zhang lost access to Facebook’s systems, she published one final message on the company’s internal forum, a farewell tradition at Facebook known as a “badge post”.

“Officially, I’m a low-level [data scientist] who’s being fired today for poor performance,” the post began. “In practice, in the 2.5 years I’ve spent at Facebook, I’ve … found multiple blatant attempts by foreign national governments to abuse our platform on vast scales to mislead their own citizenry, and caused international news on multiple occasions.”

Over the course of 7,800 scathing words, Zhang outlined Facebook’s failure to combat political manipulation campaigns akin to what Russia had done in the 2016 US election. “We simply didn’t care enough to stop them,” she wrote. “I know that I have blood on my hands by now.”

Zhang knew that this was not a tale that Facebook wanted her to tell, so when she hit publish, she also launched a password-protected website with a copy of the memo and provided the link and password to Facebook employees. Not only did Facebook temporarily delete the post internally, the company also contacted Zhang’s hosting service and domain registrar and forced her website offline.

Now, with the US election over and a new president inaugurated, Zhang is coming forward to tell the whole story on the record. (Excerpts of her memo were first published in

— source theguardian.com | Julia Carrie Wong | 20 Apr 2021

Nullius in verba


Facebook planned to remove fake accounts in India

Facebook allowed a network of fake accounts to artificially inflate the popularity of an MP from India’s ruling Bharatiya Janata party (BJP), for months after being alerted to the problem.

The company was preparing to remove the fake accounts but paused when it found evidence that the politician was probably directly involved in the network, internal documents seen by the Guardian show.

The company’s decision not to take timely action against the network, which it had already determined violated its policies, is just the latest example of Facebook holding the powerful to lower standards than it does regular users.

“It’s not fair to have one justice system for the rich and important and one for everyone else, but that’s essentially the route that Facebook has carved out,” said Sophie Zhang, a former data scientist for Facebook who uncovered the inauthentic network. Zhang has come forward to expose the company’s failure to address how its platform is being used to manipulate political discourse around the world.

Facebook’s failure to act against the MP will also raise questions about Facebook’s relationship with the Hindu nationalist party. Facebook has repeatedly treated rule violations

— source theguardian.com | Julia Carrie Wong , Hannah Ellis-Petersen | 15 Apr 2021

Nullius in verba


BJP Gets Cheaper Ad Rates on Facebook Due to its Polarising Content

Facebook’s algorithm grossly favours Bharatiya Janata Party’s (BJP) advertisements because of the divisive nature of the content, as revealed by part four of a year-long investigation conducted by Kumar Sambhav of The Reporters’ Collective (TRC) and Nayantara Ranganathan of ad.watch, a research project that studies political ads on social media.

The first three parts of the investigation focused on how Facebook helps BJP in the digital space by promoting advertisements from its surrogates. The first part of the investigation found that Facebook carried 718 surrogate political ads of Reliance Jio-funded New Emerging World of Journalism Limited promoting the BJP and denigrating its rivals costing Rs 52,00,000 viewed more than 290 million times in those 22 months.

The second part of the investigation discovered that at least 23 ghost and surrogate advertisers paid more than Rs 5.8 crore to Facebook to run 34,884 ads either to promote

— source newsclick.in | 17 Mar 2022

Nullius in verba


Facebook ‘Charged BJP Less’ Than Rivals

In a huge and unfair advantage to the Narendra Modi-led Bharatiya Janata Party (BJP) government, Mark Zuckerberg’s Facebook promoted the party’s advertisements at 29% less price than the amount paid by its arch-rival Congress during 10 elections between February 2019 and November 2020 (22 months), allowing it to reach a wider audience than opposition parties.

Part three of a year-long investigation conducted by Kumar Sambhav of The Reporters’ Collective (TRC) and Nayantara Ranganathan of ad.watch, a research project that studies political ads on social media, and published by Al Jazeera has revealed that Facebook showed a BJP ad on average one million times but charged only Rs 41, 844 from the party, its candidates and affiliated organisations.

On the other hand, the report discovered that the social media giant charged the Congress, its candidates and affiliated organisations Rs 53,776 on average to show one ad for the same number of times.

— source newsclick.in | 16 Mar 2022

Nullius in verba