Skip to content

Privacy News: January 17

Algorithms of oppression (re)producing racism, pushing back on AI hype, a new approach to try to make GDPR's protections real, state privacy legislation ... and more!

The word "news", all in capital letters, on a blue background with meridian lines.

Algorithms of oppression (re)producing racism, pushing back on AI hype, a new approach to try to make GDPR's protections real, state privacy legislation ... and more!

When Black Death Goes Viral: How Algorithms of Oppression (Re)Produce Racism and Racial Trauma

Dr. Tiera Chantè Tanksley on SAGE Perspectives Blog (

"Concerned about how seeing images of Black people dead and dying would affect young social media users, I conducted a study to understand how digitally mediated traumas were impacting Black girls’ mental and emotional wellness.  As I completed interviews with nearly 20 Black girls (ages 18-24) across the US and Canada, my initial fears were quickly confirmed: Black girls reported unprecedented levels of anxiety, depression, fear and chronic stress from encountering Black death online. The most common phrases participants used were “traumatizing,” “exhausting” and “PTSD.” Many of the girls endured mental, emotional and physiological effects, including insomnia, migraines, nausea, prolonged “numbness” and dissociation.... Mental health concerns directly affected schooling experiences, with many girls too overwhelmed, triggered or physically exhausted to fully engage in academics following high profile police killings."

Algorithms Allegedly Penalized Black Renters. The US Government Is Watching

Khari Johnson on WIRED (

The Department of Justice has warned SafeRent, a provider of tenant-screening software, that its tenant-screening algorithms must comply with fair housing law. SafeRent claims that they're exempt from the law because their discrimintory algorithms only advise landlords and don’t make decisions, but DOJ's Kristen Clarke isn't having any of that:

“Housing providers and tenant screening companies that use algorithms and data to screen tenants are not absolved from liability when their practices disproportionately deny people of color access to fair housing opportunities.”

Stop feeding the hype and start resisting

Iris van Rooij on Iris van Rooij (

As van Rooij points out, it’s almost as if academics are eager to do the PR work for OpenAI (the company that created ChatGPT).  Why?

"As Tamar Sharon, professor of Ethics and Political Philosophy and co-director of iHub at the Radboud University, notes in the Dutch newspaper NRC1: “the ideals of OpenAI are not credible”, the company is “founded by millionaires based on their ideology of Effective Altruism, EA” and while they talk about making “beneficial AI”, so far this type of tech is realised by exploiting cheap labour of underpaid workers and the push to make Large Language Models (LLMs), such as ChatGPT, larger and larger creates a “gigantic ecological footprint” with implications for “our planet that are far from beneficial for humankind”....

Please join me in resisting and start helping to curb the hype."

Will Europe’s Privacy Bill of Rights Ever Truly Be Enforced?

Julia Angwin on The Markup (

As Angwin says, the GDPR is essentially a privacy bill of rights.

Amid rising commercial surveillance, it offered EU residents some essential rights over their data—including the right to see their data, to correct it, to delete it, to restrict or object to specific uses, and to not have data used about their sex, health, religion, political identity, and other sensitive topics....

But many of the rights have been nearly impossible to exercise because legal challenges based on the law have often languished at the Irish Data Protection Commission, which handles most GDPR complaints against Big Tech.

Angwin talks with Tanya O’Carroll, co-founder and former director of Amnesty Tech, who's trying to going around the Irish regulator by filing a lawsuit in a U.K. court against Meta for violating the GDPR,

Mastodon and the Fediverse

How you could build a search that the fediverse would welcome

Anil Dash (

Mastodon doesn't have full-text search.  This is a hassle from a user interface perspective, and as people check out Mastodon and other compatible fediverse software as alternatives to Twitter, there have been several recent attempts to implement search ... all of which have gone down in flames after intense community pushback.  As Dash says

[S]earch is also a signifier to those who pioneered and established the current era of the fediverse, symbolizing the extractive and exploitative hypergrowth systems that often ruined the positivity and promise of the human web.

One of the issues with all of the previous search proposals is that they're opt-out based: posts get indexed for search unless people say they shouldn't.  Dash suggests a consent-based, opt-in approach, which is much better aligned with the fediverse's values – and with trends in privacy law.

Facial recognition

Lawyers Barred by Madison Square Garden Found a Way Back In

Kashmir Hill on th New York Times  (

MSG Entertainment resorted to facial recognition technology to kick out legal foes, but some have undermined the ban using a law passed to protect theater critics in 1941.

Does the European Parliament use Facial Recognition Technology?

Kris Shrishak on Irish Council for Civil Liberties (

ICCL has received documents that indicate the European Parliament requested for CCTV cameras with facial recognition capability.

State privacy legislation

Privacy in the City that Never Sleeps: New NY Privacy Bill

Odia Kagan on

New York's S 365 is the latest versiof of the New York Privacy Act.  It follows GDPR's terminology (controllers, processors, and personal data), and takes a similar approaches to California when it comes to deidentiied data, sale of data, and sensitive data.  It also has requirements for automated decision making.  It'll be interesting to see how far it gets this session.

South Carolina earns mid-pack scores for online privacy laws

T.A. DeFeo The Center Square (

South Carolina ranked in the middle of the pack nationwide for its laws governing online privacy.  The Palmetto State scored 5.5 out of 25 points in a new analysis from Comparitech, a website focused on cyber security and online privacy. According to Rebecca Moody, head of data research at Comparitech,

“Its score stems from data disposal laws which cover companies and government entities, legislation to protect K-12 student information, a shield law to protect journalists, and an insurance data security law. It also scores half a point thanks to its protections for genetic testing when it comes to insurance quotes and coverage.”

Virginia’s Data Privacy Law Just Went Into Effect – What You Should Know.

Melody McAnally on JD Supra (

On January 1, 2023, Virginia’s Consumer Data Protection Act (“Virginia Privacy Law”) went into effect.  Here's what you need to know.

And ...

Wikipedia needs different safety rules, says foundation

Chris VallanceTechnology reporter on BBC News (

The Online Safety Bill shouldn’t treat community-run sites like big tech firms its foundation says.

US obtains exclusion of NGOs from drafting AI treaty

Luca Bertuzzi on EURACTIV (

Civil society organisations have been excluded from the drafting process of the first international treaty on Artificial Intelligence based on a request of the US to avoid countries’ positions becoming public.

AI and privacy: Experts worry users may have already ‘traded a lot’ for services

Irelyne Lavery on Global News (

With individual data being largely unregulated, AI companies are walking a fine line with the privacy of consumers, according to one expert.

Elizabeth MacBride on CNBC (

Many individuals and businesses rely on Google and Microsoft email programs created long ago, and big tech email ‘age’ is a big cybersecurity risk.

James Vincent on The Verge (

AI art gets its first major copyright lawsuit

Meta alleges surveillance firm collected data on 600,000 users via fake accounts

Sam Levin on The Guardian (

The social media giant sued to prevent a surveillance company partnering with the Los Angeles police from obtaining data

Data Is What Data Does: Regulating Use, Harm, and Risk Instead of Sensitive Data

Daniel Solove on TeachPrivacy (

Sensitive data is unworkable and counterproductive, especially in an age of Big Data.. It is also based on a conceptual mistake.

An Interview With the Guy Who Has All Your Data

Thomas Germain on Gizmodo (

It’s 10 pm. Do you know where your data is? Chad Engelgau does. He’s the CEO of Acxiom, a data broker. Your info is probably on one of his servers.

Meta sues “scraping-for-hire” service that sells user data to law enforcement

Dan Goodin on Ars Technica (

Israeli firm says it uses AI to analyze “billions of ‘human pixels’ and signals.”

‎The Gray Area with Sean Illing: Is ethical AI possible?

Apple Podcasts (

‎Timnit Gebru joins Sean Illing to discuss whether ethical AI is possible.

‘SweepWizard’ Policing App May Have Exposed Personal Data on Hundreds of Officers and Thousands of Suspects

The A.V. Club on Gizmodo (

A flaw in the app’s API reportedly let anyone with a specific URL view data on officers, suspects, and the operations they were engaged in.

New York Enacts First State “Right-to-Repair” Law

Jeffrey D. Neuburger on The National Law Review (

At the close of 2022, New York Governor Kathy Hochul signed the “Digital Fair Repair Act” (S4101A/A7006-B) (to be codified at N.Y. GBL §399-nn) (the “Act”). The

In new GOP-led House, Rep. Sara Jacobs targets global peacekeeping, digital privacy and children’s issues

Deborah Sullivan Brennan on San Diego Union-Tribune (

Last year, during her first term, Jacobs worked to boost military pay, housing and child care and protect digital privacy for reproductive health.

Three law firms join forces for mega privacy case against Medibank

John Davidson on Australian Financial Review (

The merging a class action lawsuit with a complaint to Australian privacy watchdog could result in billions of dollars in payouts to Medibank customers.

Gary Drenik on Forbes (

In our ever-digital world, privacy is the hottest commodity. Brands want to buy it, users want to keep it for themselves, and the middleman collecting all this information doesn’t quite know what’s right.

People of colour: there’s a bias in how pictures are used to depict disease in global health publications

Esmita Charani on The Conversation (

Through the choice of images in publications, women and children of colour in low and middle income countries were treated with less dignity and respect than those in high income countries.

Report about govt purchase of Israeli spy technology rocks Bangladesh

on Benar News (

Home minister acknowledges agencies are monitoring social media to thwart “anti-government activities.”