Chris Gilliard on The Atlantic (theatlantic.com)
Surveillance isn’t just imposed on people: many of us buy into it willingly. Gilliard's latest outstanding article starts with a scenario of somebody using Amazon Halo, Amazon Echo, Amazon Ring, Amazon Neighbors ... what could possibly go wrong?
These “smart” devices all fall under the umbrella of what the digital-studies scholar David Golumbia and I call “luxury surveillance”—that is, surveillance that people pay for and whose tracking, monitoring, and quantification features are understood by the user as benefits. These gadgets are analogous to the surveillance technologies deployed in Detroit and many other cities across the country in that they are best understood as mechanisms of control: They gather data, which are then used to affect behavior. Stripped of their gloss, these devices are similar to the ankle monitors and surveillance apps such as SmartLINK that are forced on people on parole or immigrants awaiting hearings. As the author and activist James Kilgore writes, “The ankle monitor—which for almost two decades was simply an analog device that informed authorities if the wearer was at home—has now grown into a sophisticated surveillance tool via the use of GPS capacity, biometric measurements, cameras, and audio recording.”
The functions Kilgore describes mirror those offered by wearables and other trackers that many people are happy to spend hundreds of dollars on. Gadgets such as Fitbits, Apple Watches, and the Amazon Halo are pitched more and more for their ability to gather data that help you control and modulate your behavior, whether that’s tracking your steps, looking at your breathing, or analyzing the tone of your voice. The externally imposed control of the formerly incarcerated becomes the self-imposed control of the individual.
Lauren Forristal on TechCrunch (techcrunch.com)
Privacy After Roe
Aden Klein and Joanne Kim on Tech Policy Press (techpolicy.press)
Police departments already purchase personal data from third party brokers. Now, that information may be used to criminalize abortion.
Coupled with preexisting surveillance systems, the massive amount of information that data brokers can assemble on women and people who get pregnant, as well as expecting families broadly, illustrates a problem of terrifying scale. The almost entirely unrestricted access that police departments have to commercial data and surveillance tools is equally horrifying- a subject that was discussed in detail at a House Judiciary Committee hearing this summer titled Digital Dragnets: Examining the Government’s Access to Your Personal Data....
Police use of data brokers to spy on citizens is well documented. However, following the money provides additional evidence of such practices. We analyzed public data and prior reporting of states banning or imminently banning abortion to identify companies and tools that could be used to aid in enforcing abortion laws. Even a high-level overview of public documents in St. Louis County, the State of Missouri Department of Public Safety, the City of Dallas, and the City of Houston shed light on the access that police have to supposedly private information.
Cracking Down on Federal Aid for Reproductive Health Surveillance: The Computer Analysis Response Team
Jake Laperruque, Center for Democracy and Technology (cdt.org)
The fourth in a series of blog posts examining how the federal government and the White House — which has pledged to fight the criminalization of abortion — can prevent federal surveillance assistance to state and local law enforcement from being used to investigate and prosecute reproductive health care choices. Previous posts looked at Regional Computer Forensic Laboratories (where federal officials help state and local law enforcement to access and analyze electronic devices), the National Domestic Computer Assistance Center (a federal entity that helps local law enforcement collect communications content and other records from companies), and fusion centers (hubs across the country that intake immense amounts of data and distribute it to other law enforcement entities, with significant federal funding and access to federal law enforcement information). This post focuses on the FBI’s Computer Analysis Response Team (CART), a component of the agency designed to provide digital forensics support across the country
Federal Privacy Legislation
Lydia X. Z. Brown on Center for Democracy and Technology (cdt.org)
Disabled people are one of the most hyper-surveilled communities within the U.S. Public and private entities alike collect enormous amounts of often deeply personal data about our lives and health, for purposes ranging from benign (such as tracking disability-targeted hiring benchmarks) to malicious (such as profiling students as likely future criminals). Algorithm-driven systems now commonly power recruiting and hiring processes, tenant background checks, public benefits applications, and even remote test proctoring, all with outsized impact on people with disabilities. Meanwhile, researchers and developers are racing to create increasingly sophisticated algorithms to detect disability and predict future diagnosis of mental health disabilities....
This year, Congress made history by advancing out of committee a bipartisan comprehensive privacy bill, the American Data Privacy and Protection Act (ADPPA).... For disabled people in particular, ADPPA offers heightened protections. ADPPA defines as “sensitive” any data that is or could be interpreted or perceived as relating to disability, or mental and physical health in general, including past, present, or future disabilities and any data that can be used to infer such information.
Cybersecurity Will Not Thrive in Darkness: A Critical Analysis of Proposed Amendments in Bill C-26 to the Telecommunications Act
Christopher Parsons on The Citizen Lab (citizenlab.ca)
This report offers 29 recommendations to the draft legislation in an effort to correct its secrecy and accountability deficiencies, while also suggesting amendments which would impose some restrictions on the range of powers that the government would be able to wield. It is important that these amen…
Nilay Patel / @reckless on The Verge (theverge.com)
Signal recently killed SMS support as part of that mission
Meghan McCarty Carino on Marketplace (marketplace.org)
New research suggests the proliferation of doorbell cameras create an environment of high surveillance that puts delivery drivers on edge.
Elizabeth Denham on International Association of Privacy Professionals (iapp.org)
David Flaherty – who studied with Alan Westin, published "Protecting Privacy in Surveillance Societies" in 1989, and was British Columbia's first Information and Privacy Commissioner in the 90s – passed on October 11.
Jeff Kosseff on WIRED (wired.com)
A case heading to SCOTUS claims platforms should be held responsible for their algorithmic recommendations. The author of The Twenty Six Words That Created The Internet argues that a history of the statute suggests otherwise.
Innovating Like an Optimist, Preparing Like a Pessimist: Ethical Debt and the Problem of Unanticipated Consequences for AI
Casey Fiesler on Colorado Law - University of Colorado Boulder (siliconflatirons.org)
Thursday October 20, noon - 1:00 pm Mountain Time (11:00 am - noon Pacific).
Artificial Intelligence, or AI, is increasingly in use by both the government and the private sector, from the allocation of government benefits to the creation of smart contracts. However, poorly designed AI systems can lead to significant harms, including but not limited to discrimination. The question of how to regulate and build ethical AI is central. This lecture series will emphasize the practical applications of AI technology and ways to ensure principle-based ethics are a key focus of both development and regulation.
What is Fog Reveal? A legal scholar explains the app some police forces are using to track people without a warrant
Anne Toomey McKenna on The Conversation (theconversation.com)
Some US law enforcement agencies are using a commercial app that tracks people all day long via their phones – without a court order or oversight.
Eli Hager, photography by Stephanie Mei-Ling, special to ProPublica on ProPublica (propublica.org)
Each year, child protective services agencies inspect the homes of roughly 3.5 million children, opening refrigerators and closets without a warrant. Only about 5% of these kids are ultimately found to have been physically or sexually abused.
The A.V. Club on Gizmodo (gizmodo.com)
The Do Not Track Kids app will help you fight back against tech companies who harvest kids’ data.
Steven Gallagher on JD Supra (jdsupra.com)
On September 29, 2022, Governor Newsom signed AB 984 into law, allowing for digital license plates (i.e. “alternative devices”). Employers using these to monitor employees run significant risks.
Scott Frey, PossibleNOW on VentureBeat (venturebeat.com)
Apple’s privacy changes mean brands must switch to zero-party data, new social platforms and engagement through traditional marketing.
Tory Shepherd on The Guardian (theguardian.com)
Exposed passport numbers blocked from being used in national Document Verification System
Bernice Hillier on CBC (cbc.ca)
A third Newfoundland family is speaking up about a reported breach of privacy involving their loved one, who is a resident of long-term care at the Baie Verte Peninsula Health Centre.
Ollie Williams on Cabin Radio (cabinradio.ca)
Reported health privacy breaches in the NWT nearly tripled in the past year, for reasons ranging from Covid-19 data mishaps to improvements in reporting.
Anna Delaney on bankinfosecurity.com
In the latest “Proof of Concept,” Lisa Sotto of Hunton Andrews Kurth LLP and former CISO David Pollino join ISMG editors discuss the first California
Written by Adrian Shahbaz Allie Funk Kian Vesteinsson on Freedom House (freedomhouse.org)
At home and on the international stage, authoritarians are on a campaign to divide the open internet into a patchwork of repressive enclaves.
Patrick Lecomte on The Conversation (theconversation.com)
Augmented reality (AR) uses wearable tech to enhance the physical world. To develop and enhance AR experiences, companies are tracking users’ eye movements, which may be more revealing than intended.