| |
|
Spying on journalists? Just say no
|
|
|
|
Spyware vs. press freedom
How existential risk became the biggest meme in AI
It's rare to see an article about AI these days that doesn't include the words “existential threat." While some dismiss this language as harmless hyperbole, it's a dangerous distraction from the real risks of AI technologies. So why are some tech company leaders engaging in this kind of fear-mongering? It may be part of a strategy to gain power, argues Signal's Meredith Whittaker: “Ghost stories are contagious [...] It’s a significant thing to cast yourself as the creator of an entity that could be more powerful than human beings.” Read more via MIT Technology Review
|
Exclusive: OpenAI lobbied the E.U. to water down AI regulation
A week after the European Parliament voted to approve the E.U. AI Act, TIME broke the news that while OpenAI – the maker of ChatGPT – was publicly calling for stronger AI regulations, it was also working behind the scenes to weaken the proposed legislation. “It’s very confusing because they’re talking to politicians saying, ‘Please regulate us,’ they’re boasting about all the [safety] stuff that they do, but as soon as you say, ‘Well, let’s take you at your word and set that as a regulatory floor,’ they say no," says Access Now's Daniel Leufer. Read more via TIME
|
In Jordan, poverty-targeting algorithms harm rights
Social security nets are supposed to protect people from poverty. But what happens when an algorithm decides who gets help? A new Human Rights Watch report reveals that Jordan’s automated cash transfer system, which was developed with support from the World Bank, uses flawed and discriminatory socio-economic indicators to rank people's worthiness for assistance — leading to denial of funds even as people "go hungry, fall behind on rent, and take on crippling debt." Read more via Human Rights Watch
|
Humans are biased. Generative AI is even worse
A Bloomberg investigation looked at 5,000 images generated by Stable Diffusion, revealing that the model generates images of lighter-skinned people when prompted with terms such as “CEO,” “lawyer,” or “architect,” while the terms “housekeeper” or “janitor” consistently produce images of darker-skinned subjects — with three times as many female subjects as men. Read more via Bloomberg
|
Watch: How to address the risks of generative AI
A new report from Forbrukerrådet, the Norwegian Consumer Council, examines the real-world risks of generative AI in detail, offering policy recommendations to ensure that its development and use are "safe, reliable, and fair." Watch the launch webinar for an overview of the report and analysis from a panel of experts, including Access Now's Daniel Leufer. Watch Now via Forbrukerrådet
|
When no internet = no food, no pay, no vote
Report: “No internet means no work, no pay, no food”
Every time Indian authorities order an internet shutdown, ordinary people pay the price. In a new report, Human Rights Watch and the Internet Freedom Foundation document just how devastating deliberate network disruptions can be. The Indian government’s increasing push to digitize almost all services – from applying for food rations, to recording work attendance for salary payment purposes – means that when the internet is cut off, access to all these services is too, leaving people without access to information, vital government services, and even food. Read more via Human Rights Watch
|
Opportunities and other highlights
Help us improve STOP, the Shutdown Tracker Optimization Project
As the #KeepItOn campaign enters its eighth year, Access Now is committed to improving our documentation, verification, and reporting of internet shutdowns around the world. If you'd like to help, please fill out this quick survey. We thank you for your time and valuable feedback. Read more via Access Now
|
AI to content moderation: Five highlights from RightsCon
What are the top takeaways from this year's RightsCon? While the Access Now team continues to reflect on the summit, outcomes, and next steps, Context offers a useful overview of some of the most-discussed themes. If you want to dive deeper into a specific subject, check out the sessions we’re sharing on our YouTube channel. Read more via Context
|
|
|
|