News

Malicious exploitation of AI tools The scenario is always the same: individuals retrieve photos of women from their public accounts, often influencers, activists, journalists, or content creators ...
Heidy Khlaaf, profiled as part of TechCrunch's series on women in AI, is an engineering director at cyber firm Trail of Bits.
Artificial intelligence technology that has been used to create fake, nonconsensual nude photos of women is now being used to cover up women so their clothing is less revealing. Conservative ...
Elon Musk's Grok AI has been found responding to requests for explicit images of women on X. As flagged by Kolina Koltai, a researcher at Bellingcat, X users have been asking Grok AI to undress ...
There are 7 common photo types these AI tools rely on — avoid them, and you make their job a hell of a lot harder for perverts.
Women are 1.5 times more likely than men to need to change jobs by 2030 thanks to automation by AI in the workforce, according to a study by McKinsey & Co.
Last summer, Tinder announced Photo Selector, an AI tool to help pick out the best pictures for your dating app profile. But privacy experts warn that there may be risks associated with the tool ...
San Francisco officials filed a landmark lawsuit against popular deepfake websites that use artificial intelligence to “undress” images of clothed women and girls.
Hairstylists explain why AI-generated images are problematic. One hairstylist remarked, “I’ve been getting so many AI ‘grey blending’ pics.
Several of the companies that made the commitment in partnership with Thorn have faced scandals related to child sex abuse material and AI.
The Gemini AI app can now edit your photos with simple text prompts. Even better, Google is working on placing actual watermarks on them.
Women in congress are 70 times more likely to be victims of AI-generated deepfakes than male counterparts A new report found 1 in 6 active congresswomen had nonconsensual imagery made of them.