2024-04-08
8 分钟I'm Nicholas Kristof.
I’m a columnist for “The New York Times.”
We’ve been hearing a lot lately about the potential dangers of AI.
That includes deepfakes.
Before the primary, this year, voters in New Hampshire got a robocall.
That sounded a lot like President Biden.
Voting this Tuesday only enables the Republicans in their quest to elect Donald Trump again.
There's also a fake video of Taylor Swift endorsing Donald Trump.
This video is obviously fake, and it's been manipulated.
If Taylor Swift ever endorsed Donald Trump in the Grammy Awards,
I think this would have been all over the news by now.
But I think there's a bigger problem that we hear much less about,
and that is deepfake, nude photos, and videos.
Deepfakes are AI–generated imagery that often uses real people’s faces and identity.
There was one study that found that 98 of deepfake videos online are pornographic.
And of those, 99 percent of the time those targeted are women and girls.
I first got a sense of the scale of this problem from an activist who was trying to fight it.
My name is Breeze Liu, and I am a survivor of online image abuse.
Her story begins back in April 2020 when she was just 24 years old.
I got a message from a friend of mine.