The most recent criminal case involving artificial intelligence emerged last week from a Maryland high school, where police say a principal was framed as racist by a fake recording of his voice.
The case is yet another reason why everyone — not just politicians and celebrities — should be concerned about this increasingly powerful deep-fake technology, experts say.
“Everybody is vulnerable to attack, and anyone can do the attacking,” said Hany Farid, a professor at the University of California, Berkeley, who focuses on digital forensics and misinformation.
Here’s what to know about some of the latest uses of AI to cause harm:
Manipulating recorded sounds and images isn’t new. But the ease with which someone can alter information is a recent phenomenon. So is the ability for it to spread quickly on social media.
The fake audio clip that impersonated the principal is an example of a subset of artificial intelligence known as generative AI. It can create hyper-realistic new images, videos and audio clips. It’s cheaper and easier to use in recent years, lowering the barrier to anyone with an internet connection.
Related articles:
Related suggestion:
Stars center Roope Hintz out for Game 5 against the Avalanche with upperAnt McPartlin shows off neverOklahoma begins quest for unprecedented 4th straight softball title at regionalsAnt McPartlin's adorable nickname for his wife AnneEdmunds: The five things you need to know before buying your first used TeslaCeltic secures third straight league title in Scotland and stays on course for a trophy doubleJennifer Lopez, 54, shows off her abs at dance rehearsal as she prepares for tour kick offWilliam Contreras leads the way as Brewers hit 5 homers off Martín Pérez in 10Fat pride influencer who says other passengers should fund free seats for plusAn Islamist group used child soldiers in Mozambique attacks, says Human Rights Watch
3.258s , 6500.765625 kb
Copyright © 2024 Powered by Deepfake of principal's voice is the latest case of AI being used for harm ,Earth Essence news portal