MIT researchers develop tool to battle deepfakes and AI manipulation; Know how PhotoGuard works
[ad_1]
Deepfakes have emerged as a major talking point this year as a malicious side-effect of artificial intelligence (AI). Many bad actors have used the current boom in this space to use AI editing tools to create fake images of people and institutions. Multiple reports have emerged of criminals creating fake nudes of people and then threatening them to post these photos online if the victim did not pay them money. But now, a group of researchers at the Massachusetts Institute of Technology (MIT) have come up with a tool that can help combat this problem.According to a report by MIT Technology Review, researchers have created a tool called PhotoGuard that alters images to protect them from being manipulated by AI systems. Hadi Salman, a contributor to the research and a PhD researcher...