Facebook Promises to develop its AI which could detect Terrorist Videos

Facebook tried to stop the distribution of videos that took place in mosques in New Zealand. The giant of social networks seemed to be “recognized” because he had not stopped the video immediately during the shoot.

In a recent release, Guy Rosen, VP of Integrity at Facebook, talked about the company’s successes and weaknesses in overcoming the situation and its plans to prevent the distribution of these videos in future social networks.

He explained that although the AI ​​platform can quickly detect videos that contain suicides or are dangerous, the transmission of shots seems undetected.

In order to train AI that is suitable for recognizing certain types of content, the platform is reported to require a large amount of training data.

As Facebook has explained, such a thing is hard to come by because “this event is very rare”. In addition, none of them has seen the live broadcast. The first user report will appear 29 minutes after the start of the transmission and 12 minutes after the end of the live broadcast.

Live only sees less than 200 times, while the original video is seen a total of 4000 times.

Rosen also explained why more than 300,000 copies could be distributed on the platform after the Facebook system detected and deleted 1.2 million copies of the video after uploading. He said there is “a core community of bad actors” who continue to upload the edited version of the video.

By changing a bit and not loading the original copy of the original, you can avoid the filter platform. Some even played the original on their computers and then recorded them on their cell phones.

In general, Facebook can recognize more than 800 video variants, each of which visually differs.

In order to prevent similar videos from circulating in the future, Facebook plans to improve its artificial intelligence by preserving the ability of audio-based recognition.

Similar Posts:

Leave a Reply

Your email address will not be published. Required fields are marked *