sexta-feira, 22 de março de 2019




















To place a time delay on live streams


"Artificial Intelligence systems rely on "training data". Social media companies feed their software on this "training data" as recognizable examples of content to take down. 
Facebook uses such systems to help catch and take down content like nudity.

The social media company's vice president of integrity, Guy Rosen, wrote 

https://newsroom.fb.com/news/2019/03/technical-update-on-new-zealand/ 
that the NZ shooter's video did not trigger Facebook's automatic detection systems because its artificial intelligence did not have enough training to recognize that type of video.
The shooter livestreamed 17 minutes of the horrific attack on Facebook. Fb said when the video was live, fewer than 200 people watched it. The video was later viewed 4,000 times before Fb took it down. The company hasn't said exactly when it removed the shooter's video. Since the attack, the video has been downloaded and re-uploaded millions of times to various platforms. New Zealand leaders have criticised Fb for not taking enough action to remove all versions of the video.

Some critics have called 



Sem comentários:

outros dias do caderno