https://www.nytimes.com/2017/04/17/t...pgtype=article
This is dangerous. I'm gonna start saying that right now. A video like that should be pulled down immediately. Yes. And it's unfortunate that it stayed up for that long. Since now it will likely be on the internet forever. Once you put something on the internet, it's on there. You can never get rid of it. But if you give the power to pull anything down from the internet that you want, people are going to abuse it.On Easter, Steve Stephens drove around downtown Cleveland on what he said was a mission to commit murder — and soon he had an audience of millions for his shooting of Robert Godwin Sr., 74, which he recorded and posted on Facebook, the police in Cleveland said.
On Monday, the authorities nationwide were looking for Mr. Stephens, 37, with the police as far away as Philadelphia saying they had received calls about sightings of him in that area.
Now Facebook is facing a backlash over the shooting video, as it grapples with its role in policing content on its global platform.
It is an issue that Facebook, the world’s largest social network, has had to contend with more frequently as it has bet big on new forms of media like live video, which give it a venue for more lucrative advertising. The criticism of Facebook over Mr. Stephens’s video built swiftly Monday, with critics calling it a dark time for the company and outrage spreading on social media over how long it had taken — more than two hours — for the video to be pulled down. Ryan A. Godwin, the victim’s grandson, pleaded with other users on social media to stop sharing the video online.
Going to comment about youtube's situation first. It's bullshit. Because ads are targeted. They can appear next to anything. If you search for certain products and then go on youtube. Watch your videos, watch a video about I don't know, say pro white genocide? That does not mean that, that product supports the message in that video. I don't know where this came from. Or why this is happening. And Twitter's hate speech thing is also a load of shit. They're left leaning. They recently banned a lot of right leaning posters for no apparent reason. And they're still doing it. Their flagging system is automatic.Facebook’s dilemma is part of a debate that has pulled in other technology giants, including Twitter, Amazon and Google. As these companies have rushed to provide tools for people to widely share their intimate moments more frequently, they are dealing with a rising tide of calls to more proactively filter the type of content that appears. In recent weeks, Google’s YouTube has been scrutinized for posting advertising next to racist video content, while Twitter contends with hate speech almost daily.
This is happening more and more often. Facebook live, Youtube streaming. You can stream from anywhere with a good wi-fi connection now. Or if your phone company is good enough. And it's going to keep happening. Traumatic events can be streamed live. And companies are helpless to stop it it seems, because they won't be able to take notice until thousands are watching. And then they will be blamed for it. I don't think they can actually do anything about it without screwing up. Like taking down streams that probably shouldn't be taken down.But the attention is often focused on Facebook because of its nearly two billion users and global influence. It is an issue that is bedeviling Mark Zuckerberg, the company’s chief executive. Facebook has encouraged users to post more — it has spent the last two years emphasizing its push into photographs and video, underpinned by a thesis that cameras have become more important in how people share moments of their lives with their friends.
The company was not prepared for the consequences of that push. Last summer, the death of Philando Castile, a Minnesota man shot by the police during a traffic stop, was broadcast by his girlfriend live across Facebook. In January, three men in Sweden were arrested on suspicion of raping a woman and streaming the assault live to a private Facebook group. In February, two radio journalists in the Dominican Republic were fatally shot during a Facebook Live broadcast.
Some groups have pressured Facebook to take a stronger role in reviewing content posted on its platform. In a letter this year to Joel Kaplan, Facebook’s director of global policy, the American Civil Liberties Union called for the social network to be more transparent in its censorship process and to agree to an external audit of its practices.
I suppose the question is this. Should Facebook/Google be held responsible for violent content that gets on to their site? Even if it gets taken down quickly?