The online commentary sections on blogs to newspapers are often a cesspool of hate that one wonders if Russia has that many bots and then you realize no, people are that angry. And yes clearly the bots surf the web incessantly and flag and tag sites they can polarize and exploit to generate friction for intentions I am not fully clear but I think it makes Putin hard to see Americans get their knickers in a twist.
We have become the indignation nation where we stifle free speech if we don't like it or turn it on its ass to bring harm to others when a car barrelling through protesters is not enough and we elect the Orange Theory to the White House as he calls it like he sees it and is a tough talker that has no time for political correctness. This is what our obsession with being so vanilla had done, made us all turn into word police and in turn call names and destroy other peoples lives just to make us feel better.
I have discussed my run in with an individual over calling children "special snowflakes" and saying "shut up" when a child was running off at the mouth with vile hate filled nonsense. It was to shut down a toxic situation without needing to escalate it or turn into an endless talking cycle. And that was met with shut down negative ad hominem attack as if I had used a specific name with a vulgarity attached. This is what we have become, vanilla angry bitter people who are sure if we are all good enough, smart enough and dull enough people will be just like us. Silicon Valley is taking care of that right now with AI and robots. Finally I can speak my mind to my new friend - RorBort
Friday night Bill Maher had on Kathy Griffin, and hate or love her, she was utterly vilified for a tacky if not stupid photo holding Donald Trump's severed head. The "right" wanted her destroyed, to take away her ability to work and to terrify her and anyone who would work with her in the most extreme nature that to serve what purpose I have yet to fully grasp. Don't like what someone says, walk away, don't listen, don't read or watch or care about them. But this does not seem to be enough to scold and reprimand someone it is almost compulsion to utterly destroy them. And this falls across both identities in politics. And the purpose is to do what? Leave them penniless and homeless and then fully dependent upon the Government? Or should they just die? When kids say and do shit that seems utterly idiotic I ask them is there a point? What do you want me to say and do when they level one moronic comment after another? I already know it is to get angry or get sad as they need emotions in which to resonate. When I don't I utterly confuse them to the point they finally shut up. And the children learn this from whom? Again, the strident indignation and need to tattle and do harm is an act of a child not an adult.
The little 19 year old girl on the plane who videoed the angry woman complaining about her baby crying posted it and it went viral. The woman's employer terminated her and the girl realized that was not her intent. Her intent was what? To shame and embarrass, yes but why? What did she gain from that? And that is one of many who have lost there jobs. The woman on the bike giving Trump the finger and others who have said or done something on impulse thanks now to social media finding themselves under fire or just fired for doing what was utterly nothing but an act of stupidity.
The internet is a buzz with story after story of someone who tweeted or posted something inane or simply said something that was non intended to be harmful or unkind but was interpreted that way. Another evil perpetrator of online bullying, Sam Biddle, seems to embrace said reputation and is another possible reason why sites like Gawker were finally shut down. And it appears Thiel is not done with them even today.
When you are rich and powerful hell hath no fury than a Queen exposed. Bully is as bully does. He who accuses excuses. Again this is what the Valley brought us the ability to torment people over and over again and point fingers, lay blame and feel better about ourselves for what exactly?
And that ability to harm others and infer or interpret someones words is all part of literature, journalism and reading. And it is what defines critical thinking, analyzing which means interaction.
When you speak or write and do so with metaphors or use an example to illustrate a point you find yourself trying to explain that was all it was and that only seems to make it worse. The reality is that no one can take a joke when they don't know you, cannot see your face or have a full dialogue with you. Confined to a few short words relegated to a Facebook post, a comment section you open yourself up for endless bullshit. But mostly I have learned from standing in front of a classroom, its a bait and switch and they want you to explode. The same thing I do now in classes I do online. I quit responding as it wastes my time and energy and yet at times even I am guilty of having to take a stab or stab verbally back. The perk of anonymity is that the ability to be a bitch.
Bari Weiss, who is a new addition to the New York Times was also on Bill Maher and frankly I find her more interesting in person than via her writing. I am not sure her purpose as a guest other than to make Bill feel better for being a contrarian and there is nothing in being a Millennial at this point that deserves that moniker to deserve that label (I prefer whiner). I don't follow her on social media and given that most of her "problems" stem from that I can see her point that having endless twitter wars do nothing to lend to an argument and that forum seems to make things worse. However, she is like Trump and cannot stay away from it lure apparently and that may be a problem.
And all of this was because Bill Maher was criticized for using the word "nigger." Again it was a stupid example and bad choice but he was not calling anyone that name and was trying to use it as an example or metaphor when sitting with Ben Sasse a total phony who was full of shit. I still think he should of said "Mexican" as that is what the Mexican people are today in our culture - slaves.
And this is my opinion and my thoughts on the matter. I have spent way too much time explaining, defending and justifying my thoughts to the point I don't want to hear anyone else's. That is what happens and why we live in bubbles.
I still read the comment sections of newspapers and roll my eyes but I am not compelled to do more than comment on my own thoughts and not respond to others, negative or positive. But at least I read them.
A Bot That Identifies 'Toxic' Comments Online
A Google-funded algorithm flags messages that are likely to drive others away from a conversation.
Kaveh Waddell The Atlantic Feb 23, 2017 Technology
Civil conversation in the comment sections of news sites can be hard to come by these days. Whatever intelligent observations do lurk there are often drowned out by obscenities, ad-hominem attacks, and off-topic rants. Some sites, like the one you’re reading, hide the comments section behind a link at the bottom of each article; many others have abolished theirs completely.
One of the few beacons of hope in the morass of bad comments shines at The New York Times, where some articles are accompanied by a stream of respectful, largely thoughtful ideas from readers. But the Times powers its comment section with an engine few other news organizations can afford: a team of human moderators that checks nearly every single comment before it gets published.
For those outlets who can’t hire 14 full-time moderators to comb through roughly 11,000 comments a day, help is on the way. Jigsaw, the Google-owned technology incubator, released a tool Thursday that uses machine-learning algorithms to separate out the worst comments that people leave online.
The tool, called Perspective, learned from the best: It analyzed the Times moderators’ decisions as they triaged reader comments, and used that data to train itself to identify harmful speech. The training materials also included hundreds of thousands of comments on Wikipedia, evaluated by thousands of different moderators.
Perspective’s current focus is on “toxicity,” defined by the likelihood that a comment will drive other participants to leave a conversation, most likely because it’s rude or disrespectful. Developers that adopt the platform can use it as they choose: It can automatically suppress toxic comments outright, or group them to help human moderators choose what to do with them. It could even show a commenter the toxicity rating of his or her comment as it’s being written, in order to encourage the commenter to tone down the language. (That could work a little bit like Nextdoor’s prompts aimed at tamping down on racist posts.)
Perspective’s website lets you test the system by typing in your own phrase. The system then spits out a toxicity rating on a 100-point scale. For example, “You’re tacky and I hate you,” is rated 90 percent toxic. Fair enough. But there are discrepancies—“You’re a butt” is apparently 84 percent toxic, while “You’re a butthead” is only at 36 percent. (When I tried more aggressive insults and abuse—your usual angry comments-section fodder—each scored over 90 percent.)
The Times has been using the system since September, and now runs every single incoming comment through Perspective before putting it in front of a human moderator. Perspective will help the newspaper expand the number of articles that include comments—currently, only about one in ten have comments enabled.
Future versions of Perspective will approach other aspects of online commenting. It may one day be able to tell when a comment is off-topic, for example, by comparing it to the themes contained in the news story it’s referring to.
The platform could help make more comment sections enjoyable and informative—and it might help draw out voices that are often silenced by harassment. A study published in November found that nearly half of Americans have been harassed or abused online, and that women, racial minorities, and LGBT people are more likely to be attacked than others.
The abuse drove people to change their contact information or retreat from family and friends. Worryingly, it also led one in four people to censor themselves in order to avoid further harassment. Most harmful abuse happens on social networks, not news-site comment sections, of course—Twitter is often a loud crossfire of vitriol—but barbs exist on every social platform. Tamping down on abuse on news sites can help make them a safer space for commenters.
Perspective’s developers hope that opening the tool to every publisher will bring comment moderating within reach for more, and perhaps stave off the demise of comment sections. As more news organizations adopt the system, it will continue to learn and improve its accuracy. And if automated moderating proves useful for news sites, it may have a future on larger social media networks, which are most in need of a gatekeeper to stop abuse.