Why is this such a big conservative gripe? Just trying to understand it.
1/12/2021 9:27:03 PM
Republicans LOVE unfettered capitalism when it benefits THEMSELVES, but once the capitalism becomes fettered (or worse, starts benefiting others) they are SO NOT DOWN
1/12/2021 9:29:00 PM
It ain't just a conservative gripe. Dems hate it too
1/23/2021 11:53:01 AM
Tell me why I hate Section 230.-]
1/23/2021 11:56:28 AM
Progressives and Dems hate it for the same reason Trump hates it: because they can't sue tech platforms over speech they don't like. The only difference between the two sides is which speech they dislike.It cannot seriously be argued that there is not bipartisan hatred of Section 230.https://www.bloomberg.com/news/articles/2020-08-11/section-230-is-hated-by-both-democrats-and-republicans-for-different-reasonshttps://www.cnet.com/google-amp/news/democrats-and-republicans-agree-that-section-230-is-flawed/https://reason.com/2020/12/18/the-bipartisan-push-to-gut-section-230-will-suppress-online-communication/?amp[Edited on January 23, 2021 at 12:24 PM. Reason : ]
1/23/2021 12:02:43 PM
There's a bit more daylight between those positions than your post implies. But you've always been a both-sides kinda guy, so yeah, no surprise there.
1/24/2021 2:37:10 PM
For fuck's sake, man Joe Fucking Biden said it should be revoked. Trump said it should be revoked. How much "daylight" do you see between those two positions?https://www.npr.org/2020/05/30/865813960/as-trump-targets-twitters-legal-shield-experts-have-a-warning
1/24/2021 8:44:29 PM
you mean the right and the far right want it gone. i don't know any progressives who do.
1/24/2021 9:46:39 PM
1/24/2021 9:56:04 PM
Certainly I hate it. Websites are publishers and should be treated as such. The state of the country and world would be immeasurably better if we'd been doing that for the last 20 years.It's a little silly to say everyone wants to sue tech firms for speech "they don't like." The First Amendment still exists. But when Alex Jones lunatics post videos telling people to harass shooting victims, posting their home addresses, etc., YouTube needs to be accountable for enabling that.
1/25/2021 10:18:04 AM
^
1/25/2021 12:24:41 PM
Why should YouTube bear the legal burden over Jones personally? Or, if Jones is liable, why should YouTube bear the burden at all?
1/25/2021 1:14:31 PM
RIP TWW
1/25/2021 1:27:31 PM
Is it even feasible for a website like Youtube to monitor their users on a level that this would require? I really don't know, I'm fairly ignorant with respect to this, that's why I'm asking.I'm thinking Youtube has millions of users, so I don't think it's feasible for Youtube to have human monitoring of all these interactions. Sure, you can have some level of algorithmic monitoring, looking for hate speech and threats, but I feel sure things can be phrased in such a way to slide past most any algorithm like that; in that case, I don't think Youtube should be held responsible with or without 230.
1/25/2021 1:38:51 PM
so should TWW be responsible for the shitty poasts and child pr0n (shit's gross, son) users may upload?[Edited on January 25, 2021 at 1:49 PM. Reason : not saying it exists, just a more relatable scenario]
1/25/2021 1:49:00 PM
With other media, one who publishes material that is, say, defamatory, has the same liability for the material as the individual who created it, on the theory that a publisher has the means and responsibility to check its veracity. Nobody bats an eye at this, and for that matter, very few people successfully sue these publishers for anything. Again, holding a publisher liable doesn't mean it's open season on everything that someone might disagree with. The standards for successfully suing someone over speech are extremely high.So why should YouTube get different treatment from a newspaper or magazine? If the News & Observer is held responsible for what it publishes, why isn't Facebook? Cabbage points to scale. OK, how much volume does a publisher have to put out before they're immune from liability? And what kind of fucked up logic is it that the bigger and more powerful you get, the fewer rules we're going to put on you?I get that moderating content would be enormously difficult for YouTube, Facebook, etc. So? They made this enormous, unaccountable thing. These guys are Dr. Frankenstein - they created a monster; the monster started causing damage; they hemmed and hawed about reining in the monster; the monster caused a lot more damage; now they need to set off on the quest to destroy it. That might mean massive changes to how these sites operate. Would that be the worst thing in the world?Oh, and one more thing before I move on - most states impose penalties on people that raise losing defamation cases against publishers. Anti-SLAPP laws, in a rare case of legal terms being fun. There won't be millions of people frivolously suing - at least, not for long, not after they all get their cases thrown out and have to pay $texas to Mark Zuckerberg.
1/25/2021 3:00:13 PM
1/25/2021 3:51:35 PM
And now that I've thought about it a bit more, one problem with having an open source algorithm to filter content is that it would actually enable people to bypass the filter even better, so I'm really not sure if that's the best approach after all. I'm just spitballing some ideas here.
1/25/2021 4:06:48 PM
So you touch on the only real distinction between traditional publishers (books, newspapers) and online publishers: whether material is reviewed before it is published rather than after. And this is the area in which I'd agree there do need to be separate rules to deal with the practical aspect, but which ultimately arrive at the same end: responsibility for what gets published. A grace period is in order; I don't expect YouTube to take down defamatory material within minutes or even hours. So whereas a newspaper publisher can be expected to do their due diligence before running the presses, we might allow social media a day after the fact before they're liable.You point out that holding these companies accountable will lead to massive, even catastrophic changes in how they operate. I get that. I'm happy about that. Putting social media companies in a position where they have to get really serious about fake news, bots, troll farms, etc. is a good thing. Surely we can't spring it on them overnight - give them time to reorient their businesses. But if the future of social media is slower, less wild, more moderated, and finally curated with an eye towards truth and civic responsibility rather than algorithmically click-generated profit, I say: so much the better.(Note to any TWWers who fall into the more libertarian/free market camp: First of all, hi, I didn't think any of you were left. Second of all, I'm not anti-profit or generally in favor of forcing private enterprise to behave in accordance with my understanding of "civic responsibility." In this case, I refer merely to their having to follow the same rules as the rest of us.)
1/25/2021 5:33:43 PM
Some 64% of users who join an extremist Facebook group do so because Facebook recommends it to them. Facebook actively promotes defamatory and extremist content to drive engagement and make money. Social media companies are not passive participants here and should not be treated as such. The speed and range of social media is far beyond anything possible with physical media. If anything, requirements for online publishing should be more restrictive than those for traditional publishing.
1/25/2021 6:54:55 PM
^^Yeah, I agree with pretty much all of that. In case my position wasn't clear, I'm really not trying to advocate a pro or con social media position here, merely trying to point out some issues that I think make it complicated. I remember seeing on Bill Maher's show a few years ago they had, I think, a Google employee who had a job title something like "Social Engineer". Another guest mentioned how Orwellian that sounded, and he defended it by basically saying that Google gets such widespread use that they need to be aware that their search results and such (how they are ordered, what results come at the top of the list, for example) contribute to shaping opinions and points of view, and that it is incumbent upon them to put effort into at least pushing that into a socially positive direction. I'm not presenting that example to promote a pro-Google position; I don't know enough of their sorting algorithms (and their results) to have a pro or con opinion on how good a job Google is doing on that front. I merely bring it up to point out I'm aware of the issues you mention, in fact I believe I mostly agree with your position. I just think we should be careful not to let the pendulum swing in the opposite direction, introducing draconic measures that will hold a social media platform responsible for the conduct of any of its members whenever there's some post that may vaguely reference that conduct. There should be some clear metric that says THIS is what you, as a platform, need to maintain for the good of society. It won't be perfect, and we shouldn't hold the platform accountable in those cases.
1/25/2021 8:21:03 PM
All good points. But can we start with the simple? These platforms, since their inception, have had or developed policies to combat some of these issues, but time and time again, they've defended not removing content / users except in the most extreme cases, up until recently. Point being, in addition to ^^, these platform operators know about this content, talk about this content, and outright refuse[d] to act. They are complicit and only recently have seen at least some light. They enable. They invite. They willingly give platforms to known bad actors.
1/26/2021 10:04:11 AM
Anyone want to talk about the consequences of removing Section 230 protection?The legal standard before Section 230 - and the standard we would return to if repealed - is that a platform could only avoid lability for third-party content if they imposed no moderation what so ever. In this day and age that means everywhere would turn into a cesspool of porn bots and Nazis because everyone else would be driven off.If websites did impose moderation to remove what undesirable content they could, they would have liability for what's left. If repealed, that would mean that there would be no new or small social media sites or forums or sites with comments since even the costs of defending frivolous litigation would break them. You can effectively and comprehensively moderate at scale. If you try to automate moderation, you're going to miss bad content and have a lot of false positives.Any remotely controversial content would never see the light of day since banning it is way cheaper than potentially being sued over it. So there would be no negative product reviews. Blogging platforms (remember LiveJournal?) would likely go away completely. Fan fiction sites gone. YouTube would just be corporate content. Want to complain about local government? You're going to have to mail a newsletter. There are platforms for the disadvantaged that only exist because of the low barrier of entry provided by sites otherwise protected by Section 230. So many useful sites would be stripped of their utility if that protection is stripped. This place literally wouldn't exist because Ken isn't going to risk losing this house for when The Soap Box devolves to name calling or someone posts pics of Anita Flick and then a lawyer gets involved.Section 230 was created for a reason and it wasn't to help mega corporations or enable assholes to harass people. It's purpose is to allow online communities to be able to moderate what they can without getting sued for the content they missed. The goal is to foster good communities. The Internet is bigger than Facebook and YouTube and people need to carefully consider the consequences of what they're asking for.
1/26/2021 11:17:10 AM
That’s one way it could be done I suppose. There are tons of smart engineers out there that would find ways to moderate content without being so onerous. Shoot there’s already message board systems that moderate manually content posted for your first X number of posts to determine that you’re not a piece of shit before allowing general access. Also verified/trusted users are a thing that can be done.I firmly believe that there is a way to reign in the Wild West culture while still respecting the “town square” culture.This isn’t a technical problem it’s a scale problem.[Edited on January 26, 2021 at 2:31 PM. Reason : Z]
1/26/2021 2:30:07 PM
1/26/2021 3:30:50 PM
Even defending frivolous suits takes time and $$$. You can be bankrupt and on the street before the process plays out. It's begging the rich to prey on the weak. Also, Anti-SLAPP isn't universal. It's a play on the adage, "The process is the punishment." Just look at the Devin Nunes Cow defamation cases. They're obviously a farce but you have folks six-figures in debt defending those cases.
1/26/2021 3:57:15 PM
Speech is great. Free speech is great. For 250 years in this country, we've largely agreed on this point. We've also agreed that free speech is not boundless. You can't stand in the middle of the street at 3:00 AM with a bullhorn and should that Barack Obama is a lizard person from planet Muslim. We're fine with these limits in every other context; why are they suddenly so horrifying in the context of the internet? Because it will be hard for poor, put-upon tech giants to deal with? The novelty of their industry doesn't grant immunity. When Ford rolled out the Model T and people started getting hit by cars, we didn't say, "Well, I know it's terrible that these people are being run over, but [traffic laws, safety standards, etc.] are going to put an insurmountable burden on car manufacturers. Nobody will want to buy one because if they hit somebody they might get sued. At the end of the day, this is about freedom of movement and I'm not going to be easily convinced we need less of it." No. We made rules about what you could do with cars and what would happen if something went wrong. In the process, we created a new industry - auto insurance - to deal with the fact that people were going to have to take responsibility for their actions and mistakes. And now there are 273 million automobiles in the United States.If you get rid of 230 and provide a robust legal mechanism to penalize frivolous or bullying lawsuits - effectively through a beefed-up version of anti-SLAPP rules at the Federal level, given that online publications are necessarily interstate - then I envision some enterprising portion of America's glut of lawyers will move to fill a new gap defending against such cases, free or at very low cost to the client, the lawyers to be paid out of the penalties. Possibly this would take the form of a new arm of personal liability insurance. Whatever the form, well-structured penalties offer an incentive for accessible legal defense and against spamming websites with lawsuits.
1/26/2021 5:14:12 PM
Let's come at this from another angle. Why should the website hosting third party content be liable if you can just go after the author? Should I sue Ford because someone driving an F-150 ran over my cat? Should they have not vetted the driver before they passed over the keys to the vehicle they built with their name plastered across the side?
1/26/2021 9:54:11 PM
First problem: Assuming that you can even identify the author of something on the internet. Anonymity is baked into so many sites, and even places that nominally require your real name (say, Facebook), that's hardly enforced. But even if you could require that users verify their identify before signing up for a site, and even if you made those identities publicly available (so that they can be held liable) - how are you going to verify that the owner of the account is the one that posted a given item? Second problem: Even if you could identify everybody on the internet, most of them live outside of the United States and are beyond the practical reach of the U.S. legal system.Third problem: Going after the author doesn't get rid of the material causing the problem. Your hypothetical F-150 doesn't keep driving around flattening cats after you bring the first driver to trial, but a defamatory video keeps getting views unless the website takes it down. OK, you say, we'll make the websites take down videos that are the subject of lawsuits. But what are we going to do if they don't? Sue them?None of which is to say that authors should be immune, obviously.
1/27/2021 7:39:49 AM
1/27/2021 10:56:39 AM
^^^ A better analogy is Tesla's Autopilot. Technology advances from a '48 F-1 to an Autopilot-equipped Tesla mean that manufacturers are taking on responsibility for operating the car. They're no longer providing a simple machine; they're actively involved in the operation of that vehicle. Should Tesla be liable when a Model 3 runs over your cat while on Autopilot? Maybe. Certainly no one would argue Tesla should enjoy blanket immunity for incidents involving their vehicles. (See also 737 Max)Likewise, user-based websites have evolved from basic forums to massive social media platforms. I believe forums (e.g. TWW, craigslist, and even cesspools like Stormfront, 4Chan, etc.) should generally be free of liability for user-generated content. A key feature for me is that these sites perform little promotion or curation and have relatively limited reach.Social media companies should be open to liability for content posted on their sites, a distinguishing feature being they select specific content and promote it to other users (potentially millions). Users don't just see what others post; they see user-content the platform has selected for them (promoted, you might like, autoplay, etc.). These sites take an active role in what content their users see and should share responsibility for that content.Obviously there's a continuum between small, basic forums like TWW and a massive social media company like Facebook that actively curates its content. I'm not exactly sure where the lines should be drawn between promotion of selected content and responsibility for that content. Where would a Reddit-style voting system fall?I also share your concern for chilling effects. Craigslist was browbeaten into removing its adult forums. I'm sure it wouldn't take much for Ken and other small operators to pull the plug, and it's not hard to imagine larger companies deciding it's not worth the trouble and pulling the plug on support forums, comment sections, etc.-]
1/27/2021 1:22:12 PM
1/27/2021 10:25:57 PM
1/27/2021 11:10:09 PM
Regardless of why 230 was created to begin with, I'm concerned with the problems it enables now.The ineffectiveness of state SLAPP laws is immaterial when I'm suggesting that any repeal of 230 come with a Federal equivalent - which seems logical enough, given the necessarily interstate nature of the internet - that would need to be considerably more effective for me to favor it.The doom-and-gloom predictions that all these sites are going to close or become pale imitations of their former selves doesn't seem to be standing on much. Many of these businesses are enormously profitable and are not going to throw up their hands and walk away because of a regulatory change. Plenty of rules have made plenty of industries more expensive or complicated, but few have imploded under the weight.Comment sections on news articles? Good riddance. It isn't as though people had a right to mass-mail, for free, their opinions to everyone who reads the New York Times.Reviews will go away? How do you figure? Reviews exist in plenty of traditional media outlets. Reviews are, at their core, opinion, and thus protected speech.
1/28/2021 11:03:59 AM
1/29/2021 5:53:29 PM
2/2/2021 1:50:53 PM
2/4/2021 1:00:37 AM
2/4/2021 1:01:09 AM
2/4/2021 9:52:03 AM
...(c)Protection for “Good Samaritan” blocking and screening of offensive material (1)Treatment of publisher or speaker No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider. (2)Civil liability No provider or user of an interactive computer service shall be held liable on account of— (A)any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; or (B)any action taken to enable or make available to information content providers or others the technical means to restrict access to material described in paragraph (1)....
2/4/2021 1:24:48 PM
I'm backing off of this one. I still think 230 needs changes but in the course of arguing for that I've backed myself into a corner of advocating, or seeming to advocate, its obliteration without any replacement, which is not a position I can defend.Section 230 has done a number of important things that I do not want to destroy. It has also inadvertently created an environment in which the services it protects now incubate an existential threat to the liberal democratic system. I find it difficult to believe that there is no way to reduce the damage without preserving the benefit.[Edited on February 4, 2021 at 3:19 PM. Reason : ]
2/4/2021 3:18:49 PM
Belongs in other thread[Edited on November 9, 2024 at 11:12 PM. Reason : ]
11/9/2024 11:02:30 PM
Also, is it just me, or did that seem AI-generated as fuck
11/9/2024 11:03:37 PM