Choosing to remove some speech -> be legally responsible for all content under platform?
That would be the same as a business owner who kicked out a drunkard is now legally responsible for any criminal activities performed within his store (or even, using his products.)
They 'choose to remove some speech' in the same way that major record labels 'choose to allow some speech'. They screen literally everything and only permit that which is politically neutral, favourable or profitable to them. They SHOULDN'T be legally responsible for anything which happens on the platform, if they were to treat all the content agnostically like a phone or mail institution would, but when they deliberately foster only certain flavours of speech then it makes sense that they become responsible for that which remains.
If AT&T were in the habit of cutting off your phone calls every time you insulted mohammed, I think it wouldn't be unreasonable that they be investigated further for any jihadattack that was successfully planned through their platform.
When a record company publishes a record, it does so under their name and the record is effectively theirs. Therefor they would be responsible if a record they published encouraged criminal activities for example. I do not see it the same way when a website (or a business) allows posts from the public, then picks and chooses whichever ones they don't like and removes those. It is not the same in scope (record companies typically don't publish many millions of records), and it is not the same as in principle.
We've already established that they remove speech because they don't like it. That part we agree on.
How does it follow that they now own what speech remains?
I do think phone companies should have the option to remove customers for any reason. I would also financially support those companies' competitors, because this is a behavior I disagree with. Nevertheless, they should have that right. Just like private business owners should have the right to do or not do business with whomever they wish, hire or not hire anyone they wish, bake or not bake cakes, for any and no reason what so ever.
Once you engage in editorial oversight, I.e removing content for arbitrary reasons, you become a publisher not a platform. And publishers have legal obligations to police their content and can be punished for not doing a good enough job.
This effectively boils down to 'the law is so and so.' I am not interesting in debating what is currently in the law books, but rather how things ought to be.
No, once you are aware of it you should do what is reasonably possible to remove said illegal activities from your premise and turn them over to the police.
Facebook already runs 100% of user generated content through their algorithms, and uses the analysis of those algorithms to allow the communication to happen or not.
The phone company does not screen your calls to decide whether you are allowed to speak.
It’s moot anyway. Section 230 was amended in 2018 to exclude sex trafficking, so Facebook has zero protection from civil liability for this type of conduct under the law now.
How does the logic follow that:
Choosing to remove some speech -> be legally responsible for all content under platform?
That would be the same as a business owner who kicked out a drunkard is now legally responsible for any criminal activities performed within his store (or even, using his products.)
Let's have a discussion.
They 'choose to remove some speech' in the same way that major record labels 'choose to allow some speech'. They screen literally everything and only permit that which is politically neutral, favourable or profitable to them. They SHOULDN'T be legally responsible for anything which happens on the platform, if they were to treat all the content agnostically like a phone or mail institution would, but when they deliberately foster only certain flavours of speech then it makes sense that they become responsible for that which remains.
If AT&T were in the habit of cutting off your phone calls every time you insulted mohammed, I think it wouldn't be unreasonable that they be investigated further for any jihadattack that was successfully planned through their platform.
When a record company publishes a record, it does so under their name and the record is effectively theirs. Therefor they would be responsible if a record they published encouraged criminal activities for example. I do not see it the same way when a website (or a business) allows posts from the public, then picks and chooses whichever ones they don't like and removes those. It is not the same in scope (record companies typically don't publish many millions of records), and it is not the same as in principle.
We've already established that they remove speech because they don't like it. That part we agree on.
How does it follow that they now own what speech remains?
I do think phone companies should have the option to remove customers for any reason. I would also financially support those companies' competitors, because this is a behavior I disagree with. Nevertheless, they should have that right. Just like private business owners should have the right to do or not do business with whomever they wish, hire or not hire anyone they wish, bake or not bake cakes, for any and no reason what so ever.
Once you engage in editorial oversight, I.e removing content for arbitrary reasons, you become a publisher not a platform. And publishers have legal obligations to police their content and can be punished for not doing a good enough job.
This effectively boils down to 'the law is so and so.' I am not interesting in debating what is currently in the law books, but rather how things ought to be.
No, once you are aware of it you should do what is reasonably possible to remove said illegal activities from your premise and turn them over to the police.
Facebook already runs 100% of user generated content through their algorithms, and uses the analysis of those algorithms to allow the communication to happen or not.
The phone company does not screen your calls to decide whether you are allowed to speak.
"Private companies can do whatever they want without repercussions!" Spoken like a real liberal.
It’s moot anyway. Section 230 was amended in 2018 to exclude sex trafficking, so Facebook has zero protection from civil liability for this type of conduct under the law now.