Choosing to remove some speech -> be legally responsible for all content under platform?
That would be the same as a business owner who kicked out a drunkard is now legally responsible for any criminal activities performed within his store (or even, using his products.)
Once you engage in editorial oversight, I.e removing content for arbitrary reasons, you become a publisher not a platform. And publishers have legal obligations to police their content and can be punished for not doing a good enough job.
This effectively boils down to 'the law is so and so.' I am not interesting in debating what is currently in the law books, but rather how things ought to be.
How does the logic follow that:
Choosing to remove some speech -> be legally responsible for all content under platform?
That would be the same as a business owner who kicked out a drunkard is now legally responsible for any criminal activities performed within his store (or even, using his products.)
Let's have a discussion.
Once you engage in editorial oversight, I.e removing content for arbitrary reasons, you become a publisher not a platform. And publishers have legal obligations to police their content and can be punished for not doing a good enough job.
This effectively boils down to 'the law is so and so.' I am not interesting in debating what is currently in the law books, but rather how things ought to be.