In fact this proposal is pretty much the polar opposite of what we should want - they are saying "Yeah as long as you play ball and do what we say you're safe", codifying the current defacto system of control.
I disagree. I don't see why any platform should be liable for their user's postings. Those posts belong to the user. The law that is paired with 230 (not sure if this law is part of that descriptor) enables copyright strikes. This system is widely believed to suck, so by all means replace it. But I think that protection for websites like this one is good.
Almost everyone moderates, so almost no one can be other than a publisher according to your definition. And that's fine. I don't see the need for distinction between platform and publisher. Just make the person responsible for content the person that generated it, and then if you need some parallel copyright enforcement mechanism, figure out one.
The obvious issue with holding users responsible is what to do with anons. I think not much. They are still responsible for their posts, but we must acknowledge that a large number of untraceable posts will be made. Some mechanism must be made such that even though anon cannot be held responsible, his lawbreaking posts must still be taken down. And you do this without punishing platforms who comply.
The guy is talking about "search results" and what makes it to trending, not so much the content itself. The user is always responsible for his own content, but why shouldn't the publisher also be responsible if they have proven they had the power to filter it out with mysterious algorithms and didn't? Facebook/Twitter/Reddit are totally responsible for any CP traded there if they don't do anything to remove it as soon as possible.
why shouldn't the publisher also be responsible if they have proven they had the power to filter it out with mysterious algorithms and didn't?
The details seem a nightmare to regulate. What happens if they have a CP filter, but an item of CP makes it through? Are they then hit with the penalty for CP distribution? "As soon as possible" sounds impossible to litigate. I like CP as an example because Twitter doesn't want to distribute that (ostensibly). This isn't a matter of viewpoint filtering. This is a matter of law enforcement.
Having thought about it some more, perhaps there is more than one element of regulation. One goal is: you want pirated and otherwise illegal material taken down. The other is, I think: you want some fairness in moderation, lack of viewpoint discrimination. You can address these separately.
Section 230 needs to be FIXED. Revoke Section 230 protection from entities that edit/publicize their search results. Specify PLATFORM vs PUBLISHER.
Don't fucking use the "for the children" excuse - if you're going to fuck with Section 230, FUCKING FIX IT!
In fact this proposal is pretty much the polar opposite of what we should want - they are saying "Yeah as long as you play ball and do what we say you're safe", codifying the current defacto system of control.
I disagree. I don't see why any platform should be liable for their user's postings. Those posts belong to the user. The law that is paired with 230 (not sure if this law is part of that descriptor) enables copyright strikes. This system is widely believed to suck, so by all means replace it. But I think that protection for websites like this one is good.
Almost everyone moderates, so almost no one can be other than a publisher according to your definition. And that's fine. I don't see the need for distinction between platform and publisher. Just make the person responsible for content the person that generated it, and then if you need some parallel copyright enforcement mechanism, figure out one.
The obvious issue with holding users responsible is what to do with anons. I think not much. They are still responsible for their posts, but we must acknowledge that a large number of untraceable posts will be made. Some mechanism must be made such that even though anon cannot be held responsible, his lawbreaking posts must still be taken down. And you do this without punishing platforms who comply.
The guy is talking about "search results" and what makes it to trending, not so much the content itself. The user is always responsible for his own content, but why shouldn't the publisher also be responsible if they have proven they had the power to filter it out with mysterious algorithms and didn't? Facebook/Twitter/Reddit are totally responsible for any CP traded there if they don't do anything to remove it as soon as possible.
The details seem a nightmare to regulate. What happens if they have a CP filter, but an item of CP makes it through? Are they then hit with the penalty for CP distribution? "As soon as possible" sounds impossible to litigate. I like CP as an example because Twitter doesn't want to distribute that (ostensibly). This isn't a matter of viewpoint filtering. This is a matter of law enforcement.
Having thought about it some more, perhaps there is more than one element of regulation. One goal is: you want pirated and otherwise illegal material taken down. The other is, I think: you want some fairness in moderation, lack of viewpoint discrimination. You can address these separately.