This site is based off reddit which has a large amount of bots for controlling narratives, so it suffers some of the same risks.
Upvoting is very powerful. It can determine what posts reach rising or hot, where most users see them. It also affects what replies people see, although just being the first to post a reply or being the first to reply to the top reply can do that without bots. Most users that read comments will upvote the topmost comment, but don't bother scrolling through comments to upvote later ones.
An AI that can read new posts, or a few cheap staff that can upvote or downvote new posts would be effective.
I'm not aware of .win having public API access, so making complex bots is harder, but not impossible. This includes auto mods like reddit has. Subreddits also censor by requiring new posters have their posts approved.
There was someone who posted a list of users who used sections of the .win network, so that is possible, although not everything they posted was accurate.
Why waste your time on a bot farm? Have a user with admin power pick what you want to give 10,000 upvotes and post it on a sock account ready to manipulate. Indian phone farms are what poor people use to manipulate online media. People with real money and power tell the guy doing it for free to post what they want and then send him a mario lunch box signed by Chris Pratt.
Do some accounts upvote the same posts at the same time beyond a safe %. Are some accounts posting identical posts. Does a post include a link to a known scam site.
Flag for review (not suspension) if an account keep posting links with the same text. This can detect unknown scam sites, or referral links. You're not looking for someone who posts multiple links to the same news site, but if they exclusively post links from that site it could mean they're related to it.
Accounts with AI or people behind them are much harder to detect.
This site is based off reddit which has a large amount of bots for controlling narratives, so it suffers some of the same risks.
Upvoting is very powerful. It can determine what posts reach rising or hot, where most users see them. It also affects what replies people see, although just being the first to post a reply or being the first to reply to the top reply can do that without bots. Most users that read comments will upvote the topmost comment, but don't bother scrolling through comments to upvote later ones.
An AI that can read new posts, or a few cheap staff that can upvote or downvote new posts would be effective.
I'm not aware of .win having public API access, so making complex bots is harder, but not impossible. This includes auto mods like reddit has. Subreddits also censor by requiring new posters have their posts approved.
There was someone who posted a list of users who used sections of the .win network, so that is possible, although not everything they posted was accurate.
Why waste your time on a bot farm? Have a user with admin power pick what you want to give 10,000 upvotes and post it on a sock account ready to manipulate. Indian phone farms are what poor people use to manipulate online media. People with real money and power tell the guy doing it for free to post what they want and then send him a mario lunch box signed by Chris Pratt.
I was answering your first sentence.
Do some accounts upvote the same posts at the same time beyond a safe %. Are some accounts posting identical posts. Does a post include a link to a known scam site.
Flag for review (not suspension) if an account keep posting links with the same text. This can detect unknown scam sites, or referral links. You're not looking for someone who posts multiple links to the same news site, but if they exclusively post links from that site it could mean they're related to it.
Accounts with AI or people behind them are much harder to detect.