Sorry if this is not exactly on topic.
I was wondering how the moderation privileges look like now? ESPECIALLY for the timed comments that we can’t flag as inappropriate.
I have noticed some offensive comments floating around. Is it possible to report these somewhere?
I’ve created a new thread. In the future, if you cannot find an appropriate thread to post in, please just start a new one.
As for timed comments, they can be deleted by managers and moderators of that channel, as before. You’re probably best off reporting the comments to them, or perhaps ask the manager to be added as amoderator to help with it.
What @xomachi says is right. They can be deleted by moderators and cm’s of the channels and normally the timed comments are checked from time to time but it’s a lot of work to keep checking everything, specially when there are a lot of eps or the series stopped airing. So if you find anything offensive in the timed comments notify a mod/cm about it. Please also add the series, the episode and maybe the username/time so the CM/Mods can find the comment more easily.
Pros.
No worries about misspelling.
No need to track every comment with a screenshot or remember the exact time.
Can search by username or the offensive comment.
Report multiple comments on the same video easily.
Don’t have to copy video link separately.
Viki should have had a moderation tool to remove things quickly using their API by now. Maybe even let channel managers or moderators use such a tool. They have everything they need at their disposal. A report button for each timed comment couldn’t hurt. Time, video, user id would be included.
If i remember correctly, they have to go to the show page, open up the moderation page, select the episode then search for the offending comment.
The people reporting might not give all the details (username, time, comment, which specific video (with URL))
What else would be better?
Automating as much as possible. A report button for each comment would be ideal, then the moderator would accept deletion, or reject it. A separate interface for all the shows a volunteer is moderating so they don’t have to jump around. If its a busy time and they’re getting 15 reports per show per week. If you add all the shows, that’s a lot of time wasted on navigating to the show page. In lieu of that, a tool to parse the API reports, to verify the integrity of the report, very permissions, then delete or keep the comment would work.
In these 7 years I’ve been moderating, it has never happened to me to have a single report for a show I’m moderating. Maybe it’s a coincidence, because I have reported comments occasionally, so I know it does happen. But from what other mods tell me, it’s rather rare to get one report per 16-episode show. Most people, sadly, don’t bother. (They might shut off Timed Comments instead, or try to cover the offensive comment with one of their own). You could retort that this is because it’s not so easy, and if a report button made it easy, there would be more - and you’d probably be right. But not several per week, LOL!
Reports on the comment page under the show, those get reported much more from what I know.