Submissions/Building a Better Dispute Resolution System/notes
SESSION OVERVIEW
- Title
- Building a Better Dispute Resolution System
- Day & time
- Friday, 11:45am
- Session link
- Building Better Dispute Resolution System
- Speakers
- Trevor Bolliger & Sydney Poore
- Note takers
- hersei
DISCUSSION & QUESTIONS
Detection, Prevention
What measures are being used to prevent disputes?
- Blocking
- Page protection
What other measures could be used in the future?
How can we guide newcomers to the pages / places where they can more fruitfully dive into the (controversial) topic that made them click the edit button in the first place? The same editor starting to edit the lead of Global Warming vs. a stub in the topic area will have very different outcomes. Avoiding the dispute hotspots for early edits, without telling users they need to steer clear of the entire topic?
Reporting
On your home wiki, where do users currently report disputes?
- There is difficulty defining disputes at all. There are disputes that are things like two people not agreeing on the name of an article that are correctly discussed on talk pages, and then there are disputes where those two people disagree and one starts calling the other names or saying mean things to them based on who they are that fall more into what we want to talk about.
- Biggest problem -- conflicts about content turn into personal insults
- It's difficult to find the line, many variables
- Need to find when is the right time to react
- There's a thin line between "your idea is stupid" and "you are stupid"; some people perceive the former as the latter
- Both of the people think of themselves as the victim -- defense mechanisms kick in and people quit
- "unblockables" -- we know that this person is problematic but don't want them to stop contributing
- AbuseFilter people wouldn't agree with, it's too complicated to put a filter on an interaction ban
- Admin is afraid to block because they know there will be controversy
- difference between "vandalism" and "harassment" -- we've generally seen user page directed problems as vandalism, when it's actually harassment
Are these appropriate places to report disputes?
- Enwiki - multiple places, poorly defined, open noticeboards. Mix of dispute resolution, behaviour, and content. A maze of noticeboards and procedures. Very confusing. Reports are rejected because of process problems. WikiLawyering is rampant, users must wikilawyer to get a result, peoiple feel it is useless to try. Language used is very aggressive - cursing, insults ON the PAGE!
- He.wiki - village page, central noticeboard - request for admins. new users don't know who to talk to
- Bg.wiki - Clear place, but sometimes requests go directly to admins
- Eu.wiki - No specific place, usually D.R. is just a short discussion
- Cross-wiki: no real place, possibly meta, but nobody to handle them.
- Affiliates: no place to do this
- Small wikis: place to report, but they lack capacity to respond with variety. For example, they don't have resources for interaction bans, so they're often stuck withthe ban or don't ban option in the software.
Are these reports responded to? Ignored?
- People responding quickly, not admins, deal with problems badly without context
- ANI discussions are shallow, quick replies, low quality. Aggressive comments are common. Problems can be rejected for strange reasons
- Edit conflicts are a BIG PROBLEM on busy noticeboards
- Speed is an advantage
- Quote: "I would rather wait 2 months for a good review than get a bad one in two minutes"
- "Revenge" reporting happens (reporting party is reported themselves by the reported party)
- Archiving without closure happens often. This is sometimes used as a reason to call the problem "solved" or "stale"
- German: reports are often ignored. People have different levels of thin or thick skin, such that someone who complains about being called names is often told that it doesn't rise to the level of being blocked and is told there's nothing else to do, so that's that.
What type of information is requested by the provider of these reports?
- There is no standard form. Can be difficult to tell who started things (for example, when looking into an interaction ban, investigating who is approaching whom and when can be complex and cover multiple different pages and spaces).
What type of information is commonly omitted?
- Background information
- Collection of links
What type of information would you like to see included?
- Some kind of list of all interactions between two users over a period of time no matter what pages they occur on. Possibly even across multiple wikis in some cases.
Who responds to these dispute reports?
- All sorts of people. Sometimes admins, sometimes just bored people, or trolls
- Sometimes people or admins who are involved in the problem control the conversation.
- On large wikis, it can be arbcom, while small wikis tend to be available admins.
How is it decided who investigates cases?
- Often resource and time based. Especially on smaller wikis, there are fewer admins and it is whoever has time and is able to respond.
Which types of reports are made publicly? Which are made privately?
- Most are made publicly
Should these be private or public? Or a mix of both?
- Public reports risk backlash
- Private reports lack community oversight
- Public by default, escalate to private when appropriate?
- One user uses email to admins directly because they don't want to be humiliated publicly. But this can lead to pressure on the admins.
- Should be public ideally, but in reality we lose people because of privacy violations.
- Email is problematic as personal addresses are revealed upon reply, unless people are careful.
- People tend to react negatively to public consequences like blocking, often against the people deciding against them, even if the dispute itself is private. This can lead to calls for transparency so people can defend themselves privately.
Who should be able to view private reports?
- Ideas: Discussion is not immediately open for comment, cool-down. Could be optional, chosen by reporter.
- Admins often have too much work, or didn't sign up to be moderator.
Asymmetric disputes (between newcomers and experienced editors)
- How to deal with the very common situation of newcomers perceiving harassment when experienced editors (correctly) revert their (good faith) contributions
- Maybe opening a firehose if dispute/harassment reporting is usable and discoverable for newbies
- Potentially a major problem for experienced editors if the burden is on them to demonstrate they the reverters are *not* harassing, simply enforcing policy / community norms
- Maybe a key is making it easy to do gentle corrections of the common newbie mistakes?
Remedies
What technical remedies are currently being used as enforcement?
- Blocks
- Blocks should not be used:
- in retaliation against users;
- to disparage other users;
- as punishment against users;
- or where there is no current conduct issue of concern.
- Blocks should not be used:
- AbuseFilter
Are these effective?
- Sort of?
- The most difficult thing is people who go just up to the line and step back before the block -- "the unblockables" — people who are so productive that admins aren't willing to "punish" them with blocks
What are the shortcomings of these tools?
- Users who don't care about their status may simply create a new account
What social remedies are used?
- Bans -- interaction bans, topic bans
- Warnings and cautions (talk page)
- "smacks of optimism" that the person would actually change
- Talking about the issue on ANI
Are these effective?
- ANI discussions, to the extent that they deal with underlying issues, usually don't help solve those issues and instead only address individual manifestations of them as they relate to a single user/group of users
Potential other tools/methods
- We need the ability to slow down the speed someone is editing. Limit someone's posts per day on a specific (talk) page. Rate limiting as a method of preventing users from getting caught up in the moment.
- This may be possible with AbuseFilter but maybe not that cheap
- Assuming filters would be cheap, we could still end up with a *lot* filters targeting a lot of unique users, whereby it may be difficult to keep track of them.
- Filter categorization?
- Edit filters might be too general a tool; a moderation-specific solution might be warranted
- Potentially use a warning before save that would discourage an edit but not prohibit the publish
- Use "pending changes" for a user group of troublemakers -- are you approving content or behavior?
- problem is that the page still gets changed, asynchronous "edit conflicts"
- Problem is branching versions; our "version control system" doesn't support that
- Deferred Changes: https://en.wikipedia.org/wiki/Wikipedia:Deferred_changes may be of help here
- Pending changes level 2 https://en.wikipedia.org/wiki/Special:PermaLink/758665364
- Post-conflicts -- what happens to quitters, people who leave?
- We have sticks, no carrots? How can we—should we—"reward" the victims of bad behaviour?
- Is there a way to create a note that would say "this user has been involved in conflicts"
- Public shaming, or does the public nature of it make everything worse?
- Does it become a "badge of honor" for trolls?
- Admins/checkusers may prefer to keep track of sockpuppets via a category. On enwiki this is done by putting a Template:Sock template on the user page, however sometimes the socks may find this gratifying
- Useful as moderation tool, though? Perhaps if restricted to users in "moderation" roles (not necessarily admins) it could reduce workload by documenting "sub-block" troublemaking
- Warnings and blocks are never automatic -- could they be?
- 3RR -- this could get enforced or flagged automatically
- people would try to get around it by adding a comma in between edits -- although you could factor that into the design -- won't be perfect but it could prevent some incidents
- do this as a warning message before 3rd revert? a caution that you're about to violate a policy
- A private wiki just for admins, to help communicate more privately about trouble users?
- you can't document sockpuppet investigations publicly
- This could create a social gap between admins and other users if there's a private "admin wiki"
- should we have a system to help people contact people privately?
- there's no system to report problems
- public reports risk backlash!
- Emailing admins is one way to contact them privately about harassment; then they can block the user(s) involved
- Admin emails do risk a lack of documentation about the process behind an admin action; accountability
- a new user role? -- "moderators"
- key difference would be between *technocratic* admins and *socially empowered* moderators
- a help desk! something similar to the tea house. or any kind of link or software to push and nudge newbies to better templates or better guides for onboarding.
- Greater control for users themselves to control their spaces. Muting, preventing someone from thanking them for edits, or controlling who can post on their talk page.
- Enforceable interaction bans?
- Technical support for remedies that can done only by big wikis right now. Agree with the person above about technical support for interaction bans so that admins on small wikis can use them, and perhaps for specific topic or word bans as well, that would prevent users from putting certain things up there.
Training
- harassment training would be good for all admins to have
- if we build a new tool -- require people to get training in order to use it? understand how to be a better listener, how to evaluate the situation
- requiring people to get particular training would have to be accepted by each community; even if this is developed via the WMF
- Ukrainian wiki did a training on conflict resolution -- helpful for the people who went, but the people who provoke conflict didn't take the training!
- Could be required for social "moderators" if created as a new user group