Sorry for the late reply - this got stuck in my drafts...
@bluewavenet I looked further at that Wiki page and I see what you mean. I believe you are deeply knowledgeable about this stuff, so I tend to accept what you say.
However, I'm also concerned about the greater policy issue of plain wrong info - whether AI-generated or not. (If people are "just sayin' stuff" on the Forum, there's a mechanism for shutting them off.) But I don't think we have a policy for the Wiki.
I decided to see what the experts do, so I asked ChatGPT to summarize, "How does Wikipedia moderate contentious topics?" (The irony of using ChatGPT to get this summary is not lost on me.) I pasted the response at the end.
There aren't any surprises here: Everyone should be guided by what's true; People should follow Rule 12 ("Be nice to each other"); There's a dispute resolution process that involves the parties talking directly (that I don't believe exists today for the Wiki); Temporary or permanent blocks if necessary; etc.
This is the first time I'm aware of this kind of problem on the Wiki. (I wouldn't necessarily notice: I'm not an admin there, and only have deep knowledge of a couple subjects.)
I think you have taken a good first step, cloning the article with correct info and on the other article, posting a note saying, "This contains unreliable information". I will ping the Wiki admins to see what dispute resolution policies and mechanisms are available.
Thanks
How does Wikipedia moderate contentious topics?
ChatGPT says: Wikipedia moderates contentious topics through a combination of policies, guidelines, and enforcement mechanisms, including:
1. Protection Mechanisms
Page Protection: Pages related to contentious topics (e.g., politics, religion, current events) may be:
Semi-protected (only autoconfirmed users can edit).
Extended-confirmed protected (only users with a higher level of trust can edit).
Fully protected (only administrators can edit).
Pending Changes Protection (edits from unregistered or new users must be reviewed before being published).
2. Content Guidelines & Neutrality
Neutral Point of View (NPOV): Content must be balanced, representing all significant viewpoints fairly.
Verifiability: Claims must be backed by reliable, third-party sources.
No Original Research: Wikipedia does not allow unpublished interpretations or personal opinions.
3. Dispute Resolution
Talk Pages: Editors discuss content disagreements on the article’s talk page.
Request for Comment (RfC): A broader community discussion can be initiated for disputes.
Third-Party Mediation: Experienced editors may mediate conflicts.
Arbitration Committee (ArbCom): The highest dispute resolution body, used for serious cases of editor misconduct or long-standing disputes.
4. Editor Conduct Rules
Civility Policy: Editors must remain respectful.
Assume Good Faith: Editors should not assume malicious intent without evidence.
Edit Warring Rules: Repeatedly undoing others’ edits (revert warring) can lead to temporary blocks.
5. Sanctions & Blocks
Topic Bans: Users may be restricted from editing certain topics.
Blocks & Bans: Violators of Wikipedia policies can be blocked from editing.
Sockpuppet Investigations: Wikipedia monitors for users creating multiple accounts to push a viewpoint.
6. Automated & Human Monitoring
Wikipedia Bots: Some bots revert vandalism automatically.
There is in the Forum (it "comes for free" with the Discourse forum software), but not the Wiki. That has been built on a culture of deep trust for the people who post there.
I would like to maintain that culture, but come up with clearly stated policies for keeping the content up-to-date and rigorously correct, and enforcing those policies as needed.
Wikipedia seems to have been effected by not following its own NPOV guidelines in the last few years, although tech pages mostly still seem ok for the most part.
As an interesting comparison of the quality of AI output, here is what Grok has to say, it seems much more detailed:
How does Wikipedia moderate contentious topics?
Wikipedia employs several mechanisms to moderate contentious topics, ensuring that the content remains neutral, factual, and representative of multiple perspectives. Here's an overview of how they handle such topics:
Policies and Guidelines:
Neutral Point of View (NPOV): This is one of Wikipedia's core content policies, requiring that articles and other content represent viewpoints fairly, proportionately, and without bias.
Verifiability: Information must be backed by reliable sources to prevent disputes over facts.
No Original Research: Editors are not allowed to add their own interpretations or unpublished facts; all content must be sourced from reliable, published material.
Community Involvement:
Editing and Discussion: Contentious topics often attract a lot of attention from editors. Discussions take place on talk pages where editors can debate the content, sources, and neutrality of articles.
Consensus: Changes to controversial topics usually require a consensus among editors. This process can be lengthy but aims to ensure that the content reflects a balanced view.
Administrative Tools:
Protection: Articles that are subject to edit wars or vandalism can be "protected," limiting editing rights to more experienced users or admins for a period.
Arbitration Committee (ArbCom): For extremely contentious issues, Wikipedia has an Arbitration Committee. This group hears disputes and can impose sanctions or make decisions on how a topic should be handled, including bans on editing certain topics for involved parties.
Community Sanctions:
Discretionary Sanctions: For subjects known to be contentious, Wikipedia can implement discretionary sanctions where admins can act more freely to prevent disruption, including reverting edits or blocking users.
Mediation and Dispute Resolution:
Mediation Committee: This committee helps in resolving disputes by facilitating discussions between editors.
Third Opinion: For smaller disputes, an uninvolved editor can offer a third opinion to help resolve simple disagreements.
Monitoring and Oversight:
Watchlists: Many editors keep contentious articles on their watchlists to monitor changes and revert vandalism or biased edits quickly.
Bots and Tools: Automated tools and bots help in maintaining article integrity by flagging potential issues or enforcing style guidelines.
Education and Training:
Editor Training and Workshops: New and existing editors are encouraged to participate in training to understand Wikipedia's policies better, particularly around handling contentious material.
Sometimes, especially in areas like science or history, Wikipedia might rely on experts for input, although this is not formal peer review but more of an informal consultation.
These mechanisms collectively aim to manage the tension around controversial subjects by fostering a collaborative, transparent, and rule-based environment. However, the process is not without its challenges, as achieving true neutrality can be subjective, and the dynamics of large, open communities can lead to complex interactions and disputes.
All well and good, but this is all far to complex for the effectively unmoderated OpenWrt Wiki.
I stand by my first suggestion that the Wiki can be 99% self moderated by accepting edits by the OA/Timeout method. ie an edit generates a notification to the OA (Original Author), the OA can accept the edit or not, giving the reason and hopefully suggesting alterations. If the OA fails to respond after a timeout period, the edit is accepted anyway.
This will stop "Edit Warring" from ever occurring.
The notification part is already functional.
Could we use the current badging system going forward for all new requests for wiki edit privileges?
All wiki edit privileges already must be granted by the moderator team. We create an interactive certification tutorial that lays out the tools, responsibilities, rules, and protocols expected as well as the consequences of any violations before granting said privileges.
The OA simply informs the moderator team and violators privileges are suspended.
Certified Wiki Editor
This badge is granted upon successful completion of the interactive wiki editor tutorial.
You have taken the initiative to learn the basic tools, responsibilities, protocols, and requirements required of you to achieve certification as a wiki editor, and now you're certified!