Intro
Diskussion
KI+Texte
KI+Übersetzung
KI+Medien
KI+QS
KI+Hilfen
Team

Hier geht es um den Einsatz von KI zur Qualitätssicherung (Erkennen Vandalismus, Veraltung, Widersprüchen etc.)

Instrumente zur Vandalismusbekämpfung

Bearbeiten
  • mw:ORES ein Webdienst und eine API, die maschinelles Lernen als Dienst für Wikimedia-Projekte bereitstellt, die vom Machine Learning-Team unterhalten werden.

„As ClueBot NG requires a dataset to function, the dataset can also be used to give fairly accurate statistics on its accuracy and operation. Different parts of the dataset are used for training and trialing, so these statistics are not biased.

The exact statistics change and improve frequently as we update the bot. Currently:

Selecting a threshold to optimize total accuracy, the bot correctly classifies over 90% of edits. Selecting a threshold to hold false positives at a maximal rate of 0.1% (current setting), the bot catches approximately 40% of all vandalism. Selecting a false positive rate of 0.25% (old setting), the bot catches approximately 55% of all vandalism. Currently, the trial dataset used to generate these statistics is a random sampling of edits, each reviewed by at least two humans, so statistics are accurate.

Note: These statistics are calculated before post-processing filters. Post-processing filters primarily reduce false positive rate (ie, the actual number of false positives will be less than stated here), but can also slightly reduce catch rate.“

Verbesserung der Inhalte

Bearbeiten

„We show that the process of improving references can be tackled with the help of artificial intelligence (AI) powered by an information retrieval system and a language model. This neural-network-based system, which we call SIDE, can identify Wikipedia citations that are unlikely to support their claims, and subsequently recommend better ones from the web.“