Is anyone else hating a lot of these current articles that are sparse as fuck on detail. How are they actually using generative AI. Where is it being applied. Just telling me that it’s tools for editors and volunteers doesn’t tell me what the tool is doing. 😤
I’m a manager of sorts and one of the people who report to me used gen AI in their mid-year reviews. Basically, they said, “make this sound better” and the AI spit out something that reads better while still having the some content. In the past, this person had continually been snarky and self-deprecating, and the AI helped make it sound more constructive.
I hope that’s what’s happening here. A human curates the content, runs it through the AI to make it read better, then edits from there. That last part is essential though.
Yeah, this article seems like an anti-Wikipedia article. They’re just using it for translation, spelling errors, content quality, etc.
Wikipedia’s model of collective knowledge generation has demonstrated its ability to create verifiable and neutral encyclopedic knowledge. The Wikipedian community and WMF have long used AI to support the work of volunteers while centering the role of the human. Today we use AI to support editors to detect vandalism on all Wikipedia sites, translate content for readers, predict article quality, quantify the readability of articles, suggest edits to volunteers, and beyond. We have done so following Wikipedia’s values around community governance, transparency, support of human rights, open source, and others. That said, we have modestly applied AI to the editing experience when opportunities or technology presented itself. However, we have not undertaken a concerted effort to improve the editing experience of volunteers with AI, as we have chosen not to prioritize it over other opportunities.
Is anyone else hating a lot of these current articles that are sparse as fuck on detail. How are they actually using generative AI. Where is it being applied. Just telling me that it’s tools for editors and volunteers doesn’t tell me what the tool is doing. 😤
I’m a manager of sorts and one of the people who report to me used gen AI in their mid-year reviews. Basically, they said, “make this sound better” and the AI spit out something that reads better while still having the some content. In the past, this person had continually been snarky and self-deprecating, and the AI helped make it sound more constructive.
I hope that’s what’s happening here. A human curates the content, runs it through the AI to make it read better, then edits from there. That last part is essential though.
What kind of sorts do you manage
Software engineers. I’m also a software engineer.
Here’s the actual source: https://meta.m.wikimedia.org/wiki/Strategy/Multigenerational/Artificial_intelligence_for_editors
ah so no generative ai used in actual article production, just in meta stuff and for newcomers to ask questions about how to do things.
Yeah, this article seems like an anti-Wikipedia article. They’re just using it for translation, spelling errors, content quality, etc.