[ad_1]
CONTENT MODERATION
Social media corporations level to the promise of synthetic intelligence to average content material and supply security on their platforms, however AI will not be a silver bullet for managing human conduct. Communities adapt rapidly to AI moderation, augmenting banned phrases with purposeful misspellings and creating backup accounts to stop getting kicked off a platform.
Human content material moderation can be problematic, given social media corporations’ enterprise fashions and practices. Since 2022, social media corporations have applied huge layoffs that struck on the coronary heart of their belief and security operations and weakened content material moderation throughout the trade.
Congress will want laborious knowledge from the social media corporations – knowledge the businesses haven’t supplied to this point – to evaluate the suitable ratio of moderators to customers.
THE WAY FORWARD
In well being care, professionals have an obligation to warn in the event that they imagine one thing harmful would possibly occur. When these uncomfortable truths floor in company analysis, little is finished to tell the general public of threats to security. Congress may mandate reporting when inside research reveal damaging outcomes.
Serving to teenagers right now would require social media corporations to put money into human content material moderation and significant age verification. However even that isn’t more likely to repair the issue. The problem is dealing with the truth that social media because it exists right now thrives on having legions of younger customers spending vital time in environments that put them in danger. These risks for younger customers are baked into the design of latest social media, which requires a lot clearer statutes about who polices social media and when intervention is required.
One of many motives for tech corporations to not section their consumer base by age, which might higher shield kids, is how it might have an effect on promoting income. Congress has restricted instruments out there to enact change, similar to implementing legal guidelines about promoting transparency, together with “know your buyer” guidelines. Particularly as AI accelerates focused advertising and marketing, social media corporations are going to proceed making it straightforward for advertisers to achieve customers of any age. But when advertisers knew what quantity of adverts had been seen by kids, relatively than adults, they could assume twice about the place they place adverts sooner or later.
Regardless of plenty of high-profile hearings on the harms of social media, Congress has not but handed laws to guard kids or make social media platforms responsible for the content material revealed on their platforms. However with so many younger individuals on-line post-pandemic, it’s as much as Congress to implement guardrails that in the end put privateness and group security on the heart of social media design.
Joan Donovan is Assistant Professor of Journalism and Rising Media Research, Boston College. Sara Parker is Analysis Analyst on the Media Ecosystem Observatory, McGill College. This commentary first appeared in The Dialog.
[ad_2]
Source link