Although YouTube has improved its algorithm over the past year to prevent problematic videos from being recommended. Yet, It is still recommending videos that users may find offensive.
The Wall Street Journal (WSJ) reported on the 7th (local time), citing research results from the Mozilla Foundation, a software nonprofit organization.
The Mozilla Foundation conducted an experiment. It’s to track the use of 37,000 applicants on YouTube for 10 months using certain programs.
In the experiment, if applicants reported that there was a problem. It’s because the video contained sexual content or made false claims. Then, the program tracked whether YouTube recommended the video to the applicant or whether the applicant found it on his own.
Study of Youtube Algorithm Finds: 71% of Videos are Offensive
As a result, 71% of the videos reported by applicants as unpleasant were recommended by YouTube algorithms.
Although YouTube later deleted 200 videos reported by applicants, Mozilla said the video had already exceeded 160 million views.
YouTube said that the number of videos recommended as harmful content by various measures over the past year has decreased to less than 1% of those viewed.
In particular, the automated system detected 94% of videos that violated YouTube policies. Moreover, most of the videos were deleted before they reached 10 views.
Mozilla owns and operates Firefox, an Internet browser that competes with Google Chrome.
Read now: How to Get $100,000 a Year from TikTok, Instagram & YouTube