We already know that automated content filters don't work. They put absolute power over content in the hands of a system that is opaque and broken.
A couple of days ago YouTube took down all of the Blender Foundation's videos from the Blender Foundation's channel due to a copyright claim by... the Blender Foundation.
The BF is an official YT partner, but days later they still cannot get a straight answer from YT about what is going on:
The further irony is that Blender Foundation's videos were all licensed under Creative Commons, so BF had already given permission for anyone to share their videos anywhere.
BF have no idea why YouTube decided to suddenly take the videos down, or why YT refuses to fix the problem even though they know about it.
No one knows. YouTube (and Google) have a track record of taking things down without explanation, and there's no practical way to force them to explain their decisions.
If Article 13 is adopted, this kind of thing will start to be seen across the entire internet.
@switchingsocial MIT OpenCourseWare has the same problem. https://icosahedron.website/@bstacey/100215430080046245
This is the most recent thing they have to say about the problem:
It's now Day 4 of Blender's videos being taken down by YouTube's content filter without explanation.
YT/Google has total control over who sees Blender's content, and there is no way for Blender to do anything about this.
#Article13 would mean automated content filters like this across the whole internet, controlled by large corporations.
Don't let this happen. Please PLEASE call/email your MEP to reject #Article13 today. The vote is tomorrow.
If Article 13 passes, content filtering may become mandatory :/
And such filtering couldn't be done by small sites or small companies, because it would require enough resources to check for matches with millions of copyrighted files.
It would probably end up being Google, Microsoft etc filtering smaller sites' content.
@switchingsocial That IS bizarre. But, I am actually not too surprised, since I have run into quite a few communication problems with Big Tech. So much of it is automated, and even if a live human reviews a decision, RARELY do they give a response that isn't automated as well. "Too Big To Fail" really becomes "Too Big To.. Error.. Error.. Error.."