I'm not really sure that we can really point the finger at YouTube too much here.
It's their fault for radicalising people! Their algorithms are designed for that purpose, which kinda gets bad if someone's browsing history is taken into consideration for investigations/prosecutions/vetting etc etc.
Think about it this way - take a video - any video, on any topic. For the purposes of discussion, lets use... an introduction to cross-stiching.
Now, we need to come up with suggestions for other videos a viewer of this video will watch.
Which is where Google's f'ng useless. So I have 18 'recommended' videos in my browser. Generally they're the same videos every time I check. So for whatever reason, it throws up one about cross-stitching. If I have no interest in that subject, I can't tell YT to recommend me something else. If you click on a 'recommended' video, YT's AlGorithm will think you want even more videos on the same subject 'recommended' to you.. which gets really f'ng annoying.
Ok, so was a lot worse a couple of weeks ago when YT decided I really wanted a much smaller selection of recommendations & decided to use a thumb-friendly/large thumbnail template, which was also really f'ng annoying when I have 3x30" monitors..
Now, not all viewers will reach that level, and will take different times to get there, but you can follow a chain to reach the more "edgy" content.
Or you can use bias to try to navigate people away from 'edgy' content towards Google-approved and ideologically sound content. If they try to subvert this re-conditioning, then obviously those people are up to no good..