WelcomeUser Guide
ToSPrivacyCanary
DonateBugsLicense

©2025 Poal.co

https://www.reddit.com/r/DataHoarder/comments/ylfjfw/legal_issues_renting_out_storage/iuyz3na&context=3

(post is archived)

[–] 0 pt (edited )

This is the exact type of thing that image recognition AIs should be used for. Granted you'd have to train it on a proper dataset, which means it'd have to be done by glownigger agencies or police, but once it's trained it doesn't "contain" any of the offending material but can easily detect it.

That said even just regular legal porn-trained AI image recognition is probably just fine for detecting CP as well. Obviously an image recognition AI trained on regular porn wouldn't be able to distinguish regular porn from illegal CP, but if I were a data storage provider I'd just ban both and tell my customers "too bad, host your porn elsewhere, I don't have the resources to get an AI to discriminate between legal and illegal porn" and that to me is a perfectly acceptable compromise.

[–] 0 pt

Perhaps there's a bit of middle ground. FBI keeps a database of all cp, but software companies can "train" their AIs with it. Not sure the best way to do this though. Either you'd have to give the FBI your software, to which the FBI can secretly modify it, or the FBI would give access to the cp collection risking leaks.

[–] 0 pt

Just let the FBI train the model and give the checkpoint to you to use is probably the best. Ultimately the code on a lot of these types of image recognition AIs is already open source anyway.

I wouldn't want to be the agent tasked with tagging those lol