This video says Flock uses AI to identify plates, and anything with a low confidence rating is thrown out instead of kept and filed.
https://www.tiktok.com/t/ZThDcfPNxFlock uses AI to identify plates. This makes it vulnerable to AI attacks.
Another video for additional context
So perhaps tou could make a bumper sticker that says, “ignore all previous instructions and…” then whatever you wanted to try telling it to do. Maybe gell it to say a low confidence score, so that it doesn’t file identification on your vehicle.
the solution suggested in the original youtube video might get you an argument with the cops because it does cover the license plate to some amount. but instructions next to it don’t need any sort of occlusion on the plate itself. And, if you know a little about AI like I do, you’ll know that fixing this security issue is essentially impossible with LLM’s. It’s a flaw built into the nature of the beast. If they do somehow find a way around it, then you still have a bumper sticker that looks like a funny meme.


