Apple officials have revealed plans to scan all U.S. iPhones for images that depict child sex abuse.

Using a tool called “NeuralMatch,” company officials say they’ll be able to find illegal pics in customers’ iPhones without decrypting the users’ personal messages. A phone’s contents will be looked at only if the tool finds an offending image, according to Apple execs.

Needless to say, the plan has already been met by plenty of backlash. Cryptography researcher Matthew Green says the program will make it easy for people to frame others by sending them seemingly harmless photos that are encrypted to give “NeuralMatch” a positive result. “Researchers have been able to do this pretty easily,” he says.

Is this an invasion of privacy? Shouldn’t Apple have to ask for permission before doing this?

More about: