Apple’s program to scan customers’ devices in an effort to combat child exploitation has been put on hold. Sort of. According to Ars Technica, the tech giant is temporarily suspending the launch of its search for exploitative images to address criticism. But it still plans to go ahead with it despite outcries from privacy and human rights groups.

Here’s what Apple said:

Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material [CSAM]. Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.

“Make improvements” means “we’re still going to do it.” And that still has privacy advocates deeply concerned about what such tech could do in the hands of authoritarian governments:

Apple has claimed it would refuse government demands to expand photo-scanning beyond CSAM. But privacy and security advocates argue that once the system is deployed, Apple likely won’t be able to avoid giving governments more user content.

“Once this capability is built into Apple products, the company and its competitors will face enormous pressure—and potentially legal requirements—from governments around the world to scan photos not just for CSAM, but also for other images a government finds objectionable,” 90 policy groups from the US and around the world said in an open letter to Apple last month. “Those images may be of human rights abuses, political protests, images companies have tagged as ‘terrorist’ or violent extremist content, or even unflattering images of the very politicians who will pressure the company to scan for them. And that pressure could extend to all images stored on the device, not just those uploaded to iCloud. Thus, Apple will have laid the foundation for censorship, surveillance and persecution on a global basis.”

The company may think it can keep government snoops at bay. But for all its market power, Apple is hardly immune to state coercion. If anything, the company’s business model – which demands it sell as much product as it can in nations with few if any reservations about human rights violations — makes Apple even more susceptible to pressure.