Check if your photos were used to develop facial recognition systems with this free tool
Exposing.ai finds can find out if your images were used in AI surveillance research
Rapid results
I tested out the tool on Flickr accounts that have shared photos with the public under a Creative Commons license. The second account I tried was spotted in the datasets.
Unfortunately, there isn’t too much you can do once your photos have been scraped.
It’s not possible to remove your face from image datasets that have already been distributed, although some allow you to request removal from future releases. The Exposing.ai team said they’ll soon include information about this process below your search results.
Exposing.AI also only works on Flickr and doesn’tcover every image training dataset used for facial recognition. The creators say future versions could include more search options.
At this point, the tool’s main power is exposing how our photos are used to develop facial recognition without our consent. Only changes to laws and company policies can prevent the practice.
Story byThomas Macaulay
Thomas is a senior reporter at TNW. He covers European tech, with a focus on AI, cybersecurity, and government policy.Thomas is a senior reporter at TNW. He covers European tech, with a focus on AI, cybersecurity, and government policy.
Get the TNW newsletter
Get the most important tech news in your inbox each week.