Find out if your photo is in this AI training dataset



[ad_1]

Face recognition systems are everywhere, from security cameras that try to spot criminals to how Snapchat finds your face for rabbit ears. Computers need a lot of data to learn to recognize faces, and some of it comes from Flickr.

IBM released a "Diversity in Faces" dataset earlier this year, which is kind of good: many early face recognition algorithms have been trained on thin white celebrities because it's easy to find a lot of celebrity photos. . Your data source affects what your algorithm is capable of doing and understanding. There are therefore many racist and sexist algorithms. This dataset aims to help by providing facial images alongside face data, such as skin color.

But most people who downloaded their personal snapshots on Flickr probably did not realize that in years their faces and those of their friends and families could be used to form the next big mega-algorithm. If you have applied a Creative Commons license to your photos, even "non-commercial", you may find yourself in this dataset.

According to NBC, IBM says that it will delete images from the dataset at the request of the photographer or person photographed, but that they have not made the dataset public. So there is no way to know for sure if you are really there. Getting a deleted photo will not be easy, but if you want to know if one of yours has been used, you can enter your Flickr username in the NBC tool here. These are not necessarily the only data that can hold your photo, but there is at least a way to know if your photos have been used.

[ad_2]

Source link