-Google [recently](https://news.ycombinator.com/item?id=22373635) [sent](https://www.reddit.com/r/programming/comments/f6pyiu/cloud_vision_api_will_not_return_gendered_labels/) out an email to their [Cloud Vision API](https://cloud.google.com/vision) customers, notifying them that the service will stop returning "woman" or "man" labels for people in photos. Being charitable, I can think of reasons why I might defend or support such a decision. Detecting the _sex_ of humans in images is going to significantly less reliable than just picking out the humans in the photo, and the way the machines do sex-classification is going to depend on their training corpus, which might contain embedded cultural prejudices that [Google might not want to](https://ai.google/principles/) inadvertently use their technological hegemony to reproduce and amplify. Just using a "person" label dodges the whole problem.
+Google [recently](https://news.ycombinator.com/item?id=22373635) [sent](https://www.reddit.com/r/programming/comments/f6pyiu/cloud_vision_api_will_not_return_gendered_labels/) out an email to their [Cloud Vision API](https://cloud.google.com/vision) customers, notifying them that the service will stop returning "woman" or "man" labels for people in photos. Being charitable (as one does), I can think of reasons why I might defend or support such a decision. Detecting the _sex_ of humans in images is going to significantly less reliable than just picking out the humans in the photo, and the way the machines do sex-classification is going to depend on their training corpus, which might contain embedded cultural prejudices that [Google might not want to](https://ai.google/principles/) inadvertently use their technological hegemony to reproduce and amplify. Just using a "person" label dodges the whole problem.