X-Git-Url: http://unremediatedgender.space/source?p=Ultimately_Untrue_Thought.git;a=blobdiff_plain;f=content%2Fdrafts%2Fcloud-vision.md;h=a37f9760e8e1b1c0c5086bb2253dc1662a4b9a31;hp=5f6b52ec295660269a04a0d39c11c0fc14e1a1d0;hb=06c38b2813b1be78b80a9f10a4ee428f632a2076;hpb=bcb5c59eb795a7d4b2335b18f0ec58230cbbd44b diff --git a/content/drafts/cloud-vision.md b/content/drafts/cloud-vision.md index 5f6b52e..a37f976 100644 --- a/content/drafts/cloud-vision.md +++ b/content/drafts/cloud-vision.md @@ -4,7 +4,7 @@ Category: other Tags: cathartic, news Status: draft -Google [recently](https://news.ycombinator.com/item?id=22373635) [sent](https://www.reddit.com/r/programming/comments/f6pyiu/cloud_vision_api_will_not_return_gendered_labels/) out an email to their [Cloud Vision API](https://cloud.google.com/vision) customers, notifying them that the service will stop returning "woman" or "man" labels for people in photos. Being charitable, I can think of reasons why I might defend or support such a decision. Detecting the _sex_ of humans in images is going to significantly less reliable than just picking out the humans in the photo, and the way the machines do sex-classification is going to depend on their training corpus, which might contain embedded cultural prejudices that [Google might not want to](https://ai.google/principles/) inadvertently use their technological hegemony to reproduce and amplify. Just using a "person" label dodges the whole problem. +Google [recently](https://news.ycombinator.com/item?id=22373635) [sent](https://www.reddit.com/r/programming/comments/f6pyiu/cloud_vision_api_will_not_return_gendered_labels/) out an email to their [Cloud Vision API](https://cloud.google.com/vision) customers, notifying them that the service will stop returning "woman" or "man" labels for people in photos. Being charitable (as one does), I can think of reasons why I might defend or support such a decision. Detecting the _sex_ of humans in images is going to significantly less reliable than just picking out the humans in the photo, and the way the machines do sex-classification is going to depend on their training corpus, which might contain embedded cultural prejudices that [Google might not want to](https://ai.google/principles/) inadvertently use their technological hegemony to reproduce and amplify. Just using a "person" label dodges the whole problem. I think of my experience playing with [FaceApp](https://www.faceapp.com/), the _uniquely best piece of software in the world_, which lets the user apply neural-network-based transformations to their photos to see how their opposite-sex analogue would look! (Okay, the software actually has lots of transformations and filters available— @@ -13,3 +13,5 @@ I think of my experience playing with [FaceApp](https://www.faceapp.com/), the _ "Given that a person's gender cannot be inferred by appearance [...]" reads the email. _Cannot_ be inferred. This is [Cultural Revolution](https://www.goodreads.com/review/show/2671118186) shit! This is [Lysenko](https://en.wikipedia.org/wiki/Lysenkoism)-tier mindfuckery up in here! + +coronavirus, Elizabeth Warren