X-Git-Url: http://unremediatedgender.space/source?a=blobdiff_plain;ds=sidebyside;f=content%2Fdrafts%2Fcloud-vision.md;h=d76ba2de1509f5c8e8f7aa0e30e816ddd40a2392;hb=71058744ac14c8997a54feaea90788a3ab0c7d59;hp=5f6b52ec295660269a04a0d39c11c0fc14e1a1d0;hpb=570fb042095d2fce8b38c921a5d9e54411a357d6;p=Ultimately_Untrue_Thought.git diff --git a/content/drafts/cloud-vision.md b/content/drafts/cloud-vision.md index 5f6b52e..d76ba2d 100644 --- a/content/drafts/cloud-vision.md +++ b/content/drafts/cloud-vision.md @@ -4,7 +4,7 @@ Category: other Tags: cathartic, news Status: draft -Google [recently](https://news.ycombinator.com/item?id=22373635) [sent](https://www.reddit.com/r/programming/comments/f6pyiu/cloud_vision_api_will_not_return_gendered_labels/) out an email to their [Cloud Vision API](https://cloud.google.com/vision) customers, notifying them that the service will stop returning "woman" or "man" labels for people in photos. Being charitable, I can think of reasons why I might defend or support such a decision. Detecting the _sex_ of humans in images is going to significantly less reliable than just picking out the humans in the photo, and the way the machines do sex-classification is going to depend on their training corpus, which might contain embedded cultural prejudices that [Google might not want to](https://ai.google/principles/) inadvertently use their technological hegemony to reproduce and amplify. Just using a "person" label dodges the whole problem. +Google [recently](https://news.ycombinator.com/item?id=22373635) [sent](https://www.reddit.com/r/programming/comments/f6pyiu/cloud_vision_api_will_not_return_gendered_labels/) out an email to their [Cloud Vision API](https://cloud.google.com/vision) customers, notifying them that the service will stop returning "woman" or "man" labels for people in photos. Being charitable (as one does), I can think of reasons why I might defend or support such a decision. Detecting the _sex_ of humans in images is going to significantly less reliable than just picking out the humans in the photo, and the way the machines do sex-classification is going to depend on their training corpus, which might contain embedded cultural prejudices that [Google might not want to](https://ai.google/principles/) inadvertently use their technological hegemony to reproduce and amplify. Just using a "person" label dodges the whole problem. I think of my experience playing with [FaceApp](https://www.faceapp.com/), the _uniquely best piece of software in the world_, which lets the user apply neural-network-based transformations to their photos to see how their opposite-sex analogue would look! (Okay, the software actually has lots of transformations and filters available— @@ -13,3 +13,13 @@ I think of my experience playing with [FaceApp](https://www.faceapp.com/), the _ "Given that a person's gender cannot be inferred by appearance [...]" reads the email. _Cannot_ be inferred. This is [Cultural Revolution](https://www.goodreads.com/review/show/2671118186) shit! This is [Lysenko](https://en.wikipedia.org/wiki/Lysenkoism)-tier mindfuckery up in here! + +coronavirus, Elizabeth Warren + +https://twitter.com/ewarren/status/1230577418559270913 + + + +> "We must be guided by solidarity, not stigma," the director general of the World Health Organization warned on Saturday. "The greatest enemy we face is not the virus itself; it’s the stigma that turns us against each other." +https://twitter.com/nytimes/status/1229832977472946184 +https://archive.is/1pQ4p