From: M. Taylor Saotome-Westlake Date: Sat, 22 Feb 2020 21:02:00 +0000 (-0800) Subject: starting to draft "Cloud Vision" X-Git-Url: http://unremediatedgender.space/source?a=commitdiff_plain;h=4c0e5cc057e393628bc11814fe213ad2664624b5;p=Ultimately_Untrue_Thought.git starting to draft "Cloud Vision" --- diff --git a/content/drafts/cloud-vision.md b/content/drafts/cloud-vision.md new file mode 100644 index 0000000..84b9dab --- /dev/null +++ b/content/drafts/cloud-vision.md @@ -0,0 +1,13 @@ +Title: Cloud Vision +Date: 2021-01-01 +Category: other +Tags: cathartic +Status: draft + +Google [recently](https://news.ycombinator.com/item?id=22373635) [sent](https://www.reddit.com/r/programming/comments/f6pyiu/cloud_vision_api_will_not_return_gendered_labels/) out an email to their [Cloud Vision API](https://cloud.google.com/vision) customers, notifying them that the service will stop returning "woman" or "man" labels for people in photos. Being charitable, I can think of reasons why I might defend or support such a decision. Detecting the _sex_ of humans in images is going to significantly less reliable than just picking out the humans in the photo, and the way the machines do sex-classification is going to depend on their training corpus, which might contain embedded cultural prejudices that Google might not want to inadvertently use their technological hegemony to reproduce and amplify. Just using a "person" label dodges the whole problem. + +I think of my experience playing with [FaceApp](https://www.faceapp.com/), the _uniquely best piece of software in the world_, which lets the user apply neural-network-based transformations to their photos to see how their opposite-sex analogue would look! (Okay, the software actually has lots of transformations and filters available— + +[...] + +This is [Cultural Revolution](https://www.goodreads.com/review/show/2671118186) shit! This is [Lysenko](https://en.wikipedia.org/wiki/Lysenkoism)-tier mindfuckery up in here!