starting to draft "Cloud Vision"
authorM. Taylor Saotome-Westlake <ultimatelyuntruethought@gmail.com>
Sat, 22 Feb 2020 21:02:00 +0000 (13:02 -0800)
committerM. Taylor Saotome-Westlake <ultimatelyuntruethought@gmail.com>
Sat, 22 Feb 2020 21:07:17 +0000 (13:07 -0800)
content/drafts/cloud-vision.md [new file with mode: 0644]

diff --git a/content/drafts/cloud-vision.md b/content/drafts/cloud-vision.md
new file mode 100644 (file)
index 0000000..84b9dab
--- /dev/null
@@ -0,0 +1,13 @@
+Title: Cloud Vision
+Date: 2021-01-01
+Category: other
+Tags: cathartic
+Status: draft
+
+Google [recently](https://news.ycombinator.com/item?id=22373635) [sent](https://www.reddit.com/r/programming/comments/f6pyiu/cloud_vision_api_will_not_return_gendered_labels/) out an email to their [Cloud Vision API](https://cloud.google.com/vision) customers, notifying them that the service will stop returning "woman" or "man" labels for people in photos. Being charitable, I can think of reasons why I might defend or support such a decision. Detecting the _sex_ of humans in images is going to significantly less reliable than just picking out the humans in the photo, and the way the machines do sex-classification is going to depend on their training corpus, which might contain embedded cultural prejudices that Google might not want to inadvertently use their technological hegemony to reproduce and amplify. Just using a "person" label dodges the whole problem.
+
+I think of my experience playing with [FaceApp](https://www.faceapp.com/), the _uniquely best piece of software in the world_, which lets the user apply neural-network-based transformations to their photos to see how their opposite-sex analogue would look! (Okay, the software actually has lots of transformations and filters available—
+
+[...]
+
+This is [Cultural Revolution](https://www.goodreads.com/review/show/2671118186) shit! This is [Lysenko](https://en.wikipedia.org/wiki/Lysenkoism)-tier mindfuckery up in here!