So many studies have biased samples. One example that affects many people daily concerns the typical office temperature. My wife, and many other women, are cold at work. Most office buildings in the US are heated according to standards based on Fanger’s thermal comfort equation. This equation was based on studies done on offices filled with men back in the 1960s. Men typically like colder temperatures than women.
Chapter 7: Demand Transparency When The Computers Say No
I often wonder if the algorithms might enforce the systemic biases in the data sets they were trained with. I see making algorithms transparent as a major research area in the next few years.
Chapter 8: Don’t Take The Statistical Bedrock for Granted
We recently experienced the lack of a statistical bedrock with COVID. The Director of the CDC has spoken about their need to improve their testing and surveillance after decades of neglect. CDC did not have the infrastructure to provide actionable information to the US population. People had to depend on universities like Johns Hopkins and the University of Washington for information. This is not ideal because we need national standards to gather data so that we know what we have (see Chapter 3).
this is such a great write-up! your points on Chapter 6 also remind me of some of the smaller annoyances I often deal with – cell phones are too large for me to use with one hand, and pants pockets for women either don’t exist, or are too small to be useful (this graphic on women’s pants pockets from The Pudding remains one of my favorites).
re: Chapter 7 I’m super curious if we’ll dig into this a lot more with our December book, Atlas of AI! I think there’s a lot to unpack here, and it’s worth getting a handle on!