About three weeks ago, I wrote a blog post about racially skewed Google search results and received some thoughtful comments. Since then, I have tried to write a more detailed post in response to these comments and never did. Here is it, finally.
Why Google search results might be different: Google search is a VERY_VERY_VERY complex thing, and I am sure I do not know even half of the factors contributing to the results. Yes, it depends on geography, but not just a country or state. It depends on zip code, which defines the socioeconomic majority, on the computer, operating system and browser, on emails you recently received, on web sites you visited, on recent searches from your computer and your zip code, on what news sites you visit, what Kindle books you read and what audiobooks you listen.
And yes, mostly it depends on who pays :).
All of the above explains what we mean when we say that searches should be properly tested. When we run tests on the application code, we have some test cases, and we know how to tell whether the code works correctly. How we can test whether a search works correctly? It works correctly if we receive expected results. But what results are expected? Should we expect to find pictures of white families on exotic beaches when the search is initiated in my zip code? Or should we expect to receive diverse results? More importantly, which search results a local five-grader should expect?
My Canadian follower results were most likely different from mine because Canada is more progressive than the US. On the other hand, the fact that she received very few results with all-black families might mean that there are not that many homogeneous black communities in Canada compared to the US. To summarize, the search results reflect at least in part what’s going on in people’s minds—both in the minds of those who use the search engines and those who make them work.