The short answer is “yes.” The concept of “privilege” is probably familiar to you. It describes the advantages, both subtle and obvious, that flow towards certain groups in society and away from others. It is a collection of unearned rewards, benefits, and/or advantages triggered by affiliation to the dominant side of a power system. We experience this in the United States primarily as white privilege, male privilege and heterosexual privilege. Being any of non-white, female, queer—among other classifications—has been amply demonstrated to reduce one’s opportunities in the world. One need only examine some prison statistics to see how this plays out for people of color, or the evidence of pay disparities by gender to see how this plays out for women. So, how does this play out in the digital world?
We have taken to calling the experience of our online lives the “information society.” It is an increasingly apt description because is accurately describes how our social, political and legal lives are migrating out of physical space and into the digital. It should not be surprising that as both the positive and negative human inclinations found in the material world find expression in the digital world, a move to separate and segment society is finding its way there as well. The inequities that comfort and oppress various groups in modern society are not absent in the digital world. In fact, they are expressed in more subtle and pernicious ways and may prove even harder to combat. By now, you probably know a thing or two about state surveillance due to the disclosures of former NSA analyst Edward R. Snowden. Maybe you were already somewhat concerned about corporate surveillance too—the practices of all sorts of entities collecting information about your browsing habits, cell phone use, and so on. While most of us probably think mainly about how all this data collection affects our own lives, something that is obscured behind the basic problems of unregulated surveillance is how the dramatic increase in surveillance capabilities and data processing affect some people much more than others.
The journalist Natasha Singer has written a number of articles about the the profiling and scoring of consumers based on what can be discovered about them via their online activities and habits. As Singer illustrates, new scoring algorithms—which work similarly to traditional credit scores, but without any regulatory restraint—have the power to influence everything from your eligibility for a loan or a job to who you date. These are proprietary systems that operate with a great deal more information about you than you might imagine. The numbers you dial on your phone, the websites you access, the purchases you make (or decline), where you drive your car or swipe a transit pass, plus thousands of other data points gleaned from the increasingly transparent nature of our transactional and information-seeking lives, are aggregated, crunched and calculated for efficient ad-targeting…and social sorting. You could be tagged a hot prospect for a great deal on airline tickets, or you could be identified as a credit or security risk and denied an apartment rental, all through an entirely opaque and unaccountable web of algorithms and hidden interactions.
While you may or may not be concerned about this for your own life opportunities, this can have pretty disastrous effects on certain segments of society. Consider the intersection race with the technology of modern policing. As one example, body-worn cameras used by the police (which hold out much promise in addressing police/civilian violence) have raised concern due to everything else that is collected by those cameras during police interactions, and the web of technology that can act on that data. Police body-cams don’t only capture video of suspects, but also the people around them. Being the neighbor of a troublemaker dramatically increases your chances of playing a role in a police video. Facial recognition software has advanced to astonishing accuracy and will only get better. The increasing availability to the public of police data, joined with facial recognition software and digital scoring algorithms suggest a scenario in which one’s “scores” could be downgraded from simply living next door to a police target, resulting in curtailed opportunities and choices. Even if the example of this “by catch” seem farfetched to you, consider at least that the actual targets of police activity are also being unfairly impacted in new ways by technology. Racial bias in arrests and convictions is well-established and pretty hard to dispute. This increased likelihood of being targeted by the police based on race now also means an increase in data about people of color entering the data stream and staying there pretty much forever where it can permanently damage one’s future prospects. Add in the web of parsing and scoring algorithms and a picture begins to emerge of race-based algorithmic discrimination.
Race is not the only privilege factor that surfaces inequities in the information society. Women are having a very different experience of information technology than men. Revenge porn, which is the unauthorized publication of nude or pornographic imagery, typically made public by disgruntled ex-boyfriends, overwhelmingly affects women and can have profound social and economic consequences as well as put women in physical danger. Revenge porn victim have been fired, denied employment opportunities socially ostracized and stalked. While some states are getting around to criminalizing the practice, many of the websites that publish revenge porn media are out of the reach of law enforcement. Just like with criminal incident data, it can be nearly impossible to remove every instance of the data once it shows up somewhere online. Women experience other forms of harrassment, including threatening behavior by “trolls,” as happened in the Gamergate controversy earlier this year.
When a person or organization captures information about you and then uses it against you, that is a privacy-based informational harm. It is also an exercise of power. What opaque and proprietary monitoring systems know about you empowers them to manipulate and control you, while simultaneously diminishing your autonomy. We are all subject to this form of disempowerment at the hands of those who operate the technology upon which we are increasingly reliant. Lacking relative power at the outset, as is the case with the unprivileged, places some people in a far more vulnerable position; one they are unlikely to be able to extricate themselves from.
For the affluent and privileged, the market is responding to the increased awareness of, and sensitivity to, ubiquitous surveillance. For example, there are costly smartphones available that ship “hardened” with enhanced encryption and “data leakage” protection. A new industry called “reputation management” is rapidly growing aimed at businesses and the affluent enabling them to manage their online profiles in order to minimize any potential damage from any sort of bad behavior. Unprivileged people are more likely to have black marks against their profiles and are also likely lack the economic means to buy expensive phones and reputation cleanup services while the privileged can avoid discriminatory surveillance practices and can pay to maintain squeaky clean online profiles. This means that existing social divides will just get deeper and more entrenched. In the information society, the unprivileged are increasingly captured and shamed with demeaning and punitive data and can’t do much but become even more marginalized.