Browse By

On HP’s Racist Webcams (Or Lack Thereof)

To start, I’m sure many of you have seen or heard about the YouTube video of the black dude who shows that the webcam on the HP MediaCenter does not track his face but does track the face of his white co-worker. The vid is here, in case you haven’t seen. It’s pretty funny, too, because the dude (Desi) seems like a fun guy. When he says “I’m going to go on record and say HP computers are racist” you know he’s mostly joking, though it is really messed up that the camera doesn’t recognize his face as a face.

Now, this vid was uploaded to YouTube (ironically using the HP MediaCenter) on December 10th but it took a few days to really blow up around the ‘net. HP caught wind of it a couple of days ago and put up something on their blog mentioning lighting conditions and they were working to solve the problem and whatever. But that hasn’t stopped tons of commenters on blogs and Twitter and Facebook from declaring that HP is racist or, at least, its webcams are.

I find myself in a strange position here, because I’m about to say something I don’t normally say: people, there’s not racism here.

That’s not to say there isn’t a problem and a serious one. But it’s more along the lines of the stuff I pointed out yesterday with the digital frames. One of not thinking or considering, one of privilege and blindness, but I am failing to see how racism is involved.

Let’s back up a bit. In case you’re not sure what’s going on here technologically, there is a feature in some webcam software that is designed to zoom in on the face of a person looking into the camera. I don’t know why this feature is necessary, but obviously someone likes it. Anyway, Face Tracking is supposed to keep your face in close up no matter where you move within the webcam’s field of vision. It identifies what is a “face” by an algorithm I won’t even try to explain because I don’t know how it works. HP said something about measuring the distance between the eyes and cheekbones but, again, I have no clue. That is what Desi was trying to get to work in the video but could not.

The software behind all this is part of HP’s MediaCenter suite which looks like one big program all created by HP. However, that’s not exactly true. When I was playing around with the program I noticed that it was really similar to CyberLink’s YouCam software, from the way the buttons and settings menus were designed to the kinds of effects and avatars available.

It’s no secret that vendors often bring in third-party software then put their own branding on it. Why develop webcam software in house when perfectly good software already exists? You can find YouCam software on a ton of computers, not just HP, and you can also download it yourself. I put it on a computer of mine and tried the Face Tracking thing and it works the same. So, if anything, the software is “racist”, not the webcam and not the computer manufacturer.

Though HP probably did some testing to ensure that the software interacted well with their system, I doubt anyone at the company tested all of the features. That’s not their job, actually, that’s the job of the software developers. So if we’re going to look for culprits here, we need to turn our attention to CyberLink. I don’t know for sure, but I’m going to guess that the folks at CyberLink tested the Face Tracking with a few people, but either not with any dark-skinned employees (assuming they have some) or not in enough varying lighting conditions with said employees.

The webcams included with most notebooks and all-in-one PCs are not of the highest quality. They’re for Skype chatting and mking silly YouTube reaction videos or lip dubs. The brightness, contrast, and backlighting correction are rarely the best (I know, as I’ve tested dozens). And that’s where the software runs into problems.

Go look at this video, then this one. It shows that a simple change in the software’s settings makes the difference between the webcam being able to track the face of a dark-skinned person and not being able to. (Also note that different shades of dark skin make a difference, too.) So what’s the real problem here? It’s two fold: one, that the software developers didn’t properly take dark-skinned owners into consideration when creating the product. Two, that crappy webcams make everything worse in life.

Given all this, I don’t see racism here. I think this is a fine wake-up call for CyberLink or whoever actually made that software to expand their testing parameters. I am willing to bet that they probably didn’t take dark-skinned people into consideration, but I’m willing to be told I’m wrong. If they didn’t, it’s probably because all of the developers on the team were fairer-skinned (which doesn’t mean white. The webcam works fine for East Asian and light-skinned Black faces, for example). It’s looking more like a case of blindness due to privilege. Like I said, problematic, but not malicious or even unfixable.

For my part, I’m going to continue to enjoy the video that started it all. Because it’s damn funny. And though I hope people will stop just parroting the HP Is Racist line and start asking “Who made the software?” and “How can we get them to fix this problem?” I can’t force people to. Instead, I will just pop popcorn and watch the drama unfold.

17 thoughts on “On HP’s Racist Webcams (Or Lack Thereof)”

  1. Anon Nymus says:

    I post anonymously because i speak of my employer. . . who shall forever remain nameless. Let’s call it Company X.
    Company X published an advertisement that could be considered racist. With its (few) black employees up in arms, the media hailing the ad as “incredibly insensitive,” and fun bloggers painting swaztikas all over it, Company X realized that the people responsible for developing and approving the ad were not familiar enough with American history, and black people’s place in it, to know how or why the ad could be seen as racist. . . The blindness of privilege, right? Problem fixed by just simply running further ads by the diversity manager – a black woman :)

    This is just one of those things that is so blatantly stupid that it had to either be a (big) mistake or a joke. . .

  2. Tablesaw says:

    I’m a little confused by the label “not racist,” because I’m not understanding how it lines up with the definition of racism as stated by this blog; “Racism = Prejudice + Power.”

    As a major computer manufacturer/distributor, HP does seem to have a considerable amount of power. And if the software is getting widely distributed by companies in addition to HP, that bespeaks a certain amount of power as well. And “a case of blindness due to privilege” does seem to me to be an example of prejudice. More generally, I usually find privilege insensateness to be called racism, rather than a separate but related issue.

    I’m also confused by the line, “Like I said, problematic, but not malicious or even unfixable.” When I read this, it seems to imply that for something to be called racist it needs to be one or both of these things. But maliciousness certainly implies intent, and intentions are usually specifically cited as not being taken into account when it comes to identifying racism. And the idea that fixable or even easily fixable things don’t count as racism seems to throw a wrench into the conception of what anti-racist work is.

    Is there something I’m missing in your thinking? Is “racist” being evaluated differently here?

    1. nojojojo says:

      Ditto Tablesaw on this. The failure to create an inclusive and diverse sample set for testing is an act of privilege and blindness — which is racism. Not intentional, true, but racist in that it conveys some implicit assumptions as to what their audience is like. Plus — and worse — it’s bad research design. Thinking about how the software will perform with different kinds of people under different lighting conditions should be a fundamental part of the QA process. Either they took shortcuts and didn’t complete the QA process — which is possible, maybe they were rushing to get it out for Christmas or something — or their QA team needs to be fired.

    2. The Angry Black Woman says:

      Good point. My thinking was along the lines of there being not even an unconscious bias, just a failure to grasp all the conditions under which their product might be used.

      Is being privileged and not recognizing it or the consequences of such the same as prejudice? My first inclination is to say no or, at least, not always, but I’m definitely willing to debate that point.

      1. Tablesaw says:

        I think I’ve keyed into this because it’s an issue I’ve been wondering about for a few months now. I was particularly struck by Samuel R. Delaney saying that racism as a system “can be fueled by chance as much as by hos­til­ity or by the best of inten­tions.” And I’m also generally uncomfortable with “it wasn’t racist/sexist/transphobic, it was a mistake” defense, as though it can’t be both. (LJ’s recent mishandling of the gender field is an example of this.)

        Like I said, I’ve been tentatively thinking about theory along these lines for a while (though I haven’t focused on it enough to go back to the library to find the book that I think might have some guidance). It seems to lead to unpredictable places theoretically, but also holds people (and non-person entities, in the case of coroporations and such) responsible for ending the system of racism more reliably.

        I’ve been writing and deleting a lot on this comment, which is about where I am thinking about this. I have to go do seasonal shopping, but I’ll try to get back to the topic soon.

      2. Gaudior says:

        Both “prejudice” and “being privileged and not recognizing it,” are serious problems– and the differences between them are important to acknowledge because they need to be tackled in different ways. They’re related, come from some of the same sources, and have some of the same results, but I think that calling them the same thing oversimplifies the situation, and so makes it harder to come up with useful analyses.

  3. Mia says:

    You know, blind people don’t generally have much to do with the every day use or design of cameras. Did you mean ignorance or unawareness, or unintentional thoughtlessness up there?

    1. nojojojo says:

      Mia,

      Ordinarily I would say you’re right to check us on the ableist language here — I used it too, thoughtlessly, which I shouldn’t have done. But as I now think about this, I believe it’s actually appropriate, since the problem we’re referring to is linked to colorblindness — which is specifically about willful blindness, the very conscious effort to pretend one doesn’t see something, to the point of absurdity. Somebody in this software development life cycle probably decided that the non-racist thing to do was to pretend that color didn’t matter, even though we’re dealing with technology that uses color (or reflected light, which = color) to function. So they neglected to diversify the test sample. Because “color doesn’t matter” leads to the racial default — e.g., “a sample set of only light-skinned people is fine”.

      And yes, the term “colorblindness” incorporates ableism. But it would confuse the issue to try and fix the root term when we’re trying to talk about its application. This isn’t “color unawareness”, or “color thoughtlessness”. It might not even be unintentional, though I guess we can give them the benefit of the doubt. Anyway, only “colorblindness” has the nuance and history to capture what’s happened here.

  4. Mia says:

    Okay. I can accept that.

    1. The Angry Black Woman says:

      What Nora said is what I was trying to get at over on the Alas thread before I had to go back to actually doing my job instead of messing around on blogs.

  5. Mia says:

    And regarding the actual topic of this post, I kind of think that the failings of Cyberlink and HP etc. are the result of a kind of privilege and institutional racism that made it possible to go through the whole development and testing process without considering (much less catching) the issue of darker skin tones.

  6. Pete says:

    Software such as this is based on the concept of pattern recognition… Pattern recognition works by using contrast to identify objects (quick transition from a light pixel to a dark pixel usually means a border)…

    If there isn’t enough contrast the software will have trouble identifying the object, in this case, the person’s face. This video is an example of a dark skinned person in low light conditions. This can also happen to a fair skinned person in a very brightly lit room with no exposure compensation.

    This does not mean HP did not test this feature with dark skinned people… this does mean that the software probably worked fine in the well lit labs that most high tech companies have… but it failed in the real world…

  7. Jonquil says:

    What you said, but also what your commenters said. Nobody at HP said “Our cameras shouldn’t show black people” instead, their subcontractor very likely didn’t think to test on anybody but white people. It wouldn’t shock me if the “standard algorithms” HP refers to were developed by CS grad students, who tested them on the people they knew and nobody else. In other words, privilege, unconscious privilege, and not thinking about nonwhite people because they didn’t *have* to think about nonwhite people.

    It’s very much like the problem with black people being underrepresented in medical research, which means that the standard recommendations for e.g. high blood pressure are actually “standard white people” recommendations. That one, of course, is gravely complicated by people’s memories of Tuskeegee and their justified lack of trust of the medical establishment.

  8. Lesley says:

    I’m really curious about this, and I’d love to see more video examples of this zoom system by other users. I used to own a webcam that supposedly had a face-tracking zoom ability, but I ended up returning it because it wouldn’t track my face either.

  9. Maysie says:

    The article and the comments have been very interesting so far. And I wanted to add an observation that I had about something both odd and ironic about this fail.

    In a world where Black folks and all darker-skinned folks are targeted as criminals and harassed and “racially profiled” (aka systemic racism, d’oh) isn’t it one of those moments of racism biting itself in its own butt when, through the magic of light-skinned privilege and the erasure of those with dark skin, that these same dark skinned people are NOT able to be targeted through software that doesn’t see them?

  10. Benjamin Rosenbaum says:

    The question of whether it’s racism in the “result of institutional disparities of power and perceived worth” sense, not in the “malicious intentions” sense, seems to me to be solved by posing the question “had there been a similar problem in focussing on Northern-European-type skin in low light conditions, would it have been fixed?”

    I’m betting it would have.

  11. alfred liwanag, Jr. says:

    HP did end up looking into the matter. HP e-mailed a statement that acknowledged that the webcam “may have issues with contrast recognition in certain lighting situations”. There were tests conducted by various sites that indicated that given the appropriate lighting conditions, the webcam worked just fine.

    Racist HP Webcam: Tracks White People Only?

Comments are closed.