December 23, 2009

The webcam that can't see black people.

"I think my blackness is interfering with the computer's ability to follow me..."



Wanda Zamen and Desi Cryer say they didn't really mean to make a viral video and cause everyone to talk about whether HP computers are racist.

84 comments:

Brian said...

Woo hoo, I'm the first to post (I hope).

It's a function of lighting. Lights in the background make the faces darker, and sorry, but skin tone matters. The software is trying to frame the face, which means it's probably looking for eyebrows, eyes, mouth, etc. So it needs a minimum contrast on these facial features to work. The software can probably be tweaked to make it easier to follow a darker skinned person, but proper lighting is still required.

Who knew photography was racist?

Edgehopper said...

There was a whole "Better Off Ted" episode based on this.

More seriously, it's probably a lighting issue. It looks like they're lit from behind in that video, and of course black people reflect less light than white people even when well lit. I'd be interested to see if it works if you shine a light directly on the black guy's face.

MadisonMan said...

Bug, or feature?

Unknown said...

Is it because he's black or because she's unattractive? Too many variables.

The Crack Emcee said...

"Wanda, get back in the frame,..."

Insane.

Freeman Hunt said...

I was going to say the lighting thing too. If he puts a light behind the computer to light up his face, it should work.

Phil 314 said...

Cue Kanye West

rhhardin said...

Imus's Tony Powell complained about his baby pictures, where all you could see was a black empty space wrapped in a blanket.

It's a Kodak problem too.

HP has the technical question of how to say delicately that black comes from being black, photonwise.

Rialby said...

I am still amazed at viralness. Who would have though 15 years ago what kind of power the internet would put into the hands of the common woman or man?

Richard Lawrence Cohen said...

"The technology we use is built on standard algorithms that measure the difference in intensity of contrast between the eyes and the upper cheek and nose," wrote Tony Welch, the lead social media strategist for HP's Personal Systems Group. "We believe that the camera might have difficulty 'seeing' contrast in conditions where there is insufficient foreground lighting."

In other words, it's black people's fault because they don't have standard skin color the way white people do. Whenever they use our product, they'll have to set up special lighting to make up for their regrettable deficiency. And if the majority in the US were black, the same algorithms would have been adopted as the standard, so the majority of customers would have to set up special lighting.

Ron said...

Didn't they get memo that HP now stands for Honky 'Puters?

lucid said...

I love this kind of post. It exposes all of the mind-numbing and dis-thinking that goes on in our attempts to talk about race.

While affirmative action policies no doubt have achieved some worthy goals, their conceptual underpinnings have thoroughly corrupted our ability to make any kind of sense when we try to talk to each other about race. It is way past time for them to go.

Big Mike said...

One of the more insidious consequences of Microsoft's domination of the world's software industry is that programmers think they don't need to test their software.

It could be worse. One of my acquaintances told me he and his team were checking out some new face recognition software they got from a university. And they discovered that it it couldn't distinguish between African-American faces -- the software literally "thought" that all Black people look alike.

That's what they get for using grad students to write their code.

Freeman Hunt said...

RLC, what else would you propose they do? To recognize a face, the computer is going to have to be able to see the face. If you backlight anything, it's going to be harder to see. It's especially hard to see something dark that's backlit. Does that make the reality of light racist?

Unknown said...

Since it is a lighting/contrast issue, Desi is technically correct. His blackness is interfering with the computer's ability to follow him.

David said...

Richard Wright was ahead of his time.

Andre said...

Freeman,

The thing is, this shouldn't be the kind of error that they miss. It's funny to me as a black man that it's obvious they never thought to test their algorithms on black people. Of course a contrast based program is going to treat blacks differently from whites. QA is about catching that and fixing it before some goofball on youtube forces you to announce "Our webcams aren't racist."

Anonymous said...

Freeman,

HP can adjust their algorithms and/or install better cameras. Cameras are pretty remarkable and sensitive instruments these days.

It's interesting only a few people get it. RLC for one, and Big Mike the other.

What this is is a failure of testing.

Ann Althouse said...

I like the interaction between Cryer and Zamen and the lightheartedness of the demonstration.

***

I've seen embarrassing school facebooks where the black students features are completely in the dark. TV used to have the same problem. It's an issue of lighting, but with this computer feature, it's a matter of the settings.

bearbee said...

Negro?!!

Racist!!

Bug, or feature?

My thought also.

Many reasons why one would not want to be tracked by a camera.

Brian's explanation makes sense.

Try tracking tanned whites, Asians, Middle-Easterners, Hispanics.

bearbee said...

Richard Wright was ahead of his time.

The Invisible Man?

Ralph Ellison

Cedarford said...

"David said...
Richard Wright was ahead of his time."

But Gordon Parks figured it out, photographing blacks right, so why can't the Chinese and Japs
that design & build all the high tech electronic devices?

You have "Twilight Mode" and "Candlelight Mode"....on your selector function. Why not "Negro Mode" or "People of Serious Color Selector"?? Probably wouldn't be PC, but it would DAMN SURE get used!

I had a winter get together with 3 white families and two black ones. The group shot was taken with a Canon AE-1 seventeen, eighteen years back. The photo was worse than perfect shots of whites and black silhouettes...because the black subjects faces & eyes and smiles came through in dazzlingly good detail.

On the other hand, the photo was damn funny. And one black family kept a copy as a "Oh, no! That's bad!" kind of thing. Even framed it.

We did get a chance to reshoot a few weeks later, as I explained the "racist Canon camera" was to blame...but it could be a nice group photo we should try again for. Same snow, same people more or less (one white family couldn't make it), with an Asian oddball thrown in.(friend of daughter's).
But I put them against a darker background with plenty of snow in foreground and went with several shots inc. manual one stop down and manual flash. The blacks, whites, and the stray Asian ALL came out fine.

NotWhoIUsedtoBe said...

When I was in the Navy I traveled in my dress blues. They were black wool. I noticed that airport bathroom plumbing sensors couldn't see me, leading to a lot of hand waving.

David said...

Apologies to Mr. Ellison and thanks to Bearbee for the low snark correction.

Cedarford said...

Rats, typo!

"...because the black subjects faces & eyes and smiles came through in dazzlingly good detail"

Should read

"...because the black subjects faces were featureless except that their eyes and smiles came through in dazzlingly good detail.

Scott M said...

This is perfect timing for such a story simply because we here in the United States don't have enough racially-charged baggage to tote about.

Shanna said...

The thing is, this shouldn't be the kind of error that they miss. It's funny to me as a black man that it's obvious they never thought to test their algorithms on black people.

Is this a new problem with new software, or is this a pretty standard problem across the board? Would a program that makes black people easier to see screw up the pictures for white people?

The Crack Emcee said...

"Wanda, get back in the frame,..."

Andre said...

Shanna,

I wondered the same thing, so since it's a slow day in the office I've been playing with the Logitech webcam we use for web conferences. So far it tracks me just fine. Or at least no noticeable difference between me and my lily-white coworker. This isn't an industry wide "problem" whatever it is.

campy said...

A wise latina programmer could figure out how to see all faces perfectly.

Ken B said...

Amazing. A device that depends on reflected light doesn't work when the lighting is wrong. Conclusion: racism.

Beyond paraody. I mean you Richard L Cohen.

halojones-fan said...

I'm more concerned about a putatitively professional media producer (CNN) confusing "it's/its" in a for-publication work product.

As for the webcam: Add a light, problem solved. Offer a free coupon for a five-dollar clip-on light to anyone who already bought one of these things.

DADvocate said...

I always had me suspicions about those guys at HP. Now they've been found out.

former law student said...

Black people are darker than white people -- there, I said it and I'm not ashamed.

This reminds me of when my aunt took poorly-lit Polaroids of us kids years ago. She showed them to a coworker, who asked if her sister had married an Italian, because our pasty skin appeared swarthy in the pictures.

Ignorance is Bliss said...

lucid-

This seems like an odd story in which to complain about affirmative action. While I disagree with AA, this is pretty clearly a case where they didn't test it on blacks. It was most likely developed in an all-white developer & QA orgainization. It's a case where they should have hired someone based on their race, because their race effects how they do this (QA)job. ( Or at the very least, they should have called 1-800-RENT-A-NEGRO. )

Andre said...

Here let me be clear:

Stupid. Not racist.

Though in (minor) defense of RLC, HP's response was sad. "Standard Algorithms" that assume everyone is white are...unfortunate. Find a better way to word that. Isn't that why companies pay PR companies?

former law student said...

It was most likely developed in an all-white developer & QA orgainization.

No. Knowing HP these days, it was most likely developed in an all-Asian developer and QA organisation. HP needs to find some local black folks in China.

Freeman Hunt said...

Everyone's blaming testing and the algorithm, but do we know how it works without heavy backlighting? Maybe it works just fine under regular lighting conditions where there's more light on people's faces. Also, maybe if they adjusted the camera settings. (I think this is what Althouse was referring to.) I have to do that on mine because I'm backlit at my desk.

Unknown said...

Ann, i knew you'd post this, and i have to admit, i was looking forward to people going nuts over this !

Desi is hilarious !!

i wonder if they knew about the bug when they put the product on the market. If not, they really need to improve their software testing ! If so, i wonder if the gains in profits over Christmas will be as substantial as they predicted, esp inlight of all this (bad ? good ? funny ?) publicity.

The Counterfactualist said...

In other words, it's black people's fault because they don't have standard skin color the way white people do. Whenever they use our product, they'll have to set up special lighting to make up for their regrettable deficiency.

Or, perhaps some black people do not properly light themselves when they speak in front of a webcam. They do not need special lighting, just proper lighting. They do not think skin color matters, so they light themselves inadequately.

Scott M said...

Unfortunately for ideologues, physics can be a real bitch.

Shanna said...

Though in (minor) defense of RLC, HP's response was sad. "Standard Algorithms" that assume everyone is white are...unfortunate.

That was what made me wonder if they reused some sort of code from other stuff, which made me wonder if this was a problem with all webcam’s. If it wasn’t, then it’s just BS’ing,

Or it could be the lighting, but I haven’t been able to see the video yet, so I don’t want to comment on that.

bearbee said...

Apologies to Mr. Ellison and thanks to Bearbee for the low snark correction.

Sorry. Snark not meant. Was not certain of your reference, hence, the question mark.

Read Ellison, along with Wright, Baldwin, others, eons ago. The Invisible Man left a profound effect.

One of the best of 20th century American novels.

Sigivald said...

27183: It's a webcam. Making it much more sensitive would price it out of the market.

It's probably not a matter of just trivially "adjusting the algorithm". Generic cheap webcams don't provide the data required, and can't for cost reasons.

And if you try to winkle the appropriate responses for someone with backlighting and really dark skin in... I bet with 99% certainty you introduce more problems than you solve for the normal case of not backlit and higher contrast.

"Nobody tests, test more"... no.

Big Mike's blather about Microsoft is snarky and biased at best, and simply wrong at worst.

People test (especially big companies like HP). Lots of testing happens.

Bugs still get through because bugs always get through in anything where you don't have an immense budget, immense time, and a strictly controlled system (eg. avionics, where they have multiple independent implementations run against a 100% coverage mandate of automated tests). Being proven correct is the only way to ensure zero bugs - and it's a very expensive bar to meet.

In the real world for consumer hardware and software you can't do that.

You can't test against all possible hardware and software combinations (even only where it matters)*, and you can't guarantee results - in the avionics case they have a fully defined test suite and ensure the code matches the outputs.

(* Yes, HP has some advantages in that they know what hardware they're shipping and what software it comes with. But then users always modify both, and you can buy an HP camera and software for a non-HP computer...)

How can you do that with a webcam? Any failures would just be blamed on HP Not Picking The Right Test Faces Because They're Racist.

They could spend 100 times as much on testing and QA and they'd still get whined at.

Can you imagine why they don't bother? It's not remotely worth it to still not stop some internet douchebag from making "ur racist!@!#" videos.

Celia Hayes said...

It is all about the lighting, when doing straight photography or video; I knew this, back when doing news or training videos for the military. If your subjects were white, then the usual straight lights and reflectors (which put out a bluish-tinged light) would do well enough. But if your subject was very dark, you would have to use gels on the lights to get a more reddish or orange tone, which would be more flattering.

Beta Conservative said...

I would give the face tracking software a B+.

Shanna said...

Oh! A post about HP and I completely forgot to spread my message that my HP laptop completely crapped out on me after barely over a year, and despite this being a problem with enough of them that they had to extend the warranty, it wasn't extended for my specific PO number or something so they just wanted me to eat it.

But, because I bought it on AmEx, the manufacturers warranty was doubled. And the guy at Amex had talked to enough people with the same problem that he knew immediately that it was an HP and exactly what was wrong with it! Now I have a toshiba.

So, bad publicity for HP makes me very happy! HP SUCKS!

Your Correspondent said...

Actual Legal Question
Can HP now advertise for blacks to help test their stuff? How do they do this without violating some law and getting sued by some white gold-digger?

Less-serious Tech notes:
Black people are less reflective than white people.

And white people have a higher albedo.

Dark Eden said...

I would love to be a fly on the wall in the offices of HP right now.

Brian said...

Metternich said:
And white people have a higher albedo.


Hot Damn! We can stop the AGW crowd in their tracks, and they'll be unable to do anything about it!

We need to get East Anglia CRU folks on the case, and research the effects of skin tone on global warming. When they come back with even the smallest indication that darker skin means warmer temps, the whole kit and kabootle will be thrown out as racist.

Every politician in the country will have to disavow ever knowing anything about global warming.

Sofa King said...

We need to get East Anglia CRU folks on the case, and research the effects of skin tone on global warming. When they come back with even the smallest indication that darker skin means warmer temps, the whole kit and kabootle will be thrown out as racist.

Or maybe they'll decide we need more white people. A LOT more white people. But, ah, with the proper breeding techniques and a ratio of say, ten females to each male, I would guess that we could then work our way back to the present mean global temperature within say, twenty years. It is, you know, a sacrifice required for the future of the human race. I hasten to add that since each man will be required to do prodigious... service along these lines, the women will have to be selected for their sexual characteristics which will have to be of a highly stimulating nature. Mr. President, we must not allow a white people gap!

KCFleming said...

Clearly, 12.8% of face recognition software programmers need to be black, and the white and Asian workers need reeducation.

A Black Software Studies Chair underwritten by hp would help.

mariner said...

I suspect HP's best option is to simply ignore this.

About 13% of the American population is black. I suspect that half or fewer (a SWAG) are in the market for webcams. That's 6%.

I certainly hope that most (or at least many) of those are smart enough to understand that webcams are not racist.

Synova said...

I would hope that they test the recognition algorithms on a whole bunch of very different looking people. No doubt now they *will*.

I agree with Freeman that it's likely that the bright back-lighting of the store may have played a significant part of messing up the software. Taking pictures toward a bright light source never works well.

Big Mike said...

Big Mike's blather about Microsoft is snarky and biased at best, and simply wrong at worst.

Them's fighting words, sonny. I'm guessing that you either (A) work for those folks, in which case you've drunk the Kool-Aid; or (B) you've never used their software. (In particular, you either never used Vista or are the only person in the Universe who did and found no stupid bugs.)

I also work for a firm that develops software, and I want to assure you that people don't test, or, to be precise about it, hardly any programmers are capable of testing their own code very thoroughly. You need a testing organization that is empowered to spend a lot of time thinking about ways to break the software and a reward structure that motivates them to do so. Unless Microsoft has radically overhauled their approach to developing and testing new releases from the timeframe documented in Pascal Zachary's Show-Stopper (which Vista suggests they haven't) then testing is not a high priority.

I used to think better of HP, but not after this webcam software. As far as this face tracking software is concerned, I can only assume that it was not tested particularly thoroughly using ordinary light sources (including flourescent, incandescent, daylight, and mixes thereof, and that they used few, if any, African-Americans as test subjects. If anybody got paid a bonus for releasing this software, they should be forced to give it back.

Your Correspondent said...

When I worked closely with Microsoft, they had a very large testing operation. They have programmers who write test code and several different kinds of user-testing.

Different groups do things differently within Microsoft, though, and I'm not an eyewitness to the Vista project, specifically.

Testing always is pushed from one side by the programming team that wants to take more time and the marketing team that wants to deliver early. So any organization can find that testing gets crunched.

Chip Ahoy said...

Ha Ha Ha Ha Ha ... eh.

When the man steps forward he blocks the overhead lights behind him which you'll notice are blowing out the highlights, and the sensors can then distinguish the shades of his skin tone. When he steps back the bright overhead lights take over the extreme right end of the histogram, they're way over exposed, and his own color tones become crammed with the darkest things that hit the sensors on the opposite end of the histogram. RACIST!

The histogram can be tweeked (slider) oppositely so that the same thing happens in reverse to lose the detail of woman's skin tones and likewise it can be tweeked to a happy medium.

My relatives all use their phone cameras for facebook. *whispers* Their photography is dreadful mostly because they haven't bothered trying to understand their cameras.

‹anecdote alert›
James wanted to do the dawn at Haleakala thing. (It's freaking freezing up there predawn) A crowd of tourists was assembled waiting for the sun. A tourist asked me if I would take a picture of him and his wife with the dramatic sunrise behind them. He handed me his camera. I checked the camera's menu. It lacked a backlight feature that would compensate for that situation. I told him they cannot have the photo they're imagining, they must step to the side or else they'll be reduced to mere silhouettes. He said I must be professional. I go, "Hardly."
‹/anecdote alert›

bearbee said...

Laptop Magazine tested and found when adjusting the Backlight Correction, webcam will respond to various skin tones. Scroll down for 2 videos.

HP Face Tracking Software Not Racist, Just Contrast Challenged

MadisonMan said...

HP should be using the infrared part of the electromagnetic spectrum.

KCFleming said...

MM, ssshhhh.

There's a university Black Software Diversity Chair just screaming for endowment.

Sofa King said...

HP should be using the infrared part of the electromagnetic spectrum.

But the CCD chip is designed to detect visible wavelengths, since after all we do want an image on our computer, not a thermogram. The tracking software only has the incoming image to work with, so I don't think this idea is feasible.

MadisonMan said...

Well, yes, you would need two detectors -- a visible one to show the image, an infrared one to do the detection. Is cost an object?

Dust Bunny Queen said...

HP needs to find some local black folks in China.

I believe that one of Obama's half brothers lives in China.

former law student said...

Is cost an object?

In a consumer product? I'm gonna say yes.

Joe said...

One of the more insidious consequences of Microsoft's domination of the world's software industry is that programmers think they don't need to test their software.

That has nothing to do with Microsoft and everything to do with how people think and work. It goes way beyond software and computer hardware.

Deming had an awful lot to say about testing and quality control.

Penny said...

"There's a university Black Software Diversity Chair just screaming for endowment."

That was really funny, Pogo...until it became a wry head shaker.

Penny said...

But hey! Wry AND rye is a good thing.

Heck, Althouse even uses it in her winter flower box.

Freeman Hunt said...

They should have trained a gifted gnat for each computer camera to recognize the faces and follow them by use of a tiny joystick.

Ned said...

Dude is guaranteed to be a failure the rest of his life...all because of the webcam...go call jackson and Sharpton you complete loser.

I'm Full of Soup said...

A lot of youse commenters are brainy AND funny. And I feel the need to compliment you on your talents at least once a year [and go thank your parents if you still can].

Merry Christmas, Happy Hanakah, Happy Kwanza,Happy New Years and Happy Solstice/Festivus or Happy Nothing [pick one or two or as many as you want] to all of you!

blake said...

Well, first of all, of course, Microsoft tests their software. But other things besides quality have precedence. And always have been. Quality is job 12 at MS.

Their success has never had to do with quality so, you know, there's a limit to what it's worth to them. Consider how many products Microsoft has released whose quality was so bad, they would have killed a company without a monopoly. (In fact, many of MS's competitors were killed by products not half as flawed.)

You can't really blame them for this, since the market rewarded all this behavior. (Well, you can blame them for the stream of illegal and unethical behavior, and you can blame the press for being in their pocket the whole time.)

Nobody needed Windows ME or Vista or Hailstorm or Bob, etc. Well, not nobody: Microsoft needed them. Their customers didn't.

It's also true that this has created a mentality of testing what the consumer will accept as far as bugs go. This is especially bad in the gaming world. And the Xbox has probably made it worse. (If your console doesn't have a hard drive, the game you ship on the disc is the one people play. No patching!)

That's not terribly relevant here, though. First, it's not like there aren't any dark-skinned Asians (assuming this came from Asia).

Second, are we gonna attack the voice recognition software next because certain timbres—as a matter of physics—are easier to pick out from background noise than others?

'cause I know I feel discriminated against on that front.

Wince said...

Let me be the contrarian here and be the first to suggest it may be that the HP computer is trying too hard not to be racist.

In stores, black customers often feel discriminated against: why? Because they are being followed.

Similarly, people "driving while black" often complain about being profiled and followed by the police.

Yet the complaint now is that the HP web cam isn't profiling the black man following him?

Make up your mind!

Maybe the HP computer is being sensitive after diversity training?

I'm glad the fellow in the video did the whole thing in such good humor with his co-worker.

sammy small said...

What these cameras need is an algorithm known as Global Local Area Contrast Enhancement (GLACE), sometimes known as Local Area Processing (LAP). This is common on high end infrared sensors and solves similar symptoms in the IR imaging band while sensing broad areas of temperature differences.

Alex said...

Where's Jesse Hi Jackson?

The Crack Emcee said...

"Wanda, get back in the frame,..."

Penny said...

"Wanda, get back in the frame,..."

Now STOP that, Crack! This is the third time today you made me giggle with that line.

Well, OK, it's not YOUR line exactly, but you know what I mean!

Shanna said...

I'm glad the fellow in the video did the whole thing in such good humor with his co-worker.

The video was funny (although it could have been half a minute shorter) and I don't blame the guy. He gets a computer he thinks does one thing and it doesn't. He seems to be joking about it, not ready to call Jesse.

I will say that initially, before seeing the video, I thought you couldn't see him and now realize it just isn't following him. Is that a normal function in a webcam? Following you around? That's not as big a problem as it would be if you couldn't see him.

Sofa King said...

Is cost an object?

I'm sorry, I forgot you live in Madison.

MB said...

Its not racist - its sexist! It is only interested in females.

MB said...
This comment has been removed by the author.
XWL said...

I think this slide scan from a Christmas '74 picture of me and my parents illustrates the difficulties in capturing fair and dark skinned folks with the same lighting.

The Crack Emcee said...

"Wanda, get back in the frame,..."

As I see it, "Wanda" is part of the problem here - it should've been she who said a camera's algorhythm can't be racist.

It reminds me of an incident, years ago, when I used to greet other black men by saying, "What's up, brother?" I did it one day, while in the presence of a german friend that I shared everything with - food, music, intimacies - and he remarked with a stare, "He's not your 'brother', I am!" I thought long and hard about that one, but I "got" it.

White people play a role in racism, and what role they decide to play is very important. Are you a part of the problem or part of the solution?

"Wanda" is part of the problem - a white liberal supporting this "smart" black guy in his insane theory - and she deserves to be in the frame.

It is a funny line, though. I used to hang out with a lot of professional comedians and always marveled at the "punch lines" that can tickle you with no set-up but their delivery. As a songwriter, I'm always looking for conventional phrases that have that recognition factor. David Byrne does as well. I understand he collects them on strips of paper and then moves them around, like word magnets on the refrigerator, until he has the lyrics for a song. I usually collect my own phrases that have that aha! moment for me. Fun fact:

Many times, for the blog, I'll steal my banner subtitles from things people say here.

Y'all didn't know that, did you?

blake said...

I think Crack just admitted to cultural theft.

Webcam Reports said...

Big thanks for sharing this great post on live nude web cam girls that will help all bloggers.