Spend $157, Get Over $5,500 in Photography Products Now

Google Says That It Is Improving How Its Phones Photograph Black Skin

Google is changing the way that its Android smartphone cameras process darker skin tones in order to address historic problems relating to how people of color are portrayed in photographs.

Google’s Android VP Sameer Samat announced that the company is working on making its phones improve the way they render darker skin tones and different types of hair, drawing on a diverse range of experts as part of its development.

“As part of our ongoing commitment to product inclusion, we’re working to make technology more accessible and equitable,” Samat explained. The changes are expected to appear on phones later this year.

In the past, photographic processes have been geared towards lighter skin tones, as evidenced by the Shirley Card developed by Kodak, which was used to calibrate colors when processing images. More recently, it’s not unusual for even hugely respected photographers to come under criticism for struggling to portray dark skin, such as the Annie Liebovitz photograph of Simone Biles published on the cover of Vogue magazine last year.

Even in the digital era, technology has often failed to cope. Early webcams failed to track the faces of people with darker skin, and Twitter’s thumbnail algorithm continues to favor white faces over Black.

In a tweet, YouTuber Marques Brownlee welcomed this development, explaining that these changes had the potential to improve every smartphone camera:

Log in or register to post comments

14 Comments

Kirk Darling's picture

That "Shirley Card" story is a myth. Color film technology preceded the creation of the Shirley Card by a half century. Color emulsion scientists have always pursued /accurate/ color, not Caucasian skin tones. Portraits weren't even the primary color film market in its first half century. Physicists, the military, horticulturalists, and biologists were some of the early critical markets, and they wanted accurate reproduction of all colors.

Color film was being produced by multiple companies in the US, Europe, and Japan, and for sure Fuji emulsion scientists were not using Shirley Cards. The Shirley Card was created by Kodak only after Congress forced Kodak to break up its virtual monopoly on color print processing in the 50s. The Shirley Card helped ensure Kodacolor print processing by non-Kodak labs would match that produced by Kodak's own labs.

The most important part of the Shirley Card was not the Caucasian skin, it was the gray and color patches that could be accurately read by a densitometer.

Lee Christiansen's picture

To be honest, as a seasoned pro photographer, I struggle sometimes with how to accurately portray darker skin tones.

Our eyes work differently to cameras and metering. A technically accurate meter reading doesn't always show how we perceive the world around us because of how our eyes/brains calculate shadows and highlights and dynamic range, and because we also interpret movement to add detail and definition to our surroundings.

So even though my meter is accurately calibrated to each individual lens' characteristics, (it takes an age to do this), I'll often find a meter reading doesn't always cut it.

And then of course we have how our Raw processors handle the tones and colours. I use C-1 which is great, but even it has a few quirks which can mean perhaps the highlights need a little help or the contrasts need a tweak.

And don't get me started on how different screens can deliver. So I work with a very accurate Eizo CG319X, but much of the world isn't so blessed and I have to deliver images that can survive poor displays.

Typically I've found that metering and over exposing by between 1/2 and 1 stop depending on how dark a subject's skin is, will give a more faithful representation of what my eyes perceive. Yes it goes against the laws of optics, but when I'm shooting tethered and the subject is before me, it makes for a better correlation.

Interestingly as a DoP in TV, I tend to make this adjustment less - probably because there is that additional movement in the frame to add information.

(One of my bugbears in film / TV lighting is when the DoP has gone all "natural-one-light" to be highly realistic... forgetting that eyes and cameras see the word differently and "natural" is usually created with quite unnatural lighting)

One of the issues is our point of reference. I'm white, (well a pasty pink hue...) so shooting caucasian skin is an easier reference for me, because well - I'm covered in the stuff.

But if I'm shooting someone with darker skin or perhaps more olive skin, then I'm either relying on the absolute accuracy of my laptop when shooting tethered, or my memory - and that's a variable these days... :) And although many will cry "ColourChecker," we all know that isn't a perfect solution either.

I've learnt that the combo of camera profiles, metering, (even when accurate), and things like curves / contrast / black points, can effect an image - and it can become more critical when these elements are away from the mid point of the tonal range. An error in a mid grey can often be less critical than an error in darker shadows for example - particularly with the variance of screens and prints out there.

Now when I might get caucasian skin which is a bit red or blotchy, I can tweak the colour and tone of that skin without (probably), causing offence or issues. It seems there is a great deal of latitude in this respect, and we see all sorts of "interpretations" of what caucasian skin looks like.

And in the field of headshots, I see a horrible range of over exposures, bleached skin etc - which seem to be a "style" of many photographers. So in many cases, tonal representation goes happily out of the window.

I find with darker skin, there is a very wide range of how dark the skin may be an a wider range of colour too. When I searched for makeup in the darker tones, I was dismayed how this entire sector is so poorly represented - there seems to be a crazy assumption that black skin only goes so dark, and always seems to have a reddish tint... and of course this just isn't so. (I've been putting makeup on my subjects for years now and have an extensive kit - but I've had to work harder to find solutions for darker skin which is frustrating. The makeup industry should be ashamed).

So with this very wide range of skin tones and colours, I find myself second guessing myself after a shoot. Even if my tethered laptop is calibrated, it isn't the same as controlled conditions and my Eizo. I'll find myself wanting to make changes, (not forgetting how cameras and meters trick the eye).

With product work I can keep the product as a reference under a proofing light, but I find clients are less willing to stay with me for a few days just so I can make reference checks - ha.

Last year I had a lovely client who asked me o shoot her 50th birthday party. A grand affair with some very complex lighting where the combo of gelled flash and ambient lighting meant that shooting was always a best-guess and requiring some tweaking later. My client was dark skinned, her husband very dark skinned, the kids a slight mix and some with more makeup on (see my comments on the makeup industry), and their guests had every skin tone and colour you could think of - so no reference points.

I had panic attacks that I was accidentally making my client too light, or maybe too dark, or maybe the contrast range between guests wasn't correct... I'm sure at some point I got it close enough, but even shooting in the same room wit the same people, I found I had to work hard to keep consistency across the pages of the albums.

Recently I did a headshot session with a black lady who wanted retouching. And she also wanted a dark background. So the challenge was to have a great shot which looked like her, and also translated across several mediums - including poor screens. Again I had a personal struggle with myself always worrying that I'd changed her skin colour... was it olive, was it more brown...and then that combo of quite dark skin against dark background but having it look great everywhere and not just on my calibrated screen.

I find I worry less with caucasian skin, possibly because I find there is more latitude when making things a bit too light. I'll tend to err that way and make skin "pop" rather than a flatter dynamic. And because I guess I'm also conscious that we've not had years of caucasian skin being as badly represented as darker skin.

I'll welcome any developments that phone companies can bring to give us more accurate skin tones. Heck with some of the under exposed / backlit pictures I've seen, I'll welcome anything... :) With AI and facial recognition it is going to be easier than the old days of guessing an average exposure. I wonder with C-1 and LR may start offering us something similar, (by which time I'll be retired so they can make it as easy as they like for the amateur... ha)

I think though, in all this, it needs to be remembered that replicating the human face is always going to be difficult - especially with such a wonderfully diverse set of colours and tones. Unless here is some obvious botch up, I'm usually prepared to accept some tolerances in how those faces are captured.

Andy Day's picture

This should be an article.

user 65983's picture

.

Mike Shwarts's picture

It is good that AI can do this now, but "racially inclusive" sounds like Google advertising. Phones, or other cameras, are not biased. They are dumb tools. Most people don't know they can use exposure compensation and other settings to get photos that are more representative of what the eye/minds sees. They use their phones as point and shoot devices.

While the new technology is good, tying it to racial issues is dumb.

Andy Day's picture

An inanimate object cannot be biased. However, technology is always a product of society, and the social processes that shape its creation can contain biases.

Kirk Darling's picture

The "bias" has always been toward achieving accurate color, not good Caucasian skin tones. Imaging scientists have never set out to create film or sensors that were biased toward or against skin colors. They use carefully controlled color patches, they have always done so...and you know that.

Alex Herbert's picture

They didn't set out to create a bias, but then you don't always have to do you. These things 'just happen' because people aren't consciously being inclusive.

I reckon Marques Brownlee would know what he's talking about, being that he is dark skinned and has tested probably everyone smartphone camera to come out in the past decade or so.

Kirk Darling's picture

It has nothing to do with being "inclusive." It has to do with creating accurate color and dynamic range. Sensor scientists should be aiming for accurate color and greatest dynamic range, not "inclusivity."

And what would that mean? Human beings come in a great variety of skin tones. Black people are not one color, or even one shade. We can be a variety of shades with yellow, red, or even purple undertones. "Inclusiveness" is absurd. Accuracy should be the goal.

The issue has really been with photographers, since we have been able to simultaneously capture details in a bride's white dress and the groom's black tuxedo for quite a while now. The failure to do the same thing with black and white people in the same image has been the photographer's lack of skill. I've been photographing black people for 50 years, and I've been able to do it. And I've been both black and photographing black people longer than Marques Brownlee has been alive.

Alex Herbert's picture

This is about the auto mode on smartphones, not teaching professional photographers how to properly expose an image. Did you read the article?

Kirk Darling's picture

The first sentence of the article reads: "...in order to address historic problems relating to how people of color are portrayed in photographs." It further says, "...In the past, photographic processes have been geared towards lighter skin tones...."

Those assertions are specious. The "historic problems" have not been with some kind of implicit bias engineered into the technology against people of color, as the article. The historic problems have been with the capabilities of early color technology (primarily dynamic range) and photographers who were unable or unwilling to manage the technology shortfalls in the ways that were available and well-known.

If they can further advance the technology in color accuracy and dynamic range to overcome lack of skill on the part of photographers...well that's what photo-technologists have been striving for all along. It has nothing to do with some newly awakened social consciousness to become "inclusive."

Kirk Darling's picture

The first sentence of the article reads: "...in order to address historic problems relating to how people of color are portrayed in photographs." It further says, "...In the past, photographic processes have been geared towards lighter skin tones...."

Those assertions are specious. The "historic problems" have not been with some kind of implicit bias engineered into the technology against people of color, as the article. The historic problems have been with the capabilities of early color technology (primarily dynamic range) and photographers who were unable or unwilling to manage the technology shortfalls in the ways that were available and well-known.

If they can further advance the technology in color accuracy and dynamic range to overcome lack of skill on the part of photographers...well that's what photo-technologists have been striving for all along. It has nothing to do with some newly awakened social consciousness to become "inclusive."

Mike Shwarts's picture

I agree to a point. But in the film days, it wasn't always about accuracy. Might have been true for specialized film for scientific use. Other than that, Kodak, Fuji, Agfa etc. created unique films. Most if not all companies made multiple color films. All with different looks. Well known examples were Kodachrome with strong reds and yellows, Velvia biased for greens and blues and Ektachrome had four different ISO 100 emulsions.

Mind you, I'm not saying they were biased in the sense this article implies about phones. Those looks were intended for general use. While they gave people of all appearances some uniqueness in appearance, they did so for every other subject that was photographed. And like today, if you didn't use a p&s or disposable camera, you could control exposure, light etc. to achieve a more or less representative look of the subject.

Kirk Darling's picture

I remember when Fuji was making its first salient into the American market, trying to wedge that green onto the shelves between all that yellow, their advertising had the catch phrase, "The Japanese see color differently." Wow, that wouldn't even be PC today.

But the fact is, Mike, you and I can look at an object and agree, "That's red"...but we have no way of knowing if your brain actually perceives exactly the same color that my brain perceives. We've just learned to label what each of our brains perceive by the same label.

Maybe, to take an example from engineering, what we're really talking about is precision rather than accuracy. If it's "off," at least it should be consistently off so that we can apply a consistent factor to correct it to our own perception.