A digital camera that gets dark skin right: what took so long?


[ad_1]

“This work is based on personal experience,” says Florian Koenigsberger, who heads Google’s Image Equity team. Koenigsberger’s mother is Jamaican and black, his father is German and white. His skin color is relatively pale, but his brother is a little darker. So those Thanksgiving family photos have always been a problem.

Taking good pictures of black or brown faces is known to be a challenge, especially when they share the frame with white faces or when the background is brightly lit. If you properly expose the rest of the image, a black face can turn into an indistinct blur. But if you increase the exposure to bring out the characteristics of a black person, other things become overexposed. Additionally, the extra light can overwhelm the natural warmth of black skin and create an ash gray effect.

You can blame the technical limitations of the cameras, and you are partly right. But Lorna Roth, professor emeritus of communication studies at Concordia University in Canada, said the problem dates back to the era of film cameras.

For many years, film emulsion chemical formulas were specifically designed to keep white skin looking good. Industry leader Eastman Kodak even created a skin color reference card to help professional photographers calibrate their equipment. It was called the “Shirley Card” in honor of the first model to pose for it, a white woman.

Kodak could have improved the appearance of dark skin in its film. As a former lighting technician at Black Entertainment Television told Roth, “If black films had been designed, we wouldn’t have this problem.” But for decades the company didn’t care. In 1978 the French film director Jean-Luc Godard, who was supposed to be making a film in Mozambique, refused to use Kodak films because blacks made it look bad on the screen. Godard condemned the company as racist because it hadn’t gotten better.

But it wasn’t malicious racism, according to Roth. Instead, it was sheer ignorance. Kodak was run almost entirely by whites who never thought too much about the needs of their non-white customers. And even as the civil rights movement troubled the nation in the 1950s and 1960s, black citizens focused on the right to vote and desegregation in schools. The demand for better camera film was not a very high priority.

Kodak made efforts to improve from the 1960s onwards, but initially it wasn’t race. Instead, advertising agencies representing chocolate makers and logging companies complained that their customers’ dark products looked awful when shot with Kodak film. So Kodak started making films that did better with darker tones. In the 1990s it launched a new Shirley camera calibration card that shows three women – Asian, black and white.

But by then, film had been replaced by digital cameras that suffered the same dark color problem. Again, technology is partly to blame. Ramesh Raskar, who the. directs Camera culture A group at the Massachusetts Institute of Technology’s Media Lab said smartphone cameras couldn’t suit every skin tone until recently.

“Only in the last three or four years, maybe three years, have photosensors been good enough to capture a large enough dynamic range,” said Raskar.

But today’s digital camera manufacturers have run out of excuses. Not only have the cameras gotten a lot better, but so have the computer chips in the phones. Every time you take a smartphone picture, the phone’s processor instantly changes it to produce the best possible picture. It’s called Computational Photography, and it enables the exceptionally rich, sharp images of today’s best phones.

The Pixel 6 uses a number of smart gimmicks. For example, most high-end phone cameras won’t take a single photo when you press the shutter button. Instead, they shoot five or six or more, each with different light and color balance settings. Then the phone’s computer will stitch those pictures together and use portions of each picture to create the finished photo.

The Pixel 6 software has been optimized to select the most beautiful facial images from these multiple shots. If a shot gets the face right with the background too lit, the good-looking face shot is combined with a better background image. At the same time, it uses artificial intelligence algorithms to adjust the color balance and lighting for each face to ensure it is well lit while displaying accurate skin tones.

The overall effect is subtle, but quite noticeable. For example, the camera delivers clear images of brown faces with lots of details and rich colors even in low light.

To teach the camera, Google analyzed huge image databases with human skin in all shades. And the company brought in a team of Black, Hispanic, and Asian photographers and videographers to help train their AI software.

“The tool that we say was developed for you was also developed with people who look like you,” said Koenigsberger.

It’s all things camera makers could have done a few years ago, but recent discussions about racial justice may have put pressure on tech companies to act. And not just Google. The social network Snapchat has announced that it is introducing software that will modify the performance of existing smartphone cameras so that they can take better photos of dark-skinned people. Apple says it updated the AI ​​in its latest iPhones to improve dark skin, and a spokeswoman for smartphone giant Samsung said the company’s latest model, the Galaxy S21, is tailored for brown skin too.

“I think that’s fantastic,” said Königsberger. “We should get to a place where this doesn’t have to be a competitive thing, right? Where everyone knows that no matter what tool they pick up, they will be seen fairly for who they are. “


Hiawatha Bray can be reached at [email protected]. Follow him on Twitter @GlobeTechLab.


[ad_2]

Comments are closed.