How we tested the Pixel’s new inclusive camera features

How we tested the Pixel’s new inclusive camera features

In October, Google unveiled the Pixel 7 and Pixel 7 Pro, the latest step in our work to create a comprehensive and accessible camera that works for everyone. We introduced a major upgrade to True Tone, which the Pixel 6 originally introduced to improve images of people of color, especially those with dark skin. And we’re continuing to expand accessibility with Guided Frame, which uses audio cues, high-contrast animations and haptic (tactile) feedback to help people who are blind and visually impaired take selfies and group selfies. In taking photos.

The evaluation process for both of these features depends on the communities they are trying to serve.

How we tested the original tone.

We worked with more people in more places, including internationally, to improve Real Tone. The goal was to see how the team was working on a RealTone upgrade that was performed around the world, which required a tester to extend these tools.

“We worked with image makers who represented the U.K., Australia, India and Laos,” says Florian Koenigsberger, Realtone’s lead product manager. “We really tried to have a globally diverse perspective.” And Google didn’t just send out a checklist or a survey—someone from the Real Tone team was there, working with experts and collecting data. “  Florian says, “We need to be sure that we portray ourselves to these people in a real, human way. “It wasn’t quite like, ‘Hey, show up, sign this paper, boom boom boom.’ We really tried to tell people about the history of the project and why we’re doing it. We wanted everyone to be respected in the process.

Once on the ground, the Real Tone team asked these estheticians to try to “break” the camera—in other words, take pictures in places where the camera historically didn’t work for people of dark complexion.

Team members also asked – and were granted – to watch the experts edit the images, which was a huge request for the photographers: “It’s too intimate for them,” says Florian. “ Then, however, we could determine if the culprit was the exposure slider or the colour slider person.  Are you individually adjusting the tone? That was really interesting information that we got back.

To get the best feedback, the team shared prototype Pixel devices very quickly — so quickly that the phones often crashed. “The actual tone team member there had access to the specialized tools and techniques needed to run the product,” says Isaac Reynolds, lead product manager for Pixel Camera.

After gathering this information, the team tried to identify what Florian calls “headline issues”—things like hair texture that didn’t look quite right, or when color matched the surrounding environment. Penetrates the skin. Then with input from experts, they decided what to work on for 2022. And then came engineering — and more testing. “Whenever you make a change to one part of our camera, you have to make sure it doesn’t have an unexpected negative ripple effect on another part,” Florian says.

Finally, thanks to these partnerships with image experts internationally, RealTone now works better in many ways, especially in night-sight, portrait mode and other low-light scenes.

“I feel like we’re starting to see the first meaningful pieces of progress that we originally set out to make here.

How we tested the Pixel's new inclusive camera features
How we tested the Pixel’s new inclusive camera features

Photos taken with the Pixel 7 Pro. The photo on the left was taken without Night Sight. The photo on the right was taken with a night sight to enhance the true tone.

How we tested the guided frame.

The concept of Guided Frame came about when the camera team invited Pixel’s accessibility group to participate in an annual hackathon in 2021. Googler Lingeng Wang, a technical program manager focused on product inclusion and accessibility, worked with colleagues Bingyang Xia and Paul Kim. A new way to make selfies easier for people who are blind or have low vision. “We were focusing on the wrong solution: telling blind and low-vision users where the front-facing camera was,” says Langing. We discovered that we needed to offer real-time feedback while these users were actively snapping selfies after doing an initial research with three blind Googlers.

To bring Guided Frame beyond the hackathon, various teams at Google began working together, including accessibility, engineering, haptics teams and more. First, the teams tried a very simple test method: trying to take a selfie with their eyes closed. Even this inadequate test made it abundantly clear to visible team members that “we were underserving these customers,” said Kevin Fu, product manager at GuidedFrame.

It was extremely important to everyone working on this feature that testing focused on blind and visually impaired users throughout the process. In addition to internal testers, known as “dogfooders” at Google, the team involved Google’s main accessibility team and sought advice from a group of blind and visually impaired Googlers.

After coming up with the first version, they asked blind and visually impaired users to try out the yet-to-be-released feature. “We started by giving volunteers access to our early prototypes,” says user experience designer Jabi Lowe. “They were asked about things like audio, haptics — we wanted to know how everything was coming together because using the camera is a real-time experience.”

Victor Tsaran, a Material Design Accessibility Lead, was one of the blind and low-vision Googlers who tested Guided Frame. Victor remembers being impressed with the prototype even then, but he also noticed that the team listened to and addressed his feedback, improving it over time. “I was also happy that Google Camera was getting a great accessible feature of this quality,” he says.

How we tested the Pixel's new inclusive camera features
How we tested the Pixel’s new inclusive camera features

The team was soon experimenting with a number of ideas based on feedback. “Blind and low-vision testers helped us identify and develop the ideal combination of audio cues, haptics and alternative sensory feedback of vibration, intuitive sound and visual high-contrast elements,” says Langeng. From there they scaled it up, sharing the final prototype with even more volunteer Googlers.

This testing taught the team how important it is for the camera to automatically capture a photo when the person’s face is in the center. Sighted users generally don’t like this, but blind and low vision users also appreciate not having to search for the shutter. Voice guidance also proved key, especially for selfie enthusiasts aiming for the perfect composition: “People quickly knew if they were left, right or in the middle, and could adjust accordingly.” are,” says Kevin. Allowing people to take the product home and test it out in everyday life allowed the Guided Frame to help with both selfies and group pictures, which Kevin says he found consumers also wanted.

The team wants to expand what the guided frame can do, and that’s something they’re continuing to explore. “We haven’t worked towards this journey of creating the world’s most comprehensive, accessible camera.

Related topics

5 Gmail tips to start your inbox fresh in 2023

24 Must Have WordPress Plugins for Business Websites in 2023

20 Thoughts That Can Help You Enjoy “Boring” Work As Best As You Can

1 thought on “<strong>How we tested the Pixel’s new inclusive camera features</strong>”

  1. Pingback: Top Best 12 Google Chrome Extensions for 2023

Leave a Comment

Your email address will not be published. Required fields are marked *

%d bloggers like this: