Dhruv Bhutani / Android Authority
TL;DR
- A recent study by DXOMARK indicated that AI training bias might be negatively affecting the way images of people with darker skin tones are processed.
- The test scenes used for camera calibration were another factor that might be causing inconsistencies in image processing.
- DXOMARK highlighted the importance of more diversity in data sets and improved tuning strategies to ensure consistent image quality.
Ever taken a portrait photograph only to be disappointed with the results? A recent study by DXOMARK, which was conducted as part of the DXOMARK Insights initiative, sheds light on why this might be happening despite you using the best Android phone or iPhone.
AI training bias could affect the way portrait photographs are processed
DXOMARK’s study revealed a significant disparity in user satisfaction ratings for portrait-mode images that featured people with darker skin tones. This was observed regardless of the respondent’s own skin tone.
The study pointed to AI training bias as one of the main factors contributing to this disparity. The fact is that most AI models are trained on limited data sets that often don’t include a diversity of skin tones. This causes the AI tech in smartphones to detect and process images of darker skin-toned individuals inaccurately.
DXOMARK’s report included two photographs of individuals with different skin tones to emphasize just how starkly the camera’s focus and exposure settings can vary across the two instances. As seen below, the model with a lighter skin tone appears in focus and is properly exposed, while the photo of the darker skin-toned model has low exposure and is out of focus.
Tuning bias might play a role as well
Camera calibration is another factor that might further exacerbate this problem. If the test scenes used for tuning came predominantly from certain geographies, the camera’s calibration might not be a good fit for a diverse user base. This can cause differences in the way devices render skin tones, and the study noted that some smartphones appear to favor “lighter skin tones over darker ones.”
The way forward
DXOMARK’s study and resulting report have more than highlighted the need for more diversity in data sets and improved tuning strategies to ensure consistent image quality, regardless of the subject’s skin tone or appearance. In the report, DXOMARK emphasized the need to include more use cases that span a range of settings, conditions, and skin tones, as well as different annotations and multiple examples of the final rendering to ensure a satisfactory photography experience is delivered to all users.
The good news is that smartphone manufacturers are starting to recognize and address this issue. Google’s Real Tone tech that it uses in Pixel devices is proof of this. The Mountain View tech giant notes that it has worked closely with people of color to develop its camera technology, so people with darker skin tones don’t appear lighter or darker in photographs. This diversity in data sets is what helps many Pixel devices capture skin tones more accurately.
Many other smartphone manufacturers are working on similar tech as well, sometimes without being vocal about it. For instance, Apple’s #shotoniPhone campaign has photographed a diverse range of people, many of whom are people of color, for years now. Given this, it’s a strong possibility that smartphone camera tech will only continue to improve, providing a more accurate photographic experience for all users.
Got a tip? Talk to us! Email our staff at [email protected]. You can stay anonymous or get credit for the info, it's your choice.