I don’t care what color the dress is. Instead, let’s focus on the three questions around this Internet tempest in a teapot that actually are intriguing.
- What caused the camera take such a weirdly inaccurate photo?
- Why is the human perceptual system causing this image to be perceived differently by some people?
- Was one side’s perception of the color more ‘correct’ than the others (spoiler: Yes, and I’ll prove it).
By way of full disclosure, I mostly see the dress as goldish tan and very light blue. However, I am ambi-dress-trous. With some effort, I can force myself to see it as blue/black as well. Below, I’ll show you how you may be able to trick your brain into the seeing it the other way too. First, a few facts.
- Secondary photos show that the physical dress is, in reality, blue/black. However, this proves nothing about what the original inaccurate photo of the dress shows. A photo of the dress is not the dress, so knowing the true color of the dress doesn’t change what the original untrue photo shows.
- The color cues in the photo are unintentionally deceptive. How your visual system interprets these cues leads to the difference in perception of colors.
- The Photoshop color picker demonstrates that the original, inaccurate photo shows the dress colors to be a very light bluish hue and a goldish brown which fades to tan.
This brings us to the first question: what caused the camera to take a this photo with inaccurate colors?
1. Why the camera screwed up the photo
When a camera takes a photo it tries to determine the correct white balance for the scene. In other words, it has to decide what is true white. It also has to decide what is true black in the photo. To do this the camera scans the image and looks for the whitest white and the blackest black and adjusts the rest of the colors accordingly. This is called “White Balancing.” It usually works quite well but we’ve all seen it fail occasionally in photos when skin tones look blue.
So what caused the camera’s color interpretation error in the original photo? I don’t think it’s a white balance error, I think it’s an exposure error. Why? Because there are three white reference sources in the image and the camera kept them all white.
1. At the top right of the image, behind the dress, is what appears to be daylight coming from a window. We can tell it’s daylight because of the blue halo glow around the top right of the image. That’s called chromatic aberration and it often comes from daylight, so it’s proper to infer we’re seeing through a window to outdoors.
2. Closer to the camera is what appears to be a store display cabinet with internal lighting.
3. The store’s interior ceiling lights which are illuminating the white-to-beige gradient on the floor. These appear to be halogen lights, due to the slightly warmer hue of the light.