Studies show that consumers have a hard time recognizing brand colors when asked to choose from color swatches.  Not to say they don’t know that Coke’s brand color is red, but they can’t choose the correct red from color swatches. The printing industry is obsessed with color and in particular brand colors.  But, in the digital world, not so much.  There are too many variables that affect viewing a color.  Taking this into consideration along with the consumer’s color memory, is there any value in maintaining strict color standards for brands?   I’m not suggesting a rainbow variation of a color, but close seems to be good enough.  I’ve copied a blog post from Eddy Hagen who actually did a study.  I would love to know what you think. 

You can’t correctly remember an iconic color,
not even Coca-Cola red

POSTED BY: EDDY HAGEN PUBLISHED: 18/04/2018 – LATEST UPDATE: 01/11/2018

 

A few months ago, I published an article about color memory, including a short test. The test was to check whether you could remember an iconic color, Coca-Cola red, correctly. Although the number of respondents is still limited, my statement seems to be confirmed: you can’t correctly remember an iconic color. Not even that famous and ubiquitous Coca-Cola red.

THE TEST
In the test, I show six variations of Coca-Cola red, chosen from other brand colors, like e.g. Tesla and Adobe. And this is what makes it so difficult: they are all variations of red. If you read my first article on color memory, based on a Johns Hopkins University study, you will already know that our color memory is rather ‘rough’: we will know something was red, but not the exact kind of red. That was also the reason why I developed this small test.

The six variations had a Delta E 76 up to 21, (Delta E is a color measurement of space between colors) color deviation that would not be acceptable in print. So let’s take a look at the results.

As you can see in the graph above: the right color was not the most popular color! The one with the highest score had a Delta E 76 of no less than 11! And another interesting observation: only very few people picked the closest color (Delta E 76 of 4).

Yes, but…
Yes, I know, there are a few remarks to be made with this test. First: the number of participants is still limited– UPDATE 01/11/2018: the number of participants has grown significantly, from 19 when the article was first published, to 312).

Second: most people didn’t look at the colors on a calibrated screen (only 3 of them did, none picked the right color – UPDATE 01/11/2018: at this moment 69 did the test on a calibrated screen, only 14 (20%) picked the right color). And while this is correct, think about this: when do you ever, in real life, watch a TV commercial, a web page on the right type of (calibrated) screen and in the right environment (ISO standards 12646:2015 and 14861:2015). Or when do you ever look at a product under D50 light, conforming ISO standard 3664:2009? Probably never and certainly not in your local supermarket. And that’s just the point: the viewing conditions are never what ISO standards mandate, except when judging print quality… So only when you check a press proof, you will see the colors in a stable and known environment. All other times, the amount of light will be very different (on the beach versus a dark pub), the color temperature will be very different (at noon at that same beach in Florida, or in the evening in a park in Antwerp, Belgium). So, please: be realistic when judging color differences. Print defects are much more important than (small) deviations in color.

 

Why is this important?
When judging print quality, print buyers can be too rigorous. Small color deviations compared to the ideal color, to a previous batch, between proof and print, don’t matter. We can’t even remember the most iconic color in the world correctly, this test proves it.

UPDATE 18/05/2018: thanks to the endorsement by color scientist John ‘The Math Guy’ Seymour on LinkedIn, and mentions in the newsletters of Grafoc and Tao Colorist, the number of participants has grown significantly. When I first published the results, there were only 19 participants, now we have 312 (01/11/2018), making the results statistically more relevant.