Orange You Accessible? A Mini Case Study on Color Ratio

March 22, 2019 | Ericka O'Connor
Orange You Accessible? A Mini Case Study on Color Ratio blog image

I recently completed work for a client who used orange heavily in their branding: it was in their print ads, brochures, posters, and signage, and they wanted to incorporate it on their website. This led to a comparison of orange buttons. 

Can you guess which button is more accessible?

button accessibility examples showing two orange buttons one with black text and one with white text

If you guessed the left button, congrats! You are an accessibility wizard. If you guessed the right, you’re not alone. White on orange appears to be more clear, but black on orange is technically more accessible. As a designer, accessibility compliance is a topmost priority. Accessibility ensures that we are being empathetic and inclusive of all different abilities including, but not limited to, visual limitations such as color blindness. 

The difference between the buttons is complex and opens up additional questions of accessibility. Join me as I qualitatively deconstruct why the white text button is more legible than the black text button, despite what quantitative results such as contrast ratios would suggest. 

A Colorful Background

People with color blindness are less likely to see the contrast between certain colors. Using tools based in scientific equations can help determine if certain color combinations pass accessibility standards. One such tool, Colorable, calculates the contrast ratio between colors and ensures they meet Web Content Accessibility Guidelines (WCAG), an established range of recommendations for making web content more accessible. AA compliance, the most common, requires a minimum contrast ratio of 4.5 or 3 for large text, while AAA compliance, the more stringent or rare, requires a minimum contrast ratio of 7 or 4.5 for large text. Large text for AA and AAA is 18.66px minimum.

An orange background results in the following contrast ratios: 

two examples or orange background with black text and white text showing contrast ratios

Table 1: Color Contrast Ratio of Our Orange with Black and White Text

Although white looked significantly clearer to me, it wasn’t AA compliant for my 14px button text. I couldn’t change the background color — orange was a branded color used in advertisements and billboards across the city, so any other color would feel disconnected. And although white stood out more to me, black was technically the more accessible alternative — but it also felt like Halloween.  Lost that modern, but non-digital tie-in. 

As I looked at the two buttons, I was still convinced that white was more accessible. It seemed to have a lot more contrast, but the contrast ratios varied hugely between black text button (6.44 AA Compliant, or well above the minimum contrast required) and white text button (3.26 AA Large, or well below the minimum contrast required). I decided to deconstruct the disparity to see if there was a loophole, using a few tools from my experience over the years to figure out how and why this was happening.

Squint Test 

I started with the squint test because it was the easiest and quickest way to determine if this issue was worth exploring. It’s a frequently used technique and can be used by anyone. By squinting, you can tell which elements stand out on the page — in this case, the prominence of CTAs. Squinting is a natural contrast checker, but it doesn’t have science behind it. It wouldn’t hold up in the court of accessibility standards. 

Conclusion: White text button wins, but there isn’t any science behind what I tested and I’m not color blind so there are huge gaps of knowledge to fill in. Let’s look at color-blind users to see if they can explain it further.

Color Blind Simulator 

There are many tools that calculate accessibility and simulate color blindness. While using them takes a little longer than the second-long squint test, it’s still easy and quick to use them to make accessible design decisions. I pulled in a couple of tools to take my research a step beyond the squint test. 

Using the Sketch plugin Stark, I simulated a few different types of color blindness. Then, I put the Button Background HEX/RGB colors that were generated as a result of Stark’s simulator into Colorable to determine if the answers lay in the color contrast ratio.

example of various background colors with black and white text options showing color contrast ratios



Table 2: Color Contrast Simulator & Color Contrast Ratios by Color Blindness Types

With all the colors shown in each simulation of a different type of color blindness, I did a new squint test. Still, white text buttons appeared to have more contrast. I then took an eyedropper to the new color seen by our various color blindness types and found the HEX/RGB background generated to see if the contrast ratios changed. Interestingly, they stayed the same with a wide difference between the black text button and the white text button.

Conclusion: Tools gave me different answers. When I used Stark to simulate a squint test for different types of color-blind users, I found that the white text button still showed up the most clearly. However, Colorable told another story: the black text button was favored by a wide margin. In order to see if the problem lay with the tools or some other variable, I needed to understand the human factor.

User Testing with Color Blind Participants

Since I’m not color blind, I needed to survey real color-blind users. With a sample set of about 20 color-blind colleagues, I asked three questions:

  1. What type of color blindness do you have?
  2. Which option is easier to read?
  3. Why?

Q1: What Type of Color Blindness Do You Have?

chart depicting percentages of type of color blindness

Graph 1: Type of color blindness

The majority of my users had deuteranopia or a deuteranomaly (no green cones or a limited number of green cones). This is the most common type of color blindness. The second most common type of color blindness is protanopia; deuteranopia and protanopia combined affect 8% of men and 0.5% of women. Since the data consistently reflects what is happening in the world, I felt like our colleagues were a good mix of subjects.

Q2: Which Option is Easier to Read?

survey results where 61% of people surveyed preferred white text button and 39% chose the black text button

Graph 2: Easier to read — All Participants

Out of everyone surveyed, 61% of users preferred the white text button. Even color blind users thought the white text button was more legible. I was curious how the other 39% landed on the black text button, so I looked at answers one and two to see how different types of color blindness affected the second answer.

charts depicting survey participants by color blindess

Graph 3: Easier to read — Participants by Color Blindness

The results indicated a clear pattern of preference depending on the type of color blindness someone has. Protanopia/protanomaly color-blind users favored white text by 71%, while users with deuteranopia/deuteranomaly were split 50/50. The single user who had tritanopia/tritanomaly favored the white text, and the one user surveyed who had monochrome/achromatopsia favored the black text.

Our tools for color contrast, color blindness simulation, and the mathematical analysis don’t tell us that there could be a difference between users. But in design, we have to be empathetic to all experiences and find the correct journey. Marrying the brand goals with the user goals becomes extremely tricky here because no matter which option we choose, someone’s color blind preferences are being overridden. 

Q3: Why is That Option Easier to Read?

My final question was an attempt to understand why a specific combination was more understandable — if it was contrast that made it easier to read or some other factor. 

Contrast was consistently chosen as the reason for clarity. Here are a few of the responses related to contrast that provide some added depth:

“Whatever colour this is (I don't really know lol) this is easier for me to read with the white text.”

- Deuteranopia/Deuteranomaly, White Text Button

“Difference between the two is relatively small, but definitely more contrast between white and surrounding color than I see with the black text.”

- Protanopia/Protanomaly, White Text Button

“Black is more easily identifiable (and faster) — the white falls into the background.”

- Deuteranopia/Deuteranomaly, Black Text Button

“The black blends together with the orange.”

- Tritanopia/Tritanomaly, White Text Button

For those that mention contrast, a few responses stood out — they talked about accessibility problems like buzzing on the screen, headaches, and white on dark text: 

“I honestly don't have trouble reading either of these but the white text just seems slightly easier on my eyes.”

- Protanopia/Protanomaly, White Text Button

“I can read the black text just fine, but it makes my head hurt to look at it for a long time.”

- Unsure, White Text Button

“Not sure. Neither one is difficult to read. The white text is slightly easier on my eyes. The black text with the orange background has a slight halo effect around it. The white is easier to track as I scroll.”

- Protanopia/Protanomaly, White Text Button

Conclusion: The data set of users preferred the white text button over the black text button, primarily because of contrast. But different types of color blindness produced different results. Specifically, the monochrome/achromatopsia user preferred black text. I also uncovered some interesting legibility concerns. Some users had unique issues reading the black text, saying it caused buzzing and headaches. Even though the black text is the accessible option by WCAG standards, it fails to account for this particular level of accessibility.

Next Steps / Orange You Sure?

In the squint test, color contrast tools, color blindness simulation tools, and user testing, the white text button wins. It is only in the color ratio math and given certain types of color blindness that the black text button wins. But this conclusion comes with several caveats.

There were still color-blind users who found black text more legible, and we must always account for users with varying abilities when designing. To what extent we favor these users is an important distinction. These users might be rare enough to fall into the AAA compliance category and there would be no way to tell unless we scientifically studied each individual’s color blindness severity. Knowing what we know about different types of color blindness, we try to account for as many types of people as we can within the parameters of WCAG

Since the math is what dictates how the law decides if a site is accessible, it is critical to design based on math. However, the math that I researched has lost me in their equations and standards of color contrast. I would like to believe that there is an outlier, especially in the color orange that causes these digits to be off. Further research is needed to help determine why the white text button was preferred. If you're hoping for a clear answer on our orange black/white challenge, unfortunately, I don't have a great resolution here. I'd encourage you to follow the official guidelines, and as accessibility standards develop and evolve, I hope to understand more why these outliers appear to break the established standards.


Resources

Learning more about accessibility is a great first step to making your websites and products more accessible. It's not an easy process, but adhering to accessibility best practices starting with the earliest stages in design will ensure a thoughtful and accessible experience. We love sharing about accessibility, check our posts on Empathy in Web Accessibility as well as a collection of great links and tools below: