Google Fitzpatrick Skin Type Scale alternative

Most women will at some point have classified their skin type on what’s known as the Fitzpatrick Skin Type scale. It’s become the default since it was coined in the 1970s by a Harvard dermatologist.

Any time you buy colour cosmetics online or visit a dermatologist you’d find out what your Fitzpatrick score was. And yet, like many things that have become the norm, seen with a fresh perspective they turn out to not serve us anymore.

In fact, had beauty and dermatology been listening to its clients of non-white heritage for decades, they would have known that the Fitzpatrick Skin Type scale, while serviceable for some for a time, was far from ideal or even useful to anyone on the darker end of the skin-tone spectrum. Even the US’s Homeland Security department has advised to abandon using it in their screening tools as the scale is such a blunt instrument that doesn’t work with diverse populations.

So what is the FST?

Harvard Dermatologist Dr. Thomas Fitzpatrick created the Fitzpatrick Skin Type scale in the 1970s for, initially, four white skin types that suffered psoriasis. The scale didn’t even include non-white skin types for several years, only later adding monolithic “brown” and ‘black” categories, lumping together all skin tones that didn’t fit the four white categories. With ranges like Rhianna’s Fenty Beauty, which carries 50 different shades of foundation, selling out in minutes, it’s clear a six-category scale isn’t nuanced enough.

Google’s alternative to the FST

In an exclusive article with Reuters, Google says “We are working on alternative, more inclusive, measures that could be useful in the development of our products, and will collaborate with scientific and medical experts, as well as groups working with communities of color,” declining to offer details on the effort for now to both Reuters and Professional Beauty when contacted for further comment.

How will Google’s new scale work?

The search for a Fitzpatrick Scale alternative has been ongoing for several years and roused interested when Google’s announced its new AI derm tool – DermAssist – a few months ago. DermAssist is a forthcoming web-based application that will use your smartphone’s camera to help you match skin, hair and nail conditions (and related information) to your ailment. The AI initially used the FST, but Google realized that the scale was inadequate and has been working on a different way to recognise and classify skin types.

There’s no word yet on how it will work, but there might be a hint as to how it will function, thanks to new features on the forthcoming Pixel and its camera, with new updates hinging on light absportion and white balance.

CNet reports that “As part of the product inclusivity project, Google said its engineers partnered with image making experts who have used thousands of images to ‘diversify our image datasets’ to create a ‘guidebook to capture skin tones,’ improve auto exposure algorithms and overhaul auto white balance accuracy.

‘We’re making auto white balance adjustments to algorithmically reduce stray light, to bring out natural brown tones and prevent over brightening and desaturation of darker skin tones,’ [Sameer Samat, Google vice president of Android and Google Play] said. “We’re also able to reflect wavy hair types more accurately in selfies with new algorithms that better separate a person from the background in any image.’”

Background on Google DermAssist?

With Google search fielding nearly ten billion searches a year for hair, skin and nail issues – and the company noting that two billion people a year suffer from dermatologic issues, with a huge shortage in global specialists to address their needs – a tool that only recognizes six skin types was always bound to fall short in these circumstances.

Google says in its Keyword Blog “Our AI-powered dermatology assist tool is a web-based application that we hope to launch as a pilot later this year, to make it easier to figure out what might be going on with your skin. Once you launch the tool, simply use your phone’s camera to take three images of the skin, hair or nail concern from different angles. You’ll then be asked questions about your skin type, how long you’ve had the issue and other symptoms that help the tool narrow down the possibilities. The AI model analyzes this information and draws from its knowledge of 288 conditions to give you a list of possible matching conditions that you can then research further.

Our landmark study, featured in Nature Medicine, debuted our deep learning approach to assessing skin diseases and showed that our AI system can achieve accuracy that is on par with U.S. board-certified dermatologists. Our most recent paper in JAMA Network Open demonstrated how non-specialist doctors can use AI-based tools to improve their ability to interpret skin conditions.” 

Read the current issue of our digital magazine here:

Have an idea for a story or want to see a topic covered on our site and in our pages? Get in touch at