Text to image AI generators are a rabbit hole of wonder and bias. As a test I entered, “enlightenment arising from the prefrontal cortex”. Below are images produced by Dall-E 2 Open AI. All male, though the second from left might be viewed as female or androgynous.
Then I typed, “enlightenment arising from a female prefrontal cortex”. Below are the images Dall-E 2 Open AI generated.
I was taken aback by the cultural male-female stereotyping this AI generator was so clearly mirroring.
In the first grouping:
- A cerebral, serious emotive tone pervades images 1, 3, and 4; the most male-like and Caucasian
- Image 2 (the most female-like) has a softer more contemplative feel.
And though the skin is brown, the face is European.
In the second grouping:
- Image 1 looks dazed and entranced, not awakened
- Images 2 and 4 are joyful, elated, with no contemplative feel at all
- Image 3 is a feminized rehash of a traditional Buddha image
- All four are Caucasian.
AI generators are supposedly trained on massive quantities of information regarding image types and features that human minds associate with specific concepts and/or words. Therefore, one could expect an AI generator to consume a wide swath of information, gather that into the most common associations, and deliver an image reflective of that process.
I suspect what may have been delivered in this case is a mash up of existing stereotyped male favoritism found in two necessary sources: medical illustration and religious iconography. Images are a vehicle to plainly see societal assumptions, especially wholly incorrect ones.
Group 1 delivers a very specific message: enlightenment occurs only in Caucasian male brains. Most surprising for me is the vast majority of depictions of enlightened states come from thousands of years of Asian Art: Upanishadic, Buddhist, and Taoist religious images. Though the vast majority of those images are male, none are Caucasian. Add into the mix medical illustration, which until very recently only depicted male anatomy of the head. Now the scale tips so radically and completely toward Caucasian male. The Asian origins of the framework of enlightenment is utterly wiped out! Could this also be helped along by the more recent Western image bias for depicting meditators as Caucasian males? Maybe.
Group 2 delivers another sort of specific message: when enlightenment occurs in a female brain, happiness is all a woman will experience. No awakening, no clarity, no shift in understanding. And frankly, as someone who been a female meditation practitioner for many decades, until very recently, my scholarship of textual material and dedication to practice was largely minimized by male teachers in both Yogic and Buddhist training contexts. So, the AI generator is merely following suit. One can almost forgive an AI generator for its Caucasian female bias. It is clearly feeding from the trough of blond Caucasian blissful females gracing mindfulness and meditation magazine covers and Instagram posts. The assumptive biases of Group 2 make my stomach turn even more than those depicted in Group 1.
What is to be done about this quagmire of bias and lack of accuracy in AI image generation? I don’t have an answer. I do have an unsettling concern that the AI world care little about historical inaccuracies and doing their part to discontinue millennia of imagistic injustices and biases.