Black artists say AI shows bias, with algorithms erasing their history
The First Art Newspaper on the Net    Established in 1996 Wednesday, November 27, 2024


Black artists say AI shows bias, with algorithms erasing their history
Stephanie Dinkins at work in her Brooklyn studio in New York on June 10, 2023. For the past seven years, Dinkins has experimented with AI’s ability to realistically depict Black women smiling and crying. Despite improvements, there are often machine distortions that mangle facial features and hair textures. (Flo Ngala/The New York Times)

by Zachary Small



BOSTON, MASS.- The artist Stephanie Dinkins has long been a pioneer in combining art and technology in her Brooklyn-based practice. In May she was awarded $100,000 by the Guggenheim Museum for her groundbreaking innovations, including an ongoing series of interviews with Bina48, a humanoid robot.

For the past seven years, she has experimented with artificial intelligence’s ability to realistically depict Black women, smiling and crying, using a variety of word prompts. The first results were lackluster if not alarming: Her algorithm produced a pink-shaded humanoid shrouded by a black cloak.

“I expected something with a little more semblance of Black womanhood,” she said. And although the technology has improved since her first experiments, Dinkins found herself using runaround terms in the text prompts to help the AI image generators achieve her desired image, “to give the machine a chance to give me what I wanted.” But whether she uses the term “African American woman” or “Black woman,” machine distortions that mangle facial features and hair textures occur at high rates.

“Improvements obscure some of the deeper questions we should be asking about discrimination,” Dinkins said. The artist, who is Black, added, “The biases are embedded deep in these systems, so it becomes ingrained and automatic. If I’m working within a system that uses algorithmic ecosystems, then I want that system to know who Black people are in nuanced ways, so that we can feel better supported.”

She is not alone in asking tough questions about the troubling relationship between AI and race. Many Black artists are finding evidence of racial bias in AI, both in the large data sets that teach machines how to generate images and in the underlying programs that run the algorithms. In some cases, AI technologies seem to ignore or distort artists’ text prompts, affecting how Black people are depicted in images, and in others, they seem to stereotype or censor Black history and culture.

Discussion of racial bias within AI has surged in recent years, with studies showing that facial recognition technologies and digital assistants have trouble identifying the images and speech patterns of nonwhite people. The studies raised broader questions of fairness and bias.

Major companies behind AI image generators — including OpenAI, Stability AI and Midjourney — have pledged to improve their tools. “Bias is an important, industrywide problem,” Alex Beck, a spokeswoman for OpenAI, said in an email interview, adding that the company is continuously trying “to improve performance, reduce bias and mitigate harmful outputs.” She declined to say how many employees were working on racial bias, or how much money the company had allocated toward the problem.

“Black people are accustomed to being unseen,” the Senegalese artist Linda Dounia Rebeiz wrote in an introduction to her exhibition “In/Visible,” for Feral File, an NFT marketplace. “When we are seen, we are accustomed to being misrepresented.”

To prove her point during an interview with a reporter, Rebeiz, 28, asked OpenAI’s image generator, DALL-E 2, to imagine buildings in her hometown, Dakar. The algorithm produced arid desert landscapes and ruined buildings that Rebeiz said were nothing like the coastal homes in the Senegalese capital.

“It’s demoralizing,” Rebeiz said. “The algorithm skews toward a cultural image of Africa that the West has created. It defaults to the worst stereotypes that already exist on the internet.”

Last year, OpenAI said it was establishing new techniques to diversify the images produced by DALL-E 2, so that the tool “generates images of people that more accurately reflect the diversity of the world’s population.”

An artist featured in Rebeiz’s exhibition, Minne Atairu is a doctoral candidate at Columbia University’s Teachers College who planned to use image generators with young students of color in the South Bronx. But she now worries “that might cause students to generate offensive images,” Atairu explained.

Included in the Feral File exhibition are images from her “Blonde Braids Studies,” which explore the limitations of Midjourney’s algorithm to produce images of Black women with natural blond hair. When the artist asked for an image of Black identical twins with blond hair, the program instead produced a sibling with lighter skin.

“That tells us where the algorithm is pooling images from,” Atairu said. “It’s not necessarily pulling from a corpus of Black people, but one geared toward white people.”

She said she worried that young Black children might attempt to generate images of themselves and see children whose skin was lightened. Atairu recalled some of her earlier experiments with Midjourney before recent updates improved its abilities. “It would generate images that were like blackface,” she said. “You would see a nose, but it wasn’t a human’s nose. It looked like a dog’s nose.”

In response to a request for comment, David Holz, Midjourney’s founder, said in an email, “If someone finds an issue with our systems, we ask them to please send us specific examples so we can investigate.”

Stability AI, which provides image generator services, said it planned on collaborating with the AI industry to improve bias evaluation techniques with a greater diversity of countries and cultures. Bias, the AI company said, is caused by “overrepresentation” in its general data sets, though it did not specify if overrepresentation of white people was the issue here.

Earlier this month, Bloomberg analyzed more than 5,000 images generated by Stability AI, and found that its program amplified stereotypes about race and gender, typically depicting people with lighter skin tones as holding high-paying jobs while subjects with darker skin tones were labeled “dishwasher” and “housekeeper.”

These problems have not stopped a frenzy of investments in the tech industry. A recent rosy report by the consulting firm McKinsey predicted that generative AI would add $4.4 trillion to the global economy annually. Last year, nearly 3,200 startups received $52.1 billion in funding, according to the GlobalData Deals Database.

Technology companies have struggled against charges of bias in portrayals of dark skin from the early days of color photography in the 1950s, when companies like Kodak used white models in their color development. Eight years ago, Google disabled its AI program’s ability to let people search for gorillas and monkeys through its Photos app because the algorithm was incorrectly sorting Black people into those categories. As recently as May of this year, the issue still had not been fixed. Two former employees who worked on the technology told The New York Times that Google had not trained the AI system with enough images of Black people.

Experts who study artificial intelligence said that bias goes deeper than data sets, referring to the early development of this technology in the 1960s.

“The issue is more complicated than data bias,” said James E. Dobson, a cultural historian at Dartmouth College and the author of a recent book on the birth of computer vision. There was very little discussion about race during the early days of machine learning, according to his research, and most scientists working on the technology were white men.

“It’s hard to separate today’s algorithms from that history, because engineers are building on those prior versions,” Dobson said.

To decrease the appearance of racial bias and hateful images, some companies have banned certain words from text prompts that users submit to generators, like “slave” and “fascist.”

But Dobson said that companies hoping for a simple solution, like censoring the kind of prompts that users can submit, were avoiding the more fundamental issues of bias in the underlying technology.

“It’s a worrying time as these algorithms become more complicated. And when you see garbage coming out, you have to wonder what kind of garbage process is still sitting there inside the model,” the professor added.

Auriea Harvey, an artist included in the Whitney Museum’s recent exhibition “Refiguring,” about digital identities, bumped into these bans for a recent project using Midjourney. “I wanted to question the database on what it knew about slave ships,” she said. “I received a message saying that Midjourney would suspend my account if I continued.”

Dinkins ran into similar problems with NFTs that she created and sold showing how okra was brought to North America by enslaved people and settlers. She was censored when she tried to use a generative program, Replicate, to make pictures of slave ships. She eventually learned to outwit the censors by using the term “pirate ship.” The image she received was an approximation of what she wanted, but it also raised troubling questions for the artist.

“What is this technology doing to history?” Dinkins asked. “You can see that someone is trying to correct for bias, yet at the same time that erases a piece of history. I find those erasures as dangerous as any bias, because we are just going to forget how we got here.”

Naomi Beckwith, chief curator at the Guggenheim Museum, credited Dinkins’ nuanced approach to issues of representation and technology as one reason the artist received the museum’s first Art & Technology award.

“Stephanie has become part of a tradition of artists and cultural workers that poke holes in these overarching and totalizing theories about how things work,” Beckwith said. The curator added that her own initial paranoia about AI programs replacing human creativity was greatly reduced when she realized these algorithms knew virtually nothing about Black culture.

But Dinkins is not quite ready to give up on the technology. She continues to employ it for her artistic projects — with skepticism. “Once the system can generate a really high-fidelity image of a Black woman crying or smiling, can we rest?”

This article originally appeared in The New York Times.










Today's News

July 5, 2023

Black artists say AI shows bias, with algorithms erasing their history

The Approach presents a series of new paintings by Mike Silva

Tanaka Kyokusho third solo exhibition now on view at TAI Modern

Haus Kunst Mitte now presenting summer exhibitions 'The Eyes of Roxana Halls' and 'To Be - Named'

Lagos, Peckham, Repeat: Pilgrimage to the Lakes at South London Gallery & Fire Station Galleries

In Moving Images, Michael Snow teases the eye and the mind

Izumi Kato's third solo exhibition on view at Perrotin, Paris

Will America be ready for its 250th birthday?

Bill Reid Gallery highlights Haida metalwork practices and history in 'The Art of Dimension'

Colnaghi Elliott presenting new exhibition "Drawings from the 19th Century" during London Art Week

Priska Pasquer presenting 'Tending Trending' by Zohar Fraiman until this July 13th

Infamous Dillinger escape vehicle up for auction at Witherell's in Sacramento

Architects CAN and artist Felicity Hammond collaborate on new public artwork for Brighton

Chronometer from Shackleton's British Antarctic expedition stars in Bonhams Fine Clock sale

Paul Justman, who shed light on Motown's unsung heroes, dies at 74

Last chance to see: Exhibition by Felix Kindermann on view at Goethe-Institute New York

Peter Brötzmann, 82, dies; His thunderous saxophone shook jazz traditions

$1 Million gift from TD Bank Group enhances Vancouver Art Gallery's new building project

OJMCHE reopens with expanded galleries and a powerful new core exhibition

Dawit L. Petros: Recollections now opening at Tiwani Contemporary

Noonans sell Campaign Medal of recipient of Victoria Cross who was imprisoned for Bigamy

Ulster Bank gifts Belfast's 'Flying Figures' to national museums NI

Museo di Palazzo Grimani opens 'Ugo Carmeni: Venice Mapping Time'

How to get a high-paid Salesforce job in Hyderabad?

Chemical Centrifugal Pump, Efficient Fluid Transfer for Chemical Applications

Rebahin Film: Exploring the Popular Indonesian Streaming Platform

A Deep Dive into the World of Contemporary Fine Art at Eden Gallery

Why Vograce is the Best Choice to Buy Custom Body Pillows?

The Enchanting World of Raku: Discovering the Soul of Japanese Tea through Pottery

Stay Warm in Style: Embrace Winter with Cozy Scarves and Sets

Valid reasons to appoint an attorney for your will and estate plan

Can a motorcycle accident lawyer be of any help after a crash?




Museums, Exhibits, Artists, Milestones, Digital Art, Architecture, Photography,
Photographers, Special Photos, Special Reports, Featured Stories, Auctions, Art Fairs,
Anecdotes, Art Quiz, Education, Mythology, 3D Images, Last Week, .

 



Founder:
Ignacio Villarreal
(1941 - 2019)
Editor & Publisher: Jose Villarreal
Art Director: Juan José Sepúlveda Ramírez
Writer: Ofelia Zurbia Betancourt

Attorneys
Truck Accident Attorneys
Accident Attorneys
Houston Dentist
Abogado de accidentes
สล็อต
สล็อตเว็บตรง
Motorcycle Accident Lawyer

Royalville Communications, Inc
produces:

ignaciovillarreal.org juncodelavega.com facundocabral-elfinal.org
Founder's Site. Hommage
to a Mexican poet.
Hommage
       

The First Art Newspaper on the Net. The Best Versions Of Ave Maria Song Junco de la Vega Site Ignacio Villarreal Site Parroquia Natividad del Señor
Tell a Friend
Dear User, please complete the form below in order to recommend the Artdaily newsletter to someone you know.
Please complete all fields marked *.
Sending Mail
Sending Successful