QUOTE
It’s no secret by now that artificial intelligence has a white guy problem. One could say the same of almost any industry, but the tech world is singular in rapidly shaping the future. As has been widely publicized, the unconscious biases of white developers proliferate on the internet, mapping our social structures and behaviors onto code and repeating the imbalances and injustices that exist in the real world.
There was the case of black people being classified as gorillas; the computer system that rejected an Asian man’s passport photo because it read his eyes as being closed; and the controversy surrounding the predictive policing algorithms that have been deployed in cities like Chicago and New Orleans, enabling police officers to pinpoint individuals it deems to be predisposed to crime—giving rise to accusations of profiling. Earlier this year, the release of Google’s Arts and Culture App, which allows users to match their faces with a historical painting, produced less than nuanced results for Asians, as well as African-Americans. Additionally, a new book, Algorithms of Oppression: How Search Engines Reinforce Racism, argues that search engines themselves are inherently discriminatory.
“Data sets reflect the hierarchy in which we live,” said Kate Crawford, an artificial intelligence expert at NYU, at a recent research and development salon at the Museum of Modern Art. These biased algorithms and skewed data sets “reify what are fluid social categories,” she said, making the white man the online default, the norm. (Type “CEO” into Google image search and you’ll see an endless mosaic of suited white guys.)
Given all of this, it’s no wonder that when artist Stephanie Dinkins came across a brown-skinned robot, she was surprised. Five years ago, she stumbled upon a YouTube video of an animatronic human bust, named Bina48, shown moving and conversing. Its developer, Martine Rothblatt, had modeled the robot after her wife, a woman of color named Bina Rothblatt. Dinkins recalls her astounded initial reaction: “What the heck is that?” she wondered. “How did a black woman become the most advanced of the technologies at the time?” That encounter led Dinkins to Lincoln, Vermont—where Bina48 lives at Rothblatt’s Terasem Movement Foundation—in order to engage in a series of conversations with the robot about race, intimacy, and the nature of being.
Dinkins has since gone down what she calls a “rabbit-hole” of investigations into the way that culture—particularly the experiences of race and gender—is codified in technology. She has become a strong voice in the effort to sound the alarm about the dangers of minority populations being absent from creations of the computer algorithms that now mold our lives. Her research into these imbalances has taken her on a head-spinning tour of tech companies, conferences, and residencies over the past few years. Dinkins is currently in residence at the tech programs of both Pioneer Works and Eyebeam, nonprofit art centers based in Brooklyn. She regularly leads community workshops where she educates people broadly about the impact of AI on our lives, aiming to cultivate an attitude toward technology that sees it not as an intimidating monolith—the haunting specter of computers gone awry that we see so often in Black Mirror or, most iconically, in the cunning and calculating character of HAL in Stanley Kubrick’s 1968 film 2001: A Space Odyssey—but an approachable tool that is, for better or worse, very human.
“We live in a world where we have to always be learning and willing to take on new information, and to do the work to get there, otherwise we’re sunk,” she says. “How do you move forwards in this super fast technological world?” She operates under the principle that if she can get people to think about the future in increments, it’s not quite so daunting. “In five years, what’s my world going to look like? What do I need to be doing now to start dealing with that world?”
Dinkins tries to find accessible avenues into what can seem like brain-scrambling concepts by speaking the language of her target group. In recent workshops at the New York nonprofit space Recess, she worked with kids who’d been diverted from the criminal justice system. She began by inviting them to wrap their heads around what an algorithm is, exactly, finding analog comparisons in “basic things you can do without thinking,” like brushing your teeth, or behavioral tendencies, like those that shape encounters with police. She helped them to see the way in which they are hardwired to react in such moments of conflict. They worked on Venn diagrams to visualize these interactions from their point of view and that of a cop: What were those two people thinking in this shared moment? Where are the overlaps? “Some of [the kids] can be very reactionary, which makes the situation worse,” says Dinkins. “That’s where the algorithm has to change.”
From that familiar territory, Dinkins works her way into talking about online systems and chat bots—pieces of software that emulate the conversational style of humans, and evolve as people enter into dialogue with them—as well as the larger goal of training AI to use language and ideas that relate to a more diverse range of worldviews. Participants in her workshops will often have a go at setting the intentions of a bot, then implanting it with data. One group created a bot whose sole purpose was to tell jokes. The input? Yo mama jokes. “I thought that was just amazing,” says Dinkins. “It’s the idea of taking one’s own culture and putting it into the machine, and using that to figure out how the machine is making decisions.”
Dinkins herself is busy turning the experiences of three generations of her family into a bot, an oral-history-cum-memoir of a black family in America in the form of a computer algorithm. She, her aunt, and her niece have been interviewing one another intensively for the past several months, using stock questions intended to get at the fundamentals of their values, ethics, experiences, and the history of their family. The raw interviews are put into a machine learning system that digests them and generates an amalgam of the three family members—a voice in the machine whose manner of speech is a reflection of the family’s language and concerns.
“I can already see that when you ask [the bot] stuff, it sounds sort of like my family,” says Dinkins. “I know that love is going to come out, that family is going to come out. We haven’t even fed it that much data yet, but it sounds like us. It’s kind of magical. I’ve been thinking about making another bot that’s all about telling people they’re loved. But I realize that’s just my family coming out in another way.”
Dinkins sees in this algorithmic memoir something of a proof of concept: the potential to illustrate how different AI could look when it reflects the experiences and values of a more diverse set of people, and is divorced from market values. “It’s amazing that you put in a certain ethos and ethics and it comes back out,” she says. “What does that mean when it’s detached from commercial imperatives? Because I think that’s super important too. If we’re all after the next buck, we know what we get already. It could have value commercially, but it isn’t about commercial value.”
Dinkins has settled into the reality that advocating for greater representation and human values in code will probably be her life’s work. “The project keeps growing, which is both excellent and crazy,” she says. “People are listening to me, so I’m talking about something that needs to be said, clearly. There’s an urgency about it.”
Taking on the biases of a vast, multinational web of artificial intelligence technologies is no small task. Fortunately, Dinkins is part of a small but growing community of academics, technologists, multidisciplinary professionals, and organizations—like Black Girls Code and Black in AI—who recognize the threat at hand. “It’s a monster,” she says of the scale of the problem, “but I don’t think we can afford to have an adversarial relationship with technology. The work to try to get there is really worth the effort.”
That goal may require radically breaking with received code. It brings to mind the HBO series Westworld, in which we see a fantasy universe populated by robots—and created by white men. Thandie Newton, a black British actor, plays the bot Maeve Millay, the manager of a brothel who gradually unravels the true nature of her reality: that her every action is the result of computer algorithms written by men. “All my life I’ve prided myself on being a survivor. But surviving is just another loop,” she says in one scene of the first season. In another: “Time to write my own fucking story.”
Dinkins herself is busy turning the experiences of three generations of her family into a bot, an oral-history-cum-memoir of a black family in America in the form of a computer algorithm. She, her aunt, and her niece have been interviewing one another intensively for the past several months, using stock questions intended to get at the fundamentals of their values, ethics, experiences, and the history of their family. The raw interviews are put into a machine learning system that digests them and generates an amalgam of the three family members—a voice in the machine whose manner of speech is a reflection of the family’s language and concerns.
“I can already see that when you ask [the bot] stuff, it sounds sort of like my family,” says Dinkins. “I know that love is going to come out, that family is going to come out. We haven’t even fed it that much data yet, but it sounds like us. It’s kind of magical. I’ve been thinking about making another bot that’s all about telling people they’re loved. But I realize that’s just my family coming out in another way.”
Dinkins sees in this algorithmic memoir something of a proof of concept: the potential to illustrate how different AI could look when it reflects the experiences and values of a more diverse set of people, and is divorced from market values. “It’s amazing that you put in a certain ethos and ethics and it comes back out,” she says. “What does that mean when it’s detached from commercial imperatives? Because I think that’s super important too. If we’re all after the next buck, we know what we get already. It could have value commercially, but it isn’t about commercial value.”
Dinkins has settled into the reality that advocating for greater representation and human values in code will probably be her life’s work. “The project keeps growing, which is both excellent and crazy,” she says. “People are listening to me, so I’m talking about something that needs to be said, clearly. There’s an urgency about it.”
Taking on the biases of a vast, multinational web of artificial intelligence technologies is no small task. Fortunately, Dinkins is part of a small but growing community of academics, technologists, multidisciplinary professionals, and organizations—like Black Girls Code and Black in AI—who recognize the threat at hand. “It’s a monster,” she says of the scale of the problem, “but I don’t think we can afford to have an adversarial relationship with technology. The work to try to get there is really worth the effort.”
That goal may require radically breaking with received code. It brings to mind the HBO series Westworld, in which we see a fantasy universe populated by robots—and created by white men. Thandie Newton, a black British actor, plays the bot Maeve Millay, the manager of a brothel who gradually unravels the true nature of her reality: that her every action is the result of computer algorithms written by men. “All my life I’ve prided myself on being a survivor. But surviving is just another loop,” she says in one scene of the first season. In another: “Time to write my own fucking story.”
There was the case of black people being classified as gorillas; the computer system that rejected an Asian man’s passport photo because it read his eyes as being closed; and the controversy surrounding the predictive policing algorithms that have been deployed in cities like Chicago and New Orleans, enabling police officers to pinpoint individuals it deems to be predisposed to crime—giving rise to accusations of profiling. Earlier this year, the release of Google’s Arts and Culture App, which allows users to match their faces with a historical painting, produced less than nuanced results for Asians, as well as African-Americans. Additionally, a new book, Algorithms of Oppression: How Search Engines Reinforce Racism, argues that search engines themselves are inherently discriminatory.
“Data sets reflect the hierarchy in which we live,” said Kate Crawford, an artificial intelligence expert at NYU, at a recent research and development salon at the Museum of Modern Art. These biased algorithms and skewed data sets “reify what are fluid social categories,” she said, making the white man the online default, the norm. (Type “CEO” into Google image search and you’ll see an endless mosaic of suited white guys.)
Given all of this, it’s no wonder that when artist Stephanie Dinkins came across a brown-skinned robot, she was surprised. Five years ago, she stumbled upon a YouTube video of an animatronic human bust, named Bina48, shown moving and conversing. Its developer, Martine Rothblatt, had modeled the robot after her wife, a woman of color named Bina Rothblatt. Dinkins recalls her astounded initial reaction: “What the heck is that?” she wondered. “How did a black woman become the most advanced of the technologies at the time?” That encounter led Dinkins to Lincoln, Vermont—where Bina48 lives at Rothblatt’s Terasem Movement Foundation—in order to engage in a series of conversations with the robot about race, intimacy, and the nature of being.
Dinkins has since gone down what she calls a “rabbit-hole” of investigations into the way that culture—particularly the experiences of race and gender—is codified in technology. She has become a strong voice in the effort to sound the alarm about the dangers of minority populations being absent from creations of the computer algorithms that now mold our lives. Her research into these imbalances has taken her on a head-spinning tour of tech companies, conferences, and residencies over the past few years. Dinkins is currently in residence at the tech programs of both Pioneer Works and Eyebeam, nonprofit art centers based in Brooklyn. She regularly leads community workshops where she educates people broadly about the impact of AI on our lives, aiming to cultivate an attitude toward technology that sees it not as an intimidating monolith—the haunting specter of computers gone awry that we see so often in Black Mirror or, most iconically, in the cunning and calculating character of HAL in Stanley Kubrick’s 1968 film 2001: A Space Odyssey—but an approachable tool that is, for better or worse, very human.
“We live in a world where we have to always be learning and willing to take on new information, and to do the work to get there, otherwise we’re sunk,” she says. “How do you move forwards in this super fast technological world?” She operates under the principle that if she can get people to think about the future in increments, it’s not quite so daunting. “In five years, what’s my world going to look like? What do I need to be doing now to start dealing with that world?”
Dinkins tries to find accessible avenues into what can seem like brain-scrambling concepts by speaking the language of her target group. In recent workshops at the New York nonprofit space Recess, she worked with kids who’d been diverted from the criminal justice system. She began by inviting them to wrap their heads around what an algorithm is, exactly, finding analog comparisons in “basic things you can do without thinking,” like brushing your teeth, or behavioral tendencies, like those that shape encounters with police. She helped them to see the way in which they are hardwired to react in such moments of conflict. They worked on Venn diagrams to visualize these interactions from their point of view and that of a cop: What were those two people thinking in this shared moment? Where are the overlaps? “Some of [the kids] can be very reactionary, which makes the situation worse,” says Dinkins. “That’s where the algorithm has to change.”
From that familiar territory, Dinkins works her way into talking about online systems and chat bots—pieces of software that emulate the conversational style of humans, and evolve as people enter into dialogue with them—as well as the larger goal of training AI to use language and ideas that relate to a more diverse range of worldviews. Participants in her workshops will often have a go at setting the intentions of a bot, then implanting it with data. One group created a bot whose sole purpose was to tell jokes. The input? Yo mama jokes. “I thought that was just amazing,” says Dinkins. “It’s the idea of taking one’s own culture and putting it into the machine, and using that to figure out how the machine is making decisions.”
Dinkins herself is busy turning the experiences of three generations of her family into a bot, an oral-history-cum-memoir of a black family in America in the form of a computer algorithm. She, her aunt, and her niece have been interviewing one another intensively for the past several months, using stock questions intended to get at the fundamentals of their values, ethics, experiences, and the history of their family. The raw interviews are put into a machine learning system that digests them and generates an amalgam of the three family members—a voice in the machine whose manner of speech is a reflection of the family’s language and concerns.
“I can already see that when you ask [the bot] stuff, it sounds sort of like my family,” says Dinkins. “I know that love is going to come out, that family is going to come out. We haven’t even fed it that much data yet, but it sounds like us. It’s kind of magical. I’ve been thinking about making another bot that’s all about telling people they’re loved. But I realize that’s just my family coming out in another way.”
Dinkins sees in this algorithmic memoir something of a proof of concept: the potential to illustrate how different AI could look when it reflects the experiences and values of a more diverse set of people, and is divorced from market values. “It’s amazing that you put in a certain ethos and ethics and it comes back out,” she says. “What does that mean when it’s detached from commercial imperatives? Because I think that’s super important too. If we’re all after the next buck, we know what we get already. It could have value commercially, but it isn’t about commercial value.”
Dinkins has settled into the reality that advocating for greater representation and human values in code will probably be her life’s work. “The project keeps growing, which is both excellent and crazy,” she says. “People are listening to me, so I’m talking about something that needs to be said, clearly. There’s an urgency about it.”
Taking on the biases of a vast, multinational web of artificial intelligence technologies is no small task. Fortunately, Dinkins is part of a small but growing community of academics, technologists, multidisciplinary professionals, and organizations—like Black Girls Code and Black in AI—who recognize the threat at hand. “It’s a monster,” she says of the scale of the problem, “but I don’t think we can afford to have an adversarial relationship with technology. The work to try to get there is really worth the effort.”
That goal may require radically breaking with received code. It brings to mind the HBO series Westworld, in which we see a fantasy universe populated by robots—and created by white men. Thandie Newton, a black British actor, plays the bot Maeve Millay, the manager of a brothel who gradually unravels the true nature of her reality: that her every action is the result of computer algorithms written by men. “All my life I’ve prided myself on being a survivor. But surviving is just another loop,” she says in one scene of the first season. In another: “Time to write my own fucking story.”
Dinkins herself is busy turning the experiences of three generations of her family into a bot, an oral-history-cum-memoir of a black family in America in the form of a computer algorithm. She, her aunt, and her niece have been interviewing one another intensively for the past several months, using stock questions intended to get at the fundamentals of their values, ethics, experiences, and the history of their family. The raw interviews are put into a machine learning system that digests them and generates an amalgam of the three family members—a voice in the machine whose manner of speech is a reflection of the family’s language and concerns.
“I can already see that when you ask [the bot] stuff, it sounds sort of like my family,” says Dinkins. “I know that love is going to come out, that family is going to come out. We haven’t even fed it that much data yet, but it sounds like us. It’s kind of magical. I’ve been thinking about making another bot that’s all about telling people they’re loved. But I realize that’s just my family coming out in another way.”
Dinkins sees in this algorithmic memoir something of a proof of concept: the potential to illustrate how different AI could look when it reflects the experiences and values of a more diverse set of people, and is divorced from market values. “It’s amazing that you put in a certain ethos and ethics and it comes back out,” she says. “What does that mean when it’s detached from commercial imperatives? Because I think that’s super important too. If we’re all after the next buck, we know what we get already. It could have value commercially, but it isn’t about commercial value.”
Dinkins has settled into the reality that advocating for greater representation and human values in code will probably be her life’s work. “The project keeps growing, which is both excellent and crazy,” she says. “People are listening to me, so I’m talking about something that needs to be said, clearly. There’s an urgency about it.”
Taking on the biases of a vast, multinational web of artificial intelligence technologies is no small task. Fortunately, Dinkins is part of a small but growing community of academics, technologists, multidisciplinary professionals, and organizations—like Black Girls Code and Black in AI—who recognize the threat at hand. “It’s a monster,” she says of the scale of the problem, “but I don’t think we can afford to have an adversarial relationship with technology. The work to try to get there is really worth the effort.”
That goal may require radically breaking with received code. It brings to mind the HBO series Westworld, in which we see a fantasy universe populated by robots—and created by white men. Thandie Newton, a black British actor, plays the bot Maeve Millay, the manager of a brothel who gradually unravels the true nature of her reality: that her every action is the result of computer algorithms written by men. “All my life I’ve prided myself on being a survivor. But surviving is just another loop,” she says in one scene of the first season. In another: “Time to write my own fucking story.”
Le futur