a project proposal in the form of a stream of consciousness
just went to the yerba buena center for the arts
for their annual public square event
showcasing the work that fellows there have done over the year
and as i was looking at all of the displays and installations,
i wondered if i myself could do this work,
if i were to be a fellow here,
what would it take?
what would i make?
would it be poetry? music? what else?
the installations presented were anything from
zines to sticky notes on a wall to short films
to music remixes to photographs to poetry readings
all centered on things involving social justice and equity
this is a prestigious fellowship, and
brilliance was all around.
the voice inside, that i had suffocated so long,
feels it could be brilliant too.
and i started to think.
.
what could i make?
to be engaging
to make a statement
to make a difference
.
my mind immediately drifted to
artificial intelligence — the work that i do now —
and how it relates to society
how it relates to race
how it relates to oppression
and amplifies it
and amplifies our hate
and amplifies the systems of inequities that have existed throughout our history that tech doesn’t disrupt like it thinks it does but actually builds off of and takes life from
and i thought and thought
and couldn’t think of anything
and thought some more
i know how to write books
but who reads a book at an installation
is a book accessible
is a book provocative?
in some spaces, yes
but for an installation in person, maybe that’s not the best i could do
there were zines today on display for one of the installations, but
do they engage the audience in the way i want to engage them?
i love the poetry, i love the music
but in the ways that we move in these spaces
in the ways that i observed today
maybe there’s something else to do, a better way to bring them in and say what i want to say but also have it be felt. be known. be understood at the core of their being.
.
i’m being dramatic
i can aim for that lofty goal, but
there’s a certain destructive hubris that comes with trying to be too grandiose
just… make something you think will make people stop and think
there’s always computer vision
remember that time at disneyworld (or was it universal studios?) when you came off the escalators and started to enter the park and there was a big screen playing a live video feed of the people entering? there was something chilling about that and so unavoidable by the eye when you saw it because the display was so huge and there were so many people passing by
and you wonder,
why are they monitoring you
who’s monitoring you
are these being recorded?
who keeps these videos?
did i consent to this?
and my mind continued to think
maybe i can do an installation around computer vision and have a camera in the room and when people walk in, they see themselves, and the system is designed to use AI to be able to track human beings captured by the camera
there are things called bounding boxes in AI, where in pictures and videos you can teach a computer to draw a box around where it thinks there’s an object — let’s add that to the display, so people can see that the AI is following them. observing them. analyzing them.
it brings in questions of surveillance, of consent
of safety
why are we being watched
who is being watched
how are we being watched
now i’m thinking about joy buolamwini’s gender shades project, where these computer vision AI systems can’t detect black faces (esp faces of black women) but can do very very well on white faces (esp faces of white men)
i’m thinking about how these systems often guess your gender, fitting you to the gender binary, and assuming that optics are what defines your gender identity
i’m thinking about how these systems are used to find criminals and deployed by government agencies… how propublica’s machine bias article showed us that black individuals are systematically biased against in these sorts of systems, where they receive higher criminal risk scores than their white counterparts, even if their previous arrest records would not make you come to that conclusion
.
so what if the installation was that:
- a computer vision system
- a camera tracking people entering the room
- and the screen, where people can see the AI doing the work of identifying where the people are in the room; and the public observers entering the room see on the screen the AI system, following your every move, and based on the biased training data that many systems deployed in the world use to discriminate unfairly against black individuals, the system labels you as a criminal or not
that would be a shocking display to see
but shocking to whom?
the black individual who walks into that room sees the video as videos always see them — a black body, surveilled. objectified. and sometimes, invisible.
the white individual does not have this experience. and yet, they are so much of my target audience. the audience i want to shock and make realize, that for many of the rest of us, for the people of color in the world, our reality is so much different than yours.
.
i think of derrick bell and his call to reverse reality to see the absurdity of the systems of oppression that are all around us.
.
so how about:
we train the computer vision system on a reversal:
a large dataset where
the criminals included are largely white people
and not the people of color
the system learns from that data
amplifies that bias, now biased against white people
and in this display, people see the automation happening
where as white people enter the room
and engage with the installation
they are labelled as criminals
with boxes around them
the camera, watching them
the AI following their every movement:
criminal.
.
how would that make them feel?
how unnerving would that be?
would they be confronted with the realization — that’s what we’re made to feel like, always?
but of course, “the system is objective”, right?
if the AI system says it’s true
if the algorithm is “objective”, then
this is the truth right?
and so, maybe, they begin to think.
and realize,
the system can be manipulated
the system is manipulated.
it always has been.
.
.
and of course, logistically speaking,
if i applied to be a fellow,
the work that would take a year to do
could be to learn how to create this computer vision system
learn how these systems are built and trained in the real world
select the dataset as they are selected in the industry
and follow all the steps to create these very real systems that are very really out here
and document that whole process
and put in that data that has the biases of the world, but in reverse.
that would be an interesting technological contribution
an interesting pedagogical tool
an interesting experiment to showcase how these systems actually amplify bias, in a concrete and tangible way,
and then showcase it, in person, with a live, captive audience,
and make the white individuals really, really have to grapple with this reality.
i think that would be an interesting project.