Many cultural figures I came to admire from childhood to adulthood have passed over the past few years. One of them was Trugoy the Dove, aka Plug 2, aka Dave, from the foundational hip-hop group De La Soul. Looking back, their interpretation of artificial intelligence, which they call “Art Official Intelligence,” highlighted how the digital hyper-intelligence we create as a society can be used to amplify what’s organically powerful about the vibrant experiences and contributions of human culture. I ask, as these technologies continue to advance, what’s currently happening with AI, particularly as it is being integrated into the learning experiences of Black and Brown students?
Reading Ruja Benjamin’s book Race After Technology: Abolitionist Tools for the New Jim Code taught me that code is only as good as the mind, or the collective minds, of the coder(s). It doesn’t matter if that code is an “analog” like a school code of conduct policy, a board resolution, or instructional guidance in a printed curriculum/textbook, or if it’s a “digital” code like a search engine, facial recognition technology, chatbots, or typing assistance software. If the minds are polluted, so are the mechanisms. Ask the traumatized IT specialists in Kenya, who Chat GPT paid $2 an hour, to comb through and label extensive amounts of profane and offensive data from the darkest parts of the internet to stop ChapGPT from spitting out sexist and racist remarks.
So what does this mean for Black and Brown students when the “analog code” has already led to lowered expectations about rigorous work, disproportionate suspension rates, less access to teachers who look like them (despite the data-based benefits), and more access to less experienced teachers? When things become more deeply and intuitively digital in the classroom, school, and system, will the code be just as deeply and intuitively racist?
Currently, there are AI Chatbot therapy tools for students. If the collective mind that made the analog in any way informs these digital endeavors, there may be problems. Due to the efforts of exploited Kenyans, Chat GPT can be a powerful engine to create or revise curricular materials to be rigorous, relevant, relational, and intellectually accessible (i.e., provide scaffolds). However, if the user’s analog policies for instruction were based on low expectations and bias, depending on the user’s mindset, that AI program can be used to weaken, narrow, and oversimplify curricular materials. One may say, “Well, that’s not the program’s fault,” but the truth is enough of this data will inform how the AI will operate, the same way previous iterations of ChatGPT spewed volatile content.
I am glad to see folks like Digital Promise, Ed Tech Equity, and Getting Smart provide audits, supports, and parameters to ensure that curriculum publishers and professional learning networks leave the bias out of the binary. I am grateful for AI-based curriculum efforts like Century, Carnegie Learning, and Cognii for their due diligence in creating AI-based curriculum that is personalized and informed by neuroscience and untainted data. With that said, I am looking forward to the day when AI-based curriculum goes to the next step, where it is all those things but also “Art Official.” Meaning it’s not just responsive to student needs and identity catalytic as well—serving as an amplifier, not a stabilizer, nullifier, or replacement.
Suppose you are also curious about what that may look like. In that case, I’ll be checking out the book Techno-Vernacular Creativity and Innovation: Culturally Relevant Making Inside and Outside of the Classroom. Also, check out a convo I got to have with Varun Arora about his book Artificial Intelligence in Schools. We can learn what this may look like together! Peace and Progress.
RIP Dave.