A group of Portland City College faculty and administrators is more interested in having a public conversation about the ethics and reality of AI. To that end, PCC's AI and Cultural Computing cohort is hosting the AI Symposium, a three-day event that is grounded in both the science of computing and larger ethical considerations.
“By giving folks an understanding of what the technology is, we can have a discussion around it and thus have more control over it,” David Torres, new media art instructor, told The Skanner.
“How are we able to use this tool and manage it, ethically, morally?”
It is a question the group felt wasn’t being widely explored.
“Having that equity lens was really something we wanted to bring into the conversation,” Anne Grey, teaching and learning coordinator at PCC, told The Skanner, noting that human-developed algorithms have already been shown to reflect racism and misogyny. “For example, we don’t question our databases: Where do they come from? How do they collect data? That’s the premise, that unless we go back to really examine existing inequities, existing flaws, we will be perpetuating the racism…it’s about, how we are inputting that data? And then talking about the other part of ethics: What is considered ethical? What are the lines we are willing to cross? How are we going to be voicing these things?”
To explore this, the group created a three-day program with daily themes: Oct. 18 covers “AI and Education,” Oct. 19 explores “AI and Industry,” and Oct. 20’s focus will be “AI in Everyday Life.”
Kicking off the event is a keynote by a leading computer science ethicist and thought leader.
Brandeis Marshall is a computer scientist and former college professor who founded DataedX Group, a data ethics education agency that takes aim at "automated oppression efforts," instead introducing culturally responsive approaches. It is a rebuttal to Facebook founder Mark Zuckerberg's breezy "move fast and break things" philosophy.
The cohort used Marshall’s Data Conscience: Algorithmic S1ege on our Hum4n1ty as a textbook when exploring how best to prepare students for a changing tech landscape.
“There are really few people who are actually talking about it, and not really talking about the impact this would have not just culturally but in re-establishing or magnifying some of the inequities that we already see,” Grey said.
Marshall focuses on the need for transparency in how AI is developed and applied, accountability in AI development and strategies for how AI might be governed in law and algorithms. Her work has been described as the meeting of social justice and science – presented in an accessible, even engaging, way.
“She really changed my thinking in a way that it hasn’t been changed in a while,” cohort member Melissa Manolas, composition and literature instructor, told The Skanner.
Marshall helped her understand, for example, large language models – the major algorithms that use extensive data sets to understand language and eventually generate text.
“You start to understand why the large-language models end up being so misogynistic and racist – it’s how they’re trained,” Manolas said.
“Marshall is knowledgeable about the culture of programming, and points out there are already ethical issues coming up that are being rushed through, but which are really crucial to what you get as an end product. So we need to slow down and really call for that kind of thinking mindfulness about the equity at that stage, not just when you release something and then you deal with it – but when you’re creating it and programming for it.”
Alongside her critiques, Marshall offers hope, she said.
“It’s her vision of accountability,” Manolas said.
“(These flaws) are not inevitable.
"Sometimes we hide behind the sense of the inevitable as a way to be non-active about it, and she really doesn’t let people off the hook like that.”
One idea is that an immediate fix could easily happen in computer science education.
“Marshall pointed out time and again that very few curricula and programs that are producing programmers have ethics classes,” Manolas said. “If they did, if you stopped and slowed it down, you wouldn’t have to wait until you already have these problems. She makes you realize, they’re completely aware that might happen when they’re doing the programming, but there’s such an impetus to rush through that phase and then deal with it once it’s out there.”
Alongside concerns about AI is excitement for its potential as a creative tool – and a means of access.
“Technology and new media arts has this history with access,” Torres said. “When the camera was given to the public in the 60s, usually you had to have a lot of money to make your own films, and then the common folk were able to use the camera to make their own content. Fast forward to now, YouTube is mostly people just making their own videos. AI is doing something similar. We start asking, ok, what actually is art? Like when it came to cinema during the 60s when folks had the camera: Beforehand, because Hollywood had cameras, they could just call cinema whatever. Now we’re much more critical, because someone on YouTube could do the same thing.”
Torres continued, “I go back to when Photoshop first came out: Everybody was using it in a very cheesy way, and it was for all the effects it did. But over time, what ends up happening is you fine-tune how the tool can still be used with the human hand, and I think creatively, that’s how things happen in the arts. You look at it in movies, there was a huge craze when it came to 3D in cinema, it was this cheesy thing, and it kind of went away. It’s like that with a lot of technology: Usually artists find a way to include that as an extra brush in their toolkit that they use. There was an era where we thought VR was the new thing. In actuality, VR has these specific moments, whether it’s in healthcare, but it’s not everywhere.”
The symposium, funded by a federal AI education grant, is open to the public and, organizers hope, will constitute a large public conversation.
It is also the result of PCC faculty’s extensive research, study and conversations with experts like Marshall.
“This is the culmination of the cohort coming together and wanting as an artifact to put this symposium together to share, to disseminate and to have a discussion within the community,” Grey said.
Symposium events are either virtual or in-person. Video of virtual presentations will be made available after the symposium.
Symposium agenda below. For more information on Marshall’s work, including free articles, visit https://www.brandeismarshall.com/medium.
Keynote – Dr. Brandeis Marshall. 10 a.m. to noon, Virtual. Marshall is founder and CEO of DataedX Group, a data ethics learning and development agency for educators, scholars and practitioners to counteract automated oppression efforts with culturally-responsive instruction and strategies. Brandeis teaches, speaks and writes about the racial, gender, socioeconomic and socio-technical impact of data operations on technology and society. She wrote Data Conscience: Algorithmic Siege on our Humanity as a counter-argument reference for tech’s "move fast and break things" philosophy. An ASL interpreter will be present for this keynote speaker event.
AI and Education – 12:30-2 p.m., Virtual. Ahead of the session, AI experts and researchers’ Cynthia Alby, Kevin Kelly, and Colin Koopman have submitted responses to questions from PCC faculty, staff, and students on the impact of AI on teaching and learning. Their responses will serve as the basis of an open discussion among PCC AICC cohort members and session attendees.
AI at PCC – 2:30-4 p.m., Virtual. Presentation and Q&A with PCC AICC cohort members and Academic Affairs administrators on topics ranging from professional development, instructional support, academic integrity and AI, ChatGPT protocols and best practices. There will also be breakout groups to foster conversations and resource sharing.
AI Campus Workshop – 4:30-5:30 p.m., Room 225, Technology Education Building. AICC cohort members will host an open lab for students and the general PCC community in order to showcase resources and equipment available on campuses and facilitate hands-on exercises with commonly-used AI tools.
AI in the Workplace – 6-8 p.m., Moriarty Arts and Humanities Building Auditorium. Join us for an opportunity to network and listen to a panel discussion from industry experts about AI in the workplace. Panel guests include: Will Landecker, Data Science Tech Lead, Next Door, Emmanuel Acheampong, co-founder, RoboMUA, and Melissa Evers, Vice President - Software and Advanced Technology Group, General Manager of Strategy to Execution at Intel Corporation.
Spotlight Speaker Nick Insalata – 10-11:30 a.m., Moriarty Arts and Humanities Building Auditorium. Join us as we hear from our spotlight speaker Nick Insalata, PCC Computer Science faculty and AICC cohort member talk about the impacts of AI in our everyday life. Nick is interested in the challenges of making complex problems accessible, properly contextualized, as well as interesting and fun to learners of all levels.
AI Campus Workshop – 12:30-2 p.m., Room 225, Technology Education Building. Hands-on labs will include how to create text, images, and even music with AI tools and the different ways to incorporate ChatGPT. AI tools include DALL-E 2, Modo Screen, Soundraw, Looka, Legal Robot, and Deep Nostalgia.
AI in Everyday Life Panel Presentation – 2:30-4 p.m., Moriarty Arts and Humanities Building Auditorium. Join us as the AICC Cohort members lead a discussion from dooms scrolling to deepfakes, personal assistants to the 'end of work', AI promises both subtle and stunning transformations to our daily lives. Some of the topics discussed will include super intelligence, virtual and augmented reality, ethical AI, and advanced humanoid robots.