Claiming Criticality

In 1977 the poet and educator, Adrienne Rich gave a convocation speech at Douglass College in which she reminded her audience:

If university education means anything beyond the processing of human beings into expected roles, through credit hours, tests, and grades (and I believe that in a women’s college especially it might mean much more), it implies an ethical and intellectual contract between teacher and student. This contract must remain intuitive, dynamic, unwritten; but we must turn to it again and again if learning is to be reclaimed from the depersonalizing and cheapening
pressures of the present-day academic scene (‘Claiming an Education’).

This year in the reading group we are taking our cue from Rich’s still resonant words to think about what generative AI means for critical pedagogy in 2025. All of us, students and teachers alike, are still trying to get to grips with how technology has or has not changed the meaning of a ‘basic’ skill in university education. Do we need to learn to do things without AI to ensure we understand and think for ourselves? Or, should we be integrating automated processes and automating tools into our work to create time to focus even more on ‘higher’ level skills, such as critical judgment and analysis? Technology optimists regularly promise us that automating basic processes leaves us free to focus on critical ones, but we all also know that we use our smartphones to scroll Insta or go shopping at any time of the day or night, rather than using our freed up time to read more books or actively reflect on the world we want to build. What does it mean to claim an education in 2025 when there are so many attempts to automate, depersonalize, and commodify teaching and learning?

If you would like to take part in this year’s discussions or just stay informed about the Reading Group meetings and our other events please do subscribe to our mailing list at: https://www.jiscmail.ac.uk/cgi-bin/webadmin?SUBED1=CPG&A=1

Tethered to the Grid and Rendered Intelligible: CPG Annual Lecture with Caitlin Heppner

We cordially invite you to attend this year’s annual lecture, which will be delivered by Caitlin Heppner (University of Ottawa). The lecture begins at 4 pm on June 6th, 2024.

Tickets for attending in person or via live-streaming are available to book now at: https://buytickets.at/studentpartnershipuow/1231625

Tethered to the Grid and Rendered Intelligible

Algorithmic processes are increasingly tasked with normative evaluation within state infrastructure, answering questions like: who is a criminal, where crime is likely to occur, and which family homes are safe? In answering these questions, algorithms classify and constrain subjects through the co-creation of kinds of people. This algorithmic creativity tends to bury both social construction and moral evaluation under the guise of natural classification: a cataloguing of so-called natural properties legitimized by the authority of “The Algorithm.”

In the spirit of Michel Foucault, my work aims to unveil the role of power in the algorithmic construction and control of people. I examine three algorithmic systems to demonstrate how the veiling of social and contingent classifications tricks us into believing that these kinds are natural, and therefore so too is the ordering of these kinds. The algorithmic political order is portrayed as simply the natural order of things: there can be no other way. This is the threat of the algorithmic state. By automating normative evaluation—the identification of who might be a criminal or a terrorist, of where dangerous neighbourhoods are, or of what makes a good home—we are denied the possibility of thinking and doing otherwise.

About our speaker

Caitlin Heppner is a PhD student in philosophy at the University of Ottawa in Canada and a researcher at the Canadian Robotics and Artificial Intelligence Ethical Design Lab. Her co-authored in-development work includes a “Critical Research into Tech” toolkit for navigating the risks of critiquing Big Tech, a bioethics framework for the public testing of AI, and a series of storytelling modules for ethically conscientious design. This applied work is motivated by the skeptical intuition that nothing should be done simply because it has always been done.

How to find the lecture: The lecture will be held at the University of Westminster, Marylebone Campus. Full travel and access information can be found at: https://www.westminster.ac.uk/about-us/visit-us/marylebone