Accountable Computer Systems

ARST 556Q – CPSC 538S – LAW 432D – LIBR 569C

We rely on computer systems to store, process, and transmit practically all data whose reliability, accuracy, authenticity, privacy, security, and integrity are vital and regulated. Applications in human resources, medicine, education, finance, and public surveillance use machine learning models to make critical decisions that directly affect people. However, the systems on which these applications run are opaque; we rarely know what decisions are being made, how they are being made, why they are being made, and the degree of certainty any piece of software has regarding these decisions. This lack of transparency erodes public trust and deprives people of their agency. Solutions to this problem lie at the intersection of technology, recordkeeping and preservation, law, public policy, and business. Yet, few individuals understand the language, concepts, constraints, requirements, and possibilities in more than one of these fields, let alone all. This course will bring together students from a diverse set of backgrounds who will learn from an equally diverse group of faculty and outside experts in law, computer science, public policy, artificial intelligence, digital records management and preservation, philosophy, and machine learning how to identify real problems that might require technical or partially-technical solutions, the language in which to communicate between multiple disciplines, and the possible approaches for addressing the most pressing challenges.

We invite graduate students to consider the complex societal impact of computer systems. Students will form interdisciplinary teams to undertake a project of their choice, reimagining technology to reduce its negative impact based on technical, ethical, socio-economic, and legal considerations.

Course format

This course combines lectures, presentations by guest speakers, and student-led discussions. We emphasize students’ ability to engage in non-academic public scholarship.

Learning Outcomes

After taking this course, the students will be able to:

  1. Identify and analyze the ways in which technical solutions can affect society.
  2. Address conflicting disciplinary concepts and viewpoints, and suggest solutions.
  3. Communicate and work in interdisciplinary teams.
  4. Produce scholarly material and activities engaging the public.
  5. Communicate effectively with diverse audiences.
  6. Interact with the public in a professional way.

Course admissions

This course is open to any UBC graduate student but is designed for those enrolled in graduate programs in Law, Information Studies, Sociology, and Computer Science. Admissions is based on home department criteria.

Where to register?

Information Science Students: LIBR 569C 002

Computer Science Students: CPSC 538S

Law Students: LAW 432D 003

Archival Studies Students: ARST 556Q 002

Invited Speakers

Avatar

Christopher Millard

Invited Speaker

Avatar

Lilian Edwards

Invited Speaker

Avatar

Lorenzo Cavallaro

Invited Speaker

Avatar

Vered Shwartz

Expert Speaker

UBC Expert Speakers

Avatar

Vered Shwartz

Expert Speaker

Instructors

Avatar

Benjamin Goold

Instructor

Avatar

Cristie Ford

Instructor

Avatar

Laura Nelson

Instructor

Avatar

Margo Seltzer

Instructor

Teaching Assistants

Avatar

Jessie Gomberg

Teaching Assistant

Avatar

Soo Yee Lim

Teaching Assistant

Schedule

Winter 2 – Monday – 2pm-5pm

Date Lecture I Lecture II Activity Deadline (Fridays)
January 8 Course Introduction (Prof. Pasquier) Computer Security I (Prof. Pasquier) Ice Breaker N/A
January 15 Computer Security II (Prof. Pasquier) Computer Security III (Prof. Pasquier) Reading Group/Seminar N/A
January 22 Computer Security IV (Prof. Pasquier) Computer Security V (Prof. Pasquier) Student-led discussion N/A
January 29 Project Proposal Presentation Project Proposal
February 5 Introduction to Cloud Computing (Prof. Pasquier) Data Sovereignty (the indigenous perspective)  (Prof. Nelson) Student-led discussion OP-ed 1 (draft)
February 12 Guest: Generative AI and Education (Prof. Shwartz) Introduction to Fairness (Prof. Pasquier) Reading Group/Seminar OP-ed 1 (peer)
February 19 Mid-term Break Proposal Review
February 26 Technological authenticity and authentication I (Prof. Duranti) Technological authenticity and authentication II (Prof. Duranti) Student-led discussion OP-ed 1 (final)
Proposal (final)
March 4 Technological authenticity and authentication III (Prof. Duranti) Data Provenance (Prof. Seltzer) Reading Group/Seminar N/A
March 11 Risk to Privacy on the Internet I (Prof. Goold) Risk to Privacy on the Internet II (Prof. Goold) Student-led discussion OP-ed 2 (draft)
March 18 Project Presentation OP-ed 2 (peer)
March 25 TBC (Prof. Pasquier) Surveillance Capitalism (Prof. Nelson) Student-led discussion OP-ed 2 (final)
April 1 Easter Monday
April 8 Bias and Machine Learning - the sociological perspective (Prof. Nelson) Bias and Machine learning - legal dimensions (Prof. Thomasen) Reading Group/Seminar Project Report

Public Lectures

April 9th – Regulation in the Age of Cloud Computing & Generative AI

March 20th – Trustworthy AI for Systems Security

Assessments

Evaluation Criteria and Grading

  1. Classroom discussion and facilitation – 20%
  2. Individual OP-eds – 20%
  3. Interdisciplinary project – 60% (report 30%, presentation 15%, video 15%)

Classroom discussion and facilitation – 20%

Each week, we will spend part of our class discussing assigned reading material for that week. For selected classes, small teams of 2 or 3 students will be discussion leaders (leaders will be identified in Week 2). Discussion leads will have two responsibilities:

  • Before class. Prepare 2-3 discussion questions on a topic chosen by the students and relating to material covered during classes. Students are encouraged to look into recent events relating to computer technology and its impact on individuals and society. You can add two additional articles (academic articles or good quality news articles) related to your topic of choice. Send the list of questions and reading material to all class members (including the instructors) by Friday 4.00 pm before the Monday class so that your classmates can reflect on the questions and come to class with prepared for discussion.
  • During the synchronous class. Offer a total of ∼50 minutes of activities. This should include a brief introduction (5-10 minutes) to the topic of the day, including the rationale for the discussion questions mentioned above, followed by 30-45 minutes of discussion or activities. Students must demonstrate the ability to (1) explain both the background and current context of class topics, in depth and (2) design a creative and engaging “game plan” for the class discussion.

We expect that all class members will read all required readings and actively participate in class discussions and activities. The grade for this assessment component will be assigned both on the ability to lead a discussion and participation throughout the term.

Individual OP-ed – 20%

At the end of Weeks 8 and 11, students will submit Op-Eds (∼600 – 800 words) based on a topic covered in one of the previous week’s classes. Other classmates will review these submissions before grading to encourage interaction between different viewpoints. The op-eds must be accessible to a broad non-expert audience.

Submission Instruction.

Draft + Peer Review on HotCRP (link available on Canvas).

Final submission should be made directly on Canvas.

Interdisciplinary project – 60%

Inter-disciplinary groups of 3-5 students will explore and address a concrete challenge. Students will examine the technical, societal, and legal aspects and propose ways of addressing those challenges. At the end of week 4, students will submit a proposal that will be both peer-reviewed and discussed with the instructors (10% of course grade). They will work on this project for the duration of the course. They will prepare a presentation (15% of the course grade), to be presented to a public of their peers. Each group will also prepare a video (15% of the course grade) for a general audience to be published on the course website. Finally, students will write a report (∼4,000 words, not including figures/tables/charts and references). The report must be accessible to an interdisciplinary academic audience (20% of your final grade). Careful consideration must be given to the target audience of the different project outcomes.

Submission Instruction.

Proposal + Peer Review on HotCRP (link available on Canvas).

Final Report and Video Presentation submission should be made directly on Canvas.

Written Assignment Grading Criteria

Assignments will be graded on a 0 to 10 scale across the following key areas:

Structure & Logic

Thesis Clarity: Is there a clear thesis statement outlining the main argument?

Organization: Is the content logically organized from introduction to conclusion?

Argument Linkage: Are arguments logically connected and supportive of the thesis?

Argumentation & Evidence

Argument Relevance: Are the chosen arguments directly relevant to the thesis?

Evidence Use: Are arguments supported with appropriate evidence and examples?

Counterargument Consideration: Are opposing viewpoints (when appropriate) acknowledged and addressed?

Understanding of Subject

Conceptual Mastery: Is there a thorough understanding of the topic and key concepts?

Contextual Insight: Does the work reflect awareness of the broader context and implications?

Clarity & Presentation

Language: Is the assignment written in clear, correct English?

Readability: Are complex ideas communicated effectively?

Audience: Is the work understandable by the target audience?

Visual Aids

Design & Integration: Are figures and tables well-designed and relevant to the text?

Course Policies

Late Assignments. You will not receive credit for late assignments. Contact the instructor or your TA promptly (i.e., as soon as you are aware of the problem) if a medical or family reason prevents you from handing in any component of your writing assignments on time. The same policy apply for oral presentations.

In extraordinary circumstances, we may allow late turn-in of some assignments if you contact course staff (send an e-mail using your UBC e-mail account) with a clear explanation of the problem well in advance of the deadline (i.e., at least 48 hours). Poor planning or procrastination do not constitute extraordinary circumstances.

Academic Integrity. The academic enterprise is founded on honesty, civility, and integrity. As members of this enterprise, all students are expected to know, understand, and follow the codes of conduct regarding academic integrity. At the most basic level, this means submitting only original work done by you and acknowledging all sources of information or ideas and attributing them to others as required. This also means you should not cheat, copy, or mislead others about what is your work. Violations of academic integrity (i.e, misconduct) lead to the breakdown of the academic enterprise, and therefore serious consequences arise and harsh sanctions are imposed. For example, incidences of plagiarism or cheating may result in a mark of zero on the assignment or exam and more serious consequences may apply if the matter is referred to the President’s Advisory Committee on Student Discipline. Careful records are kept in order to monitor and prevent recurrences.

Academic Misconduct at UBC. Official information about Academic Integrity and Misconduct can be found at the following links:

Respectful Environment. Everyone involved with this course is responsible for understanding and abiding by the UBC Statement on Respectful Environment for Students, Faculty and Staff. The statement speaks to our freedoms and our responsibilities, and provides the guiding principles to support us in building an environment in which respect, civility, diversity, opportunity and inclusion are valued.

Reading Material

Week 1

Cobbe, Jennifer, Michael Veale, and Jatinder Singh. “Understanding accountability in algorithmic supply chains.” Proceedings of the 2023 ACM Conference on Fairness, Accountability, and Transparency. 2023. link

Kroll, Joshua A. “Accountability in computer systems.” The Oxford handbook of ethics of AI (2020): 181-196. link

Bonneau, Joseph, et al. “The quest to replace passwords: A framework for comparative evaluation of web authentication schemes.” 2012 IEEE Symposium on Security and Privacy. IEEE, 2012. link

Ion, Iulia, Rob Reeder, and Sunny Consolvo. “”… no one can hack my mind": Comparing Expert and Non-Expert Security Practices." Eleventh Symposium On Usable Privacy and Security (SOUPS 2015). 2015. link

Week 2

Akhawe, Devdatta, et al. “Here’s my cert, so trust me, maybe? Understanding TLS errors on the web.” Proceedings of the 22nd international conference on World Wide Web. 2013. link

Akhawe, Devdatta, and Adrienne Porter Felt. “Alice in warningland: a Large-Scale field study of browser security warning effectiveness.” 22nd USENIX security symposium (USENIX Security 13). 2013. link

Week 3

Dingledine, Roger, Nick Mathewson, and Paul F. Syverson. “Tor: The second-generation onion router.” USENIX security symposium. Vol. 4. 2004. link

Murdoch, Steven J., and George Danezis. “Low-cost traffic analysis of Tor.” 2005 IEEE Symposium on Security and Privacy (S&P'05). IEEE, 2005. link

Watch: Blockchains Are a Bad Idea (James Mickens), Harvard Business School

Week 4

N/A (Project Proposal presentations)

Week 5

ARMBRUST, Michael, et al. “A View of Cloud Computing.” Communications of the ACM 53.4 (2010): 50-58. link

Will Engle and Valeria De La Vega “Open Dialogues: Daniel Heath Justice on Decolonizing Open.” 2020 link

Desi Rodriguez-Lonebear (2016). “Building a Data Revolution in Indian Country.” In T. Kukutai & J. Taylor (Eds.), Indigenous Data Sovereignty. Canberra: Australia National University Press. link

Kimberly R. Huyser (2020). “Data and Native American Identity.” Contexts 19 (3): 10-15. link

Watch/Read:

Week 6

Understanding the world of AI, CBC Vancouver link

Kyi, Lin, et al. “Investigating deceptive design in GDPR’s legitimate interest.” Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems. 2023. link

The Trouble with Bias - NIPS 2017 Keynote - Kate Crawford

Tutorial: fairness definitions and their politics

FAIRNESS AND MACHINE LEARNING - Limitations and Opportunities. Solon Barocas, Moritz Hardt, Arvind Narayanan (this is a book you should not feel that reading everything is required)

How We Analyzed the COMPAS Recidivism Algorithm. Jeff Larson, Surya Mattu, Lauren Kirchner and Julia Angwin

Artificial intelligence needs to be trained on culturally diverse datasets to avoid bias. Vered Shwartz

Week 7

Luciana Duranti, Corinne Rogers and Kenneth Thibodeau, “Authenticity,” Archives and Records, 43:2 (July 2022), 188-203. [article on canvas]

Jeremy Davet, Babak Hamizadeh and Pat Franks, “Archivist in the Machine: Paradata for AI-Based Automation in the Archives,” Archival Science (2023) 23: 275–295. [article on canvas]

Hoda Amal Hamouda, “Authenticating Citizen Journalism Videos by Incorporating the View of Archival Diplomatics into the Verification Processes of Open-source Investigations (OSINT),” IEEE Sorrento 2023, Conference Proceedings link

Week 8

Carata, Lucian, et al. “A Primer on Provenance: Better understanding of data requires tracking its history and context.” Queue 12.3 (2014): 10-23. link

Week 9

Clement, Andrew, and Jonathan A. Obar. “Canadian internet “boomerang” traffic and mass NSA surveillance: Responding to privacy and network sovereignty challenges.” Law, privacy and surveillance in Canada in the post-Snowden era (2015): 13-44. link

Solove, Daniel J. “The myth of the privacy paradox.” Geo. Wash. L. Rev. 89 (2021): 1. link

R. v. Bykovets, 2024 SCC 6 link

Week 10

N/A (Project presentations)

Week 11

N/A

Week 12

N/A (Easter)

Week 13

Alina Arseniev-Koehler & Jacob G. Foster (2022). “Machine Learning as a Model for Cultural Learning: Teaching an Algorithm What it Means to be Fat.” Sociological Methods & Research, 51(4), 1484–1539. link