Estimated reading time: 4 minutes, 22 seconds
Today at 6pm UTC join the Teach-In #AgainstSurveillance to support Ian Linkletter – to donate, register, and find out more, go to https://tiny.cc/againstsurveillance
If you are reading this after the fact, here is the link to watch the recording (still time to donate!) Https://againstsurveillance.net
I’m writing this blogpost as a companion to my participation in this event. I am one of the speakers, along with Audrey Watters, Jesse Stommel, Cory Doctorow – and my fellow panelists to kick-off the event “in conversation”, sava saheli singh, Chris Gilliard, Benjamin Doxtdator.
First off, I want to share a curation of tweets that people responded about when I asked on Twitter why people were against surveillance. For the most accessible experience if you have visual impairment, click on the actual tweet below and read the thread of responses:
Tell me, Twitter, why are YOU #AgainstSurveillance?— ℳąhą Bąℓi, PhD مها بالي 🌵 (@Bali_Maha) November 27, 2020
Join us @savasavasava @hypervisible @doxtdatorb @Jessifer @audreywatters @doctorow & me
Dec 1st 6pm UTC & help raise money to support @Linkletter #DigPed #AcademicChatter
Sign up https://t.co/3oonHQyAzr pic.twitter.com/ArobFCdZGk
For curation purposes, I have taken screenshots and added them onto these Google slides, along with some tweets from a previous Twitter question i had asked in May/June regarding technologies people would un-invent, where surveillance tech came by far as the one most people would choose to un-invent (this is the thread that I incorporated into my OLC Innovate keynote in June, and where the CEO of Proctorio tried to tell me how to give a “balanced” keynote).
Anyway, here are those slides:
Now why am I #AgainstSurveillance ?
- First and foremost, regardless of how you feel about anything else, I think that if our goal in education is to cultivate ethical human beings, we should recognize that this purpose means we want people to behave ethically *even when nobody is watching*. Using surveillance technology implies to students that they are not trusted by default, and it normalizes a culture of surveillance and complying to surveillance, which has longlasting repercussions in their lives beyond university. It doesn’t help them become more ethical. It teaches them to fear being watched and it teaches them they are considered cheater until proven honest. My students tell me that the harder teachers try to proctor them with multiple means, the more creative they get at subverting it. But if assessments are authentic and interesting and really quite difficult to cheat but really quite achievable within the timeframe given, because answers are divergent but they have the tools to answer – they won’t cheat because they will see the value of learning. Students have also said that Turnitin makes them behave in similar ways, finding ways to subvert the tech rather than genuinely understanding the value of attribution and originality of their own ideas and words
- As a Muslim who travels abroad, I am very aware of the ways foreign governments surveil me. Every time I was in a line at an airport and chosen for a “random check” (random here was 50% of flights in the period between 2001-2010) and the way they frisk you in public.
- As an Egyptian living in this country knowing that people can get arrested for what they share on Facebook or like on social media or even say in private – because surveillance can happen with tech and algorithms but can also happen with humans!
- In Egypt, the difference in connectivity between one student and another, honestly from one minute to the other is huge. I can totally see how everyone gets the dame time to do an exam, but how one person’s device can work much slower than others – whether because it is an older model or because internet is wonky (some surveillance tech works offline so connectivity is less of an issue)
- The additional stress and anxiety induced by using this kind of tech makes it questionable what it is we are assessing when we use this tech (doing it in a pandemic where many people are suffering trauma above their normal means that a large number of students will be negatively affected by this, but even if you had a smaller number being affected, what kind of education and assessment is that?). Is it ability to work while watched, ability to stay still and not look to the side? How well resourced a home is such that someone can have a quiet room to oneself for an hour or two? Ability to endure indignity as Anne-Marie Scott tweeted, or how well students comply to normative rules?
- We know that algorithms and face recognition technologies are biased against darker skinned people, and neurodivergent people. Read Shea Swauger on this.
- Surveillance tech does not promote rigor or integrity. It makes teachers think it’s ok to keep doing poorly designed exams when in reality, searching online and collaborating with colleagues should be things #HigherEd ENCOURAGES and rewards. What does it mean when we focus our energy on RESTRICTING these good learning practices?
- The hidden curriculum of surveillance tech has way worse long-term and even short-term consequences than any benefit it may have. I understand some people teaching intro STEM courses feel there’s no other way. This is a failure of imagination and a failure to see the big picture. People who know why they are educators will always find a better way.
I will stop here and hope to see you at #Againstsurveillance later today!