Estimated reading time: 0 minutes, 48 seconds
I woke up this morning with this thought, related to Academic Integity (AI) and Artificial Intelligence (the other AI).
What if we took a “disclosure of learning process” approach rather than prevent and punish approach? Ask students to show how tech (and people!) helped them along the way. This would enhance their metacogition and give us insights on how they learn these with or without AI.
I said this idea to a new colleague called Nadin today and she inspired this new wording of “transparency”.
So now I wanna called it a culture of “transparent assessment”.
What do you think? Shall I try it next semester?
BTW, I posted this on Twitter in a briefer form first. The conversation in response to my post is really interesting, so posting my initial tweet here:
Header image of a transparent bubble with snow and sky background from Myriam Fotos on Pixabay.
I think the more we learn about how students are using it, the more we (and they) can decide what seems appropriate, ethical, reasonable, etc.
I love how you frame this collaboratively. Open pedagogy might be a good framework for figuring out how to handle AI with students.
aww, thank yo u so much for taking the time to tell me this. It means the world! Truly. You know, I wake up with an idea and I think, “would this benefit anyone if I shared it? what the heck, I’ll share just in case”. So it helps to hear this from you!
I love this idea of transparent assessment. I wonder where the boundary would be for too much artificial intelligence in a submission?
I wonder if this will be even more of a continuum and gray area than with plagiarism. It seems like the teacher would want to be able to define that for a particular learning activity.
Thank you so much for articulating this! I would love to read more from you on this idea of transparency with AI and student work and how we can do that with an ethos of hospitality rather than surveillance.