EdTech Tools and Privacy

Generative AI Tools & Privacy

Generative AI applications generate new content, such as text, images, videos, music, and other forms of media, based on user inputs. These systems learn from vast datasets containing millions of examples to recognize patterns and structures, without needing explicit programming for each task. This learning enables them to produce new content that mirrors the style and characteristics of the data they trained on.

AI-powered chatbots like ChatGPT can replicate human conversation. Specifically, ChatGPT is a sophisticated language model that understands and generates language by identifying patterns of word usage. It predicts the next words in a sequence, which proves useful for tasks ranging from writing emails and blogs to creating essays and programming code. Its adaptability to different writing and coding styles makes it a powerful and versatile tool. Major tech companies, such as Microsoft, are integrating ChatGPT into applications like MS Teams, Word, and PowerPoint, indicating a trend that other companies are likely to follow.

Despite their utility, these generative AI tools come with privacy risks for students. As these tools learn from the data they process, any personal information included in student assignments could be retained and used indefinitely. This poses several privacy issues: students may lose control over their personal data, face exposure to data breaches, and have their information used in ways they did not anticipate, especially when data is transferred across countries with varying privacy protections. To maintain privacy, it is crucial to handle student data transparently and with clear consent.

Detection tools like Turnitin now include features to identify content generated by AI, but these tools also collect and potentially store personal data for extended periods. While Turnitin has undergone privacy and risk evaluations, other emerging tools have not been similarly vetted, leaving their privacy implications unclear.

The ethical landscape of generative AI is complex, encompassing data bias concerns that can result in discriminatory outputs, and intellectual property issues, as these models often train on content without the original creators’ consent. Labour practices also present concerns: for example, OpenAI has faced criticism for the conditions of the workers it employs to filter out harmful content from its training data. Furthermore, the significant environmental impact of running large AI models, due to the energy required for training and data storage, raises sustainability questions. Users must stay well-informed and critical of AI platform outputs to ensure responsible and ethical use.


This article is part of a collaborative Data Privacy series by Langara’s Privacy Office and EdTech. If you have data privacy questions or would like to suggest a topic for the series, contact Joanne Rajotte (jrajotte@langara.ca), Manager of Records Management and Privacy, or Briana Fraser, Learning Technologist & Department Chair of EdTech

EdTech Tools and Privacy

Peer Assessment and Privacy Risks

Instructors, have you considered how privacy, security, and confidentiality apply to teaching and learning, specifically the data you gather as part of assessment?

To support teaching and learning, you gather and analyze data about students all year and in many ways, including anecdotal notes, test results, grades, and observations. The tools we commonly use in teaching and learning, including Brightspace, gather information. The analytics collected and reports generated by teaching and learning tools are sophisticated and constantly changing. We should, therefore, carefully consider how we can better protect student data.  

When considering privacy, instructors should keep in mind that all student personal information belongs to the student and should be kept private. Students trust their instructors to keep their data confidential and share it carefully. Instructors are responsible for holding every student’s data in confidence.  This information includes things like assessment results, grades, student numbers, and demographic information. 

Although most students are digital natives, they aren’t necessarily digitally literate. Instructors can ensure students’ privacy by coaching them about what is appropriate to share and helping them understand the potential consequences of sharing personal information. 

One area of teaching and learning in which you may not have adequately considered privacy or coached students to withhold personal information and respect confidentiality is peer assessment. Peer assessment or peer review provides a structured learning process for students to critique and provide feedback to each other on their work. It helps students develop lifelong skills in assessing and providing feedback to others and equips them with skills to self-assess and improve their own work. However, in sharing their work, students may also be sharing personal identifying information, such as student numbers, or personal experiences. To help protect students’ personal information and support confidentiality, we recommend that you consider the following points.

Privacy Considerations for Peer Assessment 

  • If student work will be shared with peers, tell students not to disclose sensitive personal information. Sensitive personal information may include, for example, medical history, financial circumstances, traumatic life experiences, or their gender, race, religion, or ethnicity. 
  • Inform students of ways in which their work will be assessed by their peers. 
  • Consider having students evaluate anonymous assignments for more objective feedback.  
  • Coach students to exclude all identifiable information, including student number. 
  • If students’ work is to be posted online, consider associated risks, such as
    • another person posting the work somewhere else online without their consent; and
    • the content being accessed by Generative AI tools like ChatGPT that trawl the internet to craft responses to users’ queries.

This article is part of a collaborative Data Privacy series by Langara’s Privacy Office and EdTech. If you have data privacy questions or would like to suggest a topic for the series, contact Joanne Rajotte (jrajotte@langara.ca), Manager of Records Management and Privacy, or Briana Fraser, Learning Technologist & Department Chair of EdTech.