Accessibility Handbook for Teaching and Learning

Accessibility handbook for teaching and learning Langara’s Educational Technology Department is excited to officially launch the Accessibility Handbook for Teaching and Learning. This free book is a comprehensive guide to digital accessibility explaining key concepts such as colour contrast, alternative text, and closed captioning. Foundational knowledge of digital accessibility is complemented with detailed, step-by-step instructions to create accessible content in platforms like Brightspace, PebblePad, PowerPoint, and Word.

With the introduction of the Accessibility British Columbia Act, post-secondary institutions will be required to adhere to accessibility standards for service and education. Langara has formed an accessibility committee and created an accessibility plan to identify, remove, and prevent barriers to accessibility. To meet those goals educational materials must be accessible and inclusive. Creating accessible material will require a consistent effort and refocusing of how digital content is created.

Screenshot of quick guide from the handbook

Accessibility is not a feature, it’s a necessity. The Accessibility Handbook for Teaching and Learning provides faculty, staff, and students with the knowledge and tools to incorporate accessibility into their existing workflows. Inside this deliberately structured and fully accessible web book, readers will find an extensive resource including quick start guides, step-by-step instructions, demonstration videos, and links to further resources. Best of all, this guide is supported and frequently updated by Langara’s Assistive Technologist, Luke McKnight.

The Accessibility Handbook for Teaching and Learning empowers all Langarans with the knowledge and tools to make accessible digital content. Accessible content is essential to learner success, democratic access to information, and minimizes feelings of othering. Creating and choosing more accessible content is essential to ensuring inclusivity is a reality for your team, in your classroom, and for your classmates.

The handbook is available for free under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License, allowing you to use, share, and adapt the material as needed.

Contact Luke McKnight or assistivetech@langara.ca for more information including direct support in creating accessible content.

Using AI to Enhance Accessibility

The Langara Accessibility Handbook for Teaching and Learning has a new chapter on AI Generated Alt Text. Generative AI chatbots have rapidly improved their ability to recognize and describe images. CoPilot, ChatGPT, and Gemini have shown to be useful tools to describe images and provide a starting point for writing alternative text (alt text). This new resource explains:

  • How to upload images to CoPilot, ChatGPT, and Google Gemini.
  • Effective prompts to get image descriptions.
  • How to refine output to write effective alt text.
  • What alt text is, when to use it, and how to write it.

In addition to this new resource, EdTech is offering a workshop on using AI to enhance accessibility. Join EdTech on May 15th for this interactive session perfect for anyone creating content using visual elements. Adding alt text to images is essential to creating accessible content. AI can be a useful tool to help start the process of writing alt text and this session will introduce effective tools and prompts.

The session offers a blend of theory and practice, providing hands-on exercises to master the art of crafting prompts that yield precise and useful outcomes. This workshop will equip Langarans with the skills to harness the potential of generative AI, paving the way for accessibility and inclusion. EdTech AI and accessibility experts will be on hand to help. Participants are encouraged to bring visual material that needs alt text.

Register for Enhance Accessibility Using AI today!

EdTech Tools and Privacy

Generative AI Tools & Privacy

Generative AI applications generate new content, such as text, images, videos, music, and other forms of media, based on user inputs. These systems learn from vast datasets containing millions of examples to recognize patterns and structures, without needing explicit programming for each task. This learning enables them to produce new content that mirrors the style and characteristics of the data they trained on.

AI-powered chatbots like ChatGPT can replicate human conversation. Specifically, ChatGPT is a sophisticated language model that understands and generates language by identifying patterns of word usage. It predicts the next words in a sequence, which proves useful for tasks ranging from writing emails and blogs to creating essays and programming code. Its adaptability to different writing and coding styles makes it a powerful and versatile tool. Major tech companies, such as Microsoft, are integrating ChatGPT into applications like MS Teams, Word, and PowerPoint, indicating a trend that other companies are likely to follow.

Despite their utility, these generative AI tools come with privacy risks for students. As these tools learn from the data they process, any personal information included in student assignments could be retained and used indefinitely. This poses several privacy issues: students may lose control over their personal data, face exposure to data breaches, and have their information used in ways they did not anticipate, especially when data is transferred across countries with varying privacy protections. To maintain privacy, it is crucial to handle student data transparently and with clear consent.

Detection tools like Turnitin now include features to identify content generated by AI, but these tools also collect and potentially store personal data for extended periods. While Turnitin has undergone privacy and risk evaluations, other emerging tools have not been similarly vetted, leaving their privacy implications unclear.

The ethical landscape of generative AI is complex, encompassing data bias concerns that can result in discriminatory outputs, and intellectual property issues, as these models often train on content without the original creators’ consent. Labour practices also present concerns: for example, OpenAI has faced criticism for the conditions of the workers it employs to filter out harmful content from its training data. Furthermore, the significant environmental impact of running large AI models, due to the energy required for training and data storage, raises sustainability questions. Users must stay well-informed and critical of AI platform outputs to ensure responsible and ethical use.


This article is part of a collaborative Data Privacy series by Langara’s Privacy Office and EdTech. If you have data privacy questions or would like to suggest a topic for the series, contact Joanne Rajotte (jrajotte@langara.ca), Manager of Records Management and Privacy, or Briana Fraser, Learning Technologist & Department Chair of EdTech

What’s an Assistive Technologist?

What’s an Assistive Technologist?

To find out the answer to this question and learn a bit about what Langara’s Assistive Technologist Team has been working on, check out the accompanying video. 

If you have any questions or would like to learn more about how the Assistive Technologist can support you and your students, please email assisitvetech@langara.ca. If you would like to demo the course, you can self-register from the Brightspace homepage.