Generative AI and STEM

Background

Artificial intelligence is not new. It has been part of our personal and work lives for a long time (autocorrect, facial recognition, satnav, etc.) and large language models like ChatGPT have been a big topic in education since version 3.5 was released in late November, 2022. Large language models (LLMs) are trained on enormous amounts of data in order to recognize the patterns of and connections between words, and then produce text based on the probabilities of which word is most likely to come next. One thing that LLMs don’t do, however, is computation; however, the most recent Open AI release, GPT-4, seems to have made strides in standardized tests in many STEM areas and GPT-4 now has a plug-in for Wolfram Alpha, which does do computation.

Chart from Open AI: exam result improvements from ChatGPT 3.5 to 4

Andrew Roberts (Math Dept) and Susan Bonham (EdTech) did some testing to see how ChatGPT (3.5), GPT-4, and GPT-4 with the Wolfram plugin would handle some questions from Langara’s math courses.

Test Details

Full test results are available. (accessible version of the problems and full details of the “chats” and subsequent discussion for each AI response are available at the link)

The following questions were tested:

 

Problem 1: (supplied by Langara mathematics instructor Vijay Singh)

 

Problem 2: (Precalculus)

 

Problem 3: (Calculus I)

 

Problem 4: (Calculus II)

 

Discussion

Responses from current versions of ChatGPT are not reliable enough to be accepted uncritically.

ChatGPT needs to be approached as a tool and careful proof-reading of responses is needed to check for errors in computation or reasoning. Errors may be blatant and readily apparent, or subtle and hard to spot without close reading and a solid understanding of the concepts.

Perhaps the biggest danger for a student learning a subject is in the “plausibility” of many responses even when they are incorrect. ChatGPT will present its responses with full confidence in their correctness, whether this is justified or not.

When errors or lack of clarity is noticed in a response, further prompting needs to be used to correct and refine the initial response. This requires a certain amount of base knowledge on the part of the user in order to guide ChatGPT to the correct solution.

Algebraic computations cannot be trusted as ChatGPT does not “know” the rules of algebra but is simply appending steps based on a probabilistic machine-learning model that references the material on which it was trained. The quality of the answers will depend on the quality of the content on which ChatGPT was trained. There is no way for us to know exactly what training material ChatGPT is referencing when generating its responses. The average quality of solutions sourced online should give us pause.

Below is one especially concerning example of an error encountered during our testing sessions:

In the response to the optimization problem (Problem 3), GPT-3.5 attempts to differentiate the volume function:

However, the derivative is computed as:

We see that it has incorrectly differentiated the first term with respect to R while correctly differentiating the second term with respect to h.

It is the plausibility of the above solution (despite the bad error) that is dangerous for a student who may take the ChatGPT response at face value.

Access to the Wolfram plugin in GPT-4 should mean that algebraic computations that occur within requests sent to Wolfram can be trusted. But the issues of errors in reasoning and interpretation still exist between requests sent to Wolfram.

Concluding Thought

It will be important for us educate our students about the dangers involved in using this tool uncritically while acknowledging the potential benefits if used correctly.

Want to Learn More?

EdTech and TCDC run workshops on various AI topics. You can request a bespoke AI workshop tailored to your department or check out the EdTech and TCDC workshop offerings. For all other questions, please contact edtech@langara.ca

Brightspace Accessibility in Five, Bonus: Accessible Uploads

Brightspace plus accessibility logoBrightspace is an excellent tool to provide equitable, inclusive access to course content, documents, and media. 

As you create content, take advantage of Brightspace’s built-in tools and the Accessibility Checker to ensure what you share is accessible. Accessible content is inclusive, democratic, and maximizes learner independence. 

However, Brightspace is also a good tool to distribute other material such as lecture slides and documents. It is important that that material also be accessible. 

Creating accessible Word and PowerPoint documents is straightforward. Ensuring a PDF is accessible requires additional time and understanding of unique tools and code. 

The best practices (link text, colour contrast, headings, tables, and text equivalents) listed in this series apply to documents of all types. The process to ensure accessibility is slightly different depending on software.  

Microsoft Office Files

Word and PowerPoint have a built-in accessibility checker. To use this tool: 

  1. Navigate to Review 
  2. Select Check Accessibility 

Read more about making Office documents accessible.

PDF

To make accessible PDFs, it is best practice to make a Word or PowerPoint presentation accessible and then export to PDF. Adobe Acrobat Pro is required to ensure your PDFs are accessible. Try to avoid PDFs for content, except for forms and content to specifically be printed directly. For more information on making PDFs accessible, consult Langara’s Accessibility Handbook for Teaching and Learning. 

docReader

Brightspace now features the docReader tool. When a Word, PowerPoint, or PDF file is uploaded a Brightspace course, students will be able to have them read aloud using the Open with docReader button below the document viewer pane.

This tool does not absolve content creators of generating accessible content. This tool will not be able to read inaccessible documents.


Check out the other posts in the Brightspace Accessibility in Five series:

  1. Link Text
  2. Colour
  3. Headings
  4. Tables
  5. Text Equivalents

A.I. Detection: A Better Approach 

Over the past few months, EdTech has shared concerns about A.I. classifiers, such as Turnitin’s A.I. detection tool, AI Text Classifier, GPTZero, and ZeroGPT. Both in-house testing and statements from Turnitin and OpenAI confirm that A.I. text classifiers unreliably differentiate between A.I. and human generated writing. Given that the tools are unreliable and easy to manipulate, EdTech discourages their use. Instead, we suggest using Turnitin’s Similarity Report to help identify A.I.-hallucinated and fabricated references.  

What is Turnitin’s Similarity Report 

The Turnitin Similarity Report quantifies how similar a submitted work is to other pieces of writing, including works on the Internet and those stored in Turnitin’s extensive database, highlighting sections that match existing sources. The similarity score represents the percentage of writing that is similar to other works. 

AI Generated References 

A.I. researchers call the tendency of A.I. to make stuff up a “hallucination.” A.I.-generated responses can appear convincing, but include irrelevant, nonsensical, or factually incorrect answers.  

ChatGPT and other natural language processing programs do a poor job of referencing sources, and often fabricating plausible references. Because the references seem real, students often mistake them as legitimate. 

Common reference or citation errors include: 

  • Failure to include a Digital Object Identifier (DOI) or incorrect DOI 
  • Misidentification of source information, such as journal or book title 
  • Incorrect publication dates 
  • Incorrect author information 

Using Turnitin to Identify Hallucinated References 

To use Turnitin to identify hallucinated or fabricated references, do not exclude quotes and bibliographic material from the Similarity Report. Quotes and bibliographic information will be flagged as matching or highly similar to source-based evidence. Fabricated quotes, references, and bibliographic information will have zero similarity because they will not match source-based evidence.

Quotes and bibliographic information with no similarity to existing works should be investigated to confirm that they are fabricated.  

References

Athaluri S, Manthena S, Kesapragada V, et al. (2023). Exploring the boundaries of reality: Investigating the phenomenon of artificial intelligence hallucination in scientific writing through ChatGPT references. Cureus 15(4): e37432. doi:10.7759/cureus.37432 

Metz, C. (2023, March 29). What makes A.I. chatbots go wrong? The curious case of the hallucinating software. New York Times. https://www.nytimes.com/2023/03/29/technology/ai-chatbots-hallucinations.html 

Aligning language models to follow instructions. (2022, January 27). OpenAI. https://openai.com/research/instruction-following 

Weise, K., and Metz, C. (2023, May 1). What A.I. chatbots hallucinate. New York Times. https://www.nytimes.com/2023/05/01/business/ai-chatbots-hallucination.html 

Welborn, A. (2023, March 9). ChatGPT and fake citations. Duke University Libraries. https://blogs.library.duke.edu/blog/2023/03/09/chatgpt-and-fake-citations/ 

screenshot of a Turnitin Similarity Report, with submitted text on the left and the report panel on the right

AI tools & privacy

ChatGPT is underpinned by a large language model that requires massive amounts of data to function and improve. The more data the model is trained on, the better it gets at detecting patterns, anticipating what will come next and generating plausible text.

Uri Gal notes the following privacy concerns in The Conversation:

  • None of us were asked whether OpenAI could use our data. This is a clear violation of privacy, especially when data are sensitive and can be used to identify us, our family members, or our location.
  • Even when data are publicly available their use can breach what we call contextual integrity. This is a fundamental principle in legal discussions of privacy. It requires that individuals’ information is not revealed outside of the context in which it was originally produced.
  • OpenAI offers no procedures for individuals to check whether the company stores their personal information, or to request it be deleted. This is a guaranteed right in accordance with the European General Data Protection Regulation (GDPR) – although it’s still under debate whether ChatGPT is compliant with GDPR requirements.
  • This “right to be forgotten” is particularly important in cases where the information is inaccurate or misleading, which seems to be a regular occurrencewith ChatGPT.
  • Moreover, the scraped data ChatGPT was trained on can be proprietary or copyrighted.

When we use AI tools, including detection tools, we are feeding data into these systems. It is important that we understand our obligations and risks.

When an assignment is submitted to Turnitin, the student’s work is saved as part of Turnitin’s database of more than 1 billion student papers. This raises privacy concerns that include:

  • Students’ inability to remove their work from the database
  • The indefinite length of time that papers are stored
  • Access to the content of the papers, especially personal data or sensitive content, including potential security breaches of the server

AI detection tools, including Turnitin, should not be used without students’ knowledge and consent. While Turnitin is a college-approved tool, using it without students’ consent poses a copyright risk (Strawczynski, 2004).  Other AI detection tools have not undergone privacy and risk assessments by our institution and present potential data privacy and copyright risks.

For more information, see our Guidelines for Using Turnitin.

Getting Started with ChatGPT

Tips for writing effective prompts

Prompt-crafting takes practice:

  • Focus on tasks where you are an expert & get GPT to help.
  • Give the AI context.
  • Give it step-by-step directions.
  • Get an initial answer. Ask for changes and edits.

Provide as much context as possible and use specific and detailed language. You can include information about:

  • Your desired focus, format, style, intended audience and text length.
  • A list of points you want addressed.
  • What perspective you want the text written from, if applicable.
  • Specific requirements, such as no jargon.

Try an iterative approach

Ethan Mollick offers the following:

  • The best way to use AI systems is not to craft the perfect prompt, but rather to use it interactively. Try asking for something. Then ask the AI to modify or adjust its output. Work with the AI, rather than trying to issue a single command that does everything you want. The more you experiment, the better off you are. Just use the AI a lot, and it will make a big difference – a lesson my class learned as they worked with the AI to create essays.
  • More elaborate and specific prompts work better.
  • Don’t ask it to write an essay about how human error causes catastrophes. The AI will come up with a boring and straightforward piece that does the minimum possible to satisfy your simple demand. Instead, remember you are the expert, and the AI is a tool to help you write. You should push it in the direction you want. For example, provide clear bullet points to your argument: write an essay with the following points: -Humans are prone to error -Most errors are not that important -In complex systems, some errors are catastrophic -Catastrophes cannot be avoided.
  • But even these results are much less interesting than a more complicated prompt: write an essay with the following points. use an academic tone. use at least one clear example. make it concise. write for a well-informed audience. use a style like the New Yorker. make it at least 7 paragraphs. vary the language in each one. end with an ominous note. -Humans are prone to error -Most errors are not that important -In complex systems, some errors are catastrophic -Catastrophes cannot be avoided
  • Try asking for it to be conciseor wordy or detailed, or ask it to be specific or to give examples. Ask it to write in a tone (ominous, academic, straightforward) or to a particular audience (professional, student) or in the style of a particular author or publication (New York Times, tabloid news, academic journal). You are not going to get perfect results, so experimenting (and using the little “regenerate response” button) will help you get to the right place. Over time, you will start to learn the “language” that ChatGPT is using.

Get ChatGPT to ask you questions

Instead of coming up with your own prompts, try getting the AI to ask you questions to get the information it needs. In a recent Twitter post, Ethan Mollick notes that this approach produced surprisingly good results.

Ideas for using ChatGPT with students

For lots of great ideas and advice, watch Unlocking the Power of AI: How Tools Like ChatGPT Can Make Teaching Easier and More Effective.

  • Use it to create counterarguments to students work. Students can use the AI output to further refine their arguments and help them clarify their positions.
  • Use it to write something for different audiences and have students compare the output and identify how writing changes for a general versus expert audience.
  • Use ChatGPT for a first draft and then have students edit a second draft with critiques, corrections, and additions.
  • Use it to start a discussion. For example, ask ChatGPT why one theory is better than another. Then, ask again why the second theory is better.
  • Use it to generate a list of common misconceptions and then have students address them.
  • Ask students to generate a ChatGPT response to a question of their own choosing, and then write an analysis of the strengths and weaknesses of the ChatGPT response.

Some ways you can use ChatGPT

  • Use it to create a bank of multiple choice and short-answer questions for formative assessment. It can also pre-generate sample responses and feedback.
  • Use it to create examples.
  • Use it to generate ten prompts for a class discussion.

Further reading and resources

Heaven, W.D. (2023, April 6). ChatGPT is going to change education, not destroy it. MIT Technology Review.

Liu, D. et al (2023). How AI can be used meaningfully by teachers and students in 2023. Teaching@Sydney.

Mollick, E. R., & Mollick, L. (2022). New modes of learning enabled by AI Chatbots: Three methods and assignments. SSRN Electronic Journal. https://doi.org/10.2139/ssrn.4300783

Rudolph, J. et al (2023). ChatGPT: Bullshit spewer or the end of traditional assessments in higher education? Journal of Applied Learning and Teaching. Vol. 6, No. 1.

ETUG Spring Workshop 2023

The Educational Technology Users Group (ETUG) is a community of BC post-secondary educators focused on the ways in which learning and teaching can be enhanced through technology. ETUG’s mission is to support and nurture a vibrant, innovative, evolving, and supportive community that thrives with the collegial sharing of ideas, resources, and ongoing professional development through face-to-face workshops and online activities.

Spring Workshop

This two-day online and in-person workshop will showcase how instructors, education developers, and education technologists are approaching design. For example, we’ll explore how digital literacy, inclusive technology, and AI could be “baked” into courses and how instructors are supported in making design decisions around technology. We’ll also consider the ongoing communication and capacity-building at institutions around digital literacy, accessibility, and AI, such as how teaching and learning centres and libraries get the word out to instructors and students about new approaches and resources.

Join ETUG online in Zoom or in-person at Kwantlen Polytechnic University Lansdowne Road Campus in Richmond, B.C. for this 2-day hybrid event, sponsored by BCcampus.

  • Day 1: June 1, 2023: 9:00 AM to 4:30 PM Pacific Time
  • Day 2: June 2, 2023: 9:00 AM to 4:00 PM Pacific Time

Workshop Rates: 2-day Registration Only

  • Early Bird In-Person: $175 CAD + 5% GST (ends April 29 at 11:59 PM)
  • Regular Rate In-Person: $200 CAD + 5% GST
  • Online: $150 CAD + 5% GST
  • Students: Free

Childcare

Attending an event sometimes means choosing professional development at the expense of spending time with family, but for large, multi-day events hosted by BCcampus, participants do not have to choose one over the other. Please let us know when you register if you will require childcare. You can read more about our childcare program and provider here Childcare Program Information

In order to secure nannies in time, our childcare registration cut-off date is May 15, 2023. 

Registration

Register online to attend the ETUG Spring Workshop

 

Improve Students’ Experience in Brightspace by Adding Dates

Adding dates in Brightspace is a great way to support students’ time management. Dates populate the Work To Do and Calendar widgets so that student can keep track of what’s due and when. Due dates can be added to almost all items, activities, and modules in Brightspace.

Students have two options for seeing upcoming items with a due date — The Work To Do widget and the Calendar widget.

Work To Do is an organizational widget, meaning that it appears on Langara’s Brightspace homepage and provides learners with a summary of assigned learning activities from all their courses that are overdue or have an upcoming due date or end date. Work To Do can also be added to a course homepage to show only due and overdue learning activities from the course.

The Calendar widget can be added to a course homepage and displays content items with due dates and all calendar events.

Start Dates, Dues Dates, End Dates, Access, and Availability

In Brightspace, Start Date, Due Date, and End Date can be set for Assignments, Quizzes, and Content modules and module content items. Start and End dates can also be set for Discussions.

Due Date: Due dates specify when an activity or item is due. When you add a due date to an item, activity, or module, it will automatically be visible in the students’ Calendar.

Note about Due Date: Due dates do not restrict students’ ability to submit materials. If students submit work past the due date, the submission will be identified as late. To restrict access to an item, add a start and/or end date.

Start Date: Start dates specify when students can access items, activities, or modules. Before the start date students can see that an item exists, but they cannot access it. If no start date is set, students will be able to access the item immediately (unless it is hidden).

End Date: End dates specify when students will no longer be able to access an item, activity, or module. After the end date, students will only be able to see the title. If you want students to be able to submit late, do not set an end date.

If you want students to be able to submit late, do not set an end date.

Tip: Since items, activities, or modules with an end date are not accessible and students cannot submit their work after the date, it is important to clearly communicate expectations to students to prevent any misunderstandings.

Special access: It is possible to make exceptions for the end date for specific students with the Special Access option. You can use special access conditions to open content access for certain students outside the specified dates.

Availability: You will sometimes see this header before the option to add a start date and end dates. These start and end dates work the same as listed above.

Note about Availability: Be careful with Availability and Hide from Users. Clicking the Hide From Users checkbox will hide content from users until you uncheck the box. An Availability start date does not override the Hide from Users option. We recommend you only use one.

Display to calendar: Not all dates/times show up automatically on the Brightspace calendar. If a Display to calendar check box appears, select this option to push the dates to the calendar.

Our Recommendations

  • Add due dates to marked activities and assessments, so students can use the Calendar, Work To Do Widget, and Notifications to help manage their time.
  • Add Start and End dates when you want to limit access.
  • Use Due Dates or End Dates, but not both.
  • Use Dates judiciously, marking only those activities and assessments that have a firm due or end date.
  • For content due dates, rather than placing the due date on the content item, create a checklist with a due date for all the content and ungraded work that must be completed for a specific class.

Brightspace Accessibility in Five, 1: Link Text

Brightspace plus accessibility logo

Brightspace is an exceptionally accessible platform. Using Brightspace for your course content, documents, and media is an excellent way to provide equitable, inclusive access to learning material.

Take advantage of Brightspace’s built-in tools and the Accessibility Checker to ensure what you share is accessible. Accessible content is inclusive, democratic, and maximizes learner independence.

In the first of this five-part series, we will learn about adding link text to your Brightspace content.

Link Text

Link text should provide a clear description of the destination, independent of the surrounding text.

Students that with a visual impairement may use screen reader software that allows them to navigate by links. Descriptive link text helps orient and guide them to resources. A list of “click here”, “click here”, “Read more”, etc. is not going to provide users with any meaningful information. Pasting raw URLs in Brightspace should also be avoided as, for example, heading “https://iweb.langara.ca/edtech/blog” is jarring and not a useful indicator of what that link would lead to.

Additionally, sighted users can easily spot or relocate a link when it has a clear text description. As well, all users benefit from quality link text to understand why they would want to click on the link.

Effective link text should be:

  • Descriptive
    • Describe the destination
  • Concise
    • Try to limit link text to a few words
  • Unique
    • If two links on a page go to the same destination, they should have the same link text, otherwise ensure all link text is unique
  • Visually distinct
    • Links should be visually distinct from surrounding text. In Brightspace, stick with default formatting (blue underlined text) for links.

To Link Text in Brightspace

  1. Highlight the text to be linked and select Add/Edit Link
  2. The highlighted text will appear in the Title field. Paste the URL in the Link field and select Create.

Find more information about link text in the Langara Accessibility Handbook and read more about adding hyperlinks in Brightspace.

Accessibility Checker

Brightspace includes a built-in accessibility checker. The checker appears on the second row of the editor toolbar.

  1. Select More Actions to reveal the second row of the toolbar
  2. Select Accessibility Checker

The accessibility checker will highlight many accessibility issues and offer solutions to correct them.


Watch for more posts in the Brightspace Accessibility in Five series coming soon, including:

  1. Link Text
  2. Colour
  3. Headings
  4. Tables
  5. Text Equivalents
  6. Bonus: Accessible Uploads

New Text to Speech Tools in Brightspace

EdTech is excited to announce new text to speech tools in Brightspace.

A new toolbar (pictured below) automatically appears on content pages, Quizzes, Assignments, and Discussions.

Screenshot of ReadSpeaker toolbar

The simple, intuitive interface allows for users to hear text read aloud. In Brightspace, simply select Listen and the toolbar instantly creates an audio version of text.

This tool offers students the choice of reading, listening, or both simultaneously. Allowing users choice and customization accounts for learner needs and preferences.

This tool may assist learners with:

  • Increased understanding
  • Improved reading comprehension
  • Information retention and recall
  • Vocabulary
  • Fluency and accuracy
  • Motivation and attitudes toward reading

Available user features include:

  • Customization of colour, style, and size of font
  • Choice of reading voice and speed
  • Synchronous text highlighting
  • Page masking and text-only view
  • Ability to select content to be read aloud
  • No download required
    • Learners can use this tool on campus, at home, on their phone, or on the bus

In addition to Brightspace pages, Word and PDF documents uploaded to Brightspace also have a text to speech reader option.

While a benefit to all learners, this tool is especially important to users that need content to be read aloud. The addition of text to speech is an important step in Langara’s work toward accessibility and universal design for learning.

For more information, read about the toolbar’s features or contact assistivetech@langara.

Using Peer Assessment for Collaborative Learning

Peer Assessment

Peer Assessment PictureThere are several benefits to using peer assessment within your course, one of which is to provide students with a more engaging experience. Opportunities to assess other learners’ work will help students learn to give constructive feedback and gain different perspectives through viewing their peers’ work. There is evidence to show that including students in the assessment process improves their performance. (1) (2) (3)

Research also shows that students can improve their linguistic and communicative skills through peer review. (4) The exposure to a variety of feedback can help students improve their work and can even enhance their understanding of the subject matter. Furthermore, learning to give effective feedback helps develop self-regulated learning, as ‘assessment for learning [shifts] to assessment as learning’ in that it is ‘an active process of cognitive restructuring that occurs when individuals interact with new ideas’ (5).

In addition to the benefits to students, peer assessment can also provide instructors with an efficient way of engaging with a formative assessment framework where the student is given the chance to learn from their initial submission of an assignment.

Options for Peer Assessment within Brightspace:

Within Brightspace, there are several ways that instructors can set up peer assessment activities depending on the nature of the assignment and the needs of the instructor. Here we highlight several use cases.

Peer Assessment Example #1:

The instructor wants to have students assess each other’s group contributions for an assignment within Brightspace.

Using a Fillable PDF, which gives the students a rubric-like experience, a student can rate their peers based on different criteria that has been built into the assessment by the instructor. Students can provide feedback on a rating scale but also can provide more in-depth feedback if needed.

The advantage of using a fillable PDF is that the student can easily download the file and fill in the blanks. The student can reflect on the built-in criteria and the entire process should be quick and easy. The scores are calculated, and the instructor can interpret the results once the student has uploaded the PDF into the assignment folder.

A few disadvantages of this method are that the instructor will have to download each fillable PDF and manually enter a grade if marks are captured for peer assessment. The other issue is the level of student digital literacy. Directions on downloading the fillable PDF to the student’s desktop and not using the PDF within the browser is a key step for this process to work. Not all students are aware that fillable PDFs cannot be used successfully in-browser.

Peer Assessment Example #2:

Students are working towards a final paper that is worth 15% of their overall mark. Before they submit the final version to the instructor, they will have the opportunity to evaluate another student’s draft and their own work using a rubric. If time is limited for this activity, learners can be invited to submit just the first paragraph of the paper, rather than the whole draft.

Through peer assessment, learners can often receive feedback more quickly than if they had to wait for the marker or instructor to review the class’s work.Aropa

Students upload their work to Brightspace Assignments where they are given a link to Aropä, a third-party open software which pairs students so they can assess each other’s work using a built-in rubric. Assessment can be anonymous, and the instructor can restrict feedback to students who have already submitted one review. Self-assessment can be required.

The advantages of Aropä is that it is free and gives instructors the ability to modify rubrics to suit one’s objectives. The disadvantage of this software is that it requires more time to set up. Rubrics provide only basic options: radio buttons or comment boxes. Instructors should be aware of privacy issues with Aropa and only upload first names of their students but avoid uploading student numbers.

Peer Assessment Activity #3

Students complete group presentations after which the class assesses each group’s performance, including their own group’s presentation, using a predetermined marking scheme.

The activity of assessing presentations encourages engagement with the work, versus passive observation, since students will be required to give feedback, encouraging deeper learning and enhancing retention.

The advantages of using an H5P Documentation tool are that H5P can be created directly within Brightspace. It looks nice and is versatile. The disadvantage is that learners will have to export their feedback and then upload it into Brightspace. This two-step process requires some digital literacy skills.

Sample H5P Documentation Tool

Peer Assessment Activity #4:

This peer assessment activity is more about checking completion. Instructor needs to ensure accountability with group work.

Students are given an MS Form with some basic criteria by which to rate themselves and their peers in terms of attendance to meetings, work on the final product / assignment and collaboration. Students will use a point rating scale and need to justify their evaluation by providing a concrete example.

Similar to Example #1, students can complete a form using a Fillable PDF or another software such as Jotform or MS Forms to reflect and assess their own work as well as the work of their teammates. Jotform allows for more complex form building and will calculate totals for each student while MS Forms will not calculate but will allow you to get a sense of how students are doing overall with a basic rating on each criteria. (Focus on qualitative assessment)

Sample MS Form

Sample Jotform

A Note on Third Party Peer Review Software:

There are many different software available for peer assessment. Edtech is currently testing out several different ones and hopes to pilot them in the spring or summer semester. Currently, the only one that we are recommending (because it’s 0-cost) is Aropa. Aropa does a great job of providing several options for peer assessment, including self-assessment, privacy options for students, anonymous assessment, etc. It does not integrate completely with Brightspace so that is one disadvantage over some of the paid peer assessment programs currently available. Programs such as peerScholar, Feedback Fruits and Peerceptive have the capability to integrate into the Gradebook, thereby making it very easy to provide marks for the feedback that your students provide for one another.

For more information on any of the above tools, please contact edtech@langara.ca

References

  1. Wu, Wenyan, et al. “Evaluating Peer Feedback as a Reliable and Valid Complementary Aid to Teacher Feedback in EFL Writing Classrooms: A Feedback Giver Perspective.” Studies in Educational Evaluation, vol. 73, June 2022. EBSCOhost, https://doi.org/10.1016/j.stueduc.2022.101140.
  2. Double, Kit S., et al. “The Impact of Peer Assessment on Academic Performance: A Meta-Analysis of Control Group Studies.” Educational Psychology Review, vol. 32, no. 2, June 2020, pp. 481–509. EBSCOhost, https://doi.org/10.1007/s10648-019-09510-3.
  3. Planas-Lladó, A. et al., 2018. Using peer assessment to evaluate teamwork from a multidisciplinary perspective. Assessment & Evaluation in Higher Education, 43(1), pp.14–30.
  4. de Brusa, M. F. P., & Harutyunyan, L. (2019). Peer Review: A Tool to Enhance the Quality of Academic Written Productions. English Language Teaching, 12(5), 30-39.
  5. Western and Northern Canadian Protocol for Collaboration in Education, 2006 p.41