Canvas Accessibility Testing and Evaluation Report

Table of Contents

  1. Introduction
  2. Global Features
    1. Login and Configuration/Compatibility Testing
    2. Personalization and Customization
    3. Navigation
    4. Forms
    5. Help and Documentation
    6. Rich Content Editor
  3. Main User Interface (UI)/Canvas Core Tools/Components
    1. Canvas Landing Page – User Dashboard
    2. Courses
    3. Calendar
    4. Inbox
    5. Profile
      1. Account: Profile
      2. Account: Settings
      3. Account: Notifications
      4. Account: Files
      5. Account: ePortfolio
    6. Course Landing Page – Course Dashboard
    7. Announcements
    8. Assignments
    9. Discussions
    10. Grades
    11. People
    12. Pages
    13. Files
    14. Syllabus
    15. Outcomes
    16. Rubrics
    17. Quizzes
    18. Modules
    19. Settings
  4. Learning Tools Interoperability
    1. Collaborations
    2. Conferences
    3. Chat
    4. Panopto Recordings
  5. Conclusion

Introduction

Overview of Learning Management Systems and Accessibility

Learning Management Systems (LMS) are becoming central to the higher education academic experience. Students typically use an LMS to receive course announcements, access assigned resources, participate in discussions, turn in assignments, take quizzes and exams, and track all of their course activities within a single application. Instructors use an LMS to deliver and manage their course materials and activities, as well as review and evaluate students' work and assign grades.

LMS interfaces are becoming richer and more complex web applications, and unless they are designed with accessibility in mind, can pose problems for students and instructors with disabilities. Depending on the features enabled for a given course, students with disabilities may find that participating independently and effectively is nearly impossible. Faculty will also find adding course content, administering tests, and interacting with their students equally as impossible if the LMS is not accessible.

Until a few years ago, accessibility was either poorly supported by vendors or not considered at all in coding these platforms. Because of inaccessibility or limited accessibility support in learning management systems, some students were not able to fully or independently use these applications or participate in the course activities such as discussion forums, chat rooms, or assignments.

Thanks to the hard work of various LMS Accessibility working groups and their respective LMS companies, LMS vendors have begun to understand the need for universal accessibility of their tools. Despite the fact that LMS vendors have started providing accessibility features that allow users with disabilities to access the LMS, we are still not at the point that users with disabilities can utilize them effectively.

History

ATHEN is a volunteer network of higher education professionals committed to improving accessibility for students, faculty, staff, and administrators. The purpose of ATHEN is to collect and disseminate best practices in access technology in the higher education environment as well as present a collective voice for the professional practice of access technology in higher education. Established in 2002, ATHEN is best known for providing in-depth accessibility reviews of enterprise systems such as Google Documents, Google Calendar and Gmail, and it also performs research, sponsors surveys, and holds meetings at major accessibility conferences.

In 2010, a group of IT accessibility professionals consisting of Hadi Rangin and Marc Thompson from the University of Illinois, Ken Petri from The Ohio State University, and Joe Humbert from Indiana University, tested, evaluated, and reported on accessibility of four major LMS’s in the North American market: Blackboard, Desire2Learn, Moodle and SAKAI. This work took place in 2011 and 2013. Two different side-by-side comparison results have been presented at Cal State Northridge International Technology and Persons with Disabilities Conference (CSUN) (2011 paper and 2013 paper).

Due to its infancy, Canvas was not included in these earlier studies but its surging adoption into many higher education systems merited an evaluation of its own. Instructure, the maker of Canvas, has realized the importance of accessibility from the earliest versions of Canvas and has demonstrated its commitment to accessibility by working with the educational accessibility community toward improving Canvas accessibility.

Early in 2014, several IT Accessibility professionals from higher education institutions, under the leadership of Terrill Thompson from the University of Washington, established a collaboration group with Instructure to work toward identifying the accessibility issues and collaborate closely with the Canvas development team to address those issues. There are 92 participants representing 68 institutions at the time of this study (March 2016).

About the Canvas Accessibility Testing & Evaluation (CATE) Project

From November 2015 to March 2016, sixteen representatives from five higher education institutions and one college system conducted an accessibility evaluation of Canvas, led by Hadi Rangin. The group tested and collected data regarding the functional accessibility of Canvas Instructure. The evaluation was inspired by the previous LMS Accessibility Evaluation projects Hadi Rangin led in 2011 and 2013. Many issues discussed in the results can be applied beyond Canvas, and all LMS vendors can benefit from our global recommendations. This document will discuss our testing process, the results, and possible recommendations.

Purpose

Author Credits

This project was initiated and spearheaded by Hadi Rangin, Information Technology Specialist at the University of Washington, who has established and lead collaboration groups with many vendors for higher education system in the past 12 years.

This project could not be lifted from the ground without the hard work and dedication of our project partners and collaborators. We collectively spent over 1000 hours testing, evaluating, discussing and verifying our findings, and compiling the report. I would like to thank all of our collaborators including my students who helped me with this project. An especial thanks to Dana Danger, our Canvas partner who provided us with the testing platform and and patiently answered many questions while we were working on this project.

The full names and affiliations of our collaborators follow:

University of Washington, Seattle Campus

University of Central Florida

University of Michigan

Virginia Tech

California Community Colleges Technology Center

Testing and Evaluation Methodologies

The testing was divided into three major components: Global Features, Main user interface (UI)/Canvas core tools/Components, and selected third-party LTIs.

The Global Features component included tools specific to Personalization/Customization, Forms, and Help to help us to identify some basic usability/accessibility features that can enhance the user experience for all users.

The Main user interface (UI) component comprised Canvas core tools and features that are unique to the student experience, including Assignments, Discussion, and Grades. These features generally require additional awareness by the instructional designers and consultants to improve content accessibility, such as proper use of headings and proprietary software such as Adobe PDF and Microsoft Powerpoint.

Third party plug-ins are also referred to as Learning Tools Interoperability (LTI). Although Canvas does not have developmental control over the use of LTIs in conjunction with Canvas, we included these components into testing because inaccessibility of one LTI could affect the overall accessibility of Canvas if institutions adopt them without any accessibility considerations.

Each of the three components was divided further into features with several categories of criteria, such as navigation, operational, management, and instructor-only tasks. Each criterion was assigned a weight of 1–5, denoting that task’s relative impact within the feature, with 5 having the most beneficial impact on accessibility. Testers were assigned an assistive technology, browser, and operating system based on availability and expertise. The testers rated each criterion from 1–5, with 5 being fully accessible and 1 being completely inaccessible. During the testing process, we held weekly meetings to address any issues encountered in testing. Once the results were finalized, discrepancies between similar assistive technologies were noted and evaluated in detail.

After the completion of testing, we normalized the data, retested and verified selected results, and created combined results into an Excel chart. Redundant criteria were consolidated and final averages were calculated.

The assistive technologies employed by the testers included the following:

Testing for Windows

Testing for Mac OS

We did not test the application with mobile technology. The Canvas layout is not fully responsive in that the absolute font size induces horizontal scrolling upon zooming text for magnification. Instructure provides Canvas mobile Android and iOS apps; however, we did not have the resources to test the Canvas apps for accessibility, although this would be an interesting future project.

Originally, we planned to test voice control interaction using Dragon NaturallySpeaking 12.5 with Internet Explorer. However, Canvas proved almost completely unusable with Dragon due to Canvas’ extensive use of dynamic HTML and WAI-ARIA and Dragon’s poor ARIA support at this time. For instance, HTML-based modal dialogs, menus, and popovers could not be accessed except by keyboard navigation, use of the mousegrid or mousemove functionality, and entering text into fields in a modal window required the use of the dictation box. We used the built-in Windows Speech Recognition (WSR) for our test suite because it provided better—though still limited—support for these more modern web techniques and standards.

Tools used for testing & evaluation

Disclaimers

Global Features

1. Login and Configuration/Compatibility Testing

Rationale

Login is the first point of contact for users interacting with any application. Often, for an LMS such as Canvas, hosting institutions may integrate their own localized authorization process, but for this instance, the Canvas login platform was tested. As for configuration and compatibility testing, use of JavaScript and other third-party components may require additional action by the user and it is vital that any components used in conjunction with the LMS be accompanied by clear, straight-forward instructions.

Criteria

Login and Configuration/Compatibility Testing criteria belonged to the “Operational” category in which we tested whether the user can interact with the login page, submit credentials with certainty, detect error messages, recover errors, and retrieve credentials properly.

Results Summary and Feedback

2. Personalization and Customization

Rationale

Due to the magnitude of user need and the complexity of LMS interaction, it is vital that the LMS provides an accessible default layout and customizable individual configurations rather than forcing the user to learn or adapt. There are many global settings that can significantly augment or limit usability and accessibility of an LMS. The High-contrast UI is currently in beta and is an example of a customization from which many users could benefit including, but not limited to, alert type, layout, session timeout duration, and rich content editor type.

Criteria

Personalization and Customization criteria are organized into “Viewing” and “Management” categories.

In the “Viewing” category, we evaluated accessibility of the default layout and alert types. It also included the application’s session timeout duration and process to check if the user was alerted prior to timing out and given the opportunity to save any work in progress.

In the “Management“ category, we tested if the user could configure desired personalization such as layout, contrast level, background and foreground colors, font size and type, editor settings, and session length while the LMS saved these individual settings after re-login.

Results Summary and Feedback

Rationale

Navigation, the most critical component of LMS, must be defined by proper, logical, and consistent structural markup to be effective and efficient. Canvas currently has global, course, and breadcrumb navigation to allow its users quickly and easily access all areas of the LMS.

Criteria

If users are not able to properly use a given LMS navigational tool, accessibility will be functionally broken for areas linked to it. Navigation criteria are divided into the following categories:

Results Summary and Feedback

4. Forms

Rationale

Forms should have properly labeled and focusable controls that help the user enter data with certainty, consistent validation upon form submission, and give guidance to fix possible invalid submissions.

Criteria

Forms were tested under two categories. “Form Control Labels” category tested whether the LMS correctly labeled controls and required fields without using dynamic “onChange” events. “Form Submission” checked for proper verification messages upon submission and whether the user can easily identify places where errors have occurred and navigate to them.

Results Summary and Feedback

5. Help and Documentation

Rationale

Learning Management Systems can add to the challenge of online learning for those who are attempting to learn how to use the new LMS, interact with the LMS with their assistive technology, or even learn a new assistive technology if their preferred technology is not supported. Understanding the general layout through external help documentation can be beneficial to users with disability in clarifying tools and tasks that are not easy to use. Ideally, the external documentation would also consolidate quick navigation and interaction guide for users with disability in fully accessible format outside the LMS. Additionally, inline help allow the user to learn, troubleshoot, and work effectively in the LMS.

Criteria

Help and Documentation criteria were evaluated to test if the LMS help was “Operational”, offered “Help Across Application”, and provided accessible “Tutorials and Guides.”

“Operational” criteria determined whether the user could search the Canvas guides, report a problem, post a question to the Canvas community, and submit a feature idea. “Help Across Application,” tested inline help and tip links or buttons throughout the application while “Tutorials and Guides” tested content accessibility on both proprietary and public platforms.

Results Summary and Feedback

6. Rich Content Editor

 Rationale

The Rich Content Editor (RCE) is used throughout Canvas, being embedded into any module that requires text editing functionality. Example pages that include the RCE include (but are not limited to) Discussions, Outcomes, Quizzes, Announcements, and Pages. The RCE is not only a necessary means by which end users communicate with each other, it is also used by instructors to create course content, which affects the overall accessibility of the course. The RCE therefore has the potential to educate content authors about the accessibility of the content they create, for example by requiring authors to supply alternative attributes for informative graphics.

Criteria

Our focus when testing the RCE was twofold: Are the RCE controls accessible? Additionally, is the content produced by the RCE accessible? We have organized the RCE criteria into the following categories: Navigation, Viewing, and Operational.

As with other tools, the “Navigation” criteria evaluated whether or not the user could easily discover the tool and navigate to it.

“Viewing” criteria tested the content generated by RCE. The user should be able to produce semantic headings, lists, images, tables, and forms. Shortcut information should be exposed to all users.

“Operational” criteria tested the RCE controls. The user should be able to align, indent/outdent, alter font size, change paragraph styling. Additionally, the user should be able to insert and remove tables, URL links, math equations, images, recorded media, and uploaded media.

Results Summary and Feedback

Canvas currently uses as its content editor TinyMCE, which unfortunately has accessibility shortcomings, and therefore negatively impacts any tool that requires its use. The problems with the RCE were significant enough (outlined below) that we would strongly recommend replacing it with another, more accessible editor, such as the CKEditor, which is used by another popular higher education platform, Drupal.

Main User Interface (UI)/Canvas Core Tools/Components

1. Canvas Landing Page – User Dashboard

Rationale

The User Dashboard is the gateway to all courses and can be customized in the Courses tool. The user can view and link to courses and also receive reminders about upcoming events and assignments. The user has the ability to toggle between “Card” view and “Recent Activities” view. Within the Card view, a newly implemented “metro” style UI, the user can select which courses will be displayed on the user dashboard (specified in “All Courses” under the Courses menu). The user can customize the card color and can see the number of new updates to Notifications, Files, Discussions, and other tools that the instructor decides to use. The Recent Activities view displays the communication activity related to all courses. 

Criteria

We have organized the User Dashboard criteria under the “Navigation” and “Viewing” categories. As with other tools, “Navigation” criteria evaluated whether or not the user can easily discover the tool and navigate to it.

“Viewing” criteria tested whether the user is able to see the most urgent announcement, assignments, and feedback; toggle between views; and access additional pop-over information associated with To Do List items.

Results Summary and Feedback

2. Courses

Rationale

Selecting the Courses menu option opens a “drawer” flyout menu with a list of courses (previous and current) and a link to a page where the user can customize which courses will appear in the User Dashboard Card view.

Criteria

We have organized the Courses criteria into the following “Navigation” and “Operational” categories.

“Navigation” criteria evaluated whether or not the user could easily discover the tool and navigate to it. The “Operational” criteria tested whether the user could specify which courses were to be added to the user dashboard and also distinguish between hidden and visible courses.

Results Summary and Feedback

3. Calendar

Rationale

The Calendar allows the user to see upcoming dates and schedule events related to courses. Canvas recommends that users of assistive technology access the Agenda view, which we tested. We did not test the other views.

Criteria

We organized the Calendar criteria into “Navigation”, “Viewing”, “Operational”, and “Appointment Groups” categories. “Navigation” criteria evaluated whether or not the user could easily discover the tool and navigate to it. “Viewing” criteria determined if the user could choose between different course calendars, view a list of undated agenda items, and preview agenda items. “Operational” criteria tested whether the user could create, modify, and delete events and receive an accessible confirmation. “Appointment Group” criteria evaluated whether the user could create, edit, and delete Appointment Groups. We also tested if the user could send messages to users within the appointment group, search the calendar for a particular appointment group, and sign up for an appointment slot.

Results Summary and Feedback

4. Inbox

Rationale

The inbox includes all messages exchanged by the user and can be filtered by the course title, message status (inbox, unread, starred, sent, archived, and submission comments), or a recipient's name. A message composed using the Inbox feature can be sent to course participants by recipient role or individually within Canvas. A duplicated copy of the message can be sent to an external email system of the receiver(s).

Criteria

We have organized the Inbox criteria into “Navigation” and “Operational” categories. “Navigation” criteria evaluated whether or not the user could easily discover the tool and navigate to it. “Operation” criteria tested whether the user could easily operate the following message controls and receive proper feedback: create, reply, reply all, delete, archive, forward, mark as read or important, and filter by course or message type. The user should be able to interact with all application fields in order to compose messages, such as adding recipients from the list of available Canvas users and groups.

Results Summary and Feedback

5a. Account: Profile

Rationale

Users can create a profile with the following information: name, title, contact information, biography, and relevant links. The Name entry is the only required information.

Criteria

Criteria for this tool have a single “Operational” category, testing whether all the fields, features, and settings of the module is accessible.

Results Summary and Feedback

5b. Account: Settings

Rationale

The Account Settings allow the user to edit their full name, display name, sortable name, language setting, and time zone. Additionally, the user can authorize external applications via Google Drive, Skype, and LinkedIn; provide additional email addresses or contact methods; or download their submissions. The user can also set Feature Options such as High Contrast (beta version) from within Canvas.

Criteria, Results, Summary and Feedback

Similar criteria, as well as results, for this feature can be found under 2. Personalization and Customization and 4. Forms.

5c. Account: Notifications

Rationale

We determined if the user can change the email frequency of the application’s course activities, discussions, conversations, scheduling, groups, administrative alerts, and conferences.

Criteria

We have divided the Notifications criteria into two categories: “Navigation” and “Operational.”

The “Navigation” criteria evaluated whether or not the user could easily discover the tool and navigate to it, and the “Operational” criteria evaluated whether the user could easily and reliably change their notification settings.

Results Summary and Feedback

5d. Account: Files

Rationale

The user can organize the files in folders, see the file hierarchy, search for files, or upload files that can be used throughout the application.

Criteria

The feature had no direct testing criteria; however, file uploading was tested using 5e. ePortfolio.

5e. Account: ePortfolio

Rationale

The ePortfolio allows the students to collect their work from different courses into one feature. For example, in an English class, the user can turn in their final portfolio into the professor using this feature.

Criteria

We have organized the ePortfolio criteria into the following categories of “Navigation” and “Operational.”

“Navigation” criteria evaluated whether or not the user could easily discover the tool and navigate to it. “Operational” criteria verified if a user could create, download, and delete an ePortfolio. The ePortfolio has sections in which the user can edit, reorder, and rename. Each section can have pages where the user can edit, reorder, and delete course submissions, embed HTML/rich text content, and add images and files.

Results Summary and Feedback

6. Course Landing Page – Course Dashboard

Rationale

The course editor customizes the landing page. It is also important to ensure that as a landing page for a course is accessible.

Criteria

We have organized the course dashboard criteria into the “Navigation” and “Operational” categories.

“Navigation” criteria evaluated whether or not the user can easily discover the tool and navigate to it. “Operational” criteria tested whether the user could change the publishing status of a course. (Depending on the administrator's configuration of the Landing Page, the user may be able to perform additional operations.)

Results Summary and Feedback

7. Announcements

Rationale

The user can search by title, body, or author, filter unread announcements, and add an external feed or announcement. An instructor can close comments or delete an announcement.

Criteria

We have organized the Announcements criteria into two categories: “Navigation” and “Viewing.”

“Navigation” criteria evaluated whether or not the user could easily discover the tool and navigate to it. “Viewing” criteria tested whether the user could create, edit, delete, search, and filter for an announcement. We also tested whether a user could add and delete external feed and open announcements for comments.

Results Summary and Feedback

8. Assignments

Rationale

An instructor can sort by grading periods, search for an Assignment, create Assignment and Groups, and weight each Assignment Group.

Criteria

We organized the Assignment criteria into the following categories: Navigation, Operational, and Instructor Tasks.

“Navigation” criteria evaluated whether or not the user could easily discover the tool and navigate to it. “Operate” criteria tested the download and upload features of the application. “Instructor Tasks” criteria determined the instructor’s ability to easily create assignments, edit details and sections, reorder, move, publish, and delete assignments in assignment groups. We also evaluated whether the instructor could message students who have not signed up for a self-sign-up group.

Results Summary and Feedback

9. Discussions

Rationale

The Discussions feature is one of the most important components of an LMS. It is used for collaboration and communication between students and instructors, exchanging ideas, and posting general announcements. Frequently Discussions are used to turn in short assignments or post questions to the professor and TAs.

The Discussions tool in Canvas consists of topics, threads, and posts. A user can reply to a post or reply to another reply. A user can subscribe to a specific thread/topic, which enables the user to receive any subsequent posts at the email associated with the account. Both students and instructors can create a new discussion thread. Canvas uses the Rich Content Editor (RCE) for the discussion tool, allowing users to create and post rich-text content. Similar to Assignments, the user can search for a title, body, or author, and sort by unread or similar operations. The instructor can also add and “pin” a discussion.

Criteria

The Discussions criteria were divided into the following categories:

“Navigation” criteria evaluated whether or not the user could easily discover the tool and navigate to it. “Viewing” criteria tested if the user could view new threads or posts, filter, sort, identify the nesting level of replies, and the perceive presence of attachments. “Operational” criteria evaluated if a user could create, delete or edit a topic, and also attach files to a post or comment.

“Management” criteria tested whether the user could mark a post or thread as read or unread, change a discussion to be published or unpublished, close and open the comments of a discussion topic, expand and collapse display of all Replies to a Discussion Topic, pin a discussion topic, interact with the search function, receive e-mail notification of new posts, subscribe and unsubscribe to discussion topics, and reply directly to a post by email.

“Instructor Tasks” criteria examined setting up the Discussion tool, deleting a topic, and pinning a topic.

Results Summary and Feedback

10. Grades

Rationale

The Grades tool offers three views: the default table view upon clicking on “Grades” in the Course navigation, the recommended individual view (via the default table view), and the SpeedGrader view.

Criteria

We have organized the Grades criteria into the following categories of “Navigation”, “Viewing”, “Instructor Tasks” under both Default and Individual View. “Navigation” criteria evaluated whether or not the user can easily discover the tool and navigate to it. “Viewing” criteria tested whether the user was able access their individual scores, instructor feedback, current cumulative grade for completed work, and relative score based on class average. There is an additional ‘What If?’ tool within Canvas that allows the user to predict their grade based on the user input. “Instructor Tasks” criteria in both the Default and Individual View determined if the user could import scores, enter scores and comments, change assignment group weight, mute, show details, and view all grades and grading history while moving from one student to another. The Instructor criteria also evaluated whether the user could message students based on submission status, display content for a particular student/assignment, and download all submissions and the exported CSV file from the Gradebook.

Results Summary and Feedback

11. People

Rationale

The user can see the list of participants from the course.

Criteria

We have organized the People criteria into the following categories: “Navigation” and “Instructor Tasks.” “Navigation” criteria evaluated whether or not the user could easily discover the tool and navigate to it. We tested whether the user could sort and filter contacts by name and role, find essential contact information, and identify groupsets, groups in groupsets, and participating students in a group.

“Instructor Tasks” criteria examined functions such as adding, editing, and removing groupsets, groups, and participating students in a group. We tested whether a user could manually assign, move, and set a student as a leader to a Group. We verified if an instructor could also visit a Group Home Page; clone Group Set; view prior student assignments, user details, summary interactions, and access a report for a course; navigate between Groups; and resend invitations to a course.

Results Summary and Feedback

12. Pages

Rationale

Pages content can be edited using the Rich Content Editor (RCE) and Pages has a Wiki-like features allowing the user to link one page to another. Under the Images tab, the user could upload a new image, select an existing image to add to a page, or search for an image from Flickr to add.

Criteria

We organized the Pages criteria into the following categories of “Navigation,” “Operational,” and “Instructor Tasks.”

“Navigation” criteria evaluated whether or not the user could easily discover the tool and navigate to it. “Operational” criteria evaluated whether the user could add various links, new images, existing images, and image search results from Flickr to the course content using the page editor. “Instructor Tasks” criteria verified whether an instructor could create, edit, delete, sort, set a page as the front page, and view the page history and permissions.

Results Summary and Feedback

13. Files

Rationale

The Files UI allows the user to review the files uploaded in a Folder Browsing Tree format sortable by name, date created, date modified, whom modified, and size.

Criteria

We have organized the Files criteria into the following categories of “Navigation” and “Operational.”

As with other tools, we evaluate whether or not the user can easily discover the tool and navigate to it. We evaluated whether the user was able to read, search, sort, upload, download, and view files in preview mode. We also tested whether a user could create, edit, delete, and move between folders in the folder hierarchy.

Results Summary and Feedback

14. Syllabus

Rationale

The student can see the assignments in data table with two columns of date and details; once inside the body of the table, the user has more than one relationship to the date in that more than one assignment can be related to the user or an assignment may have an assignment detail adding additional table cells.

Criteria

We have organized the Syllabus criteria into the following categories of “Navigation” and “Viewing.”

Like in other tools, in the “Navigation” category we evaluated whether or not the user can easily discover the tool and navigate to it.

In the “Viewing category”, we evaluated whether the user can view contents of the syllabus table while jumping to current day and properly reading assigned group weightings.

Results Summary and Feedback

15. Outcomes

Rationale

Outcomes are created to track mastery of a course and Canvas allows you to add outcomes to the user’s grading rubric on assignments. The Outcome is a folder structure in which the user can create, organize, and find outcomes/outcome groups and manage rubrics.

Criteria

We have organized the Outcomes criteria into the following categories of “Navigation” and “Operational.”

Like in other tools, in the “Navigation” category we evaluate whether or not the user can easily discover the tool and navigate to it. In the “Operational” category, the user can create, edit, move, and delete outcomes and its groups as well as search.

Results Summary and Feedback

16. Rubrics

Rationale

Rubrics outline the necessary and quantifiable milestones the student must achieve in order to be successful in a course.

Criteria

We have organized the Rubrics criteria into “Navigation”, “Viewing”, and “Instructor Tasks.” Like in other tools, in the “Navigation” category we evaluated whether or not the user can easily discover the tool and navigate to it. In the “Viewing” category, the user should simply be able to read and understand the given table. We also examined Instructor specific functions such as creating, editing, and deleting a rubric and its criteria and ratings.

Results Summary and Feedback

17. Quizzes

Rationale

Quizzes are an essential feature of a learning management system in which the user is often given a grade for their participation and/or performance based on the application interface that Canvas created.

Criteria

We have organized the Quizzes criteria into the following categories: “Navigation”, “Viewing”, “Management”, “Operational”, and “Instructor Tasks.” Like in other tools, in the “Navigation” category we evaluates whether or not the user can easily discover the tool and navigate to it. In the “Viewing” category, the user should be able to navigate between questions determining unanswered, point value, and required status of each questions while easily obtaining the elapsed/remaining time and error notification of the quiz if needed. In the “Management” category, the user should be able to easily save the quiz without redrawing the page, take the question one at a time, bookmark and find questions, and receive verification upon submission. In the “Operational” category, we evaluated whether the student can interact with various question types while in the Instructor Tasks category, we evaluated the instructor’s capabilities. Question types included but are not limited to multiple choice, true/false, fill in the blank, matching, multiple dropdowns, essay, formula with a single variable, and numeric answer. The instructor tasks also included whether the user could manage question groups and banks, search, moderate, regrade, and view results and logs for a quiz.

Results Summary and Feedback

18. Modules

Rationale

A module can contain lecture slides, readings, assignment pages, and more where the student users go from start to finish.

Criteria

We have organized the Modules criteria into the “Navigation”, “Viewing”, and “Operational” categories.

Like in other tools, in the “Navigation” category we evaluated whether or not the user can easily discover the tool and navigate to it.

“Viewing” category should allow the user to be aware of their own progress while completing required fields.

“Operational” category evaluated whether the user can create, edit, reorder, and delete modules and its content and items. The user should allow the user to move through each module item with proper prerequisites.

Results Summary and Feedback

19. Settings

Rationale

The course settings allow the instructor to edit the courses, determine sections, view list of enrollments, and preview the course as a student.

Criteria

We have organized the Settings criteria into “Navigation”, “Viewing”, and “Operational.”

Like in other tools, in the “Navigation” category we evaluate whether or not the user can easily discover the tool and navigate to it.

In the “Viewing” category, the user should be able to preview the course as a student and determine the sections and see its list of enrollments.

In the “Operational” category, the user should be able to change and reorder modules used in the course navigation. The user can also add, edit, and remove sections and set page/section permissions, course details, and time limits of a course.

Results Summary and Feedback

Learning Tools Interoperability (LTI)

1. Collaborations

Rationale

Collaborations redirect the users to Google Docs, which is currently inaccessible.

Criteria

We have organized the Collaborations criteria into the following categories of “Navigation” and “Management.” Like in other tools, in the “Navigation” category we evaluated whether or not the user can easily discover the tool and navigate to it. In the “Management” category, the user is to easily add and remove people and groups to new and existing collaborations and be able to remove the collaborations.

Results Summary and Feedback

2. Conferences

Rationale

This feature uses Adobe Flash for the external conferencing application, which is not accessible.

Criteria

Conferences criteria were divided into the “Navigation” and “Operational” categories. Like in other tools, in the “Navigation” category we evaluated whether or not the user can easily discover the tool and navigate to it. The user should also be able to join and participate in a conference using an external application under the “Operational” category. Under the same category, the user should be able to edit, end, and delete a conference if an instructor.

Results Summary and Feedback

3. Chat

Rationale

Chat allows the user to have an ongoing conversation. It is important that the user is able to connect to different users and follow the conversation in this possible third party platform.

Criteria

We have organized the Chat criteria into “Viewing”, “Management”, and “Operational” categories. Like in other tools, “Navigation” category evaluated whether or not the user can easily discover the tool and navigate to it. “Viewing” category tests whether the application uses a built-in text-to-speech or OS TTS with ARIA live-regions to update and notify the user. The user should distinguish between private and public messages, read through the chat history, and discover the chat participants. Additionally, the user should be able to configure the chat thread chronologically, suppress time-stamps that can be redundant during an ongoing conversation, and change the font size and style under “Management”. In the final “Operational” category, the user should be able to use the basic editing features, submit upon pressing Enter key, select and copy content of the chat log using keyboard, click hyperlinks within, and locate the desired participant and send private messages if needed.

Results Summary and Feedback

4. Panopto Recordings

Rationale

Panopto Recordings is an external application which allows instructors to record their lectures and capture screen in order to provide resource outside the lecture for students who missed class, have a learning disability, or simply want to relisten to the lecture. It can externally link the user to Panopto web application in which the user as a student can watch recorded videos of the lecture and as a faculty edit videos using the content editing platform.

Criteria

We have organized the Panopto Recordings criteria into the following categories of “Navigation” and “Operational.”

Like in other tools, in the “Navigation” category we evaluated whether or not the user can easily discover the tool and navigate to it.

The “Operational” category allows the user to interact with the video features of the application such as pause, play, rewind, and advance of the screencast. The user while doing so should be aware of the elapsed time on the video, access audio description feature for closed captioning, and easily sort and filter content.

Results Summary and Feedback

Conclusion

We have observed that Instructure has been working hard to make the Canvas LMS more accessible in recent years. They have a dedicated accessibility liaison for the product, and we applaud their genuine and serious efforts in reaching out to the accessibility community and working with them to enhance the accessibility of their product. Their willingness to participate in regular meetings with the higher education accessibility community is also commendable. A willingness to regularly address accessibility issues should be a facet of more companies.

Overall, Canvas is technically accessible in most areas but lacked consistency in design pattern and coding best practices, and, consequently, certain areas are inaccessible, cumbersome to use, or functionally not accessible. Technical accessibility refers to the application or web site passing automated tests successfully and is a key starting point for accessibility. However, we consider functional accessibility to be the goal, where a user with a disability can complete tasks and have the same experience as a user without a disability.

As the CATE team doesn’t know the dependencies and constraints surrounding all the widgets and interfaces we have analyzed, we were not always able to provide a clear design or propose implementation solutions. However, we believe we have collected valuable data that we can share with Instructure and the Canvas Accessibility Collaboration Group to discuss the issues we found and work out optimal solutions that could enhance the overall usability and accessibility of the application.

Offering flexibility through personalization and customization could significantly improve the user experience. There is minimal customization available for users in the form of a high contrast view (still in beta). We believe Canvas needs to provide even more customization and personalization options for users with different needs. Examples of additional personalization options include adjusting alert types, layout of the dashboard, session timeout duration, help verbosity, and the type of content editor applet to be used. Basic personalization would also include font family, size and color as well as background color.

Inconsistent implementation of widgets and coding practices was noted throughout Canvas. For example, the application should clearly segregate between link and button functionality: links go somewhere and buttons do something. Because the site uses absolute font sizes, this means that zooming the page view can result in horizontal scrolling. In many places, controls had no visible labels, and their hidden labels were not unique, making navigation using Windows Speech Recognition more difficult. Also, sometimes the visible labels did not match the accessible name given to the control using other means. The lack of voice-control interaction functionality in Canvas excludes many people with dexterity limitations and highlights the danger in producing a highly scripted interface. It doesn’t matter to users with disabilities whether the blame is laid at the feet of Instructure or Nuance. The end result is that these users, both faculty and students, cannot participate fully in a highly competitive learning environment. Therefore, a sounder strategy would be to rely as much as possible on standard HTML elements when they can do the job and avoid gratuitous use of JavaScript. Canvas developers should comb through the application to ensure that the design and coding practices are consistent across the application.

Within the views of the Assignment modal dialogs, there were novel interface components that employed non-standard keyboard interaction patterns, poorly labeled controls, and improper focus management. In general, HTML-based modal dialogs should not themselves shoulder the burden of many pages of content with complex interface components. The best strategy might be to move this functionality to an entirely separate page or even multiple pages.

The Rich Content Editor, which had been unusually accessible in prior reviews, is now highly problematic. Although Rich Content Editors and other LTI components are not Canvas core components, from the end user perspective, it does not matter if it is a core component or an LTI from a third party. As soon as an inaccessible component is introduced to Canvas, the overall usability of Canvas declines. We encourage Canvas to develop minimum accessibility standards for any LTI that they are offering as well as guidelines that would help their clients make an informed choice when it comes to selecting an LTI. This will help guarantee overall accessibility of the application.

The Canvas help system was problematic in a variety of areas. We recommend a fresh help tutorial be designed and deployed that does not rely on the current forum-style that is provided. There should be a dedicated area where accessibility information can be delivered and located easily for users. Additionally, inline help and external help that could potentially ease interaction was difficult to access and needs improvement. This would include user-adjustable verbosity of the inline help system throughout the application.

Canvas can also improve the content creation process by prompting instructional designers and instructors to offer closed captioning for media, to avoid use of proprietary software like Adobe Flash or PDF, and to ask for text transcripts for video/audio content. Institutions also have a responsibility to train faculty and staff about accessibility for the content they create and upload. Canvas can lead and inspire an accessibility culture through offering resources to create accessible content in PDF, Powerpoint, and more.

We are confident that despite the functional and technical accessibility problems noted in this report, Instructure will be willing to address the problems indicated and work toward further repairs, redesign, and feature development.