Skip to content

Accessibility review of online survey tools

Here in UW-IT Accessible Technology Services (ATS), we are frequently asked which of the online survey tools that are commonly used at the UW produces the most accessible survey forms. To be sure we could confidently answer this question, the ATS IT Accessibility Team recently conducted a review of the following five tools, listed alphabetically:

  • Catalyst WebQ
  • Google Forms
  • Microsoft Office Forms
  • Qualtrics
  • Survey Monkey

With each survey tool, we created a test survey that included the following form fields:

  1. A simple text field of “Your name,” with supplemental instructions in addition to the label (“Enter your full name [first, middle, and last”]). In order to be accessible to screen reader users, both the label and help text need to be explicitly associated with the input field. The latter would require using of the aria-describedby attribute. This field is required.
  2. Another simple text field of “Your email,” with validation where available.
  3. A date field of “Your birthdate,” with validation where available.
  4. Two multiple-choice questions, one with one possible answer (radio buttons) and another with multiple possible answers (checkboxes). For both of these questions, users must select at least one response.
  5. A grid of Likert-scale questions, where each row contains the item to be rated, and each column contains a radio button on which to rate that item (i.e., “Excellent”, “Good”, “Neutral”, “Bad”, “Horrible”). Users must select one item in each row (see screenshot below).

A likert question for rating multiple conferences on a 6-point scale

We tested each form by attempting to complete it with the keyboard alone (no mouse), then Hadi Rangin, a full-time screen reader user, tested each form with both JAWS and NVDA (Windows screen readers). We also inspected the HTML code to see whether the code conformed to HTML standards and accessibility best practices, and we checked for other accessibility issues such as color contrast.

The following is a high-level summary of what we found:

  • Both Google Forms and Microsoft Office Forms have highly sophisticated forms that use ARIA extensively for making complex relationships between a form field’s parts accessible to screen reader users. (ARIA stands for “Accessible Rich Internet Applications”; for more on its role in web accessibility, see Using ARIA for Web Applications). Google and Microsoft produce the most accessible survey forms of the tools tested, although the amount of information provided to screen reader users may in fact be excessive (the forms became extremely verbose).
  • Google Forms and Microsoft Office Forms seem to be the only tools tested that do client-side error checking. If WebQ, Qualtrics, or Survey Monkey have this capability, it wasn’t apparent when we were creating the test surveys. Client-side error checking is preferred since it provides immediate feedback if the user skips a required field or enters an invalid response. If the user doesn’t learn about their mistake until after they’ve submitted the form, it can be challenging for them to (a) figure out what they did wrong, and (b) find their way back to the field that requires fixing.
  • Qualtrics doesn’t seem to indicate whether fields are required (neither in the code nor in the user interface). It’s the only tool we tested that doesn’t at least place an asterisk (*) next to required fields. There are many possible methods for communicating this information, the best being the HTML required attribute, which is well-supported by browsers and assistive technologies. In a Qualtrics survey, users have no idea which fields are required until they’ve submitted the form. Then, they are, in a sense, reprimanded for breaking rules they didn’t know existed.
  • Both Qualtrics and Survey Monkey have quirky code, and screen readers are quirky in how they handle it. As we studied the code behind their surveys, we often found ourselves completely puzzled as to why they had coded something the way they did. I’ll be exploring that code in detail in a presentation at the HighEdWeb conference on October 5 (see below for more details). When testing with screen readers, we frequently experienced unexpected and inconsistent output with surveys created using these tools, which may be due to the quirky code.
  • The Likert question is presented visibly as a table. However, functionally it isn’t a table; it’s a form. Therefore, the best practice is to remove the table markup, either by using non-semantic elements such as <div> instead of table elements, or by adding role="presentation" to table elements. If the interface is not coded this way, screen readers announce both the table-related information and the form-related information, which makes the interface extremely verbose and therefore extremely cumbersome for users to complete. Another problem is that both JAWS and NVDA fail to recognize relationships between radio buttons when they’re distributed in separate table cells. If radio buttons are grouped together in a common block element, they’re announced by screen readers as “Radio button X of Y” where Y is the number of options, and X is the position of the current option within the set. If radio buttons are in separate table cells, screen readers announce each button as “Radio button 1 of 1”, which is inaccurate and potentially confusing for users. Google Forms and Microsoft Office Forms are the only tools tested that code this interface correctly.
  • In the default Qualtrics theme, there is no visible difference between checkboxes and radio buttons (see Figure 1, below).
  • The default Survey Monkey theme has a very low color contrast (see Figure 2, below).
  • Catalyst WebQ has generally good HTML code, but is no longer actively developed, so does not use modern techniques for coding HTML forms or handling errors.

Figure 1: Radio buttons and checkboxes in Qualtrics

Fieldsets with radio buttons and checkboxes look exactly the same in Qualtrics

Figure 2: Color contrast in Survey Monkey

Colour Contrast Analyser dialog shows that Survey Monkey's default light green on white text fails all contrast requirements

Conclusion and recommendations

We’ve been using Catalyst WebQ for decades for our own forms and surveys. We have trusted its accessibility and, if we found problems, we knew who to call. However, web coding practices have evolved over the years, including methods for coding forms and for making them accessible. Since WebQ is no longer actively developed, it no longer offers the most accessible survey solution.

Today, the best tools for creating accessible forms, based on our tests, are Google Forms and Microsoft Office Forms. However, which of these tools is best may change at a moment’s notice. When we did some preliminary tests two years ago, Google’s client-side validation was triggering an error message on every keystroke as users typed their responses. This was silly and annoying to sighted users (of course that’s not a valid email address–I’ve only typed one character!) But for screen reader users, it constantly interrupted the user and made the form nearly impossible to complete. Google has since fixed that bug, and currently, the forms produced by both Google and Microsoft are highly accessible. However, given the frequent updates that occur with these products, we recommend carefully testing your forms before deploying them, including testing with screen readers.

Email field with a single character typed, and an error message: That is not a valid email address!

More details at HighEdWeb

I’ll be giving a presentation on this topic at the HighEdWeb virtual conference next week. The conference is Monday and Tuesday (October 4 and 5), and my presentation, Accessible Online Forms, is the last of the conference, at 2:00pm Pacific on Tuesday. Registration is open until Friday, October 1, at 11:59pm.