Both the Kent Academic Repository (KAR) and the Kent Data Repository (KDR) have been around for some time but needed work to improve accessibility.
These systems are based on the leading open source EPrints software, with a mixture of third-party plugins, in house modifications and styling added. As the software is open source, people can modify it and contribute changes back to community so they can benefit from that work. As a result, many people have contributed to the code base for the repositories. Fortunately, as we can access the source code, we can also amend the code for the repositories to make them more accessible.
In this post I will cover how we detected accessibility issue, some of the issues we found, how we went about addressing them, and some of the tools that helped us along the way.
We had to look at each component of the system that generated any output to ensure that:
- Keyboard navigation worked as expected, e.g. when pressing the tab key in a menu, the next menu item became highlighted. It was also necessary to make sure that all elements in the page were reachable just through keyboard navigation.
- Elements were correctly labelled and announced by screen readers properly, e.g. a navigation button should be labelled as such to make its purpose clear.
- HyperText Mark-up Language (HTML) tables used for presentation were removed so that screen readers could navigate the page properly. HTML tables used to be used a lot to lay out pages, but screen readers see them as being there to represent tables of information. Screen readers may struggle to represent the layout of a page if HTML tables have been used inappropriately.
- Colour combinations used on the site were of high enough contrast to not cause any problems for people with colour vision deficiency.
- Both consuming and creating functionality were equally as accessible. Users of assistive technology should be able to just as easily use the system when creating or editing records as when they use the system to look for information.
As this work was being done, regular testing took place ensure that accessibility levels were improving. Testing with people is one of the best ways to test, but as we need to continuously test our changes, it is a lot of ask people to do the same tests over and over again. Tools such as Lighthouse, which is built into Google Chrome, and WAVE can help by generate lists of prioritised issues outstanding. Hopefully during the process of making a site more accessible, this list should reduce in size every time the tests are run. Testing with users is better done after these automated tests as hopefully most common issues will have been caught.
A lot of work has already been done by EPrints Services, the commercial company that provides support services for EPrints users. Unfortunately, these were mainly to be found in a major new version which we could not upgrade to yet for operational reasons. Instead, these changes were incorporated into our source code by modifying them to work with the version of EPrints we used.
Further changes were made to the our instance of the EPrints source code, each plugin and the theme used, to meet the goals above. The tools below were used iteratively until improvements were observed.
Many software tools are freely available to assist in making websites more accessible. Fortunately, there is no need for any expensive specialist software. Different tools excel in different, so using several tools in combination was a very useful path to issue discovery and resolution.
Keyboard navigation issues were isolated by using the ANDI bookmarklet. This utility will generate a list of issues with elements that will not be announced by screen readers correctly. It will also show the keyboard navigation by letting you navigate through the screen and showing the label that would be announced.
The Accessibility Tab in Mozilla Firefox was also used as it can show a representation of the web page with the tab order shown so you can compare it to expectations. Checks were also made by using the NVDM screenreader and listening to its output to see if it matched expectations.
The Lighthouse tests showed a number of colour contrast issues. These were solved by using Colour Contrast Analyser tool to generate adjusted colour schemes that looked similar to the original, but did not cause problems for people with colour vision deficiency.
Not all issues can be solved by automated tooling. Advice was also sought from accessibility champions inside our organisation and also knowledge acquired from attending virtual accessibility conferences. One unexpected side effect of the Covid pandemic has been that attending conferences became much easier. There was no need to physically attend or justify travel expenses. Instead it was possible to attend several events and learn a great deal from the presenters.
The arrival of the new regulations has brought a welcome focus on improving sites for users of assistive technology. While complying with the regulations would be a wonderful start, it is important to remember that this work is about people and not just regulations. In the future we would like to incorporate feedback from assistive technology users directly in our planning and development. We would also like to incorporate User Stories into our planning that include a wide and diverse audience using different parts of the system. This would help us better evaluate the impact of choices on different groups.
Accessibility will now form a regular strand of work in the development of our institutional repositories. A “continuous improvement” approach will be taken to help ensure that our systems are usable by all.