Evaluating the application

The living wall of the Grimond Building at the University of Kent, UK.

The new living wall on the Grimond building at the University of Kent

University of Kent Faculty librarians have suggested a number of reports which I have now included in the list of pre-configured reports within Raptor. We have the application installed and running well and available to the library staff who will help us evaluate the service. Last week we held a short workshop to introduce library staff to Raptor and to get some initial feedback. These staff will familiarise themselves with Raptor and then be asked to complete a series of tasks andprovide feedback.

We are approaching the evaluation in two segments –

  • the usability and user-friendliness of the Raptor UI, and;
  •  the usefulness of the resulting reports to library staff in various roles.

The report facility does allow us to configure things to a limited degree but it might be necessary to port data into Excel if users want to have a greater degree of control over the final appearance of a graph. In particular the pdf report can only be edited if the user has a copy of Adobe Acrobat rather than just reader. This is not to say that the pdf report would have no use – for a quick snapshot it is pretty good. Later in the project we will explore how we can integrate the Raptor data with Microsoft Reporting Services which should give us a lot more control over the appearance of reports.

The workshop produced requests for some data that Raptor does not currently collect and probably never will – for instance a measure of how long a user stayed on a particular e-resource site. This would be difficult even if the log files did store such data. We would need to distinguish between users who just had the resource available on a browser tab or alternative window but who were not actually interacting and those who were searching for data or reading. Not so easy.

I am sure a common observation from Raptor users is that the drop down parameters are hard to decipher and there are often too many of them – some of no relevance. Again this may be something we can tackle when we look at how we can integrate with Microsoft Reporting Services.

Workshop participants also identified one or two reports that did not work as expected but I have yet to establish whether this is down to Raptor or to badly configured XML files – which will be down to me….

I am currently working on an evaluation survey to gather opinions on the usability of Raptor for library staff.

Standard

Leave a Reply