User feedback for the Raptor Application

Library staff who have been part of a group  assisting with the evaluation of Raptor were recently asked to complete a short survey on their experiences with the application. I am now analysing the results – that sounds rather grand doesn’t it? – well how about I am having a look at what they said? The test group were a small group but were picked from library staff who would likely to be using Raptor if it is adopted as a service and who would benefit from the  data Raptor can provide. From my own experience with Raptor I think it is obvious that useful data is available and useful charts can be produced either direct from within the application or by porting  data into  Excel. However I don’t think I am best placed to judge this as I have been working fairly regularly with the application for a few months now. On the other hand I didn’t think it would be a useful exercise to just sit someone in front of Raptor for the first time and ask them to give us an opinion as to how useful it is. So we  worked on a compromise.

At the end of last month I ran a short workshop for library staff, going over the main features and operation of Raptor and allowing time for a bit of experimentation and questions. Raptor is installed as a test service at Kent and once placed in the user group these library staff were able to access the application using their usual network username and password (LDAP) from anywhere they were working. I asked the test group to try and find time to experiment with the application over the following weeks and then sent out a link to an online survey asking questions on the user interface, customisation options and the usefulness of the  reports available. The survey featured some straight forward questions abut also asked the users to produce a graph showing authorisations to a specific resource over a specified period. They were then asked to  change the parameters and sort order for this graph and finally to export a pdf of the graph. The survey asked for comments as well as  ranked responses on ease of use and usefulness.

This was a very small test group so we should be wary of too strict an interpretation of the results. However in general the testers gave positive feedback to Raptor. Suggestions were made for improvements to the interface which will be fed back to the developers. At Kent, response times were not always great though this may not be the fault of the software.  More frustrating was the lack of ‘feedback’ during the period after an update had been requested and Raptor producing – or sometimes failing to produce – the requested graph. Users did not  know whether ‘anything was happening or not’.  When updates failed it was not always apparent that  this had happened as the previous version of the graph remianed visible and the Processing Status is not particualrly prominent. These issues are minor ones which I am sure  can be eradicated in future releases – Raptor is still in the early stages of development. Overwhelmingly the group considered Raptor to be a useful tool which would assist them in their work and planning.

 

Standard

Evaluating the application

The living wall of the Grimond Building at the University of Kent, UK.

The new living wall on the Grimond building at the University of Kent

University of Kent Faculty librarians have suggested a number of reports which I have now included in the list of pre-configured reports within Raptor. We have the application installed and running well and available to the library staff who will help us evaluate the service. Last week we held a short workshop to introduce library staff to Raptor and to get some initial feedback. These staff will familiarise themselves with Raptor and then be asked to complete a series of tasks andprovide feedback.

We are approaching the evaluation in two segments –

  • the usability and user-friendliness of the Raptor UI, and;
  •  the usefulness of the resulting reports to library staff in various roles.

The report facility does allow us to configure things to a limited degree but it might be necessary to port data into Excel if users want to have a greater degree of control over the final appearance of a graph. In particular the pdf report can only be edited if the user has a copy of Adobe Acrobat rather than just reader. This is not to say that the pdf report would have no use – for a quick snapshot it is pretty good. Later in the project we will explore how we can integrate the Raptor data with Microsoft Reporting Services which should give us a lot more control over the appearance of reports.

The workshop produced requests for some data that Raptor does not currently collect and probably never will – for instance a measure of how long a user stayed on a particular e-resource site. This would be difficult even if the log files did store such data. We would need to distinguish between users who just had the resource available on a browser tab or alternative window but who were not actually interacting and those who were searching for data or reading. Not so easy.

I am sure a common observation from Raptor users is that the drop down parameters are hard to decipher and there are often too many of them – some of no relevance. Again this may be something we can tackle when we look at how we can integrate with Microsoft Reporting Services.

Workshop participants also identified one or two reports that did not work as expected but I have yet to establish whether this is down to Raptor or to badly configured XML files – which will be down to me….

I am currently working on an evaluation survey to gather opinions on the usability of Raptor for library staff.

Standard

Raptor and Data Protection issues

It is always fascinating in the early weeks of a project to observe how what might seem like a fairly straightforward proposition rapidly spins out its tentacles and gets much more complicated. One of the issues that has arisen already at Kent is that of data protection and long term data storage. The library staff would like to see data trends over periods of time – ideally as long as five years. Raptor, supplied with log files for the required period – has no problem in producing appropriate reports. However the general advice on keeping log files from the European Directive on Privacy and Electronic Communications says we should not retain data for longer than the time necessary to carry out the specific purpose for which the data was collected. Most people would say that log files are created and kept to help troubleshoot problems and to increase security – log files can be analysed to spot unusual patterns of activity. It is pretty difficult to justify keeping data for more than a few months if that is indeed the purpose of collecting the data.  But if we are using log files to help with projections of usage, or to produce reports and graphs which simply analyse access to e-resources do we need to specifically state this is what are intending to do with the data?

The relationship between a university and its users – generally staff and students – is somewhat different to that between, for instance, an ISP and its users. In the former case there is an assumption, even an expectation that data will be retained much longer than six months – in fact in order to comply with education and employment law, organisations are required to keep the data throughout the period of education. But five years is probably beyond what is reasonable.

The guidelines do not state that we cannot keep the data for longer periods but stipulate that the if we do keep it, it must be anonymised.  Anonymised data is less rich – it may not be possible to extract the type of affiliation from anonymised data for instance, if there is no longer a username to look up.  The problem is not insurmountable but this does bring into focus a more general issue with Raptor and its use by non-technical staff.

An ideal tool for the library liaison librarians and library management would be one that could deal with ad hoc reports. By its nature, an ad hoc report should not have to be delayed whilst someone in the Learning & Research Department finds time to create it. The seriousness of this may vary across the sector depending on what level of technical expertise exists among library staff. Some university libraries may well have staff who are comfortable with tinkering with the xml to create new and modified reports or who can anonymise data and import the modified logs for Raptor to parse. But many libraries will rely on their IT department to do this work for them. Whilst IT departments will be happy to provide this service it will not always be possible to respond as quickly as the requesters would like.

I would be interested in hearing other Raptor users views on data protection issues and how they intend to tackle them.

BTW It feels a little mean criticising Raptor but I guess that is part of the evaluation. So I would like to also say that we think Raptor is a really useful tool and was much needed.

 

Standard

Laying the foundations

Following the news of our succesful bid, its down to work –  putting the administration in place, preparing the communication channels – such as this blog page, the sharepoint site and the project website.  The Project Plan and the text for the JISC Project website  will be delivered to our JISC Programme Manager – Chris Brown – by the end of the week .

I am very pleased to have been given the role of Project Manager for this project and looking forward to working with colleagues in the Library on delivering our goals. ARK is a relatively short six-month project  but we are fortunate in that many of the Project Team have already worked on the pilot project and bring very welcome experience to the project.

The UKC Project website will be up and running very soon, giving a summary of the aims and goals of the project and links to project plan and progress reports as they become available.  Here is a paragraph from the website giving a summary of what the project is about.

The JISC-funded ARK project will analyse the impact of, and look at the potential gains from the University’s implementation of the Raptor toolkit. Raptor is a software suite for the accounting of authentication information, primarily designed to assist organisations account for e-resource usage. Raptor reports will improve our understanding of the demand for, and use of electronic resources, such as journals and databases by the staff and students of the University. Raptor reports will allow us to assess the usage of each Academic school and use this data to explore how we can improve internal charging models.

Other work to be done in these early days of the project includes setting up the Steering and Implementation groups and agreeing a schedule of meetings for the duration of the project.

So, in summary, not much to report yet but the project team are very enthusiastic about what ARK can deliver. Prior to the implementation of Raptor the University had been unable to monitor  the use of  e-resources with the level of detail needed to inform strategic planning.  There is little doubt that the impact of the ARK project will be useful, not jsut to Kent but to the wider HE/FE community.

Standard