Well, it's already July and between two courses and work at the Inforum, the summer is flying by. I have recently wrapped up my first big project in my role as summer intern with the Information Services team: analysis and reporting on the results of the 2010 Information Services student survey.
Each year, the Information Services (IS) team at the Faculty of Information conducts a student user survey to help evaluate Information Services and find ways to improve. This year, the IS team tried something different from what they've done in past surveys. Rather than a long list of multiple-choice questions, the 2010 survey consisted of three open-ended questions in which student respondents were asked what they liked (Q3), what they didn’t like (Q4), and what other information they wanted to share with the Information Services team (Q5).
As a result, the data were also pretty different from what the IS team had collected in the past! Responses to each of the three questions ranged from simple, one- or two-word answers to paragraphs of up to several hundred words. Of course, this type of data requires different analysis than does a tally of multiple choice answers. Since I had previous experience doing qualitative analysis (QDA) during my BA and MA, I took on the job of coding and analyzing the qualitative survey data. I sorted the data into themes, assigned codes, and completed analysis using NVivo 8 software. I have to admit that I was disappointed in NVivo. First, the software is only available for Windows machines, which always irks Mac users! More importantly, I was surprised by the limited options for importing data. Why isn't there a solution for importing Survey Monkey results, for instance? Given the wide range of new data sources emerging from Web 2.0 applications, I am certain someone has figured this out. I have begun looking into alternative QDA softwares and will post the results of what I find here.
The results of the open-ended survey (while perhaps more difficult to analyze than results from a multiple choice survey) provided us with a rich description of students' understandings of the work done by the Information Services team. Ultimately, I think it is this sort of qualitative examination of users' perceptions that is vital for shaping strategic planning in service-based organizations like libraries. As web applications like Survey Monkey make online surveys easier, and as more and more students are turning to Web 2.0 applications like Twitter and Facebook to express their thoughts and opinions, it is really only our imaginations that limits the kind of user surveys we do.
The Information Services 2010 User Survey, which was released to the iSchool community on June 27th, summarizes the data collected from the survey and includes the IS team's plans to address students’ issues and make improvements. You can find a PDF version of it here.