Measure for Measure : A Surveyor’s Guide
The AISL listserv recently had a discussion of reading preferences, digital vs print, and someone asked about student practices. I was able to zip over, open a file, and respond that our students have shown a steady preference for print, particularly for leisure reading: 71% prefer print, 8% prefer digital, and 21% don’t care about format. For research, 50% prefer print, 15% prefer digital, and 35% don’t care about format. These ratios have been fairly stable over the past several years.
I had those numbers at my fingertips is because those questions have been included on our annual student survey for the past 5 years. An annual survey is a challenging tool. It’s tricky to build, and arguably trickier to interpret, but it can provide supporting information to help craft your curriculum in the most useful direction, or to make an iron-clad case for a much-needed capital improvement project.
We use SurveyMonkey to create our surveys. Our school has a subscription so we’re able to incorporate useful ‘advanced’ elements like Logic, which allows respondents to be routed to different sections depending on how they answer multiple choice questions. Our 2017 survey was sent out last May 16. We sent the survey out via all student email, following that up with a reminder on the 22nd. This year we had 339 responses, out of an Upper School population of 970. The number of respondents fluctuates yearly, and we’re always trying to increase the number. In 2013 we had 260 responses, in 2014 we had 382, and in 2015 we had the highest number of responses ever with 474. That number has been dropping again. I’m thinking adding a (candy?) reward for those who fill out the survey may bring more responses. It’s clear we also need to work on our promotion of the survey.
There is always need to achieve a balance between creating a survey that is comprehensive and one that is brief enough to be quickly completed. Our most recent survey had 27 questions divided into 3 different sections (or ‘pages’ in SurveyMonkey lingo). I’m not confident in our arrangement of pages, but here’s how we do it for now.
* Overall Library Experience includes questions on what year you are, what libraries you use, do you ask a librarian for help with specific resources, (that one includes an ‘Other’ box), How could your experience be improved, has it been ever too noisy, and Silent Study use. Silent Study is our ‘absolutely silent’ room with 36 carrels that is used for study and testing. When Silent Study works well, I’m not so worried about noise in the rest of the library, so this is a vital question to track
* Library Resources asks if students have skills required to effectively search a database such as JSTOR or ProQuest: (yes, no, sort of and a box for “please explain”), how easy or hard was it to manage citations, do students buy books for research (and if so, why?), do students prefer print or digital, and have students ever taken a book out without checking it out (and if so, why?).
*Recreational Reading explores whether students use library materials for leisure reading, which format is preferred, and if students were aware the library had various materials available. This section also has a box for “any suggestions for books, magazines or other resources”, as well as a box for “anything else you’d like to comment on not covered in the survey”.
As I look at it now, I can see a number of changes we may incorporate for our next survey. For one thing, by naming these pages in this way we may be causing students to alter their progression through the survey. By calling the last section “Recreational Reading” we may inadvertently lead students to quit early, thinking they don’t do anything called “Recreational Reading”, so they don’t need to continue.
Interpreting Results
Each year I comb through the survey results for useful data. It’s important to be consistent with some questions because then you can compare responses from year to year. If you alter your questions too dramatically, you lose the ability to gauge changes over time.
Like other librarians, we hope to provide a positive UX: User Experience. Many survey questions reflect how students interact with library staff. With one question, we were noticing a reduction in the number of students who would ask librarians for help from year to year. My first reflex reaction was that we must be scary librarians and don’t encourage return business. Then I noticed the following answer in a ‘tell us more’ box: “During one of our history classes, a librarian came in and told us how to use the catalog and datbases, so I did not have to ask for help”. A carefully crafted survey will help tease out reasons for the answers students give.
Another caution: before wigging out at one specific negative response, look at the numbers. Even though 339 responded to the survey, it might be that only 8 people answered a question. Data may show that 25% of students don’t like a particular thing, but if only 8 students answered that question, and 2 didn’t like that thing, then that would account for a 25% negative response.
Acting on Results
A few points of action come to mind as I reflect on our surveys. When we started seeing student comments mentioning discrepancies between teachers’ instructions and librarians’, I knew I needed to touch base with the team leader to clarify where we may have strayed apart in our presentations. It turned out that a new teacher had not been managing things the same way as the other teachers.
Another important point came from a question asking whether students felt they could successfully search a database. This number responding ‘yes’ was lower for Sophomores, higher for Juniors, and – surprise—lower again for Seniors. Looking at this facet of the survey, it makes perfect sense. We teach information literacy to our Sophomores and Juniors, but we don’t teach Seniors, in large part because many teachers assume (falsely, it seems) that Seniors ‘know this stuff’. We can use these details as support for adding Seniors to our curriculum.
On a practical note, last year’s survey indicated many students wanted more soft seating, so we got 2 additional beanbag chairs. They’re a big hit. We also had students asking for more carrels. No, we didn’t go out and buy more carrels, but we are more aware that they are a valued commodity, so we patrol more often and keep students from ‘claiming’ a carrel as their own private territory between classes.
Reflections
Each year I learn more about our library, our program, our strengths and our challenges. It is important to have a staff meeting to discuss the survey soon after it is complete to take note of how the survey worked ‘this year’, discussing about how it might be changed for next year. A final piece of advice: never ever send a survey out on a Friday. Ever!
Our survey is one of the most important tools we have to help improve our library. Do you have a survey you find useful? Please share it here. I’ve included a link to the questions on our 2017 survey. Let me know what you think.