Home › Forums › Effecting change and the roles of the subject teachers and teacher-librarians / librarians in inquiry › Measuring impact
Hello, I have attended Elisabeth’s webinar (second part of Information Literacy) and she suggested posting questions here. Apologies if I have ended up in the wrong place. [Moderator note: Topic moved to this forum from Skills, skills frameworks and stages in the FOSIL Cycle on 8th May 2020]
My question is: does anyone “measure” the effects of Information Literacy on students? In my albeit limited experience I found the schools have to measure because the bottom line is grades. I’m just wondering what everyone does.
I was thinking that any or all of the following might apply:
N.B. I had originally posted essentially the same question elsewhere in the Forum but now following the webinar I have decided to post it here…
Hi Vittoria – sorry I didn’t manage to answer your question before. It is an important one, because it’s very easy to get caught up in doing what we do because we think it ‘ought’ to work, without actually making sure that it does. Also because, as you rightly point out, schools are often very interested in measuring impact and it can be an important (vital?) advocacy tool to be able to ‘prove’ that what we do has an impact.
On a national and internation scale, there have been a number of academic studies into the impact of school libraries (particularly in the States) which include “School Library Media Programs” (which usually include what we would term Information Literacy) e.g. the Colorado studies by Keith Curry Lance et al. This 2017 National Literacy Trust research report into School Libraries is quite a good summary of the research (although a few years old now).
But I don’t think I am answering your question though – which is what we do – largely because I don’t actually do as much as I would like when I sit back and think about it, because this important task gets lost in the everyday busyness (until it’s too late and you suddenly find you need evidence that you don’t have!).
On the understanding both that I would like to (and fully intend to…) do more in this area, and that we have travelled so far down the road of Inquiry now that I find it almost impossible to think about assessing Information Literacy as something separate from Inquiry as a whole, here are some things we have done:
These are mostly qualitative measures – they are about staff and student attitudes – and while this is important, quantitative data is also an important tool. I don’t think exam results are necessarily helpful to us as individual librarians, because it is too hard for us to control for other factors and say that any impact on results is down to our intervention (the Lance studies did look at standardised test results but they were able to do this because they were large studies across a whole state). However, it should be possible to look at pieces of work where we can pick out skills we know we have taught and analyse our impact that way. One study I would love to do (and keep threatening to do when I have the time!) is to look at research essays produced in the first few weeks of term as part of our Year 12 induction and analyse the technical aspects of those (e.g. formatting and citing and referencing) both to assess the level of technical competence and to see if there are any differences between our ‘home grown’ students and new joiners. This might give us a sense of how effective our embedding of these inquiry skills is further down the school (depending, of course, on what is or isn’t being done elsewhere!).
In 2011, when I did my MSc in Information and Library Studies, my dissertation was on “The impact of Library interventions to support the IB Extended Essay at Oakham School”, and I did a similar thing in a sense. I compared the technical aspects of the Extended Essays our IB students produced the year before a significant change was made to our support program and the year after. Although it was a small sample, there was evidence that the new interventions had had an impact, and that they were well-received by students and staff. This was a useful advocacy tool for continuing and expanding our provision.
I think that over the next few years it will be important for our Library to start pulling all these disparate bits of feedback together into a single ‘Annual Report’, running alongside the traditional Library measures such as usage statistics. In an economic downturn where school budgets are bound to be even more squeezed it is more important than ever to be able to demonstrate clearly how we support the mission of the school and add value – because I believe passionately that we do, but it is often more challenging for us to demonstrate that than for other academic departments because the value we add is across the whole school and cannot easily be measured by (although almost certainly impacts upon – as Lance demonstrates) subject-based exam results.
Thank you for prompting me to think that through ‘out loud’ – sometimes we need a prompt like that to spur us on into action! I would be very interested to hear what others do. I have considered, on occasion, a September/July questionnaire for a year group, but wonder how good the uptake would be – perhaps it would work best if we could tie it to particular events/workshops/courses/lessons that the Library delivers? Does anyone else do a September/July questionnaire? With which year groups? How do you get students to respond?
[Note: I have moved this part of the thread into the ‘Effecting change’ forum because I think that is probably where ‘impact assessment’ belongs. Although I completely understand why you posted it where you did following the webinar, it doesn’t really relate to Anne’s question about priority standards]