Gathering Feedback from Patrons

This blog post was written for STAR Net by NASA@ My Library Mentor Atlas Logan

Having worked in a public library setting for quite some time I can attest to how difficult it can be to gather meaningful feedback from patrons. There can be even more added pressure when a specific type of data is needed to fulfill grant reporting requirements.

Most libraries keep track of basic statistical information such as the number and types of programs being provided, target audiences, and how many people attend those events… but what about collecting the more qualitative data, such as direct statements and evaluations of patron experience? These can be just as important as quantitative data and provide you with impactful success stories that you can share with partners and stakeholders.

Building the rapport to gather customer feedback and evaluation can begin before the event has even occurred. In order to make sure you are offering the programs and services needed by your patrons, consider hosting a community dialogue event. Use these conversations as an informal way to understand your community members and find ways to better serve them. These dialogues can also be used to generate buy-in from your community who may become future attendees at your events. Keep the informal conversation going before and after your programs; get to know your attendees in a low pressure environment. This is a great method for building rapport and gathering information about what they might need, how they felt about the event, ideas for improvement, and more. Consider the use of a talkback board. This is a simple strategy that involves providing a board or area with prompts that you’d like responses for.  Examples prompts may include things like:

  • I came to this program because…
  • The most useful thing I learned was…
  • I’d like to learn more about…
  • Next time I’d like to see…

Make sure to leave sticky notes and pens out so that your patrons can anonymously leave brief feedback responses to your prompts. For all of these types of informal conversations, I have found it helpful to maintain a document, file, etc. (I use a Google Form) that can be quickly accessed after the interaction so that you can record any relevant information quickly and accurately. 

These methods of gathering feedback are certainly useful and help to build connections but, depending on your needs or reporting purposes, you may also consider more formal interviews or documentation, such as surveys and other evaluation tools. While these can seem intimidating, there are plenty of tips, tricks, and resources freely available to make the process easy and effective. 

For the administration of surveys or evaluations, there are definitely a few basics to consider. First and foremost, keep it simple and easy to use. Your patrons are more likely to complete a questionnaire if they can easily understand what it is you are asking and can anticipate that it will not take long to complete. Whenever possible, provide your evaluation tools in the languages most needed by your community. For both surveys and interviews, utilize open-ended questions when possible and allow an opportunity to provide general comments so that you can get direct statements from your patrons. 

Free library-geared evaluation tools to get you started include ALA’s Project Outcome and Library Research Service. If you are interested in tools with more of a science background, the Center for Advancement of Informal Science Education (CAISE) has a great five part primer on evaluation in informal science education. Each page has extensive resources on how to design and implement evaluation research.

Another potential science programming evaluation tool is Chapter 12 of UNC Chapel Hill’s open online textbook Picking Up STEAM: Science, Technology, Engineering, Arts, and Mathematics Instruction in the Public Library. This particular section provides a wonderful introduction to evaluation and assessment in these types of events. 

Other free online resources to explore for evaluation templates include, but are not limited to:

Citizenscience.org’s Research and Evaluation Group

STEM Learning and Research Center

National Informal STEM Education Network

A few final thoughts and considerations on gathering feedback for program evaluation and improvement, especially in light of increasingly virtual interactions with patrons: if your event requires registration, consider collecting email addresses so that you can send a follow-up to attendees requesting feedback. If your event is fully virtual, consider capturing any comments received in the chat or recording the session so that feedback can be transcribed.

Last, but not least – once you have that feedback, don’t be afraid to share your success stories with others! There are many reasons beyond grant reporting requirements that make qualitative data collection useful. Share with your community, partners, stakeholders, those who may provide potential funding opportunities, and anyone else who may benefit from hearing about the amazing science programming taking place at the library!

Related Articles

Responses