top of page

4. What Does It Mean?

After analyzing all of these student responses and comparing them to the faculty suggestions, I had a little trouble remembering why I was analyzing any of this in the first place. It’s easy for me to get wrapped up in the tools and the process and forget about why I’m using the tools and going through the process. I reminded myself of the original questions I had and developed answers for those.

1.

WHAT TOPICS WOULD BE WELL-ADDRESSED BY A TUTORIAL?

While topic modeling might not have been super helpful, running each instance of text analysis revealed that none of the student suggested topics needed to be covered in a tutorial. Most of their responses dealt with services and facilities. Our department was thinking more of information literacy concepts.

2.

WHAT FORMAT WOULD WORK FOR WHAT ISN'T TUTORIAL-WORTHY?

The bulk of our current students decided that first-year students need to know information about the library: when it’s open, where it is, and what it offers. These topics could be covered easily in a brief video welcoming students to the library. This could take on a tour format or more of an advertisement format. Whether or not we want to create a video like this is up to the department head.

Essentially, by answering these questions I had met my goals. All I wanted to do was analyze student responses and answer these two questions. I didn’t realize, though, that I would discover some other important information while digging through these responses.

The Tool Must Fit the Text

When I started this project, I thought topic modeling would be the most useful tool for me, and that’s where I needed to spend most of my time. However, I quickly noticed that I wasn’t getting the results that I wanted. Survey responses were too varied for topic modeling, and there weren’t always enough words occurring together to create cohesive topics. While text analysis proved to be a useful method, I was reminded of the importance of using assessment that fits what you are assessing. It is useless to force methods on texts when they just don’t mesh. This can render assessment completely ineffective, a critical point to keep in mind.

Librarians & Students Think Differently

This probably does not need to be said, but comparing the department’s suggestions with students’ responses reemphasized the differences in how librarians and students think about the library. In Education and Outreach Services, we as librarians think a lot about information literacy and how students use the library. We want to make sure that they are able to use and evaluate information in addition to being able to find it. Students see the library not necessarily as their ticket to becoming good information consumers but more as a space for them to spend time, relax, and learn. It is important to acknowledge both of these viewpoints and to acknowledge them both as valid.

Student Feedback Is Valuable

The other (perhaps also obvious) insight was that collecting student feedback is necessary and useful. These students’ responses did not match up with what our librarians suggested, but without the Spring Fling survey, we would not have known. It was also good not just to get students’ feedback but also to include them in this process. We had a project and wanted to know what the students would think, so we asked. Now we know how students think about the library which will hopefully help us better serve them.

© 2018 by Grace Therrell
bottom of page