Without evaluations we would not have a clear understanding or definition of success. To know if something succeeded we must identify the original objective, define success with clear markers using milestones and timelines and then assess if we obtained our original goal and if it was worth the time, budget and effort. As Lankes says in The Atlas of New Librarianship: “The mission of librarians is to improve society through facilitating knowledge creation in their communities.” This means keeping end users and service outcomes in mind while evaluating and assessing how relevant and capably we met our goals and served our community. To question ourselves and our services is to build the best user experiences and end product along with curating trust and relationships between all stakeholders- internal and external (McDonald, 2018. p180).
Many of my classes at SJSU prepared my understanding of this competency: in INFO 210 I evaluated and analyzed multiple information transactions using the RUSA guidelines by visiting different libraries, observing and evaluating their services. In INFO 285 I created a research paper focused on evaluating the efficacy of a reading program that utilized canines. In INFO 204 we created a SWOT and Environmental Scan with specified goals, and evaluation measures using the mission and vision statements as grounding points. For INFO 202 my group evaluated and did a re-design of a library website, looking at the end user needs as one of the evaluation criteria, and for INFO 282 Project Management I evaluated the efficacy of PM software assessing it as a tool for organizing, while comparing it to the ultimate goal of a successful project. Evaluation is something I will continually do, and having this process of identifying original goals, having those milestones and timelines as markers and mission and vision statements as standards is something I can now do going forward.
As Barefoot (2018) talks about a leader’s self diagnosis during a change in management, evaluation is as an organization’s self diagnostics to make sure things are running smoothly and effectively to ensure the end goal and outcomes are being met. It is also an opportunity to explore if there are ways to improve services or work product, serve a wider audience, fill in gaps, and streamline outdated processes (Barefoot, 2018. P 253). To best serve all patrons as well as all staff members, while creating the best work product and a healthy work environment evaluation and assessment must take place. This includes assessment with criteria tying in mission and vision of the organization, and agreed upon measurable data points. According to Gilman, there are two primary duties of assessment for academic libraries: quality assurance, and institutional improvement (2018. p 90). We must have continuous assessment to remain relevant to our communities and each other and usually these assessment efforts in academic libraries are focused on: reference, information and technology literacies, collections, research needs, programs and services (Gilman, 2018).
To evaluate anything we must have agreed upon set standards, preordained to measure and compare with past and future data collected. The process starts with identifying the purpose of the service, assessing the user’s needs, designing a service to meet that need, assigning criteria to measure how the need was met, the level of user satisfaction, and how effective the service was. One of these examples is the Reference and User Services Association (RUSA) guidelines to better serve end-users by continuously evaluating services in the world of information that is continually evolving (ALA, 2023). RUSA breaks their guildines down to general, in person, and remote interactions with five key criteria markers: visibility/approachability, interest, listening/inquiring, searching, and follow-up. Measurements for success can be gathered with many different tools ranging from surveys, interviews, focus groups, observations, feedback cards, and usability statistics to name a few. The ACRL and ALA approved a set of standardized statistical measures and metrics in 2017 for archivists and special collection libraries for public service data, to have some commonly accepted measures to evaluate best practices. These standards go over user demographics, reference transactions, collection use, and events (ALA, 2018). There are also set procedures for evaluating an employee, and performance evaluations are common among all information professions. To conduct a performance evaluation I would start with the job description and duties, the “goal” of the job, have a formalized set of criteria including attendance, punctuality, completion of day to day tasks, work product completion and participation in group projects. There should be clear transparent goals, and rewards based on merit to motivate, along with clear stated outcomes for future action (Goch et al., 2018. P 271). Regardless of if the evaluation is being done internally with fellow employees, or externally on a service with the end user in mind, the reality is that feedback is essential for all of us. Evaluation is essential to our continual evolution as an industry and as individuals.
Description: For INFO 210 we had multiple assignments where we were asked to visit different libraries, observe and evaluate their services based on the RUSA guidelines.
Justification: No matter what library I work in, I will need to understand the RUSA guidelines and keep them in mind when evaluating services in different information environments. Understanding the general visibility and approachability, interest upon which you interact with patrons, listening and searching and follow up are all elements that can relate to any information environment and should be kept in mind as I go forward in my career. As a person with eighteen years of customer service experience I know the importance of quality customer service. These RUSA guidelines are for patron retention and stakeholder engagement. Evaluating services is for the sake of the end user and for organizational efficacy and should be viewed as a continued beneficial process.
Description: For INFO 285 I created a research project around evaluating the OCPL Paws to Read Summer Program. This includes an introductory history to the end users of the OCPL, their needs, and history, an annotated bibliography to evaluate current research on the efficacy of using canines in reading programs, a method section where I describe how to do purposive sampling, what data collection instruments I will use, and the procedure for collecting the data.
Justification: At some point in my career as an information professional I will need to justify a program’s funding, or evaluate if a service is meeting the needs of my community served, and this article of evidence shows how I could define the end user’s needs for the service, evaluate what others in the field have done, design my own method of evaluation with measurable criteria and data collection using surveys, and how to ethically conduct the evaluation. After doing this evaluation I would need to compare to past data, as well as run this process again at scheduled intervals to truly have actionable results.
Description: For INFO 204 my group created an organizational analysis of Highlander Research and Education Center with an environmental scan, and a strategic plan all compared to the mission and vision of the organization. I was team leader for the whole semester providing scaffolding for all the projects. I assisted in editing all three articles from start to finish, as well as contributed to the legal section of the environmental scan, the SWOT Analysis. I edited the mission and vision section, wrote the Executive Summary and Goal 2 of the Strategic Plan.
Justification: This involved utilizing all elements of evaluation from the SWOT of the organization, to the assessment and rewrite of the mission and vision, to creating a timeline with actionable goals and criteria for future measurement. This process is done across many different information institutions and I can now replicate this process.
Description: For INFO 202 we chose to evaluate and do a possible re-design of the Auckland Libraries’ website. I wrote the executive summary, was team lead for this group facilitating group discussions.
Justification: This involved identifying the end user, and a lot of conversations to discuss best practices, and usability. After having stated what we thought were criteria to evaluate based on accessibility and usability we did a re-design of their website.
Description: For 282 Project Management I created an evaluation of a software for project management: Monday.com. I started with my own bias and vendor reputations when searching for project management software, evaluated these based on ease of use and pricing, along with compatibility with other software.
Justification: There is so much technology available to someone like myself, and it can be overwhelming. So once I have identified a problem such as a need to have software for effective project management, I will need to actually look at different software vendors and evaluate which one works best for my intended purpose. Just like evaluating a service against a mission or vision statement of an organization. This is an example of that process. I started with Monday.com’s reputation as a vendor, looked at pricing scales and ease of use and what my needs as a project manager would be. New technologies are always coming out, and it is possible this process of identifying, using and evaluating new technologies will be a constant in my life as an information professional.
Evaluation is essential to the growth and evolution of all organizations. Without it libraries will cease to be relevant or useful to their people. Feedback and assessment of what we do not only can act as a foundation for advocating for future budgets, but also to see where and how we can best meet the needs of our community and how we can do better.
American Library Association. (2023, January 24). Guidelines for behavioral performance of reference and information service providers. http://www.ala.org/rusa/resources/guidelines/guidelinesbehavioral
American Library Association. (2018, January 7). Standardized statistical measures and metrics for public services in archival repositories and special collections libraries. https://www.ala.org/acrl/sites/ala.org.acrl/files/content/standards/statmeasures2018.pdf
Barefoot, R. (2018). Change management. In S. Hirsh (Ed.), Information services today: An introduction (2nd ed., pp. 246-254). Rowman & Littlefield.
Gilman, T. (2018). Learning and research: Academic libraries. In S. Hirsh (Ed.), Information services today: An introduction (2nd ed., pp. 81-93). Rowman & Littlefield.
Goch, R., Haller, B., DiStefano, D., & Mackenzie, M.L. (2018). Managing personnel. (Ed.), Information services today: An introduction (2nd ed., pp. 266-277). Rowman & Littlefield.
McDonald, C. (2018). User experience. In S. Hirsh (Ed.), Information services today: An introduction (2nd ed., pp. 81-93). Rowman & Littlefield.