Oh Wow! Handling Unexpected Results

Think for a moment about a time when someone was sharing results from an assessment project at a professional conference or division-wide meeting on campus. The results were neatly packaged, and told a cohesive story. The results, both positive and negative, were explained rationally and were coupled with clear implications and suggestions. Despite the prevalence of these stories, assessment results do not always come back as expected and instead fall under the umbrella of what is commonly known as an “Oh Wow! Moment.” Unexpected results can come in many forms. Sometimes the numbers look different than previous years, or they contradict best practices or what the department has come to embrace as conventional wisdom. Other times the data show no improvement, despite numerous programs that should have an impact, or students express dissatisfaction and displeasure. Unfortunately, unexpected results are often not discussed, and when they are shared broadly, confusion or frustration about the process is often glossed over. Because the work needed to make sense of unexpected results is often complex and because unexpected results often raise emotions, sharing them may require political savvy, additional information, or alternate strategies. When college union and student activities professionals are facing unexpected results, decisions must be made related to if, what, and how to share results with broader audiences. This article describes strategies for facing unexpected assessment results and making decisions related to those results.

The Oh Wow! Moment Defined and Described 

Assessment projects start off with goals and objectives, the methods are established, and data are collected and analyzed. The “Oh Wow! Moment” comes during analysis. Briefly defined, unexpected results are somewhat obvious in nature, unforeseen, and concerning, and they can include anything surprising, good or bad. They may not be what the department wants, they may not match what everyone thinks they should be, and they may just not make sense. 

Regardless of their form, unexpected results usually draw attention because they are unexpected. But they may also trigger confusion, panic, or, in high-stakes environments, dread. The results set off a process of reflection, exploration, and decision making that may be unsettling, so having strategies to use in these situations will make the process easier to manage.  

Build a Core Team

A primary strategy for handling unexpected results is to have a trusted team. The right individuals can help think through not just the results, but also their implications. A core group does not need to be a formal team, nor do the individuals have to meet as a group, but any professional facing unexpected results will benefit from having a core group for support. The potential members may vary depending on the project, but a valuable team should include those with specific dispositions, knowledge sets, and perspectives. 


Having a core team of members who reflect various dispositions and qualities can bring stability and confidence to the process, so look for team members who are calm, thoughtful, and pragmatic. These qualities ensure that emotions do not cause panic nor a rush to judgment. These group members should also be trusted so that discussions can be transparent without fear of consequences. Team members should be curious, ask questions, and explore alternatives. Ultimately, qualities such as calmness, thoughtfulness, trustworthiness, and curiosity will encourage smart thinking and better choices when dealing with unexpected results.  


A core team should also include good coverage of specific knowledge areas. At least one team member should have a solid understanding of the research and analysis methods used in the assessment project. This person will be able to raise questions related to methods and propose alternative explanations for results based on the methods. 

Some team members should be subject matter experts with a deep understanding of the content of the assessment. They may have even helped design the assessment. When reviewing the results, these people can ask questions about content issues that could impact results. For example, if the assessment is related to training of student leaders, the group should include someone familiar with the goals of the training, the theory behind the training, the topics covered, and the methods used.  

If the project involved student participants, then the team should have a person with a deep understanding of the student experience. That person can describe findings from the student perspective, including alternate understandings of interview language, survey questions, or student quotes. 

Finally, the team needs a person who knows the broader context of the assessment. Specifically, someone needs to understand not only how the assessment was conducted but also the campus context in which it was conducted.  

All four of these subject areas—research and analysis, content, students, and context—could affect assessment results, so having a core team with deep knowledge in each area could help make sense of unexpected results. 


The third item to consider regarding a core assessment team is perspectives. Three perspectives should be represented in the team: a big-picture thinker, a detail person, and a challenger. These categories are common in many discussions and are vital to ensuring effective discussions of unexpected results. The big-picture thinkers ensure that results are evaluated for usefulness, importance, and fit with department goals. The big-picture perspective can even stop issues such as analysis paralysis and dilemmatizing. The detail-oriented people play an important role because of their ability to dig into results. They may find nuggets that explain high-level unexpected findings or uncover deeper patterns that point out which results are anomalies and which are widespread. The role of challenger is critical, and not just to avoid group-think. They use questions to challenge the traditional narratives that reinforce what was done in practice or during the assessment project. Big-picture thinkers, detail people, and challengers bring three unique perspectives that can prove insightful and move conversations about unexpected results into productive territory. 

Overall, the collection of smart, thoughtful, curious, and calm people to discuss results is good practice for any assessment project. Unexpected results and particularly the “Oh Wow! Moment” can feel lonely, but working with a group can be both supportive and productive. They help think through results, explore alternatives, and can preview the range of issues that may come up if results are shared with a broader audience. 

That said, college union and student activities professionals will need more than a core team: Having a set of specific strategies for thinking through the actual results also is important. 

A campus official was dismayed when she saw much higher binge-drinking rates on her campus than the national averages. In her case, looking at the respondent demographics clarified the results. Her respondents were traditional-aged college students; few students were over the age of 25. The national data, in contrast, included a much higher proportion of students 25 years or older. So, when she looked at only students under 25 years old, the results were comparable. In another instance, a college union professional was surprised at results that indicated few students used the bookstore. He received the results at the beginning of the semester, and the results did not match what he was seeing. Once he determined that the survey specifically asked about visits in the previous week and had been given at mid-semester, the results made more sense. 

Explore the Measures and Methods 

When making sense of unexpected results a good place to start is to look at the assessment measures. Assessment results are often presented with broad labels or themes, or the language may contain words specific to program goals or initiatives. If the data were collected using a different language and the results don’t make sense, examining the exact measure—how the question was asked—may provide clarity.  

For example, results may talk about perceptions of “personal competencies,” but specific questions may refer to a variety of topics, including articulating personal values, awareness of limitations, and awareness of talents. The results may be unexpected if assumptions about what is being measured do not match the specific questions, so survey language can influence the results.  

Survey structure can also affect results. For instance, survey results may say that 20% of students aren’t aware of the student activities offered by the university. Yet, if the survey question is part of a branch and only asked of students who have not participated in student activities during the current academic year, that 20% may only represent 5% of the respondent group. If a student activities professional expects 5% and sees 20%, it is unexpected but easily explained. Similarly, with qualitative assessment results, broad themes are usually presented, so unexpected results may be a misunderstanding of the theme. Examining specific quotes may be useful to ensure a common understanding of the broad theme. 

In addition to looking at the measures, college union and student activities professionals should consider the methods: Unexpected results may come from assumptions about who the participants are, assessment timing, or other methods choices. 

Unexpected results can sometimes be clarified by examining the measures (how something is asked, operationalized, or coded) or methods (how the data is collected and from whom).  When these strategies do not help, then college union and student activities professionals need to turn elsewhere. 

Consider the Context

Professionals may benefit from more contextual information that can come in the form of broader campus context or in small microcosms. For instance, a student activities professional who was looking at longitudinal survey results regarding leadership training for student leaders noticed one year had lower levels of participation and collaboration among leaders. The year in question was one in which the department had been short-staffed, so her knowledge of departmental history helped her make sense of that result.  

Serious incidents such as recent deaths or acts of violence may be of concern. But smaller scope issues, like a union vandalism incident, can also provide context to unexpected findings. A single case of vandalism in one restroom may seem small in scale, yet it may explain unexpectedly low responses on a facilities or safety survey. Similarly, responses by program may vary greatly in unforeseen ways after an incident if student leaders handled the situation in different ways. It is not unusual to see open-ended responses that describe these situations; unexpected results may be tied to the incidents or efforts of student staff.  

Dig Deeper & Find Alternative Explanations 

Because unexpected results come in many forms, sometimes they raise further questions, so more digging is required to find an explanation. Sometimes high-level results mask issues that can be spotted when professionals dig deeper into the data, so results broken down by subpopulations can tell a much different, more nuanced story. Similarly, student union and student activities professionals also need to consider alternate explanations. 

In one case a student activities professional was shocked that perceptions of the student organization membership process dropped drastically after implementation of a new system that allowed students to sign up as members of an organization via a smartphone application. The professional had expected the new system would improve perceptions since the previous process required hard-copy membership forms to be delivered to the campus activities office by student organization leaders. 

The first potential explanation was relatively straightforward: Perhaps the new system was not effective. But discussions with colleagues suggested three alterative explanations. First, it was possible that the lower scores were the result of student organization leaders being unhappy about learning a new process. Second, perhaps the new process had hiccups in the rollout and the challenges of implementing a new process caused dissatisfaction. If the rollout were part of the issue, then those challenges could be addressed in the future. The third alternative explanation centered on expectations. Student expectations of any process involving paper forms were likely low. Students expected the process to be clunky and inefficient, but as long as the process worked students were happy. 

Moving to a web-based system could raise the bar regarding expectations and if student leaders expected Amazon-like interactions and instead found a basic system, then dissatisfaction would be expected. In this example, alternatives provided depth to the explanations and calmed the nerves of the professional as preparations were made to share the results. 

A committee was surprised to find that the response rates to a yearly union survey had dropped 15% compared to the previous year even though the survey administration plan (who, when, why, and how) was the same.
The committee was at a loss until they realized that a second survey related to student activities had been implemented at the same time as the union survey. Some students did not realize there were two surveys and only completed one. The low response rate was easily explained in the context of competing efforts. The following year, the department coordinated the timing of the two surveys and split the student population to ensure there was no overlap.

But Really, What Do I Do?

Fundamentally, college union and student activities professionals with unexpected results face decisions related to sharing and acting upon those results. The first decision is whether to do anything. The unexpected results could simply be ignored. On the surface, professionals may think that the ethical choice is always to be transparent, but there are scenarios when ignoring an unexpected result is not only viable but also appropriate. 

Ignoring unexpected results may be an appropriate strategy if there are contradictory results. If one data point runs counter to others, then focusing on that one data point may be misleading. Additionally, unexpected results may be tangential to the goals and missions of the department, the programs, and the students, so highlighting results may distract departmental efforts. A national survey may include questions that are not applicable to or within the purview of the department, so the results of those questions may not be useful. For instance, some college union departments have limited influence on the prices that restaurants in the union charge, so sharing results related to dining offerings or pricing with dining services may make sense, but sharing them broadly may not be productive. 

Professionals may choose to ignore the unexpected results temporarily while waiting for more information. A smaller group may be exploring the methods or measures, gathering context information, collecting more data, or discussing alternative explanations. Waiting until the full picture is in place may be more responsible. 

When college union and student activities professionals decide that acting on unexpected results is the correct course of action, they still have options. They can share the results, recommend collecting more data, and use it as a way to ask for outside input. A student activities professional surprised by student dissatisfaction with leadership development opportunities could share the results with student leaders and student organization advisors and use the process for validating the findings and gathering more qualitative information. 

In other situations, collecting additional data might require a more formal process. The unexpected results could prompt another assessment, or even a follow-up by focus groups to understand the reasons for the results. In the case of the new student organization membership process, focus groups might uncover implementation issues or high expectations. Professionals can use unexpected results to prompt further research. 

A second option for action is to pass the issue. In short, college union or student activities professionals may decide that the unexpected results should be passed on to others who can act on them. This could be another department, level, or committee. The other group can review and interpret the results or act on them. For example, results related to campus safety concerns may need to be passed to facilities, the campus police department, or a campus safety committee. Student activities professionals may decide that unexpected results have implications related to the responsibilities of other groups such as custodial and maintenance staff. In those situations, once the owner of the assessment has made sense of the unexpected results, they would likely pass along the results to other stakeholders.  

College union and student activities professionals who have dug deeper into unexpected results may decide to explain the results using additional insights. In the case of the department with a drop in response rates, they shared not only the response rates but additional information about the reason for the drop and the resulting actions. In subsequent years, their displays of longitudinal response rates always showed the single year with the drop and their explanations. This practice of sharing unexpected results, the explanations, and subsequent actions is common in conference presentations. Unexpected results become a point of discussion. 

When a college union heard from student leaders that some students were complaining about the ‘cliqueness’ of college union activities, the department undertook a climate survey and was puzzled to find positive results related to diversity, social integration, and sense of belonging as a result of college union activities. Since this ran counter to the complaints, the department worried about sharing positive results without digging deeper. Additional analysis to explore results for subpopulations identified lower responses and important differences across subpopulations.

Beyond sharing information, professionals have two more action-focused options. One is to present the information and, in essence, fight it. A student activities professional with the drop in satisfaction regarding a new student organization membership system may choose to share the information. But in doing so they may provide context, including the reduction in staff time, better collection of data, and the increasing usage, with the discussion emphasizing the transition to a new process, the benefits of a new process, and the early indicators of improvements. The professional may argue that data from the first year of implementation would become a baseline to measure future performance against and describe what “might look like negative results” within the context of bigger wins.  

A final option that college union and student activities professionals have when facing unexpected results is to make changes. The results can be used to challenge existing thinking and shape next steps. Comparisons among organizations can be used to draw attention to bright spots that demonstrate high results are possible. Changes could focus on learning what the bright spots are doing and spreading those practices.  

Unexpected results may expand the range of ideas being discussed. If a campus had a high percentage of respondents who chose “other” when asked about why they use the college union, the typical categories were missing some perspectives. Because the survey contained the option to elaborate on the “other” option, the professionals used those results to provide recommendations to improve outreach and marketing to students and to get staff talking about new issues and perspectives.  

A student activities director was surprised when an assessment found that student leaders wanted communications from the student activities department via email versus receiving information and updates via social media, which the department had been trying to do in order to better reach student leaders. Student leaders said important information was getting lost in the mix of other social media posts, which prompted changes in communication flows and an awareness of the need for continually monitoring the needs of student leaders. 

Unexpected results often prompt an “Oh Wow! Moment,” but they can also prompt new thinking and spur action. Having concrete strategies, like building a core team, examining the methods and measures, understanding the context, digging deeper, and exploring alternative explanations can help professionals respond constructively. Simply understanding that the “Oh Wow! Moment” is not a unique experience can help calm emotions. Regardless of the action taken, college union and student activities professionals should think of unexpected results as opportunities. 


  • Sherry Woosley

    Sherry Woosley, Ph.D. is the director of analytics and research at Skyfactor (formerly EBI MAP-Works). She drives the research and analytics for the Mapworks Student Success System as well as more than 50 Benchworks national program assessments.

    View all posts
  • Matthew Venaas

    Matthew Venaas, M.S.Ed., is a research manager on the Analytics & Research Team at Skyfactor. In this capacity, he is responsible for building and delivering educational webinars, research notes, and reports related to analysis and findings from over 50 national benchmarking assessments and the Mapworks student success and retention system.

    View all posts