artifact: San Diego History Center Survey Analysis
DATA-BASED DECISION-MAKING: Value the use of data as the starting point for professional work
I’ve selected the Survey Analysis Report from our 795 Client Project to demonstrate this competency. Shawn Shepard and I worked closely with Marinta Skupin, the Educational Programs Director at the San Diego History Center (“SDHC”) on classifying opportunities for an intervention. Early in the consulting process we identified a need to get more schools through the door to take advantage of the educational programs the Center has to offer. Because SDHC is just one of many facilities where schools can choose to spend their program dollars, we determined a survey of San Diego Unified School District (“SDUSD”) educators might provide insight into what they expect from supplemental programs. Because SDHC has existing exhibits designed to supplement the social-studies standards for California 3rd, 4th and 5th grade programs the logical choice was chose to target SDUSD educators with experience in those domains.
To date, this has been the most comprehensive collection and analysis of data that I’ve had opportunity to work with to date. We built a solid prototype SurveyMonkey survey and successfully tested it with experienced educators. This facilitated a number of improvements in our final survey. It logically branched in several places, accommodating respondents’ experience and familiarity with SDHC, and we were able to filter and analyze responses by grades where appropriate. Shawn’s vast experience in this area was a great asset, and his facilitation of our analysis helped me better understand the nuances of filtering the data and the overall analysis process.
Challenges & Opportunities
The night after we closed the survey Shawn and I spent almost six hours on Skype analyzing and discussing the 24 survey question and corresponding responses. Towards the end of the evening, punchy but determined to complete the survey, we came upon a set of question responses the meaning of which neither of us could rap our heads around. We compared the data to similar questions and tried to generalize how they related, but could not make a solid connection. We considered what some potential alternative meanings respondents might have read into the question, but found that unproductive. We were diligently trying to fit this lopsided data into our otherwise well-developed story, and I began to dread the possibility that we’d have to reconsider so many conclusions to make this one thing fit. Finally (with what sounded like a shrug) Shawn said, “I think it’s a bad question. Let’s move on to the last one.” I remember thinking “Wait…is that allowed? Are we allowed to do that?”
He was right of course, it was a bad question. It didn’t add to the story we had begun to build, in fact it distracted from it. The question seemed redundant and the response options seemed poorly suited for the question in terms of it's context and where it had been placed within the over all survey. With a shrug Shawn gave us agency to dismiss it. Common sense. But in the midst of studious data analysis, weighing variables and building a hypothesis, I hadn’t considered the importance of that perspective. I don’t know if this sounds like it, but I think this was an “Aha moment.”
After our initial analysis marathon, we split the survey in half and each focused on expanding on our discussion notes from that night by supporting each conclusion with appropriate commentary and graphics. As I page through it, I am proud of the continuity of the report. The analyses flow seamlessly and convincingly build the story we had outlined in our initial session. This speaks not only to what I believe was insightful analysis, but also to the quality design and development of the survey. We built the survey in a logical way, branched the appropriate participants to the appropriate questions and, as a result, collected valid data. This made the process of analysis, if not easy, at least agreeable.
There is nothing that feels imposed on this report, the conclusions and recommendations, while insightful are also logical. This is what I perceive to be responsible data-based decision-making. I feel lucky that our comprehensive front-end work was rewarded by significant and serious responses. This experience has built my confidence in my ability to assess good data, and I now look forward to opportunities to further practice this competency in the field.