top of page
Search
Writer's pictureDr. Scott Dueker

The Perils of Survey Research for a SCRD Guy


Thanks to the COVID pandemic, research with human subjects ground to a halt. It just wasn't safe to work with the public in the usual way. My bread and butter is individualized interventions aimed at improving a skill. I've done a lot of different studies that include anywhere from 3 to 7 kids. My team would work with them one-on-one to deliver interventions and record data. But then the world was plunged into a global pandemic and life just sort of stopped.


As someone that needs to conduct research as a job requirement, I needed to figure out a way explore my interests without interacting directly with anyone. Enter, the survey. We've all taken them. In education research, we want to assess the knowledge of people about certain topics and identify areas for further study. This type of research is generally foreign to me, from an execution standpoint. Because I need to continually publish, I had to consider doing some survey research. Fortunately, I did have a topic in mind.


I wanted to explore the use of procedural fidelity measures in clinical settings. We do use these measure pretty extensively in research, but I couldn't find much information on how clinicians regularly use PF measures. So, I started putting together questions I wanted to ask. This was the easy part. What I underestimated is how difficult it would be to make the questions flow logically. Having never actually created a survey before, I wasn't sure how to organize things. I did a little research and came up with some ideas and ended up with a decent survey that I shared with a couple of colleagues to check for flow.


Distribution of the survey also became challenging. Because this study was not funded by anyone, there was no money available to purchase access to pertinent mailing lists. My team had to use social media and individual cold emails to recruit participants. After a few hundred emails and multiple postings on different social media, we began getting responses. Having no real expectations about just how many responses we should receive, we were pleasantly surprised to see the numbers climb to over 200. Given the hodgepodge nature of our solicitations, we had no lofty expectations.


After we closed the survey, we began the data analysis. Demographic and simple descriptive statistics came easy. SPSS broke this down nicely for us. It was the correlations that sort of tripped us up a bit. Because I generally use single case research design (SCRD), visual analysis of the data points is how we determine effects. Survey data was a new animal for me and the team. We did some simple correlations between related questions. But then we wanted to dig a little deeper and analyze some responses based on certain demographic responses. This required some regression analysis. And that was a brand new world.


It is amazing how quickly you can learn about a topic when you really need to. These deeper analyses between responses to specific questions was really important, so I did a dive into how regression worked and how to build it in SPSS. After several attempts at running the analyses, I finally got what I wanted. I think moving forward that I will take advantage of statisticians when planning and analyzing my survey data instead of trying to do it myself. Lesson learned.


So, now the write up is nearly finished and ready to be sent for publication. I hope the editors look favorably on the project. I know that there is a follow up study planned for the fall that will look at some of the responses more deeply and try to pull out more specific information on how PF measures are being used in clinical settings. I can use the troubles I had during this first attempt and greatly improve both the survey and the process of analyzing the data. This just might turn out to be a positive aspect of working in a global pandemic - acquiring a new skill.


9 views0 comments

Recent Posts

See All

Comentários


bottom of page