Meet Jared R. Morris!

Name: Jared R. Morris, Ph.D.

Institution/Organization Affiliation: Brigham Young University

SERA Research Partner Bio:

Dr. Jared Morris is an Assistant Professor in the Department of Counseling Psychology and Special Education at Brigham Young University. Jared completed his Ph.D. in special education with a minor in educational psychology at The Pennsylvania State University. He also completed a graduate certificate in applied behavior analysis at The Pennsylvania State University. He received a master’s degree in special education from The University of Utah and a bachelor’s degree in English from Brigham Young University. Jared taught students with disabilities in a variety of settings for five years.

What made you interested in partnering with SERA?

I became interested in SERA seeing a need to help further the research base for students with disabilities. Accelerating research through systematic large scale replications using the SERA model has potential to increase the research base by providing more robust data because of the increased quantity and diversity of the research samples.


Crowdsourcing & Open Science

Educators strive to improve and maximize the learning outcomes of their students by applying effective instructional practices. Although no instructional approach is universally effective, some teaching practices are more effective than others. Therefore, it is important to reliably identify and prioritize the most effective instructional practices for populations of learners. Rigorous experimental research is generally agreed to be the most reliable approach for identifying “what works” in education. However, it is important to recognize that scientific research has important limitations and does not always generate valid findings.

In large-scale replication projects in psychology and other fields, researchers often failed to replicate the findings of previously conducted studies (e.g., Klein et al., 2018; Open Science Collaboration, 2015), casting doubt on the validity of research findings. Researchers, including those in education, have also reported using a variety of questionable research practices such as p-hacking (exploring different ways to analyze data until desired results are obtained), selective-outcomes reporting (only reporting analysis with desired results), and hypothesizing after results are known (HARKing; e.g., Fraser et al., 2018; John et al., 2012; Makel et al., 2019), all of which increase the likelihood of false positive findings (Simmons et al., 2011). Moreover, research studies in education often involve relatively small and underpowered samples that do not adequately represent the population being studied, which further threatens the validity of study findings. Finally, most published research lies behind a paywall, inaccessible to many practitioners and policymakers (Piwowar et al., 2018), which reduces the potential application and impact of research. Open science and crowdsourcing are two related developments in research that aim to address these issues and improve the validity and impact of research.

Open science is an umbrella term that includes a variety of practices aiming to open and make transparent all aspects of research with the goal of increasing its validity and impact (Cook et al., 2018). For example, preregistration involves making one’s research plans transparent before conducting a study in order to discourage questionable research practices such as p-hacking and HARKing by making them easily discoverable. Data sharing is another key open-science practice, which allows the research community to verify analyses reported in an article, as well as analyze data sets in other ways to examine the robustness of reported findings, thereby serving as a protection against p-hacking. Materials sharing involves openly sharing materials used in a study (e.g., intervention protocols, intervention checklists) to enable other researchers to replicate the study as faithfully as possible. Finally, open access publishing and preprints provide free access to research to anyone with internet access, thereby democratizing the benefits and impact of scientific research beyond those with institutional subscriptions to the publishers of academic journals.

Crowdsourcing in research involves large-scale collaboration of various elements of the research process and can take many forms (Makel et al., 2019). For example, multiple research teams might conduct independent studies examining the same issues, as the Open Science Collaboration (2015) did when they conducted 100 replication studies to examine reproducibility of findings in psychology. Crowdsourcing in research has also occurred by multiple analysts analyzing the same data set to answer the same research question in order to examine the effects of different analytic decisions on study outcomes (Silberzahn et al., 2018). The most frequent application of crowdsourcing in research, which we have adopted in the Special Education Research Accelerator, is to involve many research teams in the collection of data, thereby increasing the size and diversity of the study sample – which serves to improve the study’s power and external validity. For example, Jones et al. (2018) involves > 200 researchers collecting data from > 11,000 participants in 41 countries to evaluate facial perceptions. Regardless of how it is employed, “crowdsourcing flips research planning from ‘what is the best we can do with the resources we have to investigate our question?’ to ‘what is the best way to investigate our question, so that we can decide what resources to recruit?” (Uhlmann et al., 2019, p. 417).

Although open science and crowdsourcing are independent constructs (i.e., research that is open is not necessarily crowdsourced, and crowdsourced studies are not necessarily open), the two approaches are closely aligned and complementary. The ultimate goal of both open science and crowdsourcing is to improve the validity and impact of research. Although using different means to achieve this goal, crowdsourcing facilitates making research open, and open science facilitates crowdsourcing. For example, in a crowdsourced study in which data are collected by many research teams, study procedures have to be determined and disseminated to collaborating researchers before the study is begun. Because study procedures are determined and documented prior to conducting the study, they can readily be posted as a preregistration. Additionally, if data are being collected across many researchers in a crowdsourced study, the project will need a clear data-management plan to enable data to be collected and entered reliably across researchers. Such well-organized data-management plans that include metadata not only facilitate the integration of data across researchers on the project, but also make data more readily usable by other researchers when shared and reduce the burden of creating supporting metadata when sharing. Moreover, because data are collected across many different researchers in many crowdsourced studies, individual researchers may be less likely to feel that they “own” the data and therefore may feel less reluctant to share them. Similarly, materials used in crowdsourced studies must have been developed and shared with the many researchers collecting data, so – like data – materials from crowdsourced research are ready to be uploaded and shared. Most broadly, crowdsourcing and open science share an ethos of collaboration and sharing for the betterment of science, and we conjecture that most researchers involved in crowdsourced studies will want to make their research as open as possible. As such, we look forward to making our crowdsourced research through the Special Education Research Accelerator as open as possible.


Bryan G. Cook, Ph.D.

Principal Investigator

Bryan G. Cook, Ph.D., is a Professor in Special Education at UVA with expertise in standards for conducting high-quality intervention research in special education, replication research in special education, and open science. He co-directs, with Dr. Therrien, the Consortium for the Advancement of Special Education Research (CASPER) and is an ambassador for the Center for Open Science. He is Past President of CEC’s Division for Research, chaired the working group that developed CEC’s (2014) Standards for Evidence-Based Practices in Special Education, coedits Advances in Learning and Behavioral Disabilities, and is coauthor of textbooks on special education research and evidence-based practices in special education. Cook plays an integral role in developing infrastructure and supports for SERA, conducting the pilot study, and assessing the usability and feasibility of using SERA to conduct future replication pilot studies.

Crowdsourcing Research Primer

In crowdsourced research, resources are combined across researchers to conduct studies that could not be accomplished on their own.

“Crowdsourcing flips research planning from ‘what is the best we can do with the resources we have to investigate our question,’ to ‘what is the best way to investigate our question, so that we can decide what resources to recruit.” (Uhlmann et al., 2019, p. 7).

Crowdsourcing research is new to special education research, but in other fields, it has been around for decades. Many aspects of research can be crowdsourced, such as deciding what ideas to research, analyzing data, and conducting peer review (Uhlmann et al., 2019). The most common use of crowdsourcing is regarding data collection, in which many researchers collect data, resulting in much larger and diverse samples of study participants. Examples of crowdsourced data collection outside of special education include:

  • Citizen data collectors: Fields such as astronomy and geology utilize volunteers to collect data that researchers would never be able to collect on their own. Many of these projects and opportunities to become data collectors are compiled on the federal government citizen science website.
  • Psychological Science Accelerator: The Psych Accelerator is a network of over 1,000 research labs, across more than 70 countries, conducting large-scale studies on topics such as gendered prejudice and stereotype threat.

Conducting crowdsourced research, especially involving crowdsourcing data collection, in special education has many potential uses such as:

  • Implementing large scale observational studies so we can better understand how students with disabilities are provided services across the country
  • Validating evidence-based practices using nationally representative student samples
  • Enabling adequately powered group studies with low-incidence populations to be conducted
  • Dramatically increasing the number of direct and conceptual replications so we can efficiently determine what interventions work for who, under what conditions (Coyne et al., 2016).

Ultimately, we envision crowdsourcing will democratize the research enterprise by enabling more and diverse researchers to be involved in large-scale research studies involving large and diverse participant samples that ask and answer critical research questions aimed at improving services for children with disabilities. By facilitating crowdsourcing of data collection across many researchers, we hope the Special Education Research Accelerator (SERA) will play a core role in democratizing the research enterprise in our chosen field — special education.


William J. Therrien, Ph.D., BCBA

Co-Principal Investigator

William J. Therrien, Ph.D., BCBA, is a Professor in Special Education at UVA. He is an expert in designing and evaluating academic programming for students with disabilities, particularly in science and reading. Along with co-directing CASPER, Dr. Therrien co-edits Exceptional Children, the flagship journal in special education, and is Research in Practice Director for UVA’s Supporting Transformative Autism Research (STAR) initiative. Therrien assists Cook with developing infrastructure and supports for SERA, conducting the pilot study and assessing the usability and feasibility of using SERA to conduct future replication pilot studies.

Meet Chris J. Lemons!

Name: Chris J. Lemons, Ph.D.

Institution/Organization Affiliation: Stanford University

SERA Research Partner Bio:

Christopher J. Lemons, Ph.D., is an Associate Professor of Special Education at Stanford University. His research focuses on improving academic outcomes for children and adolescents with intellectual, developmental, and learning disabilities. His recent research has focused on developing and evaluating reading interventions for individuals with Down syndrome. His areas of expertise include reading interventions for children and adolescents with learning and intellectual disabilities, data-based individualization, and intervention-related assessment and professional development. Lemons has secured funding to support his research from the Institute of Education Sciences and the Office of Elementary and Secondary Education, both within the U.S. Department of Education and from the National Institutes of Health.

What made you interested in partnering with SERA?

The idea to crowdsource special education research is innovative and well-aligned with other initiatives focused on open science and replication. This project is moving our field forward in creative, meaningful ways and I’m happy to be involved.


Meet Nathan A. Stevenson!

Name: Nathan A. Stevenson, Ph.D.

Institution/Organization Affiliation: Kent State University

SERA Research Partner Bio:

Dr. Stevenson is an Assistant Professor in the area of mild/moderate educational needs at Kent State University. He teaches graduate and undergraduate courses in core instruction, classroom management, inclusive practices, and instructional methods for struggling learners. He earned his doctorate in special education from Michigan State University. Dr. Stevenson began his career as an elementary classroom teacher with New York City Public Schools. His research interests include classroom behavior management, inclusive practices, and adoption of evidence-based instruction. 

What made you interested in partnering with SERA?

Being a part of SERA is truly a unique opportunity to draw on the collective expertise of scholars and speed the pace of research in critical areas. I am delighted to be working with such a distinguished group of scholars.


Meet Amelia K. Moody!

Pictured from left to right: Amelia K. Moody, Ph.D., James Stocker, Ph.D., and Sharon Richter, Ph.D.

Name: Amelia K. Moody, Ph.D.

Institution/Organization Affiliation: University of North Carolina-Wilmington

Additional Research Team Members: James Stocker, Ph.D., Associate Professor & Sharon Richter, Ph.D., Assistant Professor

SERA Research Partner Bio:

Amelia Moody received her Ph.D. in special education from the University of Virginia and currently works as a Professor in the Watson College of Education at the University of North Carolina Wilmington. Moody is the director for the Center for Assistive Technology and serves as a member of the Science, Technology, Engineering, and Mathematics (STEM) Learning Cooperative. Current grant work surrounds the research of innovative technologies that enhance educational outcomes for children with disabilities, with a focus on students with Autism Spectrum Disorders. 

What made you interested in partnering with SERA?

Our team is interested in participating in SERA because we are dedicated to improving educational outcomes for students with ASD. This project allows for accelerated data collection on innovative educational interventions in efforts to determine how to best meet the needs of this population of students.


Welcome to the Special Education Research Accelerator

Bryan G. Cook, William J. Therrien, Vivian C. Wong, Christina Taylor

We are pleased to welcome you to the Special Education Research Accelerator (SERA). SERA is a platform for crowdsourcing data collection in special education research across multiple research teams. When we read about the Psychological Research Accelerator, which crowdsources data collection for massive studies in the field of psychology conducted throughout the world, we began to think about the potential benefits of crowdsourcing in special education research. In essence, instead of a single research team conducting a study, crowdsourcing of data collection involves a network of research teams collecting data. Crowdsourcing allows researchers to flip “research planning from ‘what is the best we can do with the resources we have to investigate our question,’ to ‘what is the best way to investigate our question, so that we can decide what resources to recruit’” (Uhlmann et al., 2019, p. 713).

Given that there are relatively few students with disabilities, especially low-incidence disabilities, in schools, it is often difficult for special education researchers to obtain large, representative samples for their studies. Moreover, given limited grant funding, relatively few researchers in the field have the resources to conduct studies with large, representative samples on their own. Crowdsourcing data collection across many research teams seemed to us to be well-suited to address these and other challenges faced in special education research.

However, implementing crowdsourcing in special education research presents many challenges. Can interventions be conducted with fidelity across many different research teams? How will data be managed? How will implementation fidelity be measured? How will IRB issues be handled across multiple institutions? Fortunately, the National Center for Special Education Research (NCSER) funded an unsolicited grant for us to develop and pilot a platform for crowdsourcing research in special education research (i.e., SERA) to examine these and other issues. By involving many different research teams in data collection of studies, SERA can generate large and representative study sample, involve diverse sets of researchers, and examine whether and how study findings vary across researchers and researcher sites, in ways that studies conducted by a single research team could not.

This website has two primary functions: (a) a public-facing site to provide information and resources related to SERA and crowdsourcing research in special education, and (b) a hub for research partners to access resources and interact with SERA staff related to ongoing SERA projects. We hope all of you explore the website to learn more about SERA, the team behind SERA, and our SERA research partners. Please check back periodically as we provide updates. For our research partners, please be on the lookout for an email over the next few weeks which will contain your login credentials, as well as provide additional information on navigating pilot study resources and materials.

Currently, we are in the process of preparing to conduct a crowdsourced randomized control trial that conceptually replicates Scruggs et al.’s (1994) study of acquisition of science facts. We will be examining the effect of instructor-provided elaborations and student-generated elaborations on science-fact acquisition for elementary students with high-functioning autism across more than 20 research partners and sites throughout the US.

We couldn’t be more excited about SERA, this website, and our upcoming pilot study. We plan on developing and refining SERA for future use in many different crowdsourced studies in special education. If you’re interested in potentially being involved in future studies, please send us a message using the contact form here. We hope that you’ll find the site interesting and helpful, and that (if you’re a special education researchers) you’ll consider being involved in a crowdsourced SERA study.