Call for Research
![Male backpacker standing at the edge of a lake shore looking at some mountains in the distance.](/content/dam/school/global/clinical/us/assets/research/research-engagement-portal-call-for-research-graphic-2-700x200.png)
Purpose
This annual Research Call is published by Pearson Clinical Assessment. Our mission is to ensure that our assessments comply with the highest standards of quality and improve outcomes for the customers we serve. The purpose of the Research Call is to obtain independent research that contributes to the validation and efficacy of Pearson Clinical Assessment products and offerings.
Who can apply
We invite faculty members, graduate students, and qualified researchers based in the United States to submit proposals of relevance to the research areas outlined below.
2024 Research areas
- Clinical validity
Proposals evaluating the clinical validity of one or more Pearson Clinical Assessment products.
For example, conducting a clinical study for a test with a special population not reported in the test’s manual. - Efficacy
Proposals investigating the use of Pearson Clinical Assessment products or offerings to improve outcomes for intended groups or settings.
Examples of groups and settings include practitioners/clients or special populations in schools or clinics. - Equitable Use
Proposals investigating the use of Pearson Clinical Assessment products with a special population.
Special populations include underrepresented, unaccounted for in the normative samples, or excluded from the normative samples. - Equivalence studies
Proposals investigating the equivalence and reliability of Pearson Clinical Assessment products administered under nonstandard administration conditions.
Examples of non-standard conditions include telepractice and language translations.
- Methodological studies
Proposals investigating new data collection, norming or psychometric methods, or scoring approaches with Pearson Clinical Assessment products (e.g., embedded performance validity measures, machine learning).
![Icon of a person with a question mark on top](/content/dam/school/global/clinical/us/assets/research/research-engagement-portal-research-areas-graphic-800x800.png)
Data, funding, and duration of projects
Pearson will provide researchers with Pearson Clinical Assessment test data (e.g., standardization data) and/or product materials, if required by the research proposal.
The maximum budget across all research projects accepted in 2024 is $15,000. Funding is available for one or more research projects and the amount of each award will be based on the scope and required budget of the accepted proposals.
*If funding is requested, indirect costs for overhead may not exceed 6 percent of the project’s funding request.
The maximum duration of projects is one year from the date of the signed contract.
![Icon of a plant growing in a hand](/content/dam/school/global/clinical/us/assets/research/research-engagement-portal-data-funding-graphic-800x800.png)
How to apply
The principal investigator is requested to complete the application, and attach their curriculum vitae and a well-structured research proposal using our template.
Questions regarding the Call for Research can be sent to CallForResearch@pearson.com.
![Icon of a pen and paper](/content/dam/school/global/clinical/us/assets/research/research-engagement-portal-how-to-apply-graphics-800x800.png)
Vetting process
Each proposal will be rated internally according to the following criteria:
- Match with the Research Call: Contributes to the validation or efficacy of Pearson products
- Quality of research design: Includes a sound theoretical foundation and clear methodology and complies with data protection & ethics standards
- Feasibility of the research: Taking into account the budget, timeline, and availability of data requested (if applicable)
- Impact or benefit to the development and use of Pearson products
The shortlisted applicants will be notified and may be contacted to provide further details about their proposal. Applicants will be notified if there are any delays to the timetable for the vetting process.
![Icon of a checklist and pen](/content/dam/school/global/clinical/us/assets/research/research-engagement-portal-vetting-process-graphics-800x800.png)
Notification to Applicants
Pearson will notify all applicants by the deadline provided in the timetable.
Accepted Proposals
Following the signing of an agreement between the researcher(s) and Pearson, the researcher will commit to sending Pearson regular reports on the project and sharing the final data with Pearson.
The researcher(s) will have full publishing rights in the research results. Pearson will have rights in the data produced by the project and may use such data to enhance its products and services.
For applicants of accepted proposals:
- October–December 2024: Onboarding to finalize contracts and data requirements
- January 2025: Research activity begins
- December 2025: Research completed, data and research results/manuscript submitted to Pearson
![Icon of a stopwatch](/content/dam/school/global/clinical/us/assets/research/research-engagement-portal-notification-to-applicants-graphics-800x800.png)
2023 Awardees
Multivariate Examination of Performance Validity Tests for Neuropsychological Evaluation: An Ability-Focused Approach
Submitted by: John-Christopher Finley, MA & Michael Brook, PhD
Clinical Validity of the MABC-3 with Children and Adolescents Previously Diagnosed with DCD and ASD
Submitted by: Priscila Tamplain, PhD
2022 Awardees
Factor Analysis of Processing Speed Measures in a Sample of Normative and Clinically-Referred Children
Submitted by: Rachel K. Peterson, PhD,& Lisa A. Jacobson, PhD, ABPP-CN at the Kennedy-Kreiger Institute and Johns Hopkins School of Medicine
Research Update: Over the quarter, we obtained the neuropsychological data from Kennedy Krieger Institute and Pearson Assessment for children between 6 and 16 years of age. We performed initial descriptive statistics, including the sample size for each neuropsychological measure as well as mean, standard deviation, and range for both the Kennedy Krieger Institute dataset and Pearson Assessment. We then ran a correlation analysis with the Kennedy Krieger Institute sample to determine the number of measures in common in our dataset.
Assessing the predictive validity of autism indicator items on the Bayley Scales of Infant and Toddler Development in (N=600) high risk children
Submitted by: Jennifer Amato, PsyD., Nina Sand-Loud, MD., Jonathan Lichtenstein, PsyD., MBA; Dartmouth-Hitchcock Medical Center
Development of Novel Embedded Validity Tests (EVTs) within the WMS-4 Logical Memory Subtest
Submitted by: Ben K. Mokhtari, MS, graduate student at The University of Texas Southwestern Medical Center Mentors: Laura Lacritz, PhD, ABPP; Caitlin Reese, PhD; Jeff Schaffert, PhD; Richard Robinson, PhD; Shawn McClintock, PhD
Research update: Toward the proposed development of novel embedded validity tests within the WMS-IV Logical Memory subtest, research to this point has focused on initial aims to develop an updated rarely missed index. Data from the ACS No Stimulus sample and standardization sample have been analyzed to identify 12 rarely missed items (in keeping with the general process used for the WMS-III rarely missed index items). Next steps will involve matching and cleaning of clinical and invalid performance groups to examine performance of the index. Discriminant function analysis will follow to help identify an optimal cut off for the new rarely missed index.
![Icon of a lightbulb wiht three people figures sitting in front of it.](/content/dam/school/global/clinical/us/legacy-replace/site-scrape-flattened/research-engagement-portal-awards-graphic-800x800.png)