The results are in! About one year ago we introduced the Applicant Success Assessment (ASA) as a psychometrically validated tool for colleges and universities to measure non-cognitive skills of their applicants. Based on decades of scientific evidence and our own validation data, we estimated that the ASA could predict student performance (GPA) and save the typical institution millions of dollars in net tuition. Today, we are excited to announce that we have our first set of empirical results from two clients. This letter describes how these two clients are using the ASA and the results we have seen after just one year.
Both clients are small, private institutions in the northeastern United States. We sent the ASA to all enrolled incoming students at both institutions at the beginning of the fall 2017 semester. We received responses from 180 students and 163 students at the two institutions respectively. Using institutional data, we next set out to see how well the ASA predicted first semester GPA.1
We predicted student performance in three ways. First, we examined the simple correlation between the ASA’s composite “Student Success” scale and actual student GPA. Second, we examined the multiple correlation using all seven of the non-cognitive factors of the ASA as predictors. Finally, we employed genetic algorithms to identify maximally predictive models of student performance.
Simple Correlations
The ASA Student Success scale is a composite scale of several ASA subscales and is designed to be the best single scale predictor of student success. That is, if you were only going to use a single non-cognitive predictor of student success (not something that we necessarily recommend), this is the one. Student Success scores predicted fall semester GPA at r = .10 and r = .17 for the second client. Not too bad for a single scale.
Multiple Correlations
While our Student Success scale is the best single scale predictor of student success based on non-cognitive skills, the ASA measures seven distinct non-cognitive characteristics related to performance. As such, a better approach to predicting student performance is to take all seven of these non-cognitive characteristics into account. Multiple regression allows us to do just that – predict student GPA from the combination of all seven scales. The multiple R indicating the predictive validity of the full set of scales was .24 for the first client and .28 for the second!
Predictive Modeling
Because the ASA consists of 55 items, we can employ sophisticated predictive modeling techniques (e.g., genetic algorithms) to build models that maximize prediction of student performance on the basis of their responses to the entire set of 55 items. Responses to the ASA items predicted student GPA at a whopping R = .40 and R = .53 for our two clients respectively! These results are displayed graphically below.
Predictive Validity for Client One
Predictive Validity for Client Two
In real world terms, these values are highly consequential. A test with predictive validity of r = .40 means that people who score above the average (median) on the predictor have a 70% chance of scoring above the average (median) on the outcome. Conversely, those who score below the average (median) on the predictor have only a 30% chance of scoring above the average (median) on the outcome. Thus, knowing someone’s score on the ASA dramatically improves your ability to predict that person’s college GPA.
Conclusion
Data from two clients at two separate institutions now definitively show what we predicted over on year ago: non-cognitive ability, as assessed by the ASA, predicts student performance. The ASA can be used to (a) improve admissions decisions at selective institutions, (b) assign financial aid more efficiently, and (c) identify students who are likely to struggle in the classroom and to develop intervention strategies designed specifically for them. Overall, this is quite impressive for an assessment that takes less than 5 minutes to complete and costs only fractions of the dollars it saves.
1The ASA can also be used to predict retention. Fortunately for these two clients, attrition among those who completed the ASA was extremely low. Thus, we could not use the ASA to predict first semester attrition for these two clients. However, completion of the ASA (vs. not) did predict attrition for both of these clients with students who did not complete the ASA being less likely to return for the spring semester. As more data become available, we will report on retention results.