Best Practices: Flee

Last week I did something that felt revolutionary, dangerous and bold.

I walked out of an academic conference.

I had every reason to stay. The theme of the conference was a topic that I’ve spent the last 6 years of my professional life living.  It was full of people who are former colleagues and likely future collaborators. I would have networked and maybe even been offered a job. But I sat and listened to perhaps 30 minutes and I couldn’t stand it anymore. I’ve heard those presentations before. I’ve given those presentations before. And what they say is, “We know should study these areas more.” “These are the domains where we should be collecting data, but we don’t have good instruments.” “The gains are incremental, but statistically significant.” And I couldn’t stand it a moment more.

We’ve become slaves to the quantitative world. Outcomes, standardized tests, percentage growth and percentage change are what the non-profit sector, the granting agencies, the schools, and the statisticians care about. And it’s often meaningless. Garbage in, garbage out. I’ve shocked many a student by saying “It’s not enough” when they claim that they are doing fine in school because they’ve passed the Virginia SOLs. Employers don’t care that you can pass the SOLs if you can’t write an email asking for a job. Employers also don’t care about certifications if you can’t perform the work you’re asked to do. The ultimate test of workplace-readiness is whether or not an employee is hired, rehired or fired.

Similarly, college-readiness is truly measured by the coursework and grades that students attain their first year. Additionally, college-readiness is measured by the students’ ability to navigate scholarships and FAFSA, complete applications by deadlines, and actually attend class. Passing the SOLs is not a skill that is valued anywhere outside of the K12 arena, and doesn’t guarantee any kind of college or job success. So, research documenting that an after-school intervention may increase that SOL score doesn’t interest me.

At this conference, a well-meaning professional spoke about 4 different ways that one could measure outcomes in after-school programs. Although the 4 ways were perfectly reasonable, she repeatedly said, “There isn’t a really great tool for this.” The reality is, most non-profits do not have the expertise or staff to write a measure that gets at the data desired. If they do create one or luck into one, there are most likely not reliability or validity data available, and the non-profit certainly does not have the sample size, budget or expertise to test the reliability or validity of their tool. And that’s not their job. It’s the job of universities, foundations and government to conduct studies of effectiveness. It’s the job of practitioners to follow best practices, implement effectively and ethically, and exercise their best professional judgment.

So that beautiful late fall day, I decided that rather than hear platitudes about the importance of measuring student engagement, I instead decided to use the time to check in with them about how they were getting along with school, work and life. I figured that they were pretty engaged with my process because they all responded to me with pleas for help, updates of accomplishments, and inquiries into what was going on with me.

It felt a lot better than being reminded that we still don’t seem to have any reliable measures of engagement.

Liked it? Take a second to support Dolly on Patreon!
Become a patron at Patreon!

Leave a Reply

Your email address will not be published. Required fields are marked *