This blog is estimated to take 2 minutes to read.
One of the drawbacks of going to graduate school is finding out the limitations of what you thought you knew. Correlations just do not cut it after learning about hierarchical linear modeling! Thankfully, day-to-day program evaluation doesn’t require that level of sophistication, but careful attention to data collection design is key for equity in evaluation.
If you are new to thinking about data from an equity lens, here are baby steps to consider on your journey toward data equity:
- Consider your audience. Audience determines how information is reported. Take for example, a standard metric requested by funders: how much does this program cost per client? Organizations try to minimize this amount to appear efficient to a funder, but when advertising the program to a client, the same organization might take an opposite approach to highlight the value.
- Presentation matters. Infographics are extremely popular, but how are you presenting your findings? When using charts and graphs, notice who is presented first. Are you creating categories and lists (alphabetically or by size?) or are relationships and theories of change illustrated?
- Check box or not a checkbox? When collecting demographic data, be clear about who is reporting and what is counted. Did clients self-report their identities or select answers from a pre-populated list? Is the list inclusive? What might be missing?
- After you have the answer to your original question, don’t stop! Reflect on the data and ask questions to uncover disparities. For example, know rates (graduation, acceptance, suspension, employment, hospitalization, etc.) and examine who in the data is not a part of that count. (Who didn’t graduate, get suspended, or go to the hospital?) This might provide a clue or prompt more questions. One more step would be to brainstorm who are not in the count at all.
- Who has access to your data/evaluation? Format and styling will make a difference in who is able to access your reporting. Making sure there is a non-digital version for those without access to technology or a digital version that can be understood by a “screen reader” are just a couple of considerations. Video and alternative languages might allow for greater uptake and understanding of the information.
- Did you get it right? Asking clients to review the conclusions made in any program evaluation is a best practice of research. Talking with clients about how and why you are using information collected about them (with their consent!) is likely to illuminate issues and spark new understandings.
I do not get enough practice with econometrics anymore to feel confident doing regression analysis, but I do enjoy reading research and thinking about evaluation design. If you have tips or examples of how you take equity into your data and research projects, send us a note at Hello@WeitzFamilyFoundation.org.
In the meantime, here are some resources on data equity: