HOW PMC Monitors, Evaluates, and learns from its programs
Monitoring, evaluation, and learning (MEL) drive storytelling and program improvement at Population Media Center. These data-driven approaches are embedded in our work and are essential for understanding what we do. But what exactly are they? And how do they help us communicate our impact? This post highlights what each of these processes entails and why they are so crucial to our transformative work and overall mission. Each step in our MEL process provides us with unique data, and taken together these data sources help implement and refine our Theory of Change, allowing us to make the shows that remake our world.
Monitoring: Holding Ourselves Accountable
Monitoring is the first part of the MEL process, which is helpful because it is also the most intuitive. In the simplest sense, monitoring is checking in on our work (media campaigns, TV and radio shows, and more) to ensure everything is on track. For example, we need to verify that our broadcast schedule and plan is moving smoothly with our implementation partners, so we build relationships with studios and monitor progress to ensure our content was shared as planned. This may seem obvious but good monitoring practices are good communication and relationship-building tools, increasing transparency and credibility.
Monitoring also gives PMC a chance to interact with our audience early on and throughout the program. There are many ways to engage with our audience including surveys, social media, and listener groups—and each provides us with uniquely valuable data.
Surveys allow us to collect data from a broad sample of people living and working in our broadcast communities. This broad response reduces bias and allows us to understand how many people have heard of a certain PMC show, how many people listen, and what ideas those listeners have to improve the show. They can also (and this is true of social media and listener groups, as well) tell us what changes might improve production, audio quality, storylines, or other features of our shows.
Social media and listener groups are especially useful because they provide unfiltered feedback directly from our audience and are essential for improving the authenticity of our characters and storylines. It is critical to receive audience feedback on the “realness” of our fictional characters and worlds.
PMC also uses monitoring to determine how our programs prompt behavior change over the course of a broadcast period. For example, a certain show may have a storyline that revolves around family planning with information at the end of an episode connecting listeners with local family planning services. By carefully conducting exit interviews at family planning clinics, PMC can ascertain how our broadcast contributed to an increase in-clinic visits.
Evaluation: Answering Big Questions
Evaluation sounds a bit more cumbersome than monitoring, which is largely conducting surveys or sorting through social media posts. However, the point of evaluation is to answer the big “did it work?” question and is quite easy to understand.
To answer these types of questions PMC employs rigorous research methods. Our program evaluations are data-driven, methodical, and seek the answers to practical questions such as “are people who fully listened to PMC’s program more likely to seek family planning services than those who didn’t?” Our evaluations are contextualized to match local culture, the final scripts used in our shows, and the questions most relevant to the communities we serve. We also collect a variety of demographic data (i.e., education level, religion) during evaluation interviews, for two purposes:
1) these data allow us to see if there are fundamental differences between our listeners and non-listeners (or viewers and non-viewers),
2) we can control for these differences with statistics. This is a complex quantitative approach. It allows us to minimize any difference due to chance or to demographics and confidently claim observed differences are due to exposure to our program.
Whenever possible, PMC also collects qualitative data for evaluations. This mixed-methods (in that it includes mutually reinforcing quantitative and qualitative data) approach allows us to better understand how and why our program facilitated change.
To do this, we conduct interviews with key stakeholders including people who worked on our broadcasts, local health officials, people who listened to our shows, people who were heavily engaged on social media, and others. Interviews provide great detail from someone who has unique expertise or experience with our program. To supplement interviews, we also conduct focus group discussions. Groups are often formed around demographic factors (i.e., male and female, or creating youth-specific groups) as well as exposure to our content. We want to make sure that some groups have been fully exposed to our drama while others have not.
What we hope to see is that consensus in the exposed group builds around the value of PMC’s drama for increased awareness, knowledge, and behavior change in key areas. Through careful facilitation, we can collect stories, viewpoints, and consensus on how our content was transformative.
Learning: Continuously Refining and Refocusing
At PMC, we are experienced scientists and lifelong learners, which means we are continually taking stock of our data—from monitoring efforts and evaluations, but also financial, programmatic, and contextual data—and applying lessons to new programs. This requires humility and open minds, and is one of the primary ways to keep ourselves committed to our mission.
Good monitoring and evaluation practices are essential for learning. Monitoring data (collected during a broadcast) can be used for immediate course corrections. Evaluation data (collected at the end of a broadcast) allow us to dig more deeply into the impact of our programs and design better programs.
Learning is not a zero-sum game and PMC uses both positive and negative evaluation findings to do more of what works and avoid effort that does not result in the change we anticipated. This approach keeps us honest and challenges us to adapt to new technologies, shifting norms around the world, and emerging evidence.