At ServeMinnesota, we talk a lot about the central role research plays in our work. Research informs every stage of a program’s development, from initial program design through evaluation. We also use our programs as a way to research the issues our programs are designed to address. In this way, our research deeply affects our practice, and our practice also affects the larger research community.
This year, Results for America recognized Minnesota as one of the states that’s dedicated to policy solutions that are backed up by evidence. ServeMinnesota is one of our state’s leaders on this front, due to our dedication to research that’s only grown over time. Last year, we established the National Service and Science Collaborative, which is ServeMinnesota’s engine for evidence-based program innovation and evaluation. In recent years, we’ve also built processes that help us bring in community members as partners in program implementation and design, so that the programs we built can truly meet community needs.
There’s a lot to explore about the connection between research and practice – exactly how does research inform program design? How are community members involved? How do our programs affect the larger research community? We posed some of these key questions to three members of our leadership team: Vice President of Equity and Inclusion Dr. Sandy Pulles, Vice President of Research and Development Dr. David Parker, and Vice President of Impact and Innovation Dr. Peter Nelson. Here’s our conversation with them.
What do we mean when we say “research”? How do we define research and what does it mean to you?
Dr. Sandy Pulles: I’ve noticed the change in how research has shifted from a lot of evaluations on program effectiveness and development to really focus on the continuous improvement of equitable practices and engaging communities, which ties in with Pete’s role. So it really has shifted over time, starting from trying to understand the effectiveness of our programs to now not only that, but also making changes that really are in alignment with the communities our programs serve.
Dr. Peter Nelson: I’ve seen that change, too. For me, the two words that come to mind are evidence and improvement. One part of research is establishing evidence, and the other side is improvement – so research drives the process of constant, iterative improvement. Like Sandy touched on, we’ve continued to evolve what we think is evidence, and what we think is continuous improvement, because evidence isn’t always just a randomized controlled trial. Evidence can mean different things, but it’s still research. You’re still talking about evidence, but it’s just maybe a different definition of what evidence might mean and what it might mean to different people. The same with continuous improvement – you can think about continuous improvement a traditional research sense, but there’s other ways that we’re continuously improving our programs.
Dr. David Parker: In simple terms, “research” means using a systematic process (and tools) to understand something better. In AmeriCorps, research is incredibly important. It tell us what our well-intentioned, fully-committed AmeriCorps members can do to make a difference. Research answers questions like, “How should an AmeriCorps member support young children’s reading development?” And, “How can an AmeriCorps member make a meaningful difference within urban forestry?” Research has good, specific answers to these questions – and because research is a process that continually evolves, that means we can harness it to continually improve our impact.
Why are we so excited about connecting research to practice, and what does that mean?
Dr. David Parker: Academics often organize research into two types: basic and applied. Basic research is focused solely on new knowledge, whereas applied research is meant to benefit humanity in some way. We do applied research here, and that applied research is at its best when it has a reciprocal relationship with practice. That’s because practice is about doing things in the real world, and the real world is where problems – and solutions(!) – are best understood. What’s great about AmeriCorps is that it has an explicit charter to bring research and practice together. Researchers provide the tools of science – data, methods, analyses – and practitioners provide the knowledge and lived experiences that make research relevant. We’re incredibly fortunate to be at the center of that intersection!
Dr. Peter Nelson: It’s often difficult for practitioners and community members to access the body of research knowledge in a given area. But also, it can be hard for researchers in an academic setting to see how their research is living on in practice after the timeline of the funded project. It can also be difficult for community members and practitioners to see their learnings and expertise reflected in research projects. We believe that AmeriCorps can and does play an important role in connecting researchers to both community as well as practitioners.
Over the past ~5 years or so, ServeMinnesota has introduced many new programs. How does ServeMinnesota decide to move forward with a new program, taking research into consideration?
Dr. Peter Nelson: First, we create a need statement that determines if there’s a need for programming in this area. Then there’s a real assessment of financial opportunity. We ask, is this a focus for AmeriCorps nationally, like in the case of Public Health Corps, or else is there a lot of state or national momentum behind this? Then the third piece is whether the program is a good fit with AmeriCorps, because there are potential program areas that maybe just don’t fit as well with AmeriCorps. We also want to know if are there clear best practices or other people that we can talk to to inform the programming.
Dr. Sandy Pulles: We ask, is this an area of need? How is it contributing to improving opportunities for people in our community? And I think the new focus too is, how is it supporting community members as well as those who are serving as AmeriCorps members? That’s a shift that AmeriCorps the agency has been making to support AmeriCorps members themselves. So I think that’s another new approach to how we’re thinking of really maximizing opportunities, not only for those who are getting the AmeriCorps services, but also those providing the services.
So once we decide on a new program, we have a general practice of applying research to program design. How do we do that?
Dr. Peter Nelson: Most of our programs are informed by research. We review literature, best practice guides, and empirical work that might help the program create specific activities. That search also helps the program think about measurements, for example, how are people measuring the impact of these activities? We also talk to academic researchers. There are also a lot of people like community members who also have a body of knowledge that doesn’t always exist in peer reviewed journals. So we also pay attention to the knowledge that people and experts have, as well as what’s documented.
The research is born out in many of our programs – one example is K-3 Math Corps. We reviewed the literature on K-3 math, including intervention, as well as best practices guides. We also connected with researchers in the field who focus on early elementary math intervention and also talked to site partners and community members. All of those contributed to the design and early implementation of K-3 Math Corps. Having a research-informed program design never really goes away because we’re always trying to improve, especially during early implementation. We have feedback loops with the literature and with the people implementing the program – this helps ensure our programs are informed by research.
You mentioned that we collaborate with community members to inform program design. What has that process of working with community members looked like?
Dr. Sandy Pulles: The way that I’ve engaged with community members has been more for improvements and new ideas for existing programs. In the past couple of years, we have had a program designed already, then identify the community members who are closest to the opportunity and ask them to voice their opinions on it, while compensating them for their time.
Right now, we’re considering how we can proactively engage with the community from the beginning to really design a program that is in alignment with community needs. I hope community members can really be full members of the research team – not seeing them secondary, but really valuing what they have to say. We are taking a new approach to how we talk about experts and who we think experts are.
Dr. David Parker: In our work, there’s a big difference between what kind of knowledge exists and how to apply that knowledge. Historically, we’ve valued the knowledge that exists, for example, research on forestry or the science of reading, which is often more quantitative and highly controlled. But what we’re learning is that we can’t value that at the expense of valuing the “how” research, that is, how do we do something with this information? There’s an equally robust and valuable approach to research that involves the grounded expertise of community experts – people who have lived the scenarios or the issues or the opportunities that we’re trying to kind of incorporate into the AmeriCorps program. I really love that we’re marrying the “how” and the “what” because together they get us a lot closer to an acting viable solution.
What specific examples are there of how we’ve incorporated community member expertise into programs?
Dr. Sandy Pulles: I think the clearest example is the caregiver app design process. We hired Natasha, who was a parent of students who had experienced the program. She also is very connected to other caregivers whose kids received services too. The caregivers didn’t know what the program was like, so she had that very unique perspective. She recruited families and did all the communication with them. She helped develop all the questions that we asked caregivers about how they would like to be communicated with and what has worked well in the past, what hasn’t. She also would come with us to the meetings with the app developers. I thought that was really cool because we could have easily just said we’d report back to her, but instead we had her at every single meeting. I think that created transparency. It showed that it’s important to be transparent with community members and involve them in the entire process, to really have them be a part of our team and not just tokens.
Now we’re continuing to do that. The Children’s Trust has a similar framework now. We’re also evaluating the Early Learning Corps program right now, and we hired community members to facilitate groups of caregivers and students who participated in the program as well as educators. We also collaborated with YouthPrise to hear from BIPOC high schoolers about their perceptions of AmeriCorps and what kind of programs they’d be interested in.
So we use research to inform programs, but then we also take what we learn in practice and then share that back with research colleagues across the country. How does that work?
Dr. David Parker: It can take a number of forms. It includes publishing papers, presenting at national and regional conferences, or sharing learnings through our National Science and Service Collaborative. It also includes establishing working partnerships with researchers across the country. We have collaborative, personal relationships with experts in education, recovery, forestry, public health, and more.
Dr. Peter Nelson: One of the unique opportunities that we have with AmeriCorps programs is that we have really good data infrastructures connected to the program and the program’s theory of change. We have information on outcomes and the relationship between outcomes. It’s a unique thing to have that much information, and you can use that for continuous improvement. And we’re also positioned to share that information back with the researchers in the broader field. For example, we’re advancing our understanding of not just Reading Corps, but of reading intervention and literacy support.
What have been some of the most impactful findings coming out of our programs that have affected the larger research community?
Dr. Peter Nelson: Through our research, we noticed that a lot of students in Reading Corps who graduate midyear actually regress. We used that to better understand how Reading Corps can serve students in the long-term, but it also advanced the broader literature. A lot of the research grants we get aren’t just about advancing a specific program, they’re about advancing the broader knowledge base.
Dr. David Parker: I think one of the most impactful things we’ve done is proven that AmeriCorps members are effective tutors – indeed, our work represents an outsized contribution to the research base on tutoring in general. But bigger than that is the fact that our work is part of proving AmeriCorps members are effective change agents, period. Equipping AmeriCorps members with the tools, training, and support to solve any issues actually works! That’s awesome.