Canadian Survey on Business Conditions, Released: 2024-03-20, Statistics Canada

Canadian Knowledge Hub
for Giving and Volunteering

Researching Volunteer Screening in Alberta

A man holding a box with donation boxes behind him.
Research is an ongoing process. Whether it continues in your organization or the data takes on life in another organization, good research generates more opportunities for discussion and analysis.

When asked how Volunteer Alberta has taken a data-driven approach to volunteer screening, I can’t help but stumble in my answer. Because, truthfully, my answer is, “How else would you do it?”

At Volunteer Alberta, we run a program called the Volunteer Screening Program (VSP), the primary purpose of which is to build the volunteer screening capabilities of Albertan social impact/nonprofit/voluntary organizations. To do this, we need to understand the issues affecting volunteer screening practices in the province, including all the pre-hiring, hiring, and post-hiring screening processes outlined in Volunteer Canada’s 10 Steps to Screening.

So, how do you build an understanding of something as broad and multi-faceted as volunteer screening? You ask people. You go out into the community and start collecting data on what people are doing. You set up multiple access points for people to talk to you, through surveys, focus groups, interviews, or feedback forms. And you come back to the drawing board with your data in hand, ready to dive deep and discover what people are up to in your province. I hope that volunteer managers and other social impact/nonprofit/voluntary sector professionals can take some inspiration from this article in their own data-driven projects.

That being said, if you’re planning a data-driven program or project any time soon, keep in mind a few simple things:

  1. The point is to get out into the community and test your assumptions. No effective program can start without community input.
  2. Create time for an iterative process where each data set builds on the last. Often, more data will lead to more questions and the need for more research.
  3. Research is an ongoing process. Whether it continues in your organization or the data takes on life in another organization, good research generates more opportunities for discussion and analysis.

The Origins of VSP

VSP started in its current form in 2017, and throughout the past seven years, its connection to the community has fluctuated. To kick off the program, we started with some research. We had to identify gaps in our knowledge of how organizations screened volunteers and what they wanted to learn about the volunteer screening process. To do so, the program held forums across the province to gain data. This led Volunteer Alberta into a very fruitful relationship with Volunteer Canada, where we actively promoted Volunteer Canada’s 10 Steps to Screening as a fundamental resource for organizations. We still promote that resource in all our screening educational programming. However, throughout the next three years of operations, the Volunteer Screening Program’s connection to the community changed from actively surveying people to providing programming to people. And this educational programming was successful for multiple years.

Then COVID-19 struck and threw many of the old processes for conducting educational programming out the window. At this point, the program team had to step back and ask, “Do we still know what people need? Are we designing content that people still want?”

A New Relationship with Research 

This brought us to our first effort to rekindle our connection to the community. In our 2020/2021 program year, we performed an environmental scan of the volunteer screening landscape, nationally and internationally, then performed what we termed a “community assessment.” Throughout this process, we had our team explore contemporary issues in volunteer screening through literature reviews; then, we went out to the community with a series of surveys, focus groups, interviews, and phone calls with police services. Volunteer Alberta then collected this data up into one report, breaking it down into digestible summaries, which then led to questions and possible next steps based on the data. We realized a few key things about screening during this process: 

  1. People wanted to connect with each other about volunteer screening in a peer-to-peer learning arrangement. 
  2. People were facing numerous technical and ethical barriers to receiving police information checks. 
  3. People still believed in the power of VSP to advocate for changes to volunteer screening while still providing accessible, quality programming to social impact/nonprofit/voluntary organizations.  

This data suggested some next steps for the program. There were some shifts in focus we could make, but generally, there was still a demand for our program. However, the first thing this initial research directed us to was a revamp of our program logic model, which was sorely outdated. We spent the better part of three months having internal discussions about evaluation, what we wanted to measure, and what kind of data we wanted to capture while delivering our regular programming online during the pandemic.  

The next step after determining a new logic model was to benchmark the indicators in the model. This meant getting baseline data to judge the initial state of the sector and its relationship to volunteer screening at the time of the model’s development. This, once again, called us to do more research into the current state of the sector. The research we performed previously was a great baseline to get us thinking about the program’s direction. Still, it was our job to get data that could be presented back to the sector as an accurate depiction of the state of screening in Alberta.  

We did this through multiple extensive surveys and focus groups sent to both volunteers and organizations about their experiences with volunteer screening in 2022/2023. We then collected the data into a formal report delivered back to the social impact/nonprofit/voluntary sector for them to digest and learn from. The research design step was truly one of the process’s most time-intensive parts. From creating surveys and focus groups to mapping questions to outcomes in our logic model, it takes a lot of effort to develop benchmarks if you don’t already have recent data. Because of this, it was crucial to make the data gleaned as accessible as possible. If anyone else had a project that had to do with volunteer safety, engagement, screening, or accessibility, they might have some data to work with already. We wanted to save people some of the time we already spent collecting data. 

Generally, the feedback we received from the community about our screening research was very positive. Our presentation and report, “Changing the Conversation,” were described by attendees and readers from other social impact/nonprofit/voluntary organizations as informative. Most importantly, it helped people understand volunteer screening and related contemporary conversations more clearly, which is one of the key functions of VSP.  

Now that the research that led to “Changing the Conversation” has concluded, it’s our job to stay current. We are still planning and delivering surveys to track progress on our benchmarks and intend to continue this work going forward. As with most data-driven projects, you can never really say the work is simply “done.” People can continue to use data created by these projects, follow up to evaluate change and growth, and mobilize knowledge on an ongoing basis. That’s why VSP is now focused not just on screening education but also on screening research in the long term. To build effective educational programming, we need effective research and data to share with the public.


Graeme Dearden is the Manager of Learning and Resources at Volunteer Alberta. They manage the Volunteer Screening Program, multiple social research and design projects, and VA’s educational offerings. Graeme has been working in the social impact/nonprofit/voluntary sector, specifically doing capacity-building work, for 11 years. Their experience spans arts advocacy and professional development, as well as nonprofit and co-op program development.