Click this button-link is so that you can safely exit this website quickly Click this button-link is so that you can quickly find the help you need

Continuous Quality Improvement

Continuous Quality Improvement (CQI) is the process of continuously keeping track of how a program or project is being implemented. It is a process of creating an environment in which programs value and strive to constantly improve the quality, efficiency and effectiveness of their programming and services. 

Continuous Quality Improvement (CQI) is the process of continuously keeping track of how a program or project is being implemented. It is a process of creating an environment in which programs value and strive to constantly improve the quality, efficiency and effectiveness of their programming and services.  CQI is an important component of ensuring the success of your program and should be a focus from the start of the program.  Plans should include strategies to collect data to inform program improvement and, ultimately, client outcomes. Collecting data once a year after something has been implemented and or the grant/project is over, or waiting until a problem occurs to make a change are not compatible with the CQI philosophy. CQI aligns with formative/process evaluation, as much of the same data could be utilized for both.  

All key constituents should be included from the start and should be part of the feedback loop of information. Program implementers (anyone who works for the program, at all levels) should be always paying attention to elements of their programs that work best and those that may be problematic. Agencies should be building a culture of learning and innovation through CQI as a strategy to promote accountability, staff morale, and program effectiveness. In order to know our services and programs are effectively meeting the needs of parent and child survivors – we must solicit feedback from participants on what is working and what might be improved.  Intentionally creating buy-in from  program staff members on the CQI data collection processes is an important part of ensuring ongoing CQI. 

Image from Office of Energy Efficiency & Renewable Energy

When designing a CQI plan, think about what could realistically be changed about your program implementation. Focus your questions on these issues. (Note: When using evidence-based practices, be sure about what kinds of adaptations are allowed. What small changes could be made to make programming better? Examples:

  • Timing (number of times per week, amount of time you meet, duration of program, time of day, etc.) 

  • Delivery (openness of facilitator, keeping order in a group, developmentally appropriate, location, staffing, etc.) 

  • Cultural adaptations (pictures used, word choices, acceptability by population served, etc.) 

  • Language (non-English, reading level, etc.) 

  • Measurement (instruments, follow-up timing, etc.) 

If the accumulation of small changes does not make a difference, then maybe the program implementers might think about changing the program entirely. For example, would a new program model/strategy or other evidence based practice be more appropriate for the population served?  

How to Use CQI Data

Methods for reviewing data collected from these multiple sources should be determined from the start of the program. Think about who can inform project implementation and which constituents can speak to which issues. Providers will have different expertise than program participants who will have different priorities than the community at large. Make sure people know from the start of your project that they will be called on to provide feedback about program processes. Ask what is working and what is not working both in general and with focused questions about making services better for clients and improving client outcomes. CQI can also identify or reveal changes needed to support staff in improving program delivery, supervision issues, policy gaps, etc. 

Programs should also determine which methods would be best to collect CQI data. Data collection could be formal or informal, and almost anything could be considered a data source. Program implementers working on CQI could: 

  • Ask a direct question about how things are going at every meeting (with grant team, partners, providers, families, advisory committees, etc.) and take good notes:  

    • Have you received any feedback from participants about the length or frequency of meetings? 
  • In training surveys of providers: include questions about delivery, presentation, timing, utility, etc.:  

    • The workshop I attended today will be relevant to my future work with children who have witnessed or experienced violence (strongly disagree, disagree, agree, strongly agree). 
  • In training surveys of providers: include open-ended directed questions about what works and what does not:  

    • Please let us know what you thought about the overall presentation and the presenters. 
    • What is a topic, skill, or practice you would like to learn more about? 
  • In group interviews/focus groups with providers: ask questions specific to the audience and get recommendations for change:  

    • Did you find that this intervention was particularly effective with younger children or teens? Please explain. 
  • In group interviews/focus groups with clients/families: ask questions specific to the audience and get recommendations for change:  

    • How long did you see your therapist?  Do you think that was too long, too short, or just right? 
  • In participant satisfaction surveys: ask specific questions about what works best and what might be a challenge, and provide space for open-ended responses:  

    • What was the best part of today’s workshop? 
  • Conduct thematic analysis of all qualitative data sources – look for common responses and opportunities to make small changes 

Tip – ensure trauma informed and ethical data collection methods are used in all CQI processes. 

Provide collected feedback to other key partners including: program implementers, administrators, supervisors, advisors, community partners, EBP developers, and program participants and their families. Feedback alone is not enough, though, so be ready with recommendations for change based on data collected. Taking the feedback seriously and acting to make changes (quickly) shows the program staff and participants that their voice and ideas matter and can go a long way in building trust. 

How to Make Program Improvements

Use data to support change. Make small changes first. One way to do this is to use the Plan, Do, Study/Check, Act Cycle: 

  • (1) With the data you have collected about program implementation, determine what might need to be adapted. For example, in order to address low participant completion rate you might come up with many ideas that might increase participation. 

    • Move the location of the service closer to public transportation 
    • Extend the first session to include more family engagement strategies 
    • Offer childcare and or meals to participants 
    • Add additional culturally relevant activities to the program  
  • (2) Make a small time-limited change. Choose one of your ideas to try first. For example, move the location of the service closer to public transportation. 

  • (3) Check the efficacy of the change. For example, after three sessions that have taken place closer to public transportation, determine if participant numbers have changed compared to those meetings before the change. 

  • (4) Decide if the change makes sense as an adaptation to your program. For example, if the number of participants doubled after being moved closer to public transportation, then this might be a permanent change. If numbers have not increased, then maybe try another idea for change. 

  • (5) Repeat the process with other small changes as necessary. For example, offer childcare to participants to see if the number of participants increases. After three sessions, determine if the number of participants has increased enough to make this a permanent change. 

Why do we include CQI in our programming?

  • Create a culture of learning and innovation 

  • Data driven way to make program improvements that center the voices of survivors and program staff 

  • To show what the project is doing throughout the funding period 

  • Improve staff morale 

  • Build trust and increase accountability to the community/s being served 

  • To inform periodic check-ins with the funder