Skip to main content

Using Data and Sustaining Your Program


Of course, JDTC teams should monitor and track program completion and termination. However, JDTC teams should also routinely collect other data that can lead to ongoing program improvement and sustainability. For example, JDTC teams are encouraged to collect and aggregate data regarding family functioning, general recidivism, substance use, educational enrollment, sustained employment, involvement in prosocial activities, and youth-peer associations. Using data to inform service delivery can be referred to as continuous quality improvement (CQI). This is different from evaluative research. Evaluative research refers to having an outside entity examine your program and assess how well it achieves its goals and produces specific outcomes of interest. CQI, on the other hand, refers to the process of collecting, aggregating, and analyzing data early, often, and in multiple ways to determine whether or not service delivery and other program activities are occurring as intended. JDTC teams can use agreed-upon benchmarks to track and monitor data, make key improvements, and report progress on the JDTC mission. Use the tips, questions, answers, and resources found here to effectively use data for quality improvement and sustainability. 

Tips for Implementation 

Using Data

  • Review “7 (Easy) Steps to Measuring Performance of Juvenile Drug Courts” in 7 Articles with 7 Easy Steps to Improving Your Juvenile Drug Court (pages 7-13) for information about drawing on JDTC mission statements to draft operational objectives and performance measures that can inform continuous quality improvement.
  • Use the “Develop goals” (pages 98-102) and “Write objectives” (pages 198-201) worksheets found in the Starting a Juvenile Drug Court: A Planning Guide to draft mission-driven goals and operational objectives that JDTC teams can monitor to ensure successful service delivery. 
  • Draft a process to determine whether or not the JDTC team is achieving the proposed goals and objectives by using the “Build a system to monitor the program” (pages 202-205) found in the Starting a Juvenile Drug Court: A Planning Guide


Funding for JDTCs comes from many places. Some of these sources are stable (e.g., state supreme courts, administrative office of the courts, or county court systems). Others may be based on requests or applications (e.g., state block grants) to entities that select recipients and dollar amounts that vary, so the funding can’t be relied upon every year. Relying on funding requests or applications can put JDTCs in a precarious position and impact the services and interventions they are able to provide to youth and families. 

  • JDTC teams should tap into support from a steering committee to assist with mapping out funding sources and indicate which sources are stable and which sources are variable year to year or variable by monies awarded. This exercise may help the team strategically plan for augmenting funding sources that support treatment services, salaries, incentives, or pro-social activities. 
    • Even if your JDTC team is fully operational, use the “Develop a start-up budget” worksheets found in the Starting a Juvenile Drug Court: A Planning Guide to plan for specific budget categories like drug testing and transportation (pages 211-214). 
    • “Develop a five-year plan” using the worksheets found in the Starting a Juvenile Drug Court: A Planning Guide to help the team consider ways to institutionalize JDTCs in your state or community (e.g., advocating for legislation to fund JDTCs) (pages 215-217). 
  • JDTC teams should think about sustainability broadly, meaning it is more than finding funding to support salary and treatment services. Sustainability can include building strong community partnerships to provide services needed for youth and families. 

Frequently Asked Questions 

If we don’t have an evaluator on the team, how can we get help to collect and analyze our data?

Many teams do not have the funds or resources to engage an evaluator in their team planning or operations. Consider reaching out to a local college or university to see if there are graduate-level students in social science programs (e.g., criminal justice, social work) who need practical experience or internships. These students can provide ongoing data support to the JDTC team, even if they only work for a semester or two, as long as there are structured processes and procedures in place to follow. 

  • Use the Academic Partnerships technical assistance brief for specific guidance on working with local colleges and universities (Note: this technical assistance brief is currently being approved by OJJDP, check back soon for the approved version)
  • Request specific technical assistance to collect and analyze JDTC programmatic data. There are many national providers that have federal funder to support this type of request – visit TTA360 to submit a formal request from a technical assistance provider funded by the Office of Juvenile Justice and Delinquency Prevention. 

How can we share aggregated data with our steering committee, community partners, and youth and families? 

Consider developing a process to draft a quarterly report to easily share aggregated data with stakeholders, partners, funders, and youth and families. Think of these quarterly reports as “JDTC Report Cards,” which can be compiled based on the data that are already collected and used for program improvement and decision-making. 

  • NCJFCJ recommends working as a team to follow the guidance found in Using Report Cards to Share Juvenile Drug Treatment Court Program Data: A Technical Assistance Brief, which provides steps to producing quarterly report cards and tips on data points to collect, as well as how to analyze the aggregated data (Note: this technical assistance brief is currently being approved by OJJDP, check back soon for the approved version).   

Collecting information on family functioning seems very subjective. How can JDTC teams best go about collecting this data in an objective way? 

Yes, “family functioning” can be very subjective, and it can be difficult to collect quantitative data regarding achieving a goal to improve family functioning. However, there are ways to collect and analyze quantitative data and qualitative data to determine whether or not the services/interventions provided in a JDTC are improving family functioning. Consider using the examples below to begin this analysis: 

Quantitative Data

  • Isolate the family-related domain in the risk/need assessment used during intake and again during reassessments to identify if improvements have been made in that domain. For example, the Youth Level of Service/Case Management Inventory (YLS/CMI) has 8 domains that predict a youth’s risk to re-offend. The second domain, Family/Parenting has a low (0-2), moderate (3-4), and high (5-6) range. This domain is dynamic, meaning it is potentially changeable if the right services or interventions are put in place to support positive outcomes (i.e., improved family functioning). Therefore, if a youth scores 7 (high) in the Family/Parenting domain at intake and evidence-based services that target family dynamics are provided through comprehensive case management, we can expect that score to go down at reassessment. This becomes a quantitative data point that can show improvements in family functioning. If reductions in that domain aren’t occurring as expected, this is a signal to figure out why – 1) meet with your treatment providers to determine if they are using evidence-based modalities with fidelity; 2) aggregate and review attendance data to determine if youth are consistently attending, or 3) conduct a survey or focus group with current and past participants to determine if they are engaging with treatment providers in an authentic way.

Qualitative Data

  • Work with your treatment provider to develop questions and a methodology to survey current and past participants regarding their experiences with the services/interventions. Consider developing several types of questions to better understand these experiences: 
    • Use scales that gauge agreement on certain performance measures, for example – On a scale of 1 – 5 (5 being total agreement), how much do you agree with this statement: “The treatment provider’s approach made me feel comfortable.” 
    • Include questions related directly to the skills and concepts that youth and families were practicing during treatment sessions to gauge increases in knowledge and skills.  
    • Ask one or two open-ended questions regarding participants’ thoughts and feedback about the services/interventions provided and then code/theme the data to draw some qualitative conclusions. This type of qualitative feedback is important to gathering and using youth and family voices for improvement purposes. 
      • Consider using qualitative software for this type of analysis to ensure accuracy and reliability (e.g., NVIVO, Tableau).