Assessing data capacity - Supporting data-informed decision-making to strengthen Ontario's child and youth mental health services June 2017 ...

Page created by Salvador Gordon
 
CONTINUE READING
Assessing data capacity
Supporting data-informed decision-making to strengthen Ontario’s
child and youth mental health services

June 2017

Prepared by:
Evangeline Danseco
Kyle Ferguson
Nicole Summers

                                                                   1
Assessing data capacity

                                                                                                             June 2017

Acknowledgements
The authors of this report would like to acknowledge the input and feedback of members of the technical subgroup of
the child and youth mental health data strategy working group, project team members Blake Davey, Alejandra Dubois
and Jana Kocourek, colleagues at the Ministry of Children and Youth Services Angela Batra-Jodha and Eli Perrell, and Bill
Gardner, Senior Scientist at the Children’s Hospital of Eastern Ontario Research Institute.

Suggested citation
Danseco, E., Ferguson, K. & Summers, N. (2017). Assessing data capacity: Supporting data-informed decision-making to
strengthen Ontario’s child and youth mental health services. Report to the Government of Ontario, Ministry of Children
and Youth Services. Ottawa, ON: Ontario Centre of Excellence for Child and Youth Mental Health.

                                                                                                                         2
Assessing data capacity

                                                                                                                                                                               June 2017

TABLE OF CONTENTS
Executive summary .................................................................................................................................................................... 5
Background ......................................................................................................................................................................... 5
Framework for assessing data capacity .............................................................................................................................. 5
Methods .............................................................................................................................................................................. 5
Results ................................................................................................................................................................................. 6
Recommendations and next steps...................................................................................................................................... 6
Background ................................................................................................................................................................................ 7
Summary of the literature.......................................................................................................................................................... 7
Initial framework for assessing data capacity of the child and youth mental health sector .............................................. 9
    Figure 1. Key components of data capacity assessed among lead agencies ................................................................ 10
Method .................................................................................................................................................................................... 11
Results ..................................................................................................................................................................................... 12
Data capacity within lead agencies ................................................................................................................................... 12
    A.      Infrastructure ........................................................................................................................................................ 12
    Figure 2. Frequency distribution for responses to Data capacity - Infrastructure (N=29)............................................ 13
    B.      Human resources .................................................................................................................................................. 13
    Figure 3. Frequency distribution for responses to Data capacity - Human resources (N=29)...................................... 14
    C.      Data processes ...................................................................................................................................................... 15
    D.      Decision-making .................................................................................................................................................... 15
    Figure 4. Frequency distribution for responses to Data capacity - Data processes (N=29).......................................... 16
Data capacity in the service area ...................................................................................................................................... 17
    Figure 5. Frequency distribution for responses to Data capacity - Decision-making (N=29) ....................................... 18
    Figure 6. Frequency distribution for responses to assessment of data capacity in the service area (N=29) ............... 19
An updated framework for a continuum of data capacity ............................................................................................... 20
    Figure 7. Framework for a continuum of data capacity................................................................................................ 20

                                                                                                                                                                                                   3
Assessing data capacity

                                                                                                                                                                            June 2017

Recommended strategies for enhancing data capacity ............................................................................................................ 22
Summary and next steps .......................................................................................................................................................... 24
Bibliography ............................................................................................................................................................................. 25
Appendices .............................................................................................................................................................................. 27
Appendix A: Definitions and rationale for key components of data capacity .................................................................. 28
Appendix B: Technical subgroup of the MCYS Data Strategy Working Group ................................................................. 32
Appendix C: Data capacity assessment tool - English ....................................................................................................... 33
Appendix D: Data capacity assessment tool - French ....................................................................................................... 45
Appendix E: Scoring and guidelines for a continuum of data capacity ............................................................................. 59

                                                                                                                                                                                                4
Assessing data capacity

                                                                                                                 June 2017

Executive summary
Background

Through Moving on Mental Health (MOMH), lead agencies are responsible for system planning, performance reporting
and measurement, and effective performance management in their service area. The capacity of the lead agency to use
data is critical in these functions. Core service agencies also need to have sufficient organizational capacity to collect and
report data, and to improve their services.
Data capacity refers to the capacity of agencies to have reliable, accurate and timely information for effective decision-
making. For example, data capacity is needed to support the planning of services, to monitor and improve programs and
services delivered, and to assess the effectiveness of programs and services.
This report presents a summary of the literature and a framework for assessing data capacity. The report summarizes
results on the areas of strengths and needs on data capacity, and priority areas for improvement from 29 lead agencies
across 31 service areas.

Framework for assessing data capacity

Based on a review of the literature in information management and evaluation capacity building, we assessed data
capacity along four areas:
   1) infrastructure which refers to hardware, software and aspects of the physical environment that supports the
        collection and analysis of data;
   2) human resources which refers to the number of dedicated technical staff focused on data, and resources for
        ongoing professional development;
   3) processes relating to data collection, data analysis, quality controls and reporting; and
   4) decision-making which refers to leadership support, organizational culture, the value and knowledge of staff and
        managers on data, and practices to support the use of data for making decisions.

Methods

With input from members of a technical group for the data strategy for Ontario’s child and youth mental health sector,
an assessment tool was developed and deployed to lead agency senior management. Each lead agency team convened
three to nine people from different levels or groups within the organization to provide various perspectives of data
capacity.
The data capacity assessment tool consisted of items in the proposed four areas using a 3-point rating scale (1 =
foundational, 2 = learning, 3 = excelling). Within each rating, specific descriptions were included to provide guidance on
what is considered foundational, learning and excelling. To obtain information on the context of lead agencies within the
four areas, open-ended questions on the challenges, effective strategies and priorities for improvement were included.

                                                                                                                             5
Assessing data capacity

                                                                                                               June 2017

Items providing overall perception of data capacity of the core service agencies within the service areas and priorities for
the service area were also included.

Results

All 29 lead agencies provided responses to the data capacity assessment resulting in a 100% response rate. A revised
framework based on the qualitative and quantitative data shows five key areas and a continuum of data capacity:
leadership, infrastructure, human resources, data processes and decision-making.
The context of lead agencies defines what investments can be made for infrastructure and human resources: funding
from government and other sources, geographical settings and policies. Leadership support and value for data among
clinicians is essential in influencing an organizational culture for learning. Without these three essential components
(sufficient infrastructure, adequate staffing and leadership support), data processes and the use of data for decision-
making are severely limited. Despite funding challenges, the commitment of lead agencies towards data-driven decision-
making and proactive planning have helped in building data capacity.
Using this framework, agencies were classified along three stages in a continuum of data capacity: 1) strengthening
foundations, 2) enhancing processes and decision-making, and 3) ongoing learning and excelling.
The main priorities for enhancing data capacity include full implementation of the client information systems among
agencies that have transitioned to a new CIS, data integration to maximize automated processes, and enhancing
consistency through standardized definitions, quality controls, and staff training.

Recommendations and next steps

Recommendations were presented for consideration by lead agencies, MCYS and other provincial partners. Results will
need to be validated and the recommendations to be discussed by all relevant stakeholders, to obtain consensus and to
identify concrete actions. These will also need to be integrated into the work currently being identified through the data
and information strategy and the work relating to the implementation of a business intelligence solution. Next steps
include prioritizing areas for improvement at a provincial level, building on effective strategies to scale up across the
province, and assessing the data capacity of core service agencies in collaboration with lead agencies.
The province has embarked on various initiatives to transform the child and youth mental health system. Enhancing the
data capacity of agencies in the child and youth mental health sector is essential in these efforts. Good quality data
matters for making effective decisions to improve services and to achieve optimal mental health outcomes.

                                                                                                                           6
Assessing data capacity

                                                                                                               June 2017

Background
Through Moving on Mental Health (MOMH), the Ministry of Children and Youth Services (MCYS) and the agencies
providing mental health services have been working to improve mental health outcomes by enhancing access to care,
coordination of services and supporting youth and family engagement. Lead agencies are responsible for system
planning, performance reporting and measurement, and effective performance management in their service area. The
capacity of the lead agency to use data is critical in these functions. Core service agencies also need to have sufficient
organizational capacity to collect and report data, and to improve their services.
Data capacity refers to the capacity of agencies to have reliable, accurate and timely information for effective decision-
making. For example, data capacity is needed to support the planning of services, to monitor and improve programs and
services delivered, and to assess the effectiveness of programs and services.
In December 2016, MCYS asked the Ontario Centre of Excellence for Child and Youth Mental Health (the Centre) to lead
an initiative on assessing the areas of strengths and needs of agencies relating to data capacity, and identifying supports
for priority areas for improvement. The project aimed to:
    1) develop a continuum that differentiates between various data capacity levels, and
    2) identify opportunities and strategies for enhancing data capacity in the sector.

This report presents a summary of the literature and a framework for assessing data capacity. The report summarizes
results on the areas of strengths and needs on data capacity, and priority areas for improvement from 29 lead agencies
across 31 service areas.

Summary of the literature
Research literature on evaluation capacity building and on information management capacity was reviewed to identify
areas and methods for assessing data capacity, including measures used to assess data capacity. Search terms included
evaluation capacity, information management capacity, nonprofit organizations, and data-driven or data-informed
decision-making. Literature was limited to those published in English primarily within the past 10 years.
Broadly, data capacity in the evaluation literature refers to the ability to perform high-quality evaluations and use
evaluation findings (e.g. Bourgeois & Cousins 2008; Cousins, Goh, Clark, & Lee, 2003; Stockdill, Baizerman, & Compton,
2002). This definition includes the capacity of an organization’s structures, processes, culture, human capital and
technology to produce evaluative knowledge (Stockdill et al., 2002). An important piece of improving organizational
effectiveness is the ability to sustain the capacity of attaining and using evaluative knowledge on several operational
levels (Neilson, Lemire, & Skov, 2011).

                                                                                                                             7
Assessing data capacity

                                                                                                                  June 2017

The literature on evaluation capacity building includes several conceptual frameworks for assessment of an
organization’s current capacity to conduct evaluation, identify areas to improve, and ability to utilize the information
acquired through evaluation (e.g. Bourgeois & Cousins 2008; Neilson at al., 2011). Despite the varying frameworks
proposed, there is a high degree of consistency between the concepts and components in the literature (Labin et al.,
2012).
Across the literature, two areas of barriers associated with enhancing information management and evaluation capacity
included individual and organizational barriers (Labin et al., 2012). Individual barriers refer to the lack of participation
from staff, and difficulty applying new knowledge/skills associated with evaluation. The most prevalent was the staff
attitude (positive or negative) towards evaluation.
The most commonly reported barriers to evaluation and data-informed decision-making are organizational barriers,
most notably the availability of resources for evaluation and leadership support (Bourgeois & Cousins 2013; Cousins et
al., 2008; Labin et al., 2012; Maxwell et al., 2016). The lack of resources and infrastructure often relates to fragmented
and inadequate information systems (Upadhaya et al, 2016). Hence, in the literature on measures to assess information
management or evaluation capacity, organizational elements have been integrated into data capacity assessment such
as leadership support, an organizational culture that supports learning, and a system to assist in disseminating the
evaluative knowledge (Cousins et al., 2008; Maxwell et al., 2016; Preskill & Boyle 2008).
Qualitative research using focus groups and key informant interviews methods (e.g. Andrews et al., 2006; Brandon &
Higga, 2004; Bourgeois & Cousins, 2008; Carman & Fredericks, 2010) has led to the development of measures primarily
around evaluation capacity (Labin et al., 2012; Taylor-Ritzler, Suarez-Balcazar, & Garcia-Iriarte, 2009). Ontario Public
Health had released a comprehensive review of the measures that have been used to assess evaluation capacity
(Ontario’s Public Health Units, 2015). These included tools developed by Bourgeois and colleagues (2008; 2013), Taylor-
Ritzler and colleagues (2013), and an evaluation capacity checklist developed by the Centre of Excellence for its
evaluation grants (Danseco, 2013). These tools focused primarily on capacity to conduct program evaluation rather than
a broader conception of data capacity.
Bourgeois and colleagues (2008; 2013) developed a self-assessment instrument comprised of six dimensions, three of
which assess data capacity of the organization. One of these dimensions refers to the capacity of human resources (HR)
currently in the organization. This includes identifying challenges related to staffing resources (e.g. lack of staff, trained
staff, time, or staff resistance or lack of knowledge of the benefits of evaluation). Another dimension assesses current
organizational resources such as allotted financial resources, current ongoing data collection (performance
measurement data), and the organizational infrastructure (e.g. policies, supports). A third dimension examines the
evaluation planning and activities such as the inclusion of a risk assessment, inclusion of evaluation consultants, and
external supports.
Taylor-Ritzler et al. (2013) developed a framework that also included components related to assessing data capacity that
are consistent with those presented by Bourgeois et al. (2008; 2013). The first component examines individual factors
(awareness, motivation, and competence) and the second component examines organizational factors (leadership,

                                                                                                                                 8
Assessing data capacity

                                                                                                               June 2017

learning climate, and resources). In addition, Taylor-Ritzler and colleagues (2013) also examined the critical “Evaluation
Capacity Outcomes” which included capacity to mainstream evaluation practices into work processes and the intended
or current use of evaluation findings.
Neilson and colleagues (2011) developed a measure around a framework of 4 dimensions of data capacity. Two of the
components refer to the need for evaluation and included evaluation objectives (e.g., purpose, evaluation framework,
and formalization) and organizational structure and processes (e.g., organizational location and function of the
evaluation unit). The second component encompassed evaluation supply capacity which includes; (1) human capital
(e.g., staff evaluation experience, evaluation training); (2) evaluation technology and models (e.g., available software,
data collection techniques).
Upadhaya and colleagues (2016) focused on the health information system and data processes utilized by organizations
delivering mental health services in low and middle-income countries, using document reviews and key informant
interviews. The tool consisted of nine sections: (1) background of the health management information system (HMIS),
(2) plans and policies relating to the HMIS, (3) process of recording and collecting data, (4) monitoring, evaluation and
feedback procedures, (6) dissemination and utilization of data, (7) human resources, (8) availability of mental health
indicators, and (9) coordination and linkages.
It was interesting to note that in the study conducted by Maxwell and colleagues (2016), the variability of responses
within an organization was similar to that observed across organizations. Hence, assessing organizational characteristics
is best obtained through the input of various members of the organization, preferably from various levels or functions
(Maxwell et al 2016; Taylor-Ritzler et al., 2013; Upadhaya et al., 2016).
Consideration must be given to the limitations of using self-reports of data capacity. Self-reports alone do not provide
the in-depth information necessary to understand the underlying factors influencing data capacity. It is recommended
that the use of self-assessment data be coupled with key informant interviews or focus groups to gain a deeper
understanding of an organizations data capacity strengths and challenges (Neilson et al., 2011; Oliva, Rienks & Chavez,
2007).
Qualitative data obtained through third party evaluators can mitigate against cognitive biases. For example,
organizations with lower capacity do not necessarily know what they don’t know and will tend to over-estimate their
capacity while organizations with higher data capacity will tend to underestimate their relative competence (Critcher &
Dunning, 2009; Kruger & Dunning, 1999). Hence, when assessing data capacity, evaluators must be aware of such biases,
and contextual and cultural factors that will greatly affect the data-informed decision-making process.

Initial framework for assessing data capacity of the child and youth mental health sector

Based on the above literature, we assessed data capacity along four areas (see Figure 1):
   1) infrastructure which refers to hardware, software and aspects of the physical environment that supports the
       collection and analysis of data;

                                                                                                                             9
Assessing data capacity

                                                                                                            June 2017

2) human resources which refers to the number of dedicated technical staff focused on data, and resources for
   ongoing professional development;
3) processes relating to data collection, data analysis, quality controls and reporting; and
4) decision-making which refers to leadership support, organizational culture, the value and knowledge of staff and
   managers on data, and practices to support the use of data for making decisions.

Figure 1. Key components of data capacity assessed among lead agencies

                                                       •Information systems
                                 Infrastructure        •Automated processes
                                                       •Hardware
                                                       •Physical environment

                                                                    •Number & type of technical
                                                                     staff
                                                   Human            •Technical knowledge
                                                  resources
                                                                    •Organizational structure
                                                                    •Professional development
               Data
             capacity                                                      •Data collection
                                                                           •Use of standardized tools
                                                         Processes         •Data analysis
                                                                           •Data quality controls
                                                                           •Data reporting

                                                              •Leadership support
                                        Decision-             •Value by clinical staff
                                         making               •Organizational culture
                                                              •Use of data

                                                                                                                         10
Assessing data capacity

                                                                                                                 June 2017

For the current project, a measure was developed to assess capacity in these areas and items tailored to the context of
lead agencies. Appendix A summarizes the definition for the areas, rationale from the literature, and subsequent items
in the assessment tool.

Method
From January 2017 to May 2017, the technical subgroup of the Data Strategy Working Group provided MCYS with input
and feedback from lead agency representatives on the work relating to the business intelligence solution (see Appendix
B). During this time, members of this technical group were consulted on the various phases and activities of the data
capacity assessment project such as the areas of data capacity, the definitions and rationale for the assessment tool
items, questions to include in the assessment tool, and methods for obtaining responses.
Appendices C (English version) and D (French version) show the data capacity assessment tool with items in the four
areas using a 3-point rating scale (1 = foundational, 2 = learning, 3 = excelling). Within each rating, specific descriptions
were included to provide guidance on what is considered foundational, learning and excelling. To obtain information on
the context of lead agencies within the four areas, open-ended questions on the challenges, effective strategies and
priorities for improvement were included. Items providing overall perception of data capacity of the core service
agencies within the service areas and priorities for the service area were also included.
Data collection began in April 2017 and ended in May 2017. Contact persons from the 29 identified lead agencies in the
31 service areas included the executive director (or identified senior executive in the organization) and the senior
management lead for evaluation, research and/or quality improvement in the agency. Lead agencies in two service
areas have not been identified to date and were therefore not included for this assessment.
Lead agency contacts were instructed to convene a team and respond to the assessment tool as a group. Each lead
agency team had three to nine people from different levels or groups within the organization to provide various
perspectives of data capacity in the organization and in the service area. For example, managers or directors involved in
reporting data, and managers or directors who use and see internal reports or data participated. Staff who collect,
compile or analyze data were also included to provide perspectives from different levels of the organization.
We initially planned to conduct focus groups to supplement information from the assessment tool but this was
discontinued. Responses to open-ended questions were extensive and provided a wealth of information on effective
strategies and priorities for improvements.

                                                                                                                             11
Assessing data capacity

                                                                                                               June 2017

Results
All 29 lead agencies provided responses to the data capacity assessment resulting in a 100% response rate. Results for
each area of data capacity is summarized from both the quantitative and qualitative data, followed by a summary of
results on data capacity in the service area.
Respondents noted that the process of conducting the assessment as a group allowed for good discussions and
reflections. They indicated that the discussion confirmed their current capacity, needs and priorities. Respondents also
noted that the assessment tool was lengthy and repetitive, with some questions lacking clarity or focus, and some
ratings not being mutually exclusive. Including questions around privacy and consent was suggested by one lead agency
team.

Data capacity within lead agencies

        A. Infrastructure
Regarding lead agencies’ infrastructure to support data-informed
                                                                          “Additional infrastructure resources for
decision-making, about two-thirds of lead agencies consider their
                                                                          [agency] would expedite the work of the
client information system, hardware and connectivity to be sufficient
                                                                          entire MOMH initiative. A number of our
for the present time (see Figure 2). This is largely due to strategic
                                                                          priorities relate to bringing together data
investments and proactive planning that agencies have made despite
                                                                          systems and access systems, validating and
ongoing challenges in funding. The recent financial supports from
                                                                          using system-wide data more effectively,
the Ministry have also contributed in upgrading the infrastructure of
                                                                          developing system-wide waitlist
various lead agencies. Respondents noted that investing in
                                                                          management capacity, inter-agency
infrastructure accompanies expenses related to acquiring staff with
                                                                          scheduling, using shared data to improve
technical expertise and resources for staff training. Other ongoing
                                                                          service delivery, development of quality
challenges include the infrastructure in agencies providing services
                                                                          improvement plans, etc. These all rest on
in remote, rural and/or geographically dispersed areas, and the
                                                                          ensuring sufficient data capacity is in place
delivery of home-based clinical services.
                                                                          across our service area, and will take
One of the top priorities for improvement among several lead              significant work in 2017-18 and beyond”
agencies is in data integration, so that information from finance, HR
and clinical data can be automated and seamlessly analyzed. Having
a robust business intelligence solution, considering data requirements for multi-service and multi-funded agencies, and
methods for incorporating data from standardized measures and financial data are other priorities for improvement.
Some lead agencies are transitioning into a new client information system, with a priority of full implementation and
integration of the system into their operations this fiscal year. Other top priorities include working towards a secure and
common data platform, replacing or upgrading equipment, enhancing data consistency and integrity, and improving/
integrating reports from various sources.

                                                                                                                           12
Assessing data capacity

                                                                                                                  June 2017

    Figure 2. Frequency distribution for responses to Data capacity - Infrastructure (N=29)

        B. Human resources
About two-thirds of lead agencies have management positions with technical knowledge and about two-thirds are
working towards having sufficient technical staff. This area encompasses a broad range of technical staff and includes
evaluation, research, information technology (IT). More information is required to identify gaps in specific technical
expertise.
The most commonly cited human resources (HR) priority as it pertains to data capacity was staff training. Specific topics
such as training for client information systems (CIS), data analysis, and data-driven decision making were identified as
areas of focus. Staff were trained on these topics to spread the workload related to managing data. Another priority at
                                                     the agency had to do with staffing levels. Agencies are looking to
  “We are prioritizing increased staff               increase either the number of people or the amount of time that can
  /management time in the area of human              be dedicated to supporting data-informed decision-making.
  resources. Additionally we are going to            Specifically, lead agencies report difficulties in having enough staff to
  learn how to better use our new HR system          meet the time related to data work, having staff time available for
  and look at ways that it might improve our         training, and balancing workloads.
  quality assurance.”
                                                     A related challenge was recruitment of staff with the required data
                                                     skills. Finding data-trained staff with experience in health care
systems is a particular challenge, as is attracting qualified staff at current salary levels and hiring bilingual or
Francophone staff. The staff and supports from the Centre of Excellence were often cited as helpful, rather than relying
on paid external consultants.

                                                                                                                              13
Assessing data capacity

                                                                                                                 June 2017

Agencies are looking to develop processes related to managing data such as developing data definitions, setting
standards for data analysis, reviewing and updating existing policies, and ensuring policies and procedures are in
alignment with existing best practices and existing regulations. Finally, a noted challenge relates to funding and financial
constraints. These constraints make it difficult to increase staffing levels, hire qualified technical staff, pay for training
sessions, and backfill staff who are attending training.
The biggest barrier to achieving the above priorities will be managing the people-related aspects that accompany these
priorities. Ensuring that all staff are trained and comfortable with using data and the accompanying technology is crucial
to the success of the above priorities. Being able to get everyone trained and having enough time to do so properly,
however, will be a major challenge; as will winning people over to a data-driven way of operating.

    Figure 3. Frequency distribution for responses to Data capacity - Human resources (N=29)

                                                                                                                             14
Assessing data capacity

                                                                                                                  June 2017

        C. Data processes
Data processes relate to data collection, analysis, quality controls and reporting. About two-thirds of agencies consider
their client information systems as providing reliable and accessible information for their managers. For the majority of
the items, lead agencies rated their data processes as in the learning phase (middle range). The only exception is in the
item relating to the analysis of program outcomes where around 40% of lead agencies indicated being at a foundational
phase (i.e. use only frequencies and seldom report on pre-post data).
                                                    The top priority for lead agencies is to develop standardized processes
  “[Our challenges are] Varying levels of           to ensure greater consistency within the agency. The aim is to have
  technical capability among staff, varying         data that is of higher quality that can therefore be used to drive
  commitment to regular reporting and data          decision-making. Another priority among agencies it to get their client
  input. Varying understanding of importance        information systems and databases up and operational. This will
  of data quality and accountability among          provide a repository to collect data and a centralized location from
  frontline staff. Increasing data demands          which data can be pulled and queried. One final priority around data
  makes it difficult to have data processes in      processes is managing the culture shift that will need to take place as
  place prior to reporting.”                        agencies move to become more data driven. Explaining the
                                                    importance of quality data to staff, providing the appropriate training
to staff, and following through with using data to drive decisions are all areas where the current agency culture will need
to shift as part of the transition.
Another significant barrier will be achieving the level of consistency needed to maintain complete and reliable sets of
data. Developing consistent definitions, standards, and processes is crucial to success, but ensuring alignment and
having everyone agree poses a challenge. Once the definitions, standards, and processes are agreed upon, ensuring they
are adhered to will be the next task.

        D. Decision-making
Items in this area referred to leadership support, value of data by clinical staff, organizational orientation to using data,
and data-driven decision-making. Close to 70% of lead agencies indicated that a majority of their management and key
staff championed the value of data, while 7% of lead agencies indicated that they had only a minority of champions.
Most agencies have clinical staff who value data and are collecting data, and most agencies have a strategic focus that
guides improvements. The use of data for system planning was evenly distributed among the three categories of
foundational, learning and excelling.
The top priority lead agencies have regarding using data to drive decision-making is improving data quality. Initiatives
such as stricter data entry requirements, creating efficient procedures, and having a common understanding and

                                                                                                                              15
Assessing data capacity

                                                                                                             June 2017

    Figure 4. Frequency distribution for responses to Data capacity - Data processes (N=29)

interpretation are the top ways agencies are hoping to increase data quality. Another priority is improving the ability to
pull and access data in a timely manner. Client information systems (CIS) are going to play a key role in accumulating the
necessary data. Nevertheless, agencies have identified that in order to commit to data-driven decision-making, they will
need to come up with better ways to access the data being collected. Access to raw data can be quite complicated as
this can involve additional costs depending on the vendor, on available expertise within the agency, and these ultimately
depend on available funds for either the vendor and/or staff.

                                                                                                                         16
Assessing data capacity

                                                                                                                June 2017

A final priority is developing better tools so data can be used more intuitively to drive decision making. Specifically,
identifying what key performance indicators would be most useful for decision making and devising ways to present
information in a manner that facilitates decision making (such as a dashboard).
The biggest challenge to achieving these priorities will be
                                                                    “When staff feel like data is informing what they
getting staff to buy-in to making decisions based on data and
                                                                    are doing, they are good at collecting it (e.g.
training staff to be able to use data in this manner. Careful
                                                                    BASC2 for group work; MASC for evidence
analysis is required to ensure the correct conclusions are
                                                                    informed anxiety intervention); where it is a
drawn, so it is imperative that staff understand the value of
                                                                    challenge is when they must go into a variety of
what they are being asked to do and are sufficiently trained.
                                                                    databases in addition to [the client information
Training, however, takes time and having the time available
                                                                    system] and separately enter their assessment
for staff to not only attend training but also to dedicate to the
                                                                    and treatment plan and again a separate place
analysis required to use data for decision making remains a
                                                                    to obtain client intake profile information-these
challenge.
                                                                    multiple data bases dilute the commitment to
Another challenge regards the quality of existing data, where       accuracy; there needs to be a closer connection
data is often incomplete or inconsistent. Getting good quality      to the data adding value to their work”
data and data that has been standardized across the
organization is seen as an important challenge to overcome in
order to effectively use data to drive decisions. A final challenge is in relation to data literacy among MCYS staff. Some
lead agencies noted the lack of feedback on data submitted, and lack of information from MCYS on how their data is
used for decision-making, and lack of input on improvements in reporting.

Data capacity in the service area

Lead agency teams with core service agencies were asked to rate the overall data capacity in their service area along the
four major areas of data capacity (infrastructure, human resources, processes and decision-making). Lead agencies with
no core service provider agencies indicated “not applicable” in their responses to these items.
In general, results show that data capacity is in the middle range and lead agencies are working towards sufficient
capacity (“learning”) in these areas. Close to 40% of lead agencies are in the foundational stage in terms of their human
resource capacity in their service area. A similar percentage of lead agencies (41%) indicated a basic understanding of
data and limited organizational supports among core service agencies.
None of the lead agencies indicated having sufficient human resources for technical staff in their service area. Similarly,
none of the lead agencies indicated having sufficient data processes within their service area (relating to data collection,
quality control, data analysis and reporting).

                                                                                                                             17
Assessing data capacity

                                                                                                        June 2017

Figure 5. Frequency distribution for responses to Data capacity - Decision-making (N=29)

                                                                                                                     18
Assessing data capacity

                                                                                                       June 2017

Figure 6. Frequency distribution for responses to assessment of data capacity in the service area (N=29)

                                                                                                                   19
Assessing data capacity

                                                                                                            June 2017

An updated framework for a continuum of data capacity
The quantitative and qualitative results both showed the inter-connections among the four components of data capacity
that we initially identified. We present in Figure 7 a revised framework based on the results with five rather than four
areas:
    •   Funding from the government and other sources are essential, and often mentioned as facilitators and barriers.
        Data governance, and contextual factors such as policies and geographical settings are also important.
    •   This context strongly influences an agency’s capacity to invest in infrastructure and human resources.
        Leadership support and clinicians’ value for data are key, and influence an organizational culture for learning.
    •   When the above elements are in place, then consistent and reliable data processes can be enhanced, which in
        turn supports data-informed decision-making.

                Figure 7. Framework for a continuum of data capacity

                                               Data-informed decision-making

                                                          Data
                                                        processes

                                                        Leadership

                                                                        Human
                                       Infrastructure
                                                                       resources

                                         Funding, data governance and context

                                                                                                                        20
Assessing data capacity

                                                                                                              June 2017

Based on this revised framework, we updated the scoring to reflect scores in five areas. A leadership score was
calculated based on three items in the decision-making area from the original six items. The decision-making score was
re-calculated without these items. In addition, responses to the open-ended question on the main strengths in data
capacity were coded into each of these five areas (leadership, infrastructure, HR, data processes and use of data for
decision-making), and respondents who mentioned concepts relating to leadership support and organizational learning
were coded as having leadership strength. Appendix E describes the scoring procedures, analysis and the range of
ratings.
A continuum of data capacity emerged as follows, based primarily on ratings in the first three areas (Infrastructure, HR
and leadership in the bottom part of Figure 7) followed by ratings in data processes and decision-making:
    •   Strengthening foundations – ratings reflect capacity in the foundational stage for two of the three of the basic
        areas (Infrastructure, HR and Leadership) represented as the bottom part of Figure 7. There is a tendency for
        focusing on strengthening the organization’s capacity to have expert staff.
    •   Enhancing processes and decision-making – ratings reflect strong foundations in infrastructure, HR and
        leadership (i.e. ratings in the Learning and/or Excelling phase), and focus is in strengthening data processes
        and/or use of data for decision-making (i.e. the top part of Figure 7). Either data processes or decision-making
        was rated as foundational.
    •   Ongoing learning and excelling – ratings also reflect strong foundations in infrastructure, HR and leadership. In
        addition, data processes and decision-making were rated as in the Learning or Excelling level.

        Limitations
A sample size of 29 does not lend itself to sophisticated data analytic methods such as cluster analysis to empirically
determine these types of data capacity. The reliability indices of the five areas varied considerably and the assessment
tool will need to be revised to enhance clarity and minimize repetitiveness. A larger sample size is also needed to
validate these areas using factor analysis and reliability analyses. As noted in the literature review section, some
agencies may have either under- or over-estimated their capacities due to cognitive biases (Critcher & Dunning, 2009;
Kruger & Dunning, 1999). Adding items that reflect more specific uses of data for decision-making may be needed (e.g.
capacity to report and analyze average number of sessions for a program), as well as potential document reviews (e.g.
examples of reports or information).

                                                                                                                            21
Assessing data capacity

                                                                                                                June 2017

Recommended strategies for enhancing data capacity
The recommendations below are based on
                                                “It is a pleasant surprise to know that our decisions about infrastructure
the updated framework and the analysis of
                                                have culminated into having built an adequate, stable system. It is
results from the data capacity assessment
                                                affirming to know that our decision to have two positions dedicated to
of the 29 lead agencies. These are primarily
                                                data and quality improvement will be key to growing our capacity. This
recommendations that lead agencies,
                                                process reinforced that our technology tools are strong while we fall
MCYS and other provincial partners will
                                                short re: staffing, time, and training. Truly, a lack of human resources is
need to reflect and act upon, rather than
                                                the dominant theme regarding our data capacity.”
recommendations for individual lead
agencies. These recommendations will also need to be validated and discussed by all relevant stakeholders, to obtain
consensus and to identify how to move forward. These recommendations will also need to be integrated into the work
currently being identified through the data and information strategy (led by lead agency technical and executive
directors’ group, with Ministry and Centre representation).

       1. Consider funding directed to enhancing the complement of technical staff.
As MCYS supports the sector through the implementation of a business intelligence solution, including the involvement
of client information system vendors and the refinement of parameters for the business architecture, agencies will need
time and resources to fully implement changes and train staff. Resources are needed to ensure that lead agencies have
sufficient technical staff complement to support the use of data for lead agency functions in system planning, system
coordination and performance measurement. Several lead agencies noted the usefulness of system management funds
that helped in enhancing the infrastructure for the organization and the service area.
Examples from other sectors can be relevant in determining the models and staffing proportions needed for each service
area or region. Further investigation into what works well in the Ontario context will be needed. For example, a regional
or sub-regional model for data analysis coordinators may be a model that MCYS and the lead agencies can consider.
More information can be obtained from lead agencies as to specific human resource needs and levels of expertise
required (i.e. IT, data analysts, research assistants, quality improvement or evaluation personnel).

       2. Ensure leadership support and value for data are in place so that a majority of key managers and staff
          champion the use of data, particularly for those agencies in the foundational and learning phases.
The literature we reviewed and the results from the data capacity assessment underscored the central role of leadership
in setting the directions, policies and practices within the agency. Several lead agency respondents noted their proactive
planning and strategic decisions relating to investments towards infrastructure and hiring of staff. Leadership support
from all levels of the organization is important in setting the culture towards learning. A paradigm shift will be needed so
that staff and managers can use data to support decision-making. Enhancing data literacy is also important among

                                                                                                                            22
Assessing data capacity

                                                                                                                June 2017

Ministry staff so that they can understand the value of data and be able to support lead agencies in improving their use
of data for decision-making.
The Centre of Excellence can assist in following-up with the lead agencies and developing tailored approaches to
enhance leadership and clinician support for using data. The Centre can also work with Ministry staff to identify ways of
enhancing data literacy in their work with lead agencies.

       3.   Establish consistent and robust data definitions, business rules and data processes.
A prominent theme related to standardization and consistency of various processes such as collection, analysis and
reporting of the following: outcome measures, implementation of business rules within the client information systems,
data audits and MCYS performance indicators. MCYS has already begun this through its implementation of the business
intelligence solution and with CIS vendors’ updating their systems this fiscal year.
There will likely be ongoing work on various processes that will need to be standardized across all lead agencies so that
aggregate data can be more accurate and useful. Communication to all agencies is also essential and detailed
documents providing specific guidance on data definitions, business rules and data processes will need to be done
broadly.

       4. Support staff training activities to enhance reliability, accuracy and consistency in processes.
Many lead agencies have already embarked on their priorities towards enhancing the reliability and consistency in their
internal processes. As changes emerge from the work relating to the BI solution, there will be opportunities to leverage
these individual lead agency activities to more widespread common training across lead agency staff to truly ensure
consistency across the system.
Relevant stakeholders within the Ministry, lead agencies and the Centre will need to develop and implement a plan,
considering the complexities within each service area. Specific processes for performance indicators can likely be
prioritized, and “quick wins” and less complicated indicators tackled first to gain momentum.

       5. Make good quality data matter.
Relevant and timely data should be at the basis of decision-making, at the program, agency, service area and system
levels. The design of accountability mechanisms should rely on valid and easy to understand indicators and to produce
indicators that reflect real system performance.
The ongoing engagement of clinicians, evaluators and researchers in lead agencies, as well as key stakeholders such as
developers of standardized measures, vendors and Ministry staff needs to continue so that relevant indicators are
monitored. The work underway regarding consistent definitions of service types and operational definitions of specific
indicators is a step in the right direction.

                                                                                                                             23
Assessing data capacity

                                                                                                              June 2017

Summary and next steps
This report presents results from a data capacity assessment among 29 lead agencies in 31 service areas as of Spring
2017. The assessment of data capacity was constructed along four areas, based on a review of literature in information
management and evaluation capacity building, and input from various technical experts in Ontario’s child and youth
mental health sector.

A revised framework based on the qualitative and quantitative data shows five key areas: leadership, infrastructure,
human resources, data processes and decision-making. Without sufficient infrastructure and adequate staffing, data
processes and the use of data for decision-making are severely limited. Despite funding challenges, the commitment of
lead agencies towards data-driven decision-making and proactive planning have helped in building data capacity. Main
priorities for enhancing data capacity include full implementation of the client information systems among agencies that
have transitioned to a new CIS, data integration to maximize automated processes, and enhancing consistency through
standardized definitions, quality controls, and staff training.

The governance of data which includes data sharing agreements, handling privacy and consent were not included in the
current assessment. The assessment of data capacity in core service agencies and in the service area in general were
based on lead agency respondents and will need to be further examined and validated among the core service agencies
and Ministry perspectives. The assessment tool itself will need to be revised to minimize repetitiveness, enhance clarity
and improve reliability.
Other next steps include:
    •   Validate the summary of each agency’s results and make needed adjustments
    •   Conduct knowledge mobilization activities on the use of results of this report (e.g., webinars), focusing on
        prioritizing areas for improvement at a provincial level
    •   Consult with the data and information management strategy working group, lead agency consortium, the
        partnership table and MCYS on the results and next steps
    •   Analyze the data based on the region and/or service area size, and conduct further analysis of effective
        strategies and identify opportunities for scaling up, and
    •   Assess data capacity of core service agencies, in collaboration with lead agencies.

The province has embarked on various initiatives to transform the child and youth mental health system. Enhancing the
data capacity of agencies in the child and youth mental health sector is essential in these efforts. Good quality data
matters for making effective decisions to improve services and to achieve optimal mental health outcomes.

                                                                                                                          24
Assessing data capacity

                                                                                                                June 2017

Bibliography
Bourgeois, I., & Cousins, J. B. (2008). Evaluation Capacity Building through Profiling Organizational Capacity for
Evaluation: An Empirical Examination of Four Canadian Federal Government Organizations. Canadian Journal of Program
Evaluation, 23(3), 127–146. Retrieved from http://www.eric.ed.gov/ERICWebPortal/recordDetail?accno=EJ935340
Bourgeois, I., & Cousins, J. B. (2013). Understanding dimensions of organizational evalaution capacity. American Journal
of Evaluation, 34(3), 299–319. https://doi.org/10.1177/1098214013477235
Brandon, P. R., & Higa, T. A. F. (2004). An empirical study of building the evaluation capacity of K-12 site-managed
project personnel. The Canadian Journal of Program Evaluation, 19(1), 125–142.
Carman, J. G., & Fredericks, K. A. (2010). Evaluation capacity and nonprofit organizations: Is the glass half-empty or half-
full? American Journal of Evaluation, 31(1), 84–104. https://doi.org/10.1177/1098214009352361
Critcher, C. & Dunning, D. (2009). How chronic self-views influence (and mislead) self-assessments of task performance:
self-views shape bottom-up experiences with the task, Journal of Personality and Social Psychology, 97, 931-945. doi:
10.1037/a0017452
Danseco, E. (2013). Evaluation capacity checklist. Ottawa, ON: Ontario Centre of Excellence for Child and Youth Mental
Health. (Availablefrom the author at edanseco@cheo.on.ca.)
Hanlon, C., Luitel, N. P., Kathree, T., Murhar, V., Shrivasta, S., Medhin, G., … Prince, M. (2014). Challenges and
opportunities for implementing integrated mental health care: A district level situation analysis from five low- and
middle-income countries. PLoS ONE, 9(2). https://doi.org/10.1371/journal.pone.0088437
Information and management capacity check: Tool and methodology. Library and Archives Canada. Available online:
http://collectionscanada.ca/obj/007002/f10/007002-2008-e.ppt
Kruger, J. & Dunning, D. (1999). Unskilled and unaware of it: How difficulties in recognizing one's own incompetence
lead to inflated self-assessments. Journal of Personality and Social Psychology, 77(6), 1121–34.
Labin, S. N. (2014). Developing common measures in evaluation capacity building: An iterative science and practice
process. American Journal of Evaluation, 35(1), 107–115. https://doi.org/10.1177/1098214013499965
Labin, S. N., Duffy, J. L., Meyers, D. C., Wandersman, A., & Lesesne, C. A. (2012). A research synthesis of the evaluation
capacity building literature. American Journal of Evaluation, 33(3), 307–338.
https://doi.org/10.1177/1098214011434608
Maxwell, N. L., Rotz, D., & Garcia, C. (2016). Data and decision making: Same organization, different perceptions;
different organizations, different perceptions. American Journal of Evaluation, 1–23.
https://doi.org/10.1177/1098214015623634

                                                                                                                             25
Assessing data capacity

                                                                                                                 June 2017

Nielsen, S. B., Lemire, S., & Skov, M. (2011). Measuring evaluation capacity: Results and implications of a Danish study.
American Journal of Evaluation, 32(3), 324–344. https://doi.org/10.1177/1098214010396075
Oliva, G., Rienks, J., & Chavez, G. F. (2007). Evaluating a program to build data capacity for core public health functions in
local maternal child and adolescent health programs in California. Maternal and Child Health Journal, 11(1), 1-10.
doi:10.1007/s10995-006-0139-2
Ontario’s Public Health Units. (2015). Building Evaluation Capacity in Ontario’s Public Health Units: A Locally Driven
Collaborative Project. Retrieved from
https://www.publichealthontario.ca/en/eRepository/Building_Evaluation_Capacity_Final_LDCP_2015.pdf
Preskill, H. & Boyle, S. (2008). A multidisciplinary model of evaluation capacity building. American Journal of Evaluation,
29(4), 443–459. https://doi.org/10.1177/1098214008324182
Preskill, H. & Mack, K. (2013). Building a Strategic Learning and Evaluation System for Your Organization - FSG, 32.
Stockdill, S., Baizerman, M., & Compton, D. (2002). Toward a definition of the ECB process: A conversation with the ECB
literature. New Directions for Evaluation, Spring(93), 7–26. https://doi.org/10.1002/ev.39
Taylor-Ritzler, T., Suarez-Balcazar, Y., Edurne Garcia-Iriarte, Henry, D. B., & Balcazar, F. E. (2013). Understanding and
measuring evaluation capacity: A model and instrument validation study. American Journal of Evaluation, 34(2), 190–
206. https://doi.org/10.1177/1098214012471421

                                                                                                                             26
You can also read