Underserviced Area Program - June, 2010 - Report

E-mail Print

Recommendations for Baseline Data Collection Requirements for the Development of Specific Performance Measurements

cranhr-logo

Final Report

Prepared by
Elizabeth F. Wenghofer, PhD
&
John C. Hogenbirk, MSc

30 June 2010



1. Introduction

The Centre for Rural and Northern Health Research (CRaNHR) at Laurentian University, with the involvement of the Ontario Health Human Resources Research Network (OHHRRN), conducted a research study to provide the Ontario Ministry of Health and Long‐Term Care (MOHLTC) with a list of baseline metrics and preliminary performance measures suitable for an evaluation of the revised Underserviced Area Program (UAP). The focus of this document was on identifying data elements that needed to be collected at the roll­out of the revised UAP so as to facilitate future evaluation activities.

 

2. Goals of the Revised Underserviced Area Program (UAP)

  • To meet the unique needs of northern and rural communities that face chronic problems recruiting and retaining physicians.
  • To help all Ontario communities that experience physician shortages.

The revised UAP comprises two programs: (1) Northern and Rural Physician Incentive Fund (NaRPIF), and; (2) the new Return of Service (ROS) program. The study identified baseline data and data sources that can be used to develop measures of program performance and success. These baseline data, metrics and indicators were chosen so as to allow for the future evaluation of NaRPIF and new ROS programs in regards to the following research questions:

  1. How has the NaRPIF and new ROS changed: (a) the number of physicians providing services in eligible communities; * † and (b) the nature of physician services in eligible communities?
  2. Has the NaRPIF and new ROS encouraged participating physicians to stay in these communities?


3. Guiding Principles for the Choice of Data and Metrics

The choice of baseline data and metrics is based on background information on retention derived from a comprehensive 2009 literature synthesis1 and study2 completed by the Australian Primary Health Care Research Institute. Choices were also informed by a recently published Cochrane review that assessed “interventions for increasing the proportion of health professionals practising in rural and other underserved areas”5 that looked at both recruitment and retention. These references and the literature cited therein, provided the theoretical basis for the choice of data and performance measures.

Choices were also informed by standard evaluation practices such as those outlined in Ardal and colleagues (2008)3. Specific use is made of Results Based Management (RBM)7 evaluation approach. The RBM framework advocates:7

  1. Defining the expected results of different program components and of the program as a whole (ensuring that there is clarity about what success will look like);
  2. Identifying ways to measure progress towards achieving expected results;
  3. Measuring actual results, and progress towards longer term results; and
  4. Using information to redirect implementation/operations, and to provide lessons for future decision making.

Dimensions of the RBM evaluation approach are described in subsequent sections. Please note that the use of RBM does not preclude use of other specific evaluation approaches, such as the Health Impact Assessment approach outlined in Appendix C of Ardal and colleagues (2006)4. The proposed metrics and approach is also consistent with the alignment of strategies and measurements described in a recent paper by Veillard and colleagues (2010)6.

It is important that performance measurements take into consideration the potential covariate effects of other factors (i.e., physician professional and social factors, organization factors and community factors), which may impact our interpretation the success of the UAP initiatives. To address this, our evaluation approach focuses on examining the effects of the professional and personal characteristics of individual physicians, organizational/practice structures, and community characteristics.

3.1. Guiding Questions for Structuring Evaluation Planning

Key UAP Outcomes:

  • Recuritment—How did the UAP programs affect physician recruitment to communities? Recruitment involves the attraction and selection of physicians to a particular community, organization or role and is a necessary pre‐requisite for retention. 2
  • Turn­over— How did the UAP programs affect physician turn‐over in communities? Turn‐over refers to the number of physician departures in a specified time period divided by the number of active physicians in the same category. 1
  • Retention—How did the UAP programs affect physician retention in communities? Retention refers to the time between engagement to a service and departure from that service, and thus is a measure of the length of service in a specific location. 1


Factors Affecting UAP Outcomes:

  • What physician professional factors, social factors, organizational factors and community factors affect the outcomes of the UAP programs and what is the magnitude of this effect?

Professional factors are those attributes that a physician “brings with them” to any practice environment (e.g., years in practice, international medical graduate status, certifications).8

Social factors include social, economic and demographic factors such as age, gender, marital status, etc.1,2

Organization factors include characteristics of the immediate practice setting in which the physician works. These features may change if a physician moves from one location to another (e.g., solo practice, FHT, episodic care practice/walk‐in clinic, expected work hours per week). 8

Organizational factors also include available health care infrastructure (e.g., hospitals, after hours clinics), supporting services (e.g., diagnostic, pharmaceutical and rehabilitation services) and consultation opportunities (e.g., access to specialists) as well as opportunities for professional development, locum relief, etc.2

Community factors are intended to provide a snapshot of characteristics associated with the broader community in which a physician’s practice is situated. These include factors such as physician‐population ratio, the level of rurality of community (e.g., ROI score), and community resources (e.g., recreational and cultural infrastructure). 2,8

3.2. Level of Analysis

The proposed performance measures for the revised UAP focus on the community and physician levels.
The revised UAP comprises two programs:

  1. Northern and Rural Physician Incentive Fund (NaRPIF), and;
  2. The new Return of Service (ROS) program.

For the sake of simplicity, we refer to both programs collectively as UAP, differentiating between the NaRPIF and new ROS as necessary. However, we note that the proposed evaluation approach be completed for each program separately so that the impact of each program can be measured.

At the community level, there are four study groups of interest:

  1. Communities that were designated as an underserviced area (UA) under the former UAP and are still designated UA under the revised UAP (UA ALWAYS)
  2. Communities that were designated as an UA under the former UAP but are no longer designated UA under the revised UAP (UA FORMER)
  3. Communities that were not designated as an UA under the former UAP but are newly designated UA under the revised UAP (UA NEW)
  4. Communities that have never been designated as an UA neither under the former UAP nor under the revised UAP (UA NEVER)

At the physician level, there are two§ study groups of interest:

  1. Physicians participating in the UAP incentives (including both NaRPIF and ROS)
  2. Physicians not‐participating in the UAP incentives (including both NaRPIF and ROS)

Where ever possible, the four different community types and two physician groups should be compared to determine the impact of the changes in eligibility for the UAP on physician recruitment and retention.

3.3. Possible Evaluation Approach and Data Collection Strategies

A combination of both formative and summative evaluation approaches should provide information throughout the course of the UAP programs. Formative evaluation strategies are on‐going providing information on program activities from implementation and continuing throughout the administration of the program.3 Summative evaluation strategies primarily are used to evaluate program success at the end of the program or at specific milestones throughout a program’s administration. In order to gain the important information that both formative and summative evaluation approaches provide, both qualitative and quantitative data are required and need to be
collected at different points throughout the life of a program. We recommend that some of the key data collection points include:

  1. Application to the UAP programs
  2. During the program at
    1. 2‐6 months
    2. End of Years 1, 2 and 3
    3.  2 months prior to end of program
  1. Follow‐up after completion of program
    1.  2‐6 months after program completed
    2.  2‐5 years after program completed

In order to facilitate the process of data collection we strongly recommend that: (1) physicians participating in the UAP programs are notified at application that they will be asked to provide data required for the tracking and on­going evaluation of the program and
(2) physicians participating in the UAP programs are asked to provide consent at application for evaluators/researchers to contact them directly regarding ministry evaluation activities.

3.4. Evaluation Issues Addressed at Start of Program

3.4.1. Effectiveness:

Effectiveness is concerned with immediate changes in activities/practices resulting from program changes or interventions. These questions do not examine long‐term impacts, but rather expected changes that are immediate in nature. For the purposes of this project, effectiveness questions should focus on:

  1. Program participation by physicians and communities
  2. Recruitment success per community
  3. Physician and community satisfaction with the program
  4. Factors influencing physician community choice, including role of UAP
  5. Physician intention of staying within the community post‐UAP


Below (Section 4) we specifically outline the evaluation questions and performance measures that relate to examining program effectiveness and that will require certain data to be collected at the launch of the program and throughout the program’s administration.

3.4.2. Access and Equity:

Access and equity are concerned with whether the program and its benefits are equally accessible to all eligible physicians and communities. Access and equity questions should focus on:

  1. a. Systematic differences in physician, organizational and community characteristics with regard to application result (i.e., those who are successful vs. those who are not) to the program or program administration (e.g., eligibility waivers).


3.5. **Evaluation Issues Addressed Later During Program**

3.5.1. Impact:

Impacts examine the long‐term changes, successes and experiences as a result of the program, and can be positive or negative and anticipated or unanticipated. Impact questions should focus on:

  1. Retention in each community, factors influencing retention and relationship between intentions to stay and actual retention
  2. Perceived impact on patient access to physician services
  3. Actual impact on physician services provided

There are many existing data sources that will prove invaluable in the development of performance measures for evaluating the impact of the program. Fortunately, there are very few data elements that will be required to be collected upon launch of the program. Linkages between UAP data and existing provincial administrative and/or physician databases and primary data collection via physician and community surveys, key informant interviews and focus groups may all provide important data for program evaluation, but these may be collected at later stages of the programs’ administration.

For example, a key question evaluating program impact might be: What are the retention rates of physicians participating in the UAP? One performance measure that could be used to indicate this would be the average length of stay per individual physician (rather than total community head count) in the first 5 years upon completion of the NaRPIF and ROS funding. The data that would allow the development of this performance metric are primarily already available from other provincial data sources requiring no unique data collection at the launch of the program, with one exception. A unique physician identifier is the one key data element that will be essential to
effectively utilize and link external data sources to those data collected by the UAP.

We strongly recommend that (1) each physician be given a unique case or program identifier (i.e., that linked to the administration of a specific incentive) and (2) each physician’s College of Physicians and Surgeons of Ontario license number (i.e., CPSO number), Medical Identification Number of Canada (MINC) and OHIP Billing numbers be collected to enable linkages between existing databases. Although we do not specifically outline all the long­term performance measures that will be needed to evaluated program impact, we feel that all the key data elements that are required to be collected from program inception are listed inSection 4 – UAP Database Tracking Recommendations.

3.5.2. Rationale:

Rationale is concerned with examining the logic of the program; the questions are intended to determine whether a program’s recommended set of interventions are the most logical given the stated purpose or objectives, resources and the context in which the program is being implemented. For the purposes of this project rationale questions should focus on the logic of using incentive programs to encourage physician recruitment and improve physician retention in underserviced rural areas.

Although we feel that this is an essential issue to address, the data required to develop performance measurements in this area may be collected throughout the administration of the program and no specific data collection is required at program initiation.

 

4. UAP Tracking Database Recommendations

It is essential that the UAP Tracking Database include certain data elements to enable the calculation of the metrics indicated above. These data specifications for the UAP Tracking Database are outlined in the specification listed below which were based on a February 5, 2010 Draft "Specifications for Physicians Incentive Program Database” provided by MOHLTC staff

The new northern and rural physician incentive program database should have the capacity to track and/or generate the following information (at minimum):

I. Data to be collected from Physicians upon Application

  • Participant Information
    • Name (Surname, Given Name, Middle name, any previous names used professionally)
    • Primary (current) practice address (full street address including 6 digit postal code essential for geo tracking)
    • CPSO # (serves as an Unique Physician Identifier)
    • MINC number (if available ‐ serves as an Unique Physician Identifier)
    • OHIP Billing number (serves as an Unique Physician Identifier)
    • Citizenship details including country, dates of landed immigrant status, work permits etc.
    • Specialty Certification (CFPC, RCPSC (plus specialty area in which certification received), neither) – (NOTE: Specialty categories should be based on CFPC and RCPSC official specialty areas to allow for standardization).
    • Year Certification received
    • Current Functional Practice Area (e.g. full primary care practice, family medicine restricted to psychotherapy, family medicine including obstetrics, family medicine including emergency, internal medicine hospitalist, cardiology) –this variable should reflect what the physician actually DOES in his/her practice regardless of certification/education. (NOTE: To our knowledge, there is no nationally standardized list).
    • Medical education source (i.e., school and country) and dates (i.e., start year, year of graduation) – (NOTE: It may be possible for physicians to have more than one school to list for undergraduate education (e.g., started in one location but finished in another ) therefore requiring the database to accommodate multiple entries)
    • Postgraduate education source (i.e., functional practice area, organization, country) and dates ( i.e., start and end (graduation) year) ‐ (NOTE: The database should accommodate multiple entries for post graduate education).
    • Previous northern/rural experience (location) – including detailed data regarding location, length and type of exposure (i.e., locum, education, practice) **
    • Demographic information:
      • Year of birth, gender, marital status (NOTE: Marital status categories should be based on Statistics Canada definitions to allow for standardization.)
      • Applicant’s place of residence during pre‐school, primary school, secondary school, college/university) ** above
      • Number and age of children/dependents
      • Spouse/partner’s occupation/profession – (NOTE: Occupations categories could be drawn from a standardized list from Statistics Canada (e.g.,
      • National Occupation Classification) or another comparable source to allow for standardization).
      • Spouse/partner has offer of employment (yes/no/not applicable) or anticipates opportunities for employment (yes/no/not applicable))
      • Spouse/partner’s place of residence during pre‐school, primary school,secondary school, college/university **above
  • UAP Practice Structure Information/Practice Plan Information (NOTE: the database needs the ability to record more than one set of UAP Practice information as physicians may MOVE to another UAP community at some point in their participation in the program):
    • UAP Practice Address (full street address including 6 digit postal code essential for geo tracking).
    • UAP Functional Practice Area (e.g. full primary care practice, family medicine restricted to psychotherapy, family medicine including obstetrics, family medicine
    • including emergency, internal medicine hospitalist, cardiology) –this variable should reflect what the physician actually DOES in his/her practice regardless of
    • certification/education. (NOTE: To our knowledge, there is no nationally standardized list, however, the list should be the same as was used above with regard to the variable “Current Functional Practice Area”).
    • UAP Practice Structure Details:
      • Practice organization (e.g., FHT, Hospital, solo practice, etc.) (NOTE: Practice organization categories should be based on a MOHLTC generated list indicating all current practice models with flexibility in the database structure should NEW models be added during the life of the programs).
      • Expected work hours per week (at application)
      • Expected to provide “on call” (yes/no/not sure) or “after‐hours” services (Yes/No/not sure)
      • Electronic Medical records (yes/no/unsure)
      • Electronic health records (yes/no/unsure)
      • Remuneration (FFS, Salary, mixed/blended, other‐specify). (NOTE:
      • Remuneration categories should be based on a MOHLTC generated list indicating all current remuneration models with flexibility in the database structure should NEW models be added during the life of the programs).
      • Expected proportions of practice which are OHIP insured services vs. uninsured services (e.g., cosmetic procedures).
    • Reasons why physician chose practice type (Open text format)
    • Reasons why physician chose community.** above

II. Data to be Auto­generated by Database or Manually entered by UAP Personnel

  • File Information
    • Unique application/case identifier
    • Application date
    • UAP Incentive (ROS or NaRPIF)
    • Application status – denied (with reason for denial), accepted, withdrawn, incomplete or other potential status (e.g., on hold, deferred)
    • RIO of community
    • Program start date
    • Program end date
    • Program completed (Yes/No) – amount completed
    • Reasons for leaving community before end of program. (NOTE: This variable could either be formatted as open text or a categorized list could be developed indicating the primary reasons as outlined in the literature (e.g., Personal reasons (self, spouse/partner, children, extended family), Professional reasons (conflicts with colleagues, unsatisfactory practice characteristics (e.g., pts, on‐call, facilities, support >services), CME opportunities, Better offer (more $, more opportunities)).

III. System should have the following Audit or Report Generation Capacity

  • Audit capacity
    • # of Applications
      • Denied (including reasons for denial)
      • Application Withdrawn/Not completed
      • Accepted –
        • Eligibility waived – no
        • Eligibility waived – yes – reason waived
    • Participants, active and historic:
      • Number by:
        • CSD
        • LHIN
        • Community RIO Scores
        • Sponsoring community/organization contact
        • Year
        • • Start date
        • • End date
    • Communities: available for communities old UAP designation and new UAP designation
    • Financial
      • Total payments made
        • By fiscal year
        • By participant
        • By CSD
        • By LHIN
        • By RIO
      • “Accounts payable”
        • By fiscal year
        • By participant
    • Record of waivers of physician eligibility requirements – (NOTE: Also need to record reasons why waiver granted for each physician at application).
      • By participant
      • By CSD
      • By LHIN
      • By RIO
      • By location (sponsoring community/organization)
  •  Automatic notification capacity
    • Payment and communications – quarterly
    • Confirmation of continued eligibility – annually
  • Communications log
    •  Method of inquiry (phone, email, fax, letter, in person)
    • Reason for inquiry (application, process status, payment, etc.)
    • Details ‐ open text to allow entry of details of inquire via direct entry from phone call or email or to allow appending of scanned documents to record.

 

5.0 References

1. Humphreys J, Wakerman J, Pashen D, Buykx P. Retention strategies and incentives for health workers in rural and remote area: What works? Canberra, Australia: Australian Primary Health Care Research Institute; 2009.

2. Humphreys J, Wakerman J, Kuipers P, Wells B, Russell D, Siegloff S, Homer K. Improving workforce retention: Developing an integrated logic model to maximise sustainability of small rural and remote health care services. Canberra, Australia: Australian Primary Health Care Research Institute; 2009.

3. Ardal S, Butler J, Hohenadel J, Olsen D. The health planner's tool kit: Module 6‐evaluation. 2008. Available at: http://www.health.gov.on.ca/transformation/providers/information/im_resources.html#health.

4. Ardal S, Butler J, Edwards R, Lawrie L. The health planner's toolkit: Module 1‐the planning process. 2006. Available at: http://www.health.gov.on.ca/transformation/providers/information/im_resources.html#health.

5. Grobler L, Marais BJ, Mabunda SA, Marindi PN, Reuter H, Volmink J. Interventions for increasing the proportion of health professionals practising in rural and other underserved areas. Cochrane Database Syst Rev. 2009;(1)(1):CD005314.

6. Veillard J, Huynh T, Ardal S, Kadandale S, Klazinga NS, Brown AD. Making Health System Performance Measurement Useful to Policy Makers: Aligning Strategies, Measurement and Local Health System Accountability in Ontario. Healthcare Policy‐ Politiques de Santé. 2010;5(3):49‐65.

7. Harry Cummings and Associates. A framework for evaluating the quality assurance programs of the colleges of health professions in Ontario. Toronto, ON: Health Professions Regulatory Advisory Council; 1997.

8. Wenghofer EF, Williams AP, Klass DJ. Factors Affecting Physician Performance: Implications for Performance Improvement and Governance. Healthcare Policy‐ Politiques de Santé. 2009;42(8):141‐160.

 

Footnotes

*Eligible communities for NaRPIF are those in northern and rural Ontario with RIO (Rurality Index for Ontario) scores of 40 or more plus the northern referral communities of North Bay, Sault Ste. Marie, Sudbury, Thunder Bay and Timmins.

† Eligible communities for the new ROS program include most communities, but exclude the City of Toronto and adjacent municipalities of Mississauga, Brampton, Vaughan, Markham and Pickering as well as the City of Ottawa.

‡ Changes may be compared among different groups of communities (defined below)—communities that differ in past and present eligibility.

§ It may be that there will be additional physician comparison groups depending on whether there are differences in eligibility requirements for physicians between the previous UAP incentives and the new UAP incentives.

** CRaNHR has a template to capture this information.