OMB Supporting Statement B.January 2018_mj_urban_tc_mj_urban 2-7 clean

OMB Supporting Statement B.January 2018_mj_urban_tc_mj_urban 2-7 clean.docx

Home Visiting Career Trajectories

OMB: 0970-0512

Document [docx]
Download: docx | pdf

Home Visiting Career Trajectories


OMB Information Collection Request

New Collection



Supporting Statement


Part B


January 2018


Submitted By:

Office of Planning, Research, and Evaluation

Administration for Children and Families

U.S. Department of Health and Human Services


330 C Street, S.W.

Washington, D.C. 20201


Project Officer:


Tia Zeno


Contents

Part B. Collections of Information Employing Statistical Methods 1

B.1 Respondent Universe and Sampling Methods 1

B.2 Procedures for Collection of Information 2

B.3 Methods to Maximize Response Rates and Deal with Nonresponse 6

B.4 Tests of Procedures or Methods to be Undertaken 8

B.5 Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data 9



Part B. Collections of Information Employing Statistical Methods

B.1 Respondent Universe and Sampling Methods


Survey


Program managers at all Maternal, Infant, and Early Childhood Home Visiting (MIECHV)-funded local implementing agencies (LIAs) will be invited to complete the survey (estimated 705 LIAs receiving funding from HRSA and 19 tribal grantees receiving funding from ACF).


The research team will implement a two-stage design in which program managers in all LIAs receiving MIECHV funding are recruited, surveyed, and asked to share email addresses for their home visitors and supervisors. Then, home visitors and supervisors will be recruited and surveyed. The definition of who should be included in the home visiting workforce will be clearly defined prior to recruiting staff.


The goal is to achieve completed surveys from a minimum of 2,000 home visiting staff nationally, including home visitors and supervisors, but we estimate burden for up to 3,000 potential respondents (see Supporting Statement A). This number is robust enough to allow approximations of the home visiting workforce characteristics. To achieve an effective sample size of 2,000, given a 70% minimum response rate, the team will need to recruit at least 2,860 eligible staff. HRSA has calculated that MIECHV currently funds 3,074.4 home visitors Full-Time Equivalents (FTEs) and 711.1 supervisor FTEs. But these estimates do not reveal the actual number of individual workers who could respond to the survey. LIAs with MIECHV funding may also have other home visitors whose positions are funded by another source who would also be eligible to participate. Because of the expected universe size, home visitors and supervisors will not be sampled, but rather all staff in these positions (whose contact information is shared) will be invited to participate.


Case Study


The team will identify eight sites that vary in terms of geography, population demographics, labor markets, and home visiting models being implemented. Each case study will be organized by state and sub-state region and will involve visiting multiple LIAs within the selected region. A selected site could be a major metropolitan area, a large county with a mix of urban, suburban, and/or rural zones, or a cluster of small counties or towns. This approach is designed to maximize the study’s inclusion of different program settings, with the understanding that a home visiting program and its employees cannot operate independent of the broader state and local context.


As a first step, the team will create a state selection matrix that includes information on the following criteria, to the extent they are available:


  • counties served through MIECHV, including urbanicity and population size;

  • the number and type of home visiting programs being implemented in a particular state (including evidence-based programs, tribal home visiting programs, and promising approaches);

  • the proximity and penetration of home visiting programs in a given state;

  • the number of clients being served in the state;

  • state estimates of the number of potential home visiting beneficiaries pulled from the National Home Visiting Resource Center 2017 Yearbook;

  • population demographics, including racial, ethnic, and linguistic composition of population;

  • unemployment rates;

  • college graduation rates; and

  • other constructs that may be beneficial to understanding workforce considerations (e.g., existence of state or local home visiting training programs).

Based on the matrix information, the team will propose a preliminary set of 10 states and a list of MIECHV-supported LIAs within those states that could be included in each site visit (which could include tribal home visiting grantees). Input from OPRE, HRSA, and TWG members will help refine the list of states. The team will prioritize the list to select eight states for inclusion, with the remaining two serving as alternates. Selection decisions for the eight states will be made systematically to achieve balance and variation along these factors.


Once states have been finalized, the team will identify up to five LIAs that can be accessed within a weeklong site visit. Geographic limitations may play a role in this selection, since the small research teams will need to travel to each state. In states where there are tribal grantees, ensuring our ability to include them in our visit will inform our case study planning. Within each LIA, the team will conduct up to two key informant interviews – typically with the two most senior staff persons. The team will work with each LIA to determine whether it is feasible to conduct a focus group at the site, and if so, will work with program staff to facilitate recruitment and logistics. Focus groups will be open to all home visitors at each selected agency, and participation is voluntary.


Key Informant Interviews on Professional Development System


The case study described above includes key informant interviews with supervisory and program director staff at each LIA. In a separate component of the study, the researchers will conduct key additional informant interviews with individuals who are experts on the professional development system that supports home visitors. These key informants will be selected by the research team and may be employed at universities, policy research institutes, non-profit organizations, or home visiting agencies.

B.2 Procedures for Collection of Information


Survey


1) E-mail recruitment letters.

The research team will send a pre-invitation email to potential survey respondents for the program manager survey. The survey invitation will come from the project email account ([email protected]) with the sender name “Urban Institute Home Visiting Workforce Study” and not an automated message (ATTACHMENT F).


LIAs can respond directly via phone or email to find out more information about the study. To respond to questions immediately and effectively, the HVCT team has set up a Help Desk with a toll-free phone number and a project email inbox ([email protected]) with constant coverage Monday-Friday, 9am–5pm EST. Project assistants will coordinate coverage to ensure responses to all inquiries. They will be trained to answer the phone with, “This is [NAME] at the National Survey of the Home Visiting Workforce” and to set up a voicemail away-message reflecting the same. When one person is out of the office, he or she will forward the line to another team member.


2) Send a survey invitation to program managers.

The program manager survey (INSTRUMENT 1) will be programmed into Qualtrics web-based software. The research team will generate unique survey links for all potential survey respondents in Qualtrics and then email a survey invitation through Qualtrics to each person with the generated unique link.


3) Monitor survey completion and target follow-up recruitment efforts.

The program manager survey (INSTRUMENT 2) will remain open over a four-week period, which will be extended as needed to allow additional time to improve response rates. The survey team will use a customizable survey management system to monitor who completes the survey. Every Monday morning during the fielding of the survey, the survey team will send a customized follow-up email reminder to anyone who has not yet completed the survey. The team will modify the subject line and content of the reminders each time, emphasizing the value of respondents’ views, and make the content as short and direct as possible.


After several weeks in the field, the survey team lead will check for nonresponse bias. Specifically, he will look for the proportion of LIAs in each state that completed the survey and the proportion of LIAs implementing each model that completed the survey. For a subset of LIAs, project assistants will follow up directly with the program manager to try to encourage their participation. The assistant will call the program and ask to speak with the program manager, share with the program manager the goals of the study and what would be required for participation, answer any questions program managers might have, and obtain the program manager’s verbal consent to join the study (ATTACHMENT G).


Project assistants will maintain a contact log that they update after each contact attempt, including the date and time of the attempted contact, whether they made contact and with whom, and the outcome of the recruitment call. The contact schedule will be designed to make sure the team does not overly burden LIAs who may not be interested in participating in the research. The contact points include the initial email (ATTACHMENT F), three reminder emails (ATTACHMENT H), the initial phone call (ATTACHMENT G), and three follow-up call attempts: one 48 hours later by phone, one 7 days after the first call, and one 14 days after the first call. Additional attempts will be made in cases where contact was made with the program but the manager was not available to talk and the project assistant was encouraged to call back. When feasible, the project assistant will look up an alternative phone number or contact person for a subset of hard-to-reach programs needed to reduce nonresponse bias.


In the case of a refusal, the project assistant will notify the survey task lead and make a note in data collection records so no further contact will be made with the program.


4) Gather staff email lists for home visitor and supervisor survey.


The research team will design a feature in Qualtrics that will allow program managers to enter a list of staff email addresses. (Only work email addresses are needed and not workers’ names, phone numbers, or other identifiers.) At the end of the question portion of the program manager survey, the program manager will click a link to close out their survey and be transferred to a new survey at a unique link (separate from their survey data) where they will be prompted to enter email addresses, either copying and pasting a list or manually entering addresses. This feature will create a secure method for information sharing and streamline the process. The Qualtrics links will be unique to the program so the research team will know who did and did not upload information to the second survey and to which LIA the staff email addresses belong.


The second survey where email addresses are entered will be kept “open” until the information is entered. Follow-up reminder emails will be generated for program managers who are unable to complete that step at the time of the survey and opt to return later to the site.


The process of collecting staff email addresses is described in the recruitment email to program managers (ATTACHMENT F). The recruitment email language will emphasize to employers that the staff survey is voluntary, and even if the agency provides email addresses for staff, the individual employee can decline; if an employee chooses to respond, the survey responses will be kept private (and not shared with employer).


5) Send a pre-invitation email to home visitors and supervisors.

The research team will send a pre-invitation email to the email addresses of potential survey respondents (ATTACHMENT I). Each staff member will choose whether or not to participate, but going through employers to highlight the value of the study should encourage overall strong participation.


6) Send a survey invitation to home visitors and supervisors.

The survey for home visitors and supervisors will be programmed into Qualtrics web-based software similar to the program manager survey. The survey team will generate unique survey links for all potential survey respondents in Qualtrics and then email a survey invitation through Qualtrics to each person with the generated unique link.


7) Field web-based staff survey and send reminders.

The survey team will monitor staff survey completion over a 4-week period. Email reminders will be automatically generated in Qualtrics each week and sent to those who do not respond. After this initial data collection period ends, the survey team leader will review the response rate and check for nonresponse bias. Project assistants will then follow up by email with program managers in LIAs with very low participation rates to encourage staff to participate (ATTACHMENT H).


Case Study


Each site visit will last approximately three days and consist of up to five focus groups with home visitors (one focus group per LIA), each lasting two hours, and up to 10 key informant interviews, each lasting about 90 minutes.


The case study team will include three senior researchers, three research associates, and two research assistants with qualitative research experience. Each individual site visit will be staffed with three researchers: one senior lead, one associate support, and a research assistant. The senior lead will meet with the research assistant to plan the visit and will be the point person for that site, but the lead and the associate support person will alternate leading interviews and focus groups to reduce the burden on a single lead person during a multi-day visit. The research assistant will take typed notes during interviews/focus groups and be responsible for organizing logistics. The interviewer will audio record all interviews (with interviewee consent) and focus groups (with focus group participant consent) to help complete the typed notes and check for accuracy of statements.


Before each interview and focus group, participants will receive a printed consent form including information about the purpose of the study, the interview/focus group procedures, risks and benefits, privacy protections, and the voluntary nature of participation. Participants will sign and return one copy of the consent form and keep one for their own records. Protocols have been written in English only under the assumption that the majority of staff will be native English speakers or bilingual. If a program has a majority Spanish-speaking, limited English proficient staff, the research team will make arrangements whenever possible to conduct a focus group in Spanish. Similarly, the research team will make arrangements whenever possible to enable participation for American Sign Language speakers.


Interviewers will follow a semi-structured interview guide so data can be collected consistently across interviews. Focus group facilitators will follow a moderator’s guide with introductory language, topical questions and probes, and instructions to moderators to best guide the conversation. The in-depth interview protocol is included in INSTRUMENT 3 and the focus group protocol is included in INSTRUMENT 4. A voluntary, anonymous questionnaire for focus group participants is included in INSTRUMENT 5.


Key Informant Interviews


In addition to the key informant interviews with LIA staff described above as part of the case study, the research team will conduct a separate set of key informant interviews with experts in the field of professional development for home visitors. Key informant interviews will be conducted by a senior researcher, and attended by a junior researcher who will take notes. Each interview will take up to 90 minutes. The majority of key informant interviews will take place on the phone, but some may take place in person, if the research team can arrange an in-person meeting at the key informant’s location while on a site visit.


Interviewers will follow a semi-structured interview guide (INSTRUMENT 6) that will ensure that the same topic areas are addressed in each interview. However, the interview guide will be customized for each interviewee, depending on his or her position, background, and experience with the topic of professional development in the home visiting field.


The interviewer will audio record all interviews (with interviewee consent) to help complete the typed notes and check for accuracy of statements.

B.3 Methods to Maximize Response Rates and Deal with Nonresponse

Expected Response Rates


Based on past experience with similar studies, we anticipate being able to successfully recruit the number and type of respondents described in section B.1. We assume, and anticipate, a minimum response rate of 70 percent for each survey based on other national surveys of this nature (e.g., Urban Institute’s Nurse Workforce Study; Evaluation of the Health Professions Opportunity Grants; National Sample Survey of Registered Nurses). This means that an average of 70% of the individuals who are invited to participate will complete the web-based survey. The number of home visitors invited to participate in the survey will depend on how many home visiting supervisors participate in the survey and list contact information for their staff. If program manager response rates fall below 70 percent, our research team will make phone calls to program managers explaining the importance of the study and encouraging their participation.  Priority for this personalized non-response follow-up will be given to larger programs that have larger staffs to help with us recruit more staff to compete home visitors survey. 


All potential respondents are employees of home visiting programs that receive MIECHV funding. We anticipate this fact will motivate participation since grantees will likely want to be represented in this workforce survey.

Dealing with Nonresponse


Nonresponse bias analyses will detect differences between LIAs that complete the program manager survey and the universe of LIAs receiving MIECHV funding in terms of geographic distribution and home visiting models implemented. Survey weights will adjust for any detected nonresponse bias at the LIA-level. Some home visitors and supervisors may decline participation even if their program managers participate. Additional survey weights may adjust for nonresponse within an LIA.


For the case study, if an LIA being considered for inclusion in the study indicates they are likely to decline to participate in the research, the team will discuss the case, the concerns the LIA has about participating, and attempt to address the concerns directly with the LIA. If the selected LIA still cannot or will not participate in the research, we will attempt to select a replacement LIA in the same state and with similar characteristics (e.g., implementing the same model).

Maximizing Response Rates


All MIECHV grantees (i.e., states, territories, and tribal grantees) will receive a letter from HRSA announcing the study and describing the survey, which can be shared with the LIAs individual grantees oversee. A similar letter will be sent to the 10 model developers whose models are being implemented with MIECHV funding, which they can share with their local affiliates. Then, the research contractor will send an email directly to LIAs that invites survey participation. The survey invitation will come from the project email account ([email protected]) with the sender name “Urban Institute Home Visiting Workforce Study” and not an automated message from Qualtrics survey management system so respondents will be more likely to respond and the message will not look like SPAM.


The recruitment letter will be followed by an automatically generated email from Qualtrics with a unique link to the survey. That email invitation will be followed by weekly automated reminder emails. The survey for program managers will be fielded for four weeks, at which point a nonresponse bias analysis will identify potential bias and where targeted recruitment efforts should be made. For example, the sample should reflect the distribution of home visiting programs nationally and across program models. To improve response rates and reduce nonresponse bias, the research contractor will make phone calls to program managers to make sure the initial emails were received, and to answer questions and encourage participation.


Similar to the program manager survey, the home visitors/supervisor survey pre-invitation will come from the project email account ([email protected]) with the sender name “Urban Institute Home Visiting Workforce Study” and not an automated message from Qualtrics so respondents will be more likely to respond and the message will not look like SPAM. The survey team will monitor staff survey completion over a 4-week period. Email reminders will be automatically generated in Qualtrics each week and sent to those who do not respond. After this initial data collection period ends, the survey team leader will review the response rate and check for nonresponse bias. Project assistants will then follow up with program managers in LIAs with very low participation rates to encourage staff to participate. Because of the voluntary nature of the survey and privacy, the research team cannot disclose who participated or not, but can make one attempt to notify the LIA that participation is low and to remind staff to participate by the deadline, if they plan to participate.


For the case study, program managers will receive an email with information about the opportunity to participate in an in-person interview (ATTACHMENT J). The email aims to answer any questions he or she may have about the research, obtain agreement to participate in the study, and schedule a phone call to discuss scheduling the interviews. The phone call will be individualized, so no script for the conversation will be developed. On that phone call, we discuss with the program manager to determine whether it is feasible to conduct focus groups with home visitors within their programs. If so, we will work with the program manager to determine the ideal way to conduct focus group recruitment. Once potential focus group participants are identified, we will call potential participants to finalize recruitment (ATTACHMENT K).


Our experience suggests that most program administrators indicate a willingness to participate in this type of data collection when the burden is not too heavy on their staff, the research does not disrupt their normal program activities, and they are able to learn from our findings. We will work with program managers to schedule interviews at times that are convenient for the respondent. We also plan to share written products resulting from this research will all interested respondents.

B.4 Tests of Procedures or Methods to be Undertaken


The interview protocol for home visiting program leaders was pretested with fewer than 10 people in advance of submitting this OMB package and refined to optimize the wording and flow of questions. Additionally, priority questions were identified, so that researchers can make sure to address the most critical questions if the interview cannot be completed in its entirety within 90 minutes.


The first focus group will serve as a pilot; the research team will make note of instructions that need to be improved and topics that need to be addressed more deeply or lightly in subsequent groups.


Similarly, the first key informant interview with training and technical assistance experts will help to modify probes for subsequent interviews. Because the interview will be qualitative and tailored for each key informant, the interview guide contains broad sets of questions.


The two survey instruments were pretested with fewer than ten people at two distinct home visiting programs not receiving MIECHV funding and therefore not eligible to participate in the study. Since the survey for home visitors and supervisors does not contain any questions specific to MIECHV, the research team concluded it was not necessary to pretest with a MIECHV-funded agency. The program manager survey asks how many staff are funded fully and partially by MIECHV, and the overall share of the budget supported by MIECHV, but no other questions pertain specifically to the MIECHV program.


The pretest examined survey length, flow and order of topics, and items participants felt were unclear or confusing, sensitive, not applicable to their positions, or missing from the instrument. The pretest indicated that the home visitor and supervisor survey was significantly too long. Additionally, questions related to “typical” work schedules and tasks were difficult to answer because the work tends to vary from week to week. Additionally, questions related to training, previous employment, and satisfaction with pay and benefits were difficult to answer with the way the survey items were phrased. The program manager survey was found to be of an acceptable length and clarity but was missing some key program features that managing recommended including, such as whether the LIA had multiple sites and the background of the program manager. The instruments were revised in response to feedback from pretest participants. After revisions, the home visitor and supervisor survey took less time to complete and there was significantly less confusion about questions.


Several national home visiting model developers also volunteered to review and provide input on the two survey instruments. Feedback from four home visiting models – Early Head Start, Parents as Teachers, Healthy Families America, and Nurse-Family Partnership—guided final survey revisions.

B.5 Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data


The study design plan and data collection instruments were developed by the following project staff at the Urban Institute with experience conducting quantitative and qualitative studies using similar data collection strategies:


  • Heather Sandstrom, Project Director, Center on Labor, Human Services, and Population

  • Timothy Triplett, Survey Data Collection Task Leader, Statistical Methods Group

  • Erica Greenberg, Survey Data Analysis Task Leader, Center on Labor, Human Services, and Population

  • Sarah Benatar, Case Study Task Leader, Health Policy Center

  • Ian Hill, Senior Advisor, Health Policy Center

  • Pamela Loprest, Senior Advisor, Income and Benefits Policy Center


These individuals, along with qualified junior staff, will analyze survey and case study data.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorRebecca Peters
File Modified0000-00-00
File Created2021-01-21

© 2024 OMB.report | Privacy Policy