OCSE Responses to OIRA Comments on CSPED OMB Package Memo_final

OCSE Responses to OIRA Comments on CSPED OMB Package Memo_final.docx

Child Support Noncustodial Parent Employment Demonstration (CSPED)

OCSE Responses to OIRA Comments on CSPED OMB Package Memo_final

OMB: 0970-0439

Document [docx]
Download: docx | pdf

370 L’Enfant Promenade SW, 4th Floor, Washington DC 20447







MEMORANDUM



TO: Jake Bournazian and Seth Silverman


THROUGH: Jennifer Burnszynski, Director, Division of State, Tribal, and Local Assistance, OCSE


FROM: Elaine Sorensen, Technical Advisor, OCSE


DATE: August 21, 2013

ICR #: 201304-0970-006


TITLE: Child Support Noncustodial parent Employment Demonstration (CSPED)


AGENCY: HHS/Administration for Children and Families/Office of Child Support Enforcement (OCSE)


SUBJECT: Responses to OIRA Comments on CSPED OMB Package



This memorandum provides our responses to OIRA’s comments about the CSPED OMB Package (dated July 15, 2013). Below, we indicate the reviewer’s original comment and then present our response.


In order to fully respond to your comments, we have revised the following documents from the original OMB package: Part A – Justification; Part B – Statistical Methods; IC 2 – Focus Group Protocol; IC 6 – Introductory Script for Program Participants; IC 7 – Baseline Survey; Attachment A – Consent Statement; Attachment C – FAQs; and Attachment D – Reminders. These revised documents are attached to this memo. In addition, to respond to your questions about the follow up survey, we have created and attached a new document, Attachment F – Overview of the 12-Month Follow-up Survey.


Based on recommendations from the University of Wisconsin’s Institutional Review Board (UWIRB), we have also made modest changes to the consent language contained in three of the attached documents: IC 6 – Introductory Script for Program Participation, IC 7 - Baseline Survey, and Attachment A – Consent Statement. These documents have now been approved by the UWIRB.


After submitting the baseline instrument as part of our initial submission, we obtained information from the CSPED grantees indicating that they are likely to serve a large number of military veterans. Thus, we would like to add two questions on veteran status to IC 7 - Baseline Survey, which were adapted from the Current Population Survey. These questions will ensure that we can identify this important demographic group and that we can examine program effects separately for veterans. Since these two questions are brief, adding these two questions to the baseline survey does not change our burden estimates. They are included in IC 7 - Baseline Survey as questions B7 and B8.


We would greatly appreciate receiving OMB approval for this ICR as soon as possible. We hope to start random assignment for this demonstration October 1, 2013.


1. OIRA Comment (first page):

Recommendations: Responses to the following points highlighted below will be appreciated. In particular, it would be helpful to have more information about your plan for the 12-month follow up survey for the impact study. While this instrument is not a required element for review and approval of the study design at this time, it would be most efficient to review it now and, at least, a more detailed description of that instrument is needed at this time.


OCSE Response:

Given your request for additional information regarding the 12-month follow-up survey, we have added a new attachment to this ICR, Attachment F – Overview of the 12-Month Follow-up Survey. This attachment presents the plans for this survey and summarizes the likely content. This survey instrument is not finalized and thus we plan to seek clearance for this instrument in a future submission.


SUPPORTING STATEMENT A


A2. Purpose and Use of the Information


2. OIRA Question:

The instruments in this ICR are used to learn about the approaches that will be implemented by the eight CSPED grantees. Could (some of) this information be directly obtained from the grant applications?


OCSE Response:

CSPED grantees have substantially modified and refined their plans for program implementation during the initial planning year for the study. In some cases, they have made substantial changes to these plans, including the specific services that will be provided, the locations where services will be offered, and the criteria for program eligibility. They have made these adjustments based on a clearer understanding of the conditions in their communities, as well as OCSE’s goals and expectations for this grant initiative. Therefore, the grant applications no longer reflect the latest implementation plans for these grantees. Moreover, actual implementation is likely to evolve over time as grantees begin implementing their demonstration programs and making adjustments based on early implementation experiences. Thus it will be necessary to gather information on actual program implementation using the instruments included in this ICR. The CSPED implementation study will document actual implementation at two points in time: during early implementation (in year 2 of the grant period, first year of implementation) and after implementation reaches more of a “steady state” (in year 4 of the grant period, the third year of implementation). Documenting implementation at two time points will enable the evaluation team to document changes in program operations made over time and the reasons grantees made those changes.


3. OIRA Comment:

It doesn’t appear that information collected in the ICR will cover the effects of the different programs. That questionnaire has not been developed or not submitted with this ICR.


OCSE Response:

The follow-up questionnaire is not the only data source that the evaluation team plans to use to measure the effects of the different programs. IC 9 describes the administrative data that the evaluation team plans to collect to measure the effects of the different programs. These data include child support data, unemployment insurance benefit and wage records, public assistance records, and criminal justice records.


4. OIRA Question:

What are the child support performance goals that are being evaluated? Wouldn’t this be available from the grant applications also?


OCSE Response:

The primary performance goal of the Title IV-D Child Support Program is to collect child support payments owed by noncustodial parents. The purpose of the CSPED demonstration is to test whether services provided to noncustodial parents through the demonstration result in increased child support payments, compared to a control group of noncustodial parents who did not receive the CSPED services. If the CSPED evaluation finds favorable impacts on child support payments, a thorough understanding of how the demonstrations were implemented is essential for replication in other states and communities and for informing decisions about programmatic and policy improvements to the IV-D program.


5. OIRA Question:

How will data from the baseline survey be used to adjust for potential bias from follow up non-response? Will this be used to impute? This issue is not addressed in Part B of the supporting statement.


OCSE Response:

We anticipate a high response rate to the 12-month follow-up survey (a separate OMB submission will seek clearance for the 12-month follow-up survey of participants). Therefore, nonresponse bias should not be a major concern. Even so, all analysis of follow-up survey data will account for survey nonresponse using nonresponse weights calculated using standard statistical techniques to estimate the probability of nonresponse as a function of baseline characteristics. This weighting strategy will help ensure that impact estimates reflect the average effects of the program on the full research sample and not just those sample members who responded to the survey. Part B has been revised to provide additional information about the plans for using nonresponse weights (page 4). The evaluation team will also use the baseline data to adjust impact estimates for small differences between the research groups in baseline characteristics when estimating program impacts. In addition, they will use them to define groups for subgroup analysis, in order to test whether program services are more or less effective for certain key subgroups. The team will also use these data for estimating propensity score models for analysis of impacts for those who received larger doses of CSPED services. Part A has been revised to include these other justifications for the baseline data collection (pages 5-6).


6. OIRA Question:

A lot of PII (address, telephone numbers, social networking info etc.) is collected on three other persons who are not part of the study but simply are identified as friends or family of the study participants who may know how to contact the participant if the interviewer is not able to contact the participant after the first interview. When does this PII get deleted from the system? No mention of the record retention for this. Most forms ask for one person to contact in case of an emergency or if there is a need to contact the person. Three additional people seems like a high number, will this PII be destroyed once the participant completes their survey?


OCSE Response:

The populations that will be served by CSPED grantees, noncustodial parents with little or no earnings and who owe substantial child support arrearages, are a highly transient population, moving frequently and often changing phone numbers multiple times within a year. To ensure a high response rate to the follow-up survey, it is essential that the evaluation team collect detailed contact information at baseline. The current plan has been developed based on the extensive experience of the evaluation team with locating similar populations for other research studies. Based on the evaluation team’s experience, not only will the target population be highly mobile, the friends and family members they identify will likely be highly mobile. Asking the respondent to name up to three contacts increases the likelihood that the evaluation team will be able to reach at least one of these contacts at the time of the 12-month follow-up interview. The proposed strategy for collecting contact information—asking (but not requiring) respondents to provide up to three contacts—is identical to the strategy being implemented in the Parents and Children Together Demonstration (PACT) study, which is examining a similar noncustodial parent population and which was approved by OMB (#0970-0403).


The evaluation team takes the protection of PII very seriously. PII collected on contact persons will be used only for the purpose of locating respondents for follow-up surveys and will never be included on data files for analysis. PII on friends and family members will be stored on an encrypted server at Mathematica Policy Research with access permitted only to project team members on a need-to-know basis. All activity in the system will be logged. All PII gathered for locating purposes will be securely erased from Mathematica’s network at the end of the evaluation period.


A6. Consequences of Collecting the Information Less Frequently


7. OIRA Question:

This ICR proposes two staff interviews. The first interview is limited to understanding the program design. Can some or all of this information be obtained directly from the grant application? The second interview focuses on implementation experiences and seems more likely to capture information not available in the grant application. Why is this study interested in changes over time in staff composition or staff perceptions of the program if the main purpose of the study is to look at the different approaches and their effectiveness in helping participants?


OCSE Response:

As noted earlier, CSPED grantees have modified and refined their plans for program implementation during the initial planning year for the study. Therefore, the grant applications no longer reflect the latest implementation plans for these grantees. Moreover, actual implementation is likely to evolve over time as grantees begin implementing their demonstration programs and making adjustments based on early implementation experiences. Therefore, it is important to gather this information through staff interviews as part of the implementation study.


The CSPED implementation study will document actual implementation during two rounds of site visits: (1) the first conducted during early implementation in year 2 of the grant period (the first year of program implementation) and (2) the second conducted after implementation has reached more of a “steady state” in year 4 of the grant period (the third year of program implementation). Documenting implementation at two time points will enable the evaluation team to document changes in program operations made over the time and the reasons grantees made those changes. Staff interviews conducted during the first site visit will cover program design, including the rationale for grantees’ design decisions, but will not focus on program design exclusively. These interviews will also explore early implementation experiences and adjustments grantees have made based on those experiences. Staff interviews conducted during the second site visit will focus on challenges faced with program implementation and strategies used to address the challenges, as well as lessons learned from operating these programs.


We are interested in changes in staff composition over time for several reasons. First, staff turnover and gaps in staff coverage may affect the quality and consistency of implementation. The evaluation team will need this information to help them interpret the patterns of impacts observed in the impact study. Second, turnover may occur due to mismatches in skills and qualifications of staff members who fill various positions. It is important to capture grantee perceptions about staff qualifications that are important for implementing the demonstration.


Staff perceptions of the program—including their sense of the match of services with participants’ needs, how feasible it is to implement, implementation challenges encountered, changes in the design that would improve implementation, and lessons learned—are important for understanding how to replicate and improve the program designs and implementation plans in the future. Staff perceptions of the program are likely to change over time as grantees modify their approach in response to their early implementation experiences. Collecting data on staff perceptions at two time points will help the evaluation team capture staff feedback about the lessons they have learned along the way.



A9. Explanation of Any Payment or Gift to Respondents


8. OIRA Question:

Is the incentive payment needed for grantee staff’s cooperation? If the purpose of the collection is to improve their programs, isn’t the grantee going to receive qualitative benefits at no cost in addition to the grant funding?


OCSE Response:

The original language in the ICR was confusing on this point, since no incentives will be paid to program staff. We have revised this text to avoid this confusion (page 12 of Part A).


9. OIRA Question:

The $25 incentive for the 12-month follow up survey to the participant baseline (see Attachment A, p. 3) is not listed here under incentives. Can you please make sure that all incentives (as well as burden hours, etc.) are captured in the Supporting Statement?


OCSE Response:

As requested, we have added the incentive amount and activity length for the follow-up survey to Table A.1 (page 11 of Part A). Also, as requested, the burden estimate has been added to Table A.4 (page 18), page 19, and Table A.5 on page 20. Please note that the evaluation team proposes a higher incentive for the follow-up interview ($25) than for the baseline survey ($10), because the follow-up interview is not tied to the enrollment process. Therefore, a larger incentive will be needed to encourage a timely response.


A10. Assurance of Confidentiality to Participants


10. OIRA Question:

What laws are intended to be used to protect this information? The study would benefit from a pledge that the responses will be kept confidential and reported in a manner that will not identify any. The privacy statement in the telephone interview would benefit from a statement that the results will be reported in a manner that will not identify any individual.


OCSE Response:

OCSE will protect and hold confidential the information it is requesting in accordance with 42 U.S.C. 1306, 20 CFR 401 and 402, 5 U.S.C. 552 (Freedom of Information Act), 5 U.S.C. 552a (Privacy Act of 1974), and OMB Circular No. A-130. In accordance with the Privacy Act, OCSE’s contractors will provide assurances that all information collected will remain confidential and will not be used in any way that would identify any participants. Before the baseline survey is administered, the interviewer will read a consent statement, which includes a pledge that responses will be kept confidential and reported in a manner that will not identify individual respondents (see pages ii - iii of Information Collection #7). The respondent will also be provided with a hard-copy of this consent statement for their reference (Attachment A), that includes the following statements:


The information the research team collects about your employment and earnings, child support agreements, criminal background, credit rating, and other benefits or services you may receive from public programs, and everything you tell the program staff or the interviewers will be kept private and will not be shared with anyone. However, if you tell a person on the study team about child abuse or if you threaten to harm yourself or someone else, it must be reported by law.”


The information from all study participants will be combined and written up in a report to the U.S. Department of Health and Human Services. Researchers might use information from this study in journals, books or presentations. However, nothing will be said about you as an individual. Instead, information about you will be combined with information about everybody else in the study, so the researchers can say things like “30 percent of parents in the program have two children.”


In addition, the evaluation team will obtain a Certificate of Confidentiality from the National Institutes of Health (NIH) to further protect the privacy of participants’ responses to this study. The Certificate of Confidentiality will allow us to avoid compelled "involuntary disclosure" (e.g., subpoenas) of names and other identifying information about any individual who participates as a research subject. We have added the following text to Information Collection #7 to inform sample members of this additional protection (see page iii):


The study also has a Certificate of Confidentiality from the National Institutes of Health. This means that we will not share information that could identify you, even if a court asks us to, unless the U.S. Government demands information to audit or evaluate federal projects or to meet the Food and Drug Administration’s requirements. This certificate does not stop you from choosing to share information about yourself or your part in this study.”


11. OIRA Question:

What is the privacy statement for the web survey that the program staff take?


OCSE Response:

The privacy statement for the web-administered program staff survey is provided in the “Introduction” section of Information Collection #3 (page 2). This text will be provided on the first page of the web survey after the respondent logs in. The applicable text reads:


Participation in the survey is completely voluntary and you may choose to skip any question. Your responses will be kept private and used only for research purposes. They will be combined with the responses of other staff and no individual names will be reported. While there are no direct benefits to participants, your participation will help the U.S. Department of Health and Human Services learn how to better provide services to noncustodial parents who are having difficulty meeting their child support obligations. There is minimal risk related to taking part in this study. In the unlikely event of a data breach, your participation in the demonstration could become known.”


A11. Justification for Sensitive Questions


12. OIRA Question:

This study plans to ask participants about their criminal history, substance abuse, mental health questions, and their love life. There seems to be a lot of sensitive questions being asked. Why do you need the name of the boyfriend or girlfriend? Some States have laws against adultery and respondents may not want to implicate themselves by naming a person. Can you refer to the person as a boyfriend or girlfriend without recording their name? No justification in Table A.2 for sensitive questions on dating activities.


OCSE Response:

The baseline survey asks for the name of the respondent’s current partner (at D20), as well as the respondent’s former partners who are the parents of any of the respondent’s children (at C11). This question is asked so that the interviewer can refer back to this person in subsequent questions and make it clear to the respondent whom the interviewer is asking about. To ensure that the interviewer requests the minimum amount of data needed for this purpose, the baseline instrument now requests only first names for these partners and not last names. In addition, at D20 (about current romantic partners) respondents are prompted to provide only initials if they indicate that they do not wish to provide a first name.


Note that the series of questions concerning current romantic partners (questions D19 – D25) is skipped if respondents indicate at question B5 that they are married, or if they indicate at question D4 that they live with the other parent of any of their children. Thus, cohabiting and married respondents are not asked to report whether they also have another current romantic partner. Note also that there was an error in the skip logic of the original draft of IC 7 that was previously submitted (an “AND” should have been an “OR), which might have created confusion concerning which respondents would be asked these questions. This error has been corrected (page 27, IC 7).


As requested, we have provided justification for the baseline survey questions on romantic relationships (questions D19 – D25) in Table A.2 (page 15, Part A).


A12. Estimates of Annualized Burden Hours and Costs


13. OIRA Question:

Was there any pretesting on the Baseline Survey (Instrument No. 6), it is approx. 50 pages, the table uses 0.58 hours for a burden estimate, can this be administered in less than 40 minutes?


OCSE Response:

The CSPED baseline instrument is based closely upon the OMB-approved PACT baseline survey (#0970-0403). Our estimate of the burden for the CSPED baseline instrument (0.58 hours or 35 minutes) is based on the experience with the very similar PACT baseline survey, which has now been administered to over 2,000 noncustodial parents. The average length of PACT baseline surveys to date is 34 minutes. In addition, the CSPED team has conducted timing tests with the CSPED instrument using scenarios for family situations likely to be encountered with the CSPED population. These timings yield average survey lengths similar to those being found with the PACT baseline.


14. OIRA Question:

Table A.3 assumes there will be 6,000 participants in the CSPED program but Table A.4 uses 12,600 program applicants for the baseline survey. Why is the baseline survey administered to the applicants and not to the program participants?


OCSE Response:

We expect 12,600 program applicants during the study intake period. Those 12,600 applicants will hear program staff read the introductory script that provides information about the CSPED study (Table A.4, Introductory Script, page 18). We assume that about five percent of the program applicants will be found to be ineligible for the CSPED study or will not consent to participate. Thus, we assume that 12,000 noncustodial parents will agree to participate in the CSPED study and will complete the baseline survey (Table A.4, Baseline Survey, page 18). After completing the baseline survey, the 12,000 study participants will be randomly assigned to either a treatment group that will be offered CSPED program services or a control group that will not, with 50 percent of study participants assigned to each group. Thus, we expect that the 6,000 sample members assigned to the treatment group will be participating in CSPED program services (Table A.3, Study MIS to track program participation, page 17). To measure program impacts, we will be collecting baseline and follow-up information on the full sample of 12,000 study participants, including the 6,000 in the CSPED program group and the 6,000 in the control group.


15. OIRA Question:

Is there a burden estimate for the follow up interviews with the 12,000 respondents who complete the baseline survey?


OCSE Response:

As requested, we have added the burden estimate for the 12-month follow-up interview in Table A.4 (page 18) and discussed the derivation of this estimate in section A.12 (page 19).


16. OIRA Question:

Each grantee will be visited in years 2 and 4 of this study after the initial interview (Instrument No. 3). Are these two supplemental interviews included in the burden estimate?


OCSE Response:

Yes. We plan to conduct site visits to all 8 sites in years 2 and 4 of this study. During each set of site visits, we expect to interview 120 grantee site staff members and community partners (15 per site across 8 sites). Thus, we expect to conduct a total of 240 interviews as part of these site visits. Our protocol for these interviews is described in IC 1 – Staff Interview Topic Guide. We expect these meetings, which will involve a semi-structured interview about experiences with the program, to last approximately 1 hour per interview. Thus, the total burden for grantee site staff and community partners is 240 hours (120 staff members participating in 2 meetings of 1 hour in duration each), and the total annualized burden over three years is 80 hours. This information is included in Table A.3 under Staff Interview Topic Guide (page 17) and discussed in section A.12 (page 16).


In addition, we expect to conduct the program staff survey (Instrument #3) with 200 grantee site staff and community partners (25 per site across 8 sites) at two points during the evaluation period. We expect the web-based survey to take approximately 30 minutes to complete per respondent. Thus the total burden for grantee site staff and community partners participating in the program staff survey is 200 hours (200 program participants participating in surveys at two points during the evaluation period lasting 0.5 hours in duration). These two rounds of program staff surveys are included in the burden estimate in section A.12 (page 16) and in Table A.3 (page 16).


17. OIRA Question:

It appears that every person in the baseline study is interviewed twice within 12 months but burden table only shows the burden associated with the first initial contact. How many times will the person be contacted during a 12 month period and what is the burden associated with each contact after the baseline information is collected?


OCSE Response:

Sample members will be interviewed twice as part of the CSPED evaluation, at baseline and then again 12 months later. Since we are not currently requesting clearance for the 12-month follow-up survey, we did not originally include the burden estimate for the follow-up survey in this OMB package. As requested, we have now added the burden associated with the 12-month follow-up survey to Table A.4. We expect that 9,600 participants (80 percent of the 12,000 who are expected to complete the baseline survey) will complete the 12-month follow-up survey. We expect each survey to last 0.75 hours, for a total of 7,200 total burden hours. The total burden estimates and annualized burden estimates are presented in Table A.4 and in page 18 in section A.12 (page 19).


A16. Plans for Tabulation and Publication and Project Time Schedule


18. OIRA Question:

What are the standard qualitative procedures to be used for analyzing and summarizing the staff interviews and focus groups?


OCSE Response:

The evaluation team will use standard qualitative procedures to analyze and summarize information from staff interviews and focus groups. Analysis will involve organization, coding, triangulation, and theme identification. For each qualitative data collection activity, the team will use standardized templates to document and organize the information collected and then code this documentation. They will search the coded text to gauge consistency and triangulate across respondents and data sources. This process will reduce large volumes of qualitative data to a manageable number of topics, themes, and categories (Yin 1994: Coffey, Holbrook, and Atkinson 1996) which can then be analyzed to address the research questions.


To code the qualitative data for key subtopics and themes, the evaluation team will first develop a coding scheme that builds from the interview or focus group questions. For example, for a father focus group the team might use the following codes to document participants’ experiences in CSPED:


  • Recruitment into CSPED

  • Motivation for enrolling in the program

  • Program participation

  • Barriers to participation

  • Experiences with case management services

  • Experiences with enhanced child support services

  • Experiences with parenting education services

  • Experiences with employment services

  • Perceived benefits of program participation

  • Satisfaction with program services

Senior members of the evaluation team will refine the initial coding scheme by reviewing codes and a preliminary set of data output to make adjustments and ensure alignment with the topics that emerge from the data. For each round of coding, two to three project team members will be trained to code the data using a qualitative analysis software package, such as Atlas.ti or NVivo. To ensure reliability across coders, all team members will code an initial document and compare codes to identify and resolve discrepancies. As coding proceeds, the lead team member will review a sample of coded documents from each coder to monitor reliability.


Coded data will enable the team to compare responses across respondents within and across grantees by searching on specific codes. The software will also allow the team to retrieve data on particular codes by type of respondent (for example, case manager or parenting services coordinator). To compare information, the evaluation team may retrieve data for subsets of programs, such as those using the same fatherhood curriculum or those located in rural areas.


We have added this discussion to Part A – Justification (page 21).


SUPPORTING STATEMENT B


B1. Respondent Universe and Sampling Methods


19. OIRA Question:

How will the child support staff be “selected purposively” using the organizational charts? Can you be more specific on certain job titles, position descriptions, responsibilities, critical roles in the process, etc.?


OCSE Response:

The evaluation team will use the CSPED grantees’ organizational charts and information about each staff person’s role in the demonstration to select individuals who are appropriate to address the topics to be covered in the staff interviews. At the grantee level, the team will interview the grantee director as well as key managers and coordinators. Some grantees are implementing CSPED in multiple counties or communities. For those grantees, the team will also interview the grantee’s lead staff member in each community. All grantees are required to partner with other organizations to provide employment and parenting services. The team will interview the lead staff person responsible for grant activities at each partner agency. In addition, they will interview a sample of frontline child support, employment, and parenting staff. If an agency has dedicated more than one frontline staff person to the demonstration, the team will randomly pick one of them to interview. We have added this information to section B1 in Part B: Statistical Methods (page 2).


20. OIRA Question:

The program staff survey will all be asked to complete the web-based survey, so it appears to be a census, but the methodology assumes 25 so is this a cutoff sampling approach where once you have 25 you stop or do you include all staff that work on CSPED programs even if it exceeds 25?


OCSE Response:

The evaluation team intends to include in the web-based staff survey sample all frontline staff members who provide direct services to noncustodial parents participating in the program. We do not anticipate that any grantee will have more than 25 frontline staff. If, however, any grantee does employ more than 25 staff working directly with program participants, the evaluation team will select a random sample of 25 of these staff members to complete the survey.


21. OIRA Question:

Impact study appears to be covering a census of all Non-custodial parents that participate in the program. If this is a sample, what are the selection criteria for eligibility in the sample? Does the individual need to have engaged in at least two program activities or any activities?


OCSE Response:

The reviewer is correct that the impact study is a census of all noncustodial parents who enroll in the study. The impact analysis will include all study participants in both research groups, regardless of their level of program service receipt. This approach ensures that differences in the outcomes of treatment and control group members can be attributed to the program, and not to factors that may be associated with program service receipt.


B2. Procedures for Collection of Information


22. OIRA Question:

What is the burden estimate for the follow up survey after the baseline surveys are done? Is it only one follow up survey or is it done every year for 5 successive years?


OCSE Response:

As described above, we are planning only one follow-up survey to be conducted 12 months after random assignment. See our response #5 under Section A12 above concerning the burden estimate for this follow-up survey.


23. OIRA Question:

There is a lot of Administrative record matching for earnings and criminal justice outcomes, and other benefits. Can the survey be reduced from the use of these Administrative records? What is the main reason for collecting the same information on the survey that you can obtain from administrative records seems to be for data editing and data quality checks (it looks duplicative in data collection burden)? How do you know what is more reliable, what the person tells you or what the administrative record shows? If one data source is more accurate, then recommend just using that one data source and reducing the duplication of data collection efforts.


OCSE Response:

For some key measures—such as earnings, criminal justice history, and child support order information—the evaluation team plans to collect information through both the baseline survey and administrative records. The rationale for this approach is discussed below.


Earnings History. Encouraging employment is one of the primary goals of CSPED. Therefore it is critical that the evaluation has a complete picture of the employment history of study participants. Administrative and survey-based employment information have different strengths. Collecting information from both sources will allow the evaluation team to draw on the distinct strengths of these two data sources.


Unlike self-reported survey data, earnings measures based on UI administrative records are not subject to nonresponse or recall error. However, administrative data do not cover all jobs. Workers excluded from UI earnings records include self-employed workers, railroad employees, workers in service for relatives, most agricultural labor, some domestic service workers, part-time employees of nonprofit organizations, and some workers who are casually employed. Workers in these sectors comprise about 10 percent of workers in the U.S. economy (Hotz and Scholz 2002). Informal employment that is not covered in the UI system is likely to be more common for the low wage population that CSPED targets. For that reason, the baseline survey asks specifically about informal employment and earnings from all jobs (baseline survey items E1-E9). This information will allow us to identify participants employed in jobs not included in UI records.


Criminal Justice History. Involvement with the criminal justice system is a key factor associated with employment. Therefore, criminal justice history will be important for subgroup analysis and as a covariate in the impact analysis.


Administrative records are a strong source for criminal justice history since they do not rely on the accuracy of participant responses. Although the evaluation team will attempt to obtain administrative records related to the criminal justice involvement of study participants, in most states these records are not directly accessible by the agencies that administer child support services. State criminal justice agencies may or may not have data sharing relationships with state child support agencies; local and federal criminal justice agencies typically do not. Therefore, the evaluation team may encounter barriers to acquiring complete federal, state and local criminal justice records for some sites. Collecting a modest amount of information on criminal justice involvement through the baseline survey (survey items F7-F11) provides an assurance that this key information is covered in the event that administrative data on this topic is unavailable or incomplete.


Child Support Order Information. Administrative records from site child support agencies will provide accurate information on established child support orders, including support payment amounts. However, it is important to assess noncustodial parents’ understanding of their responsibilities, which may differ from administrative records. Therefore, the baseline survey asks study participants for information on their child support orders (baseline survey items D9-D11). In addition, these questions are used as a lead in to questions about informal child support contributions, which are not captured in administrative records. By asking about both formal and informal child support, we anticipate respondents will provide a more accurate accounting of their informal child support contributions.


B3. Methods to Maximize Response Rates and Deal with Nonresponse


24. OIRA Question:

How will you monitor and measure the quality of data reported by grantee sites?


OCSE Response:

The evaluation team will provide extensive training to grantee staff on the use of the Management Information System (MIS) to track program participation. The team will also monitor the quality of the data that are entered on a regular basis throughout the evaluation period. To assess data quality on an ongoing basis, the team will use a variety of standard reports, including reports showing the frequency of use by each MIS user (for example user log-ins and data entry statistics) and reports of service use by client, caseworker, and site. The evaluation team will look for data anomalies (for example, 100 percent attendance at a workshop) and look for indicators that data are being entered retrospectively (by comparing dates that data are entered to reported dates of service contacts) to flag potential issues. If data are not being entered or there are suspect values, grantee managers will be asked to investigate more fully.


INSTRUMENTS AND CONSENT MATERIALS


Instrument No. 3 Program Staff Survey


25. OIRA Question:

Is Question B6 collecting the staff person’s opinion or the non-custodial parent’s opinion on how easy is it to access various services? Is it useful to know how easy the staff person perceives the access of services issue?



OCSE Response:

This question is about the staff member’s opinion about the accessibility of other services in the community. Staff members who work directly with noncustodial parents are likely to have a good knowledge of other community services available for the population they serve and their accessibility. Responses to this question will be helpful for documenting services that may be accessible to control group members.


26. OIRA Question:

The responses to Question D4 will benefit from a pledge of confidentiality.


OCSE Response:

Please see response to question #11 above.


Instrument No. 7. Baseline Survey –


27. OIRA Question:

This is a long survey for a burden estimate of 0.58 hours.

OCSE Response:

Please see response to question # 13 above.


28. OIRA Question:

Is question A7 needed?


OCSE Response:

The ability of the evaluation team to reach respondents for the 12-month follow-up survey is crucial to ensuring the quality of the data that will be used for the impact analysis. In many cases, sample members will be reached on their cell phones for the follow-up survey. Question A7 is designed to help the evaluation team understand the type of cell phone contract sample members have, so that they can use the most appropriate method for locating them for the follow-up survey.


Items A7_a and A7_b help assess the likelihood that the team will be able to reach the respondent at the cell phone number provided, since respondents with a cell phone contract are much more likely to have the same telephone number 12 months later. In contrast, respondents with ‘pay as you go’ plans may experience service interruptions at times when additional minutes need to be purchased for the phone. Items A7_c and A7_d help assess the burden that might be introduced by calling or texting respondents on their cell phone. Respondents with unlimited calls and texts are likely to be much more comfortable being contacted using these methods at the time of the follow-up survey.


29. OIRA Question:

Can Questions C2-C11 be completed from information already in the participant’s application?



OCSE Response:

Noncustodial parents do not fill out a separate application for the CSPED program. Therefore, the baseline interview is the only mechanism through which this information can be obtained. In addition, because these questions ask the respondent to provide information about all of their children, including children for whom the respondent does not have a child support order, these data are not available through administrative records or child support data. The respondent is the sole source of complete information on relationships with and support provided to each of his or her biological children.


30. OIRA Question:

Question E10, the categories “A Little” and “Somewhat” seem similar. Same issue with “Very” and “Extremely”


OCSE Response:

This scale coding was developed by Dr. Nora Cate Schaeffer at the University of Wisconsin-Madison, an expert in survey design with expertise in question development and measurement. Her research indicates that these categories are more successful than typical scales at eliciting variation in responses to these sorts of questions and avoiding having large proportions of respondents at either the bottom or top end of the response categories.


31. OIRA Question:

This survey instrument seems to have some redundancy. Can Questions F7 – F11 be answered from administrative records? Attachment B explains the justification for collecting this information, however, do you need to collection it from both the individual and administrative records?


OCSE Response:

Please see the response to question #23 above.


32. OIRA Question:

Question F9 and F10 are asking the same for the same information, one provides info in discrete units and the other provides categorical info. Do you need both?


OCSE Response:

The reviewer is correct that information from both of these items is not needed. An earlier version of the survey had a mistake in the skip logic that did not reflect this fact. The current version of baseline survey instrument only asks for categorical information on the timing of release from jail or prison (survey item F10) if the respondent does not respond to the survey item on the date of timing of release from jail or prison.


33. OIRA Question:

Question A1 and H1 are the same.


OCSE Response:

We believe that the reviewer is referring to questions A4 (rather than A1) and H1, both of which ask for the respondent’s address. We agree that item H1 can be deleted and H2 slightly revised to read “Would you like me to send your $10 gift card to the address you gave me at the beginning of the interview?” The address can be displayed for interviewers to verify if needed. IC 7 has been revised to reflect this change.


34. OIRA Question:

H9 – H11 collect a lot of PII on persons that know the respondent. Will this information be deleted after the follow up interview?


OCSE Response:

Please see the response to question #6 above.


35. OIRA Question:

Of course, it would be helpful to review the script for the follow up interview.


OCSE Response:

The follow-up survey has not yet been developed. Attachment F presents the plans for that instrument and its likely content.


36. OIRA Question:

At some point in the course of the surveys or introductory scripts, does the term “noncustodial parent” need to be defined for participants or will they already be sufficiently acquainted with the term by the time they come into contact with this IC?


OCSE Response:

Based on their prior interaction with the child support system, we believe that the term “noncustodial parent” will be familiar to respondents.


37. OIRA Question:

Please make sure it is clear that information will be kept private only to the extent permitted by law (Question #5, p. ii).


OCSE Response:

To make sure this point is clear, we have moved the language up that explains when the law would not allow this information to be kept private. In particular, we have moved the following sentence up to question #5: “If you tell a person on the study team about child abuse or if you threaten to harm yourself or someone else, it must be reported by law.”


38. OIRA Question:

Why are you collecting their names on social networking sites? Is there a way to capture the information desired in this study without accumulating that much personal information?


OCSE Response:

The evaluation team plans to ask for their names on social networking sites to improve their ability to locate program participants for the 12-month follow-up interview. Based on prior experience, the team anticipates that this population will be highly mobile and will likely change their telephone and address information one or more times between survey administrations. Obtaining social networking information provides another avenue through which to reach participants if their telephone, address, and email information is no longer valid.


The evaluation team has used this approach successfully to locate hard-to-reach sample members on similar projects. This information will only be used for the purpose of locating participants at the time of the 12-month follow-up survey and will be stored on secure servers with access limited to project staff on a need-to-know basis. The data will be destroyed with all other PII at the end of the evaluation period.


As with all items on the survey, the participant may decline to provide this information if they are not comfortable doing so. If this information is provided by the respondent and the evaluation team needs to use it to find the participant, contact would be initiated through social networking sites by sending a personal message directly to the name provided. Access to the site would need to be approved by the participants.


In addition, please note in the revised IC 7 we have deleted the questions asking the respondent to provide social networking information for contact persons. We are now only asking respondents to provide this information about themselves.


Attachment A: Hard copy of Consent Statement to be read


39. OIRA Question:

This is a long statement for a person to hear. Not sure if they will understand or retain much of this. Can there be a written statement that the person reads and signs to show their consent?


OCSE Response:

The consent language and consent process included in this revised package have been approved by the University of Wisconsin IRB, the IRB overseeing this project. The IRB requires that the evaluation team use this consent language and follow these consent procedures for the evaluation to proceed.


We believe that the consent procedures that have been developed by the evaluation team and approved by the IRB provide strong assurances that study consent will be understood by potential study participants. Oral consent administered by a small set of experienced interviewers with specialized training will help ensure that potential study participants receive consistent, accurate, and complete information about the study. In addition, if participants have questions during the consent process, they will hear uniform responses. Given the limited reading abilities of some potential sample members, having a trained interviewer read the consent language to them, pausing at specific points to ask if the respondent has questions, helps ensure that the consent language is understood by potential study participants. Intake workers in each study location will also provide a hard copy of the consent language that sample members can take with them after completing the intake process. We are following similar consent procedures as part of the OMB-approved PACT baseline survey (#0970-0403) and have not received negative feedback from sites or respondents about the consent process.


40. OIRA Question:

Also, please make sure that the discussion on protection of privacy (p. 3, p. 5 consent to participate) limits the privacy commitment to the extent permitted by law (currently it does not).


OCSE Response:

We have revised the discussion on protection of privacy in Attachment A in response to your concerns and recommendations from the University of Wisconsin’s Institutional Review Board. Currently, the first paragraph on p 3 says:


This information the research team collects about your employment and earnings, child support agreements, criminal background, credit rating, and other benefits or services you may receive from public programs, and everything you tell the program staff or the interviewers will be kept private and will not be shared with anyone. However, if you tell a person on the study team about child abuse or if you threaten to harm yourself or someone else, it must be reported by law.”


Based on recommendations from the University of Wisconsin’s Institutional Review Board, we have also added the following language to p 3.


This study also has a Certificate of Confidentiality from the National Institutes of Health. This means that we will not share information that could identify you, even if a court asks us to, unless the U.S. Government demands information to audit or evaluate federal projects or to meet the Food and Drug Administration’s requirements.  This certificate does not stop you from choosing to share information about yourself or your part in this study.”


The fifth page of the Consent Statement is a review of the consent information and not intended to repeat verbatim what has already been said, but rather summarize the consent information.


Attachment C FAQs


41. OIRA Question:

The Response to the question, “when will I be notified about my group assignment,” seems to suggest that the respondent will be informed immediately after the interview. Is that correct or is there some lag? If the latter, it would probably be better to state that directly.


OCSE Response:

Yes, the sample members will be notified of their group assignment immediately following the baseline interview. The respondent will hand the phone that they used to complete the baseline interview back to the case worker. The interviewer will verify for the case worker that consent has been obtained and the interview has been completed. The caseworkers will then locate the participant in the MIS, use the system to perform random assignment, and inform the client immediately of the group to which he or she has been randomly assigned.



Attachment D Reminders


42. OIRA Question:

Is it worth specifying that we look forward to speaking them again in six months? In the next few months? I don’t know if people may expect a follow up survey shortly after the reminder and then assume the study is over when they don’t hear from us for six months.


OCSE Response:

We agree with the suggestion to add this text. The text to Attachment D has been revised to reflect that we will contact the respondent again in a few months from the date of the text message.


IC2. Focus Group Guide


43. OIRA Question:

We often see more detailed guides (sometimes including text scripts) for focus groups. This is not necessary, but it might be worth at least suggesting prompts to probe on certain issues, particularly where the current outline asks a more general statement. (e.g. “Barriers to participation in each type of service.” – possible prompt: when you decided not to attend/take advantage of a certain resource, what were common reasons for not doing so?)


OCSE Response:

Given your suggestion, we have revised the focus group guide to include an introductory script and prompts to probe each question. The revised protocol for the noncustodial parent focus group is included as IC 2 - Noncustodial Parent Focus Group Protocol.


REFERENCES

Coffey, Amanda, Beverly L. Holbrook, and Paul Atkinson. “Qualitative Data Analysis: Technologies and Representations.” Sociological Research Online, vol. 1, no. 1, 1996. Available at http://www.socresonline.org.uk/index_by_issue.html.

Hotz, V. Joseph and John Karl Scholz. “Measuring Employment Income for Low-Income Populations with Administrative and Survey Data.” in Studies of Welfare Populations: Data Collection and Research Issues, M. Ver Ploeg, R. Moffitt and C. Citro, eds., Washington, DC: National Research Council: National Academy Press, 2002, 275-315.


Yin, R. Case study research: Design and methods (2nd ed.). Thousand Oaks, CA: Sage Publishing, 1994.

1


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorDawn Duren
File Modified0000-00-00
File Created2021-01-29

© 2024 OMB.report | Privacy Policy