Supporting Statement A - NSG2

50904 Generics Supporting Statement A.docx

Formative Data Collections for ACF Research

Supporting Statement A - NSG2

OMB: 0970-0356

Document [docx]
Download: docx | pdf

Alternative Supporting Statement for Information Collections Designed for

Research, Public Health Surveillance, and Program Evaluation Purposes





Next Steps for Rigorous Research on Two-Generation Programs



Formative Data Collections for ACF Research


0970 - 0356





Supporting Statement

Part A

JULY 2020


Submitted By:

Office of Planning, Research, and Evaluation

Administration for Children and Families

U.S. Department of Health and Human Services


4th Floor, Mary E. Switzer Building

330 C Street, SW

Washington, D.C. 20201


Project Officer: Kathleen Dwyer







Part A



Executive Summary


  • Type of Request: This Information Collection Request is for a generic information collection under the umbrella generic, Formative Data Collections for ACF Research (0970-0356).


  • Description of Request: The purpose of the Next Steps for Rigorous Research on Two-Generation Programs (NS2G) project is to build the capacity of the two-generation field for future ACF research. NS2G will support formative evaluations designed to strengthen existing two-generation programs and inform the broader two-generation program field about approaches programs can take to improve their program models and readiness for evaluation. Data collection activities will occur in up to five sites and, within each site, will include phone calls, staff interviews, and focus groups to learn about the site’s two-generation programs and document their services, strengths, gaps, and participant needs. Data collection will also include a staff survey to inform program improvement efforts. We do not intend for this information to be used as the principal basis for public policy decisions.



  • Time Sensitivity: The project requires clearance by August 28, 2020, in order to secure programs’ participation and establish memoranda of understanding (MOUs) before the start of the formative evaluations, expected to begin in late September 2020.






A1. Necessity for Collection

The goal of the Next Steps for Rigorous Research on Two-Generation Approaches (NS2G) project is to build the capacity of the two-generation field for future ACF research. Two-generation approaches intentionally combine adult education and employment-focused training for parents with accessible, high quality early care and education for children. Such programs could improve family well-being and reduce the transmission of poverty across generations: Offering parents education and job training that can lead to well-paying jobs, while providing their children with high quality early care and education, can help move families toward economic security and support children’s development to promote their success in school and, ultimately, as adults. However, there is little evidence on the effectiveness of such programs.


NS2G builds on the earlier project, Exploration of Integrated Approaches to Supporting Child Development and Improving Family Economic Security, which demonstrated that to prepare for effectiveness evaluations, contemporary two-generation programs will first need to use formative evaluation, among other efforts, to refine their models (Sama-Miller et al. 2017). NS2G provides an opportunity to strengthen selected two-generation programs, extend the lessons from those activities to the broader two-generation program field, and ultimately prepare two-generation programs for summative evaluation.


There are no legal or administrative requirements that necessitate this collection. ACF is undertaking the collection at the discretion of the agency.


A2. Purpose

Purpose and Use

The proposed information collection will support formative evaluations and technical assistance (TA). The activities meet the following goals of ACF’s generic clearance for formative data collections for research and evaluation (0970-0356):

  • inform the development of ACF research

  • maintain a research agenda that is rigorous and relevant

  • inform the provision of TA.


Through a program confirmation telephone interview (Instrument 1), semi-structured interviews and design thinking activities with program staff and partners (Instrument 2), and a participant focus group (Instrument 3), we will collect qualitative data to develop an understanding of how two-generation programs are designed and implemented, how program participants interact with two-generation programs, and implementation challenges and opportunities to strengthen two-generation program design and implementation. While we are planning to administer Instruments 2 and 3 on-site and in person, we may need to administer them virtually given safety concerns around the COVID-19 pandemic. We will work with selected programs on TA activities designed to strengthen program design and implementation; a program staff survey (Instrument 4) will provide data for formative rapid-cycle evaluations (RCEs) intended to pilot-test and refine enhancements to two-generation program services and implementation.


Throughout the formative evaluations and TA, the NS2G project team will provide ongoing TA to the programs through regularly scheduled calls with program staff. The objective of TA calls is to coach program staff in strengthening their program models through activities such as refining a logic model, monitoring program data that the program collects, and pilot-testing improvements to the program model through formative rapid-cycle evaluation. At the mid- and endpoints of the project, we will use a short web survey to collect feedback from study participants on the process of participating in the formative evaluation (Instrument 5).


The information collected is meant to contribute to the body of knowledge on ACF programs. It is not intended to be used as the principal basis for a decision by a federal decision-maker, and is not expected to meet the threshold of influential or highly influential scientific information. The NS2G project team will summarize its insights from the formative evaluations and TA in a final report and up to three briefs. In these briefs, the team may share information on TA processes and experiences working with the programs to provide examples that may strengthen the field of two-generation programming.


Research Questions or Tests

The instruments included in this information collection request are designed to answer the following question related to each participating program’s formative RCE:


  • Program confirmation protocol (Instrument 1):

    • What data do two-generation programs collect about program services and participants?

    • How do two-generation programs use the data they collect?

    • What challenges do two-generation programs face in collecting data and tracking participants?

    • What are two-generation programs’ experiences with continuous quality improvement?

  • Site visit topic guide (Instrument 2):

    • How are two-generation programs designed, operated, and staffed to meet the needs of low-income parents and children?

    • How do stakeholders partner with each other to design and implement two-generation programs?

    • How are services in a two-generation program aligned and integrated to intentionally serve parents and children from the same family?

    • How do two-generation programs monitor program services, ensure fidelity to a two-generation program model, and improve implementation?

    • What are the main challenges to implementing two-generation programs and opportunities exist to address them?

  • Participant focus group protocol (Instrument 3):

    • What are families’ experiences in two-generation programs?

  • Program staff survey (Instrument 4):

    • What are two-generation program staff members’ experiences implementing a new tool, strategy, or process designed to improve program services?

  • Formative evaluation feedback survey (Instrument 5):

    • How beneficial did two-generation program staff find the process of participating in formative evaluations?

    • What topics did two-generation programs make progress on addressing through the formative evaluations?

    • What topics were two-generation programs unable to address in the formative evaluations?

    • What facilitators and challenges did two-generation programs encounter in the formative evaluations?


Study Design

The NS2G project will identify and partner with up to five two-generation programs to conduct formative RCEs informed by the LI2 framework. LI2, developed by Mathematica and the Office of Planning, Research, and Evaluation (OPRE), supports collaboration between researchers and practitioners to create sustainable program change through activities in three phases (Learn, Innovate, and Improve). LI2 employs rigorous research techniques to develop actionable evidence for programs, build programs’ capacity for research and evaluation, and build knowledge for the broader research field (Derr et al. 2017). OPRE has contracted with Mathematica to conduct the information collections described in this request package.


The study team will consult with a group of experts (see A8) to select two-generation programs for the study. Because the programs are purposively selected (see B2), this study is not designed to be representative or generalizable to a specific sub-population. The information collected in this study will be qualitative, and based on the perceptions and self-reports of staff members of participating two-generation programs. This study is not collecting outcome data on participants in two-generation programs, and has a relatively short time frame. As a result, NS2G will be limited in what it can conclude about whether the program models developed and refined through the formative evaluations and TA are effective at improving outcomes for low-income families. The primary use of the findings from NS2G will be to provide recommendations for continued evidence-building. To provide those recommendations, NS2G will develop a summative report and up to three briefs about the formative evaluations and TA. The products will not share quantitative findings or hard data about the programs. In sharing findings, we will describe the study methods and limitations to generalizability and as a basis for policy.


Each two-generation program in NS2G will participate in its own formative evaluation and receive its own TA from Mathematica. First, in collaboration with Mathematica TA liaisons, each participating two-generation program will begin its TA process by developing a formative evaluation and TA plan that specifies the types of activities that programs will engage in (supported by Instrument 1). (Calls using Instrument 1 will begin before the formal start of the formative evaluations and TA, because one of the goals of the instrument is to secure their participation; see A16.) As a part of the “Learn” phase, Mathematica will conduct a two-day site visit (supported by Instruments 2 and 3) to develop an understanding of program services and implementation challenges. Following the site visit, as part of the “Innovate” phase, each program will develop a detailed logic model and will design strategies to address implementation challenges. During the Innovate stage, each program will participate in monthly teleconferences with Mathematica TA liaisons. Once programs have developed strategies to address implementation challenges, they will move on to the “Improve” phase. During this phase, programs will conduct their own formative RCEs to pilot-test their strategies with support from their TA liaison, collect feedback (supported by Instrument 4), and use the feedback to refine the implementation of their strategies. About two years after the start of their participation in the study, the sites will participate in a briefing to share what they learned from their formative evaluations and TA and develop action plans to continue their work through the end of the project. Following this briefing, Mathematica TA liaisons will hold teleconferences with each program every other month to check on the status of each program’s action plans. At the midpoint and end of the formative evaluations and TA, Mathematica will collect feedback about the process of participating in the formative evaluations and TA (supported by Instrument 5). See Supporting Statement B, section B4 for more information about data collection procedures.


Table 1. Study Design Components and Timeline

Data Collection Activity

Instruments

Respondents, Content, Purpose of Collection

Mode and Duration

Expected Timeline

Program confirmation call

Program Confirmation Protocol (Instrument 1)

Respondents: Program leader


Content: Data collected by the program about families, how these data are used by the program, previous experience with TA, partnerships with other programs and services, and availability to participate in NS2G


Purpose: To prepare the program to sign an MOU and develop plans for formative evaluation and TA

Mode: Telephone


Duration: 90 minutes

August 2020–October 2020

Interviews

Site Visit Topic Guide (Instrument 2)


Respondents: Program leaders, mid-level managers, frontline staff, and program partner directors


Day 1 Content: Community context, program vision and goals, partners, intake, service delivery and case flow, program staffing, data management, program improvement, observation of program activity


Day 1 Purpose: To deepen our understanding of the site’s TA needs and stage of development


Day 2 Content: Program logic model and relationship between program inputs, activities, and expected outputs and outcomes; program stakeholder map; program implementation challenges; and opportunities to strengthen two-generation program model; feedback on design-thinking session


Day 2 Purpose: To identify opportunities to develop and improve the program's two-generation approach and generate creative solutions to challenges the program is facing

Mode: In-person; Day 2 comment card on paper; if site visits are virtual, then activities will be conducted by videoconference and web


Duration: Up to 2 working days


Day 1: 90 minutes per respondent


Day 2: 8 hours for all participating staff

December 2020–March 2021

Focus groups

Participant Focus Group Protocol (Instrument 3)


Respondents: Program participants


Content: Activities their family participated in, what they find helpful about the program, and what they think can be improved


Purpose: To collect information on the participants’ experiences in the program

Mode: In-person; if site visits are virtual, then activities will be conducted by videoconference


Duration: 90 minutes

December 2020–March 2021

Survey

Program Staff Survey (Instrument 4)

Respondents: Program staff


Content: Summary information about the implementation of a program improvement strategy that a site has developed or revised in partnership with the NS2G TA liaisons


Purpose: To understand how staff implement the improvement strategy that is being piloted and gather their feedback about the experience

Mode: Web


Duration: 10 minutes for each response, with each respondent answering up to 24 times over the RCE (once a week over three 8-week cycles)

July 2021–December 2022

Survey

Formative evaluation feedback survey (Instrument 5)

Respondents: Program staff members who attend regular TA calls


Content: Summary information about program staff experiences participating in the formative evaluations


Purpose: To understand whether and how programs improved capacity and readiness for evaluation as a result of participating in the formative evaluations.

Mode: Web


Duration: 20 minutes for each response, with each respondent answering twice over the formative evaluation

January 2022 and March 2023

Note: Timeline is dependent on the timing of OMB approval of this information collection request


The data collection procedures for study activities are included in Supporting Statement B.


Other Data Sources and Uses of Information

During the formative evaluation, we will hold regular TA calls with program staff. We may use call notes from TA calls to inform the summative report and briefs.


At the end of the formative evaluations and TA, Mathematica will work with each program to develop and deliver a final briefing that describes their main findings and lessons from participating. We may use these final briefing materials to inform the summative report and briefs.


A3. Use of Information Technology to Reduce Burden

The web-based staff survey for the formative RCE and the formative evaluation feedback survey will be administered using SurveyMonkey. SurveyMonkey is an intuitive platform that is easy for respondents to navigate. It is also optimized for mobile use, so respondents will have the option to use a smartphone or tablet to complete the survey.


During the site visits, we may audio-record interviews and focus groups. The purpose of the audio recordings is to verify the accuracy of written notes, and participants will be given privacy assurances (See A10). If site visit activities are conducted remotely, we will provide participants with a toll-free conference line and videoconferencing software, such as WebEx, so that programs do not incur extra costs from their participation.


A4. Use of Existing Data: Efforts to reduce duplication, minimize burden, and increase utility and government efficiency

The study team is not collecting any information that is available elsewhere. None of the five instruments ask for information that can be reliably obtained through other sources.


A5. Impact on Small Businesses

Some of the programs included in the study will be part of small organizations, including community-based organizations and other nonprofits. The study team will minimize burden for respondents by scheduling the initial site visit at a time that is convenient for them. To minimize interference of the formative RCE with program staff members’ job responsibilities, the program staff surveys (Instruments 4 and 5) have been designed to collect feedback in a manner that is rapid, low burden, and actionable.


A6. Consequences of Less Frequent Collection

The program confirmation call (Instrument 1), program staff interviews (Instrument 2), and participant focus groups (Instrument 3) are one-time data collection activities. Each program staff respondent will complete the 10-minute RCE survey (Instrument 4) up to 24 times over the three-year formative evaluation and TA engagement, depending on how frequently this exercise is deemed useful to the program while piloting a new strategy or process. The short time frame and multiple responses facilitate rapid adaptation and refinement when program changes do not appear to be working as intended. Administering the survey less frequently would yield less actionable data for programs to use to refine their strategies. The formative evaluation feedback survey will be administered twice. The first administration will be used to assess progress in the formative evaluations and make adjustments to our approach. The second administration will contribute to an overall assessment of the success of NS2G and provide information to OPRE about the usefulness of formative evaluation and TA to strengthen two-generation programs and improve their readiness for evaluation.

A7. Now subsumed under 2(b) above and 10 (below)


A8. Consultation

Federal Register Notice and Comments

In accordance with the Paperwork Reduction Act of 1995 (Pub. L. 104-13) and Office of Management and Budget (OMB) regulations at 5 CFR Part 1320 (60 FR 44978, August 29, 1995), ACF published a notice in the Federal Register announcing the agency’s intention to request an OMB review of the overarching generic clearance for formative information collection. This notice was published on October 11, 2017, Volume 82, Number 195, page 47212, and provided a sixty-day period for public comment. During the notice and comment period, no substantive comments were received.



Consultation with Experts Outside of the Study

We have consulted with experts in the two-generation field from the onset of the study. Experts provided input on what the project team should aim to learn about the programs and which formative evaluation and TA topics would be beneficial for the programs and for the field. Their input informed the content included in the attached instruments, including the information the study team should collect during the program confirmation call prior to beginning any formative evaluation and TA. To date, the study team has engaged the following experts in these discussions: Christopher King (University of Texas-Austin), Allison Holmes (Annie E. Casey Foundation), Marjorie Sims (Ascend), and Sharon McGroder (Sharon McGroder Consulting).


A9. Tokens of Appreciation

We do not plan to offer tokens of appreciation for this information collection.


A10. Privacy: Procedures to protect privacy of information, while maximizing data sharing

Personally Identifiable Information

This information collection request includes the collection of minimal personally identifiable information (PII). In order to build rapport for the telephone and in-person conversations, we need to be able to address staff by name, particularly in the context of a group interview. Not only is it respectful to refer to staff by name, but it will also help to make the interview efficient by allowing the interviewer to direct specific questions to the relevant staff. Staff names will not be connected with interview responses. This study does not use an information system that uses personal identifiers to retrieve data.


The four instruments included in this request will collect the following personal information from respondents:

  • The program confirmation call (Instrument 1) includes a request for respondent staff members’ names, how long they have worked at the program, and a description of their role at the program.

  • The site visit topic guide (Instrument 2) includes a request for staff interviewees’ names, the number of years they have worked at the program, and a description of their role at the program.

  • The participant focus group protocol (Instrument 3) includes a request for participants’ names, how long they have participated in the program, and which family members participate in program services.

  • The program staff survey (Instrument 4) does not request any PII.

  • The formative evaluation feedback survey (Instrument 5) does not request any PII.

The project team will also have access to staff names and email addresses through ongoing TA activities. The collection of staff email addresses is necessary to administer the web-based program staff survey and formative evaluation feedback survey (Instruments 4 and 5) via SurveyMonkey.


Assurances of Privacy

Information collected will be kept private to the extent permitted by law. Before answering any questions included in the instruments, respondents will be informed of all planned uses of data, that their participation is voluntary, and that their information will be kept private to the extent permitted by law. These assurances will be provided verbally by the interviewers for Instruments 1, 2, and 3, and are included in the introductory interviewer script in each of these instruments. In instruments 4 and 5, the assurances will be provided on the first screen, which respondents will read before proceeding to the questions included in the instruments.


As specified in the contract, the Contractor will comply with all Federal and Departmental regulations for private information. Mathematica’s Institutional Review Board (IRB) is reviewing the study and approach to protecting human subjects.


Data Security and Monitoring

This project will comply with Mathematica’s data security policies. Only staff from Mathematica will handle data collected under this clearance. All Mathematica staff involved in the project will receive training on (1) limitations of disclosure; (2) safeguarding the physical work environment; and (3) storing, transmitting, and destroying data securely. All Mathematica staff sign the Mathematica Confidentiality Agreement, complete online security awareness training when they are hired, and receive annual refresher training thereafter. Training addresses security policies and procedures found in the Mathematica Corporate Security Manual.


Physical copies of materials (e.g., poster paper) created using the Site Visit Topic Guide (Instrument 2) will be destroyed on site. We will photograph the materials to use for ongoing TA. These photographs will only be of handwritten notes from program staff. We will confirm that no names, likenesses, or other PII is included before taking the photographs (if the visit is virtual, then we will use screen capture or other recording supported by videoconferencing software). We will collect comment cards at the end of Day 2 of the site visit, enter the information into a spreadsheet when we return to the Mathematica office, and destroy physical copies securely (if the visit is virtual, then we will administer the survey via the web).


Audio files, images, and notes obtained during the site visit will be stored in an encrypted project folder on Mathematica’s network. Mathematica uses access control lists to restrict access to the encrypted project folders where sensitive and confidential project data are stored. Access to the project folder is explicitly authorized by the Project Director on need-to-know and least privilege bases. Mathematica staff are required to change their password for computer and network access every thirty days, and passwords must adhere to strict composition standards. Staff access rights to the project folder are revoked when they leave the project. If a staff member leaves Mathematica, his or her access to computing assets, including network access, is terminated. Staff are trained to retain digital files (such as recordings stored on a digital voice recorder) and hard copies of documents collected on site either on their persons or in a locked cabinet until such time that they can be safely transferred to Mathematica’s network.




A11. Sensitive Information 1

The information collection will not ask about sensitive information.


A12. Burden

Explanation of Burden Estimates

The current request includes burden estimates to cover the following activities:


Instrument 1. Program Confirmation Protocol: We will conduct the program confirmation call with up to five programs. We will conduct this 1.5-hour call with 20 total staff (four staff per program). This activity will occur once over the request period. The total burden is 30 hours, and the estimated annualized burden is 10 hours.

Instrument 2. Site Visit Topic Guide Protocol: We will conduct site visits to up to five programs (either in-person or virtual). We will engage up to 100 total program and partner staff (20 per program) to participate in the site visits. During the first day of the site visit, we will conduct separate interviews with program leaders and managers, program supervisors, and frontline staff. Each interview will take no longer than 1.5 hours. During the second day of the site visit, staff will spend the day participating in activities to help the program define or update a theory of change, identify challenges to program implementation, and engage in creative problem solving. We estimate participating in these activities will take 8 hours. This activity will occur once over the request period. The total burden is 950 hours, and estimated annualized burden is 317 hours.

Instrument 3. Participant Focus Group Protocol: We will conduct the Participant Focus Group at up to five sites. We estimate that a total of 60 people (12 per program) will participate in a 1.5-hour focus group. This activity will occur once over the request period. The total burden is 90 hours, and the estimated annualized burden is 30 hours.

Instrument 4. Program Staff Survey: We will conduct the program staff survey with a total of 50 staff (10 per program) at up to five sites. We estimate that each staff member will spend 10 minutes on the survey each time they complete it. This activity will occur 24 total times (8 times during each formative evaluation cycle over three cycles during the request period). The total burden is 192 hours, and the estimated annualized burden is 64 hours.


Instrument 5. Formative evaluation feedback survey: We will conduct the formative evaluation feedback survey with a total of 15 staff (3 per program) at up to five sites. We estimate that each staff member will spend 20 minutes on the survey each time they complete it. This activity will occur twice (once at the midpoint of the formative evaluation and once at the end). The total burden is 10 hours, and the estimated annual burden is 3 hours.


Estimated Annualized Cost to Respondents

We expect the total annual cost for respondents to be $15,776.56 for the information collection in the current request. For each instrument included in the burden table, we calculated the total annual cost by multiplying the total burden hours by the average hourly wage.


For Instruments 1, 2, 4, and 5, we estimate the annualized cost to respondents based on the average hourly wage estimates for deriving total annual costs based on Current Population Survey data for the fourth quarter of 2019 (Bureau of Labor Statistics 2019). For respondents, we used the median usual weekly earnings for full-time wage and salary workers ages 25 and older with a bachelor’s degree ($39.49 per hour). We divided weekly earnings by 35 hours (how the Current Population Survey defines full-time) to calculate hourly wages.


For Instrument 3, we use the federal minimum wage ($7.25) to estimate the annualized cost for respondents because two-generation programs serve predominantly low-income, unemployed, or underemployed populations.


Instrument

No. of Respondents (total over request period)

No. of Responses per Respondent (total over request period)

Avg. Burden per Response (in hours)

Total Burden (in hours)

Annual Burden (in hours)

Average Hourly Wage Rate

Total Annual Respondent Cost

Instrument 1: Program Confirmation Protocol

20

1

1.5

30

10

$39.49

$394.90

Instrument 2: Site Visit Topic Guide

100

1

9.5

950

317

$39.49

$12,518.33

Instrument 3: Participant Focus Group Protocol

60

1

1.5

90

30

$7.25

$217.50

Instrument 4: Program Staff Survey

50

24

0.16

192

64

$39.49

$2,527.36

Instrument 5: Formative evaluation feedback survey

15

2

0.33

10

3

$39.49

$118.47

Total




1272

424


$15,776.56


A13. Costs

There are no additional costs to respondents.


A14. Estimated Annualized Costs to the Federal Government

The total cost for data collection under this current request will be $674,816.

Cost Category

Estimated Costs

Instrument Development and OMB Clearance

$ 62,011

Field Work

$ 477,677

Publications/Dissemination

$135,128

Total costs over the request period

$674,816

Annual costs

$134,964


A15. Reasons for changes in burden

This is for an individual information collection under the umbrella formative generic clearance for ACF research (0970-0356).


A16. Timeline

The study team will engage two-generation programs for potential inclusion in NS2G before finalizing a list of up to five programs by October 2020. The tentative timeline for activities related to collecting and reporting data is outlined below.

Activity

Timeline

Data collection

Program confirmation calls

August 2020 to October 2020

Initial site visits

December 2020 to March 2021

Formative RCE with ongoing survey responses

July 2021 to December 2022

Formative evaluation feedback survey

January 2022 and March 2023

Reporting



Brief #1

August 2023


Brief #2

August 2023


Brief #3

August 2023

Final report

August 2023

Note: Timeline is dependent on the timing of OMB approval of this information collection request


A17. Exceptions

No exceptions are necessary for this information collection.

Attachments

Instrument 1: Program Confirmation Protocol

Instrument 2: Site Visit Topic Guide

Instrument 3: Participant Focus Group Protocol

Instrument 4: Program Staff Survey

Instrument 5: Formative Evaluation Feedback Survey


References

Bureau of Labor Statistics. “Labor Force Statistics from the Current Population Survey.” Washington, DC: Bureau of Labor Statistics. Available at https://www.bls.gov/cps/earnings.htm. Accessed February 20, 2020.

Derr, Michelle, Ann Person, and Jonathan McCay. “Learn, Innovate, Improve (LI²): Enhancing Programs and Improving Lives (Practice Brief).” Washington, DC: Mathematica Policy Research, 2017. Available at: https://www.mathematica.org/our-publications-and-findings/publications/learn-innovate-improve-li2-enhancing-programs-and-improving-lives. Accessed March 10, 2020.

Sama-Miller, Emily, Christine Ross, Teresa Eckrich Sommer, Scott Baumgartner, Lily Roberts, and P. Lindsay Chase-Lansdale. “Exploration of Integrated Approaches to Supporting Child Development and Improving Family Economic Security.” OPRE Report # 2017-84. Washington, DC: Office of Planning, Research and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services. Available at: acf.hhs.gov/sites/default/files/opre/two_gen_final_report_final_clean_b508.pdf

U.S. Department of Labor. “Minimum Wage.” Washington, DC: U.S. Department of Labor. Available at: https://www.dol.gov/general/topic/wages/minimumwage. Accessed February 20, 2020.

1 Examples of sensitive topics include (but not limited to): social security number; sex behavior and attitudes; illegal, anti-social, self-incriminating and demeaning behavior; critical appraisals of other individuals with whom respondents have close relationships, e.g., family, pupil-teacher, employee-supervisor; mental and psychological problems potentially embarrassing to respondents; religion and indicators of religion; community activities which indicate political affiliation and attitudes; legally recognized privileged and analogous relationships, such as those of lawyers, physicians and ministers; records describing how an individual exercises rights guaranteed by the First Amendment; receipt of economic assistance from the government (e.g., unemployment or WIC or SNAP); immigration/citizenship status.

12


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorNickie Fung
File Modified0000-00-00
File Created2021-01-12

© 2024 OMB.report | Privacy Policy