Part A TechHire_3rd_package

Part A TechHire_3rd_package.docx

TechHire and Strengthening Working Families Initiative Grant Programs Evaluation18-Month Follow-Up Survey

OMB: 1290-0027

Document [docx]
Download: docx | pdf

Evaluation of Strategies Used in the TechHire and Strengthening Working Families Initiative Grant Programs

ICR REF 201905-1290-001

May 2019



OMB SUPPORTING STATEMENT PRA PART A


The U.S. Department of Labor’s Chief Evaluation Office (CEO) is undertaking the Evaluation of Strategies Used in the TechHire and Strengthening Working Families Initiative (SWFI) Grant Programs. The evaluation includes both implementation and impact components. The purpose of the evaluation is to identify whether the grants help low-wage workers obtain employment in and advance in H-1B industries and occupations and, if so, which strategies are most helpful.


This supporting statement, for an 18-month participant follow-up survey, is the third in a series of OMB submissions that correspond to an array of data collection activities for the evaluation. In January 2018, OMB approved the baseline data collection for this evaluation (OMB 1290-0014), which includes a baseline information form (BIF), 6-month follow-up participant survey, a participant tracking form, and a first round of site visit interviews. In April 2019, OMB approved the survey of grantees, semi-structured telephone interviews with grantees, a template for partner contact information, a survey of partners, semi-structured telephone interviews with partners, and a second round of site visit interviews (OMB 1290-0021).


CEO is now requesting OMB approval of the last instrument so that the evaluation can continue on schedule. The 18-month participant follow-up survey will provide critical information on job training and employment outcomes for both treatment and control group members.

A.1 Circumstances Necessitating the Information Collection


A user fee paid by employers to bring foreign workers into the United States under the H-1B nonimmigrant visa program provides funding for the TechHire and SWFI grants as authorized by Section 414(c) of the American Competitiveness and Workforce Improvement Act of 1998 (ACWIA), as amended (codified at 29 USC 3224a). In September 2016, the Employment and Training Administration (ETA) competitively awarded 39 Tech Hire grants and 14 SWFI grants. These programs attempt to help U.S. residents access middle- and high-skill high growth jobs in H-1B industries. Broadly, the goals of TechHire and SWFI are to identify innovative training strategies and best practices for populations that have barriers to participating in skills trainings. Both programs emphasize demand-driven training strategies, including employer involvement in training, usage of labor market data, work-based learning, and sectoral partnerships, among other priorities. A key goal of both programs is to bring the training system into better alignment with employer needs. This evaluation seeks build knowledge about the implementation and effectiveness of the approaches used under these grant programs.

CEO undertakes a learning agenda process each year to identify Departmental priorities for program evaluations. This evaluation was prioritized as part of that process in FY 2016. Division H, Title I, section 107 of Public Law 114-113, the “Consolidated Appropriations Act, 2016” authorizes the Secretary of Labor to reserve not more than 0.75 percent from special budget accounts for transfer to and use by the Department’s Chief Evaluation Office (CEO) for departmental program evaluation. Further, 29 USC 3224a (1), authorizes the Secretary of Labor to conduct ongoing evaluation of programs and activities to improve the management and effectiveness of these programs.


Overview of Evaluation

The evaluation research questions can be topically summarized as follows:


Grantee Program Descriptions:

  • What are the types and combinations of programs, approaches or services provided under the TechHire and SWFI grant programs?

  • What are the characteristics of the target populations served?

  • How are services for the target population implemented?

  • What are the issues and challenges associated with implementing and operating the programs, approaches, and/or services studied?

Implementation Procedures and Issues:

  • How were the programs implemented?

  • What factors influenced implementation?

  • What challenges did programs face in implementation and how were those challenges overcome?

  • What implementation practices appear promising for replication?

Partnerships and Systems:

  • How were systems and partnerships built and maintained?

  • What factors influenced the development and maintenance of the systems and partnerships over the lifecycle of the grant?

  • What challenges did programs face in partnership and systems building and how were those challenges overcome?

  • How did partnership development and maintenance strategies evolve over the lifecycle of the grant?

Outputs and Outcomes and Effective Strategies for Overcoming Barriers:

  • How and to what extent did the customized supportive services and education/training tracks expand participant access to targeted employment, improve program completion rates, connect participants to employment opportunities, and promote innovative and sustainable program designs?

  • What strategies and approaches were implemented and/or appear promising for addressing systematic barriers individuals may face in accessing or completing training and education programs and gaining employment in H1B industries?

Removal of Barriers and Coordination at the Systems Level:

  • How and to what extent did the programs both remove childcare barriers and address the individual job training needs of participants?

  • What were the changes in the coordination of program-level supports (training and support services) as well as the leveraging, connecting, and integrating at the systems level?

  • What was the reach and interaction of this program to parents who receive other federal program supports?

To address each of the five research areas, the evaluation includes both implementation and impact components. The implementation study focuses on all 53 TechHire and SWFI grantees and serves several purposes: providing a thorough description of all of the TechHire and SWFI programs; documenting implementation barriers and facilitators; describing partnerships and systems change; and providing descriptive data on program outputs and outcomes. The impact study includes both a randomized control trial (RCT) study and quasi-experimental design (QED) study. The RCT study includes 5 grantees, whereas the QED study includes all of the 53 grantees.

Overview of Data Collection

To address the research questions listed above, the evaluation will include the following data collection activities:

  1. Baseline Information Form (BIF) (clearance already obtained)

  2. 6-month follow-up survey (clearance already obtained)

  3. Round 1 site visit interviews with grantee staff (clearance already obtained)

  4. Round 1 site visit interviews with grantee partners (clearance already obtained)

  5. Participant tracking form (clearance already obtained)

  6. Grantee Survey (clearance already obtained)

  7. Semi-structured telephone interviews with grantees (clearance already obtained)

  8. Partner information template (clearance already obtained)

  9. Partner Survey (clearance already obtained)

  10. Semi-structured telephone interviews with partners (clearance already obtained)

  11. Round 2 site visit interviews (clearance already obtained)

  12. 18- month participant follow-up survey (clearance requested in this package)


With the submission of this justification, CEO requests clearance for the twelfth data collection component listed above (i.e., 18-month participant follow-up survey).

A.2 Purpose and Use of the Information


For the RCT study, the main source of data on employment and earnings outcomes is the National Directory of New Hires (NDNH). Maintained by the Office of Child Support Enforcement (OCSE). The NDNH is a national database of quarterly wage and employment information collected from state unemployment insurance (UI) records. Information available includes whether a participant was employed in the quarter, the number of employers, the earnings, and the industry of employment. The 18-month participant follow-up survey supplements the NDNH and provides critical information on hours worked, hourly wage, and job quality that are key to understanding career trajectories. In addition, the 18-month participant follow-up survey provides information on completion of training, educational progress, occupational credentials, and other economic and non-economic outcomes of interest not available in administrative records. By 18 months, many of the program participants will have completed the shorter-term training programs and entered employment. However, because some of the TechHire and SWFI grantees are using longer-term training along a career pathway, which can result in associate’s and bachelor’s degrees, the 18-month participant follow-up survey will also provide information about educational progress for these participants. The survey will also provide information on program services, household composition and income, health status, and criminal justice involvement.



A.3 Use of Information Technology


The evaluation team will conduct the 18-month participant follow-up survey using a sequential multi-mode approach that begins with the most cost effective mode. Participants will be invited to complete a web survey. Web surveys are low cost and less burdensome, as they offer easy access and submission, while also allowing participants to complete the survey at a time convenient to them and at their own pace. A web survey has the additional advantages of reducing the potential for errors by checking for logical consistency across answers, accepting only valid responses and enforcing automated skip patterns. Participants will be provided the URL and a unique PIN to access the survey. To increase the response rates, automated weekly email reminders and text messages will be sent to all non-respondents. The evaluation team does not expect to achieve the desired response rate via the web only. After sending out three reminders to further increase response rates, interviewers will contact all non-respondents and invite them to complete the survey by telephone. If participants cannot be reached by telephone, interviewers will search for respondents using batch tracing and individualized tracing. As a last resort, field locators will attempt to locate non-respondents to complete the survey.


A.4 Identification of Duplication of Information Collection Efforts


There is no duplication with other data collection efforts. The survey asks questions about job characteristics that are not available in administrative data. In addition, the 18-month participant follow-up survey is the only source of information about training completion and credential attainment of control group members, which is necessary for interpreting the impact findings.

A.5 Impacts on Small Businesses or Other Small Entities


The data collection –survey of program participants—does not involve small businesses or small entities. It will have no an adverse impact on small businesses or other small entities.

A.6 Consequence to Federal Program or Policy if Collection is not Conducted


The evaluation will contribute to the body of literature about strategies to help low-wage workers obtain and advance in employment. Moreover, since DOL is funding other H-1B skills training programs, it is important to have rigorous information about the impact of the programs. If DOL does not collect the information, the evaluation cannot determine the effectiveness of the programs and program strategies. More specifically, the evaluation cannot determine the services and credential receipt differential between the treatment and control groups that cause the earnings impacts.


A.7 Special Data Collection Circumstances


There are no special circumstances for the proposed data collection. This request fully complies with 5 CFR 1320.5.


A.8 Federal Register Notice

A.8.1 Federal Register Notice and Comments


In accordance with the Paperwork Reduction Act of 1995 (Pub. L. 14-13 and Office of Management and Budget Regulations at 5 CFR Part 1320), DOL published a notice on March 6, 2019 in the Federal Register, Volume 84, Number 44 pages 8119-8120 (84 FR 8119). DOL did not receive any public comments.


A.8.2 Consultations Outside the Agency

The following people were consulted in developing the study design.


Technical Working Group

  • Kevin M. Hollenbeck, Ph.D., Vice President, Senior Economist, W.E. Upjohn Institute

  • Jeffrey Smith, Professor of Economics, Professor of Public Policy, University of Michigan

  • Gina Adams, Senior Fellow, Center on Labor, Human Services, and Population at The Urban Institute

  • David S. Berman, MPA, MPH, Director of Programs and Evaluation for the NYC Center for Economic Opportunity, in the Office of the Mayor

  • Mindy Feldbaum, Principal at the Collaboratory


A.9 Payments/Gifts to Respondents


It is critical to maximize the cooperation of study participants with the 18-month follow-up survey data collection effort, thereby increasing the response rate and reducing potential nonresponse bias. Incentive payments are commonly used to lower attrition rates in longitudinal studies. Participants who complete the 18-month follow-up survey will receive $50 if they complete the survey within the first four weeks on the web and $40 if they complete it after that time on the web or telephone. Incentives will be provided in the form of a Visa gift card. The use of incentives is supported by a number of studies. As has been documented elsewhere,1 it is increasingly difficult to achieve high response rates in surveys. In some instances, incentives have been found to be cost neutral as the price of the incentive is offset by the reduction in field time and contact attempts necessary to garner participation.2 Incentives are a reliable way to increase the overall quality of the survey by maximizing the response rate and increasing the efficiency of the survey operations. Several studies have found that incentives are particularly effective for minority and low income groups as well as unemployed individuals.3 The use of an incentive for the 18-month survey is expected to reduce nonresponse bias for the difficult-to-follow sample in the study. Low response rates increase the danger of differential response rates between the program and control groups, leading to non-comparability between the two groups and potentially biased impact estimates.


The choice of early response incentive is guided by the desire to encourage participants to complete the survey by the most cost effective method possible, thereby increasing efficiency of data collection. Literature suggests that early bird incentives can be effective at increasing early response and may even be more effective than pre-paid incentives.4 One study found that providing an early bird incentive of $100 increased the response rate in the cutoff period from 20 percent to 29 percent compared to a $50 incentive.5 Similarly, DOL’s YouthBuild Evaluation included an early response incentive experiment and found that those who were offered the $40 incentive had 38 percent higher odds of completing their survey within the first four weeks, compared to those who were offered the $25 incentive. An early response incentive has the potential to reduce overall data collection costs by shortening the data collection period and driving more responses into the more cost effective web mode. An early response incentive was also approved by OMB for the 6-month follow-up survey. The early results suggest that the response rate to the web mode in the first four weeks is 50 percent.


The 18-month participant follow-up survey incentives of $50 and $40 for early and late response, respectively, represent increases over the incentive amounts for the 6-month follow-up survey. For the 6-month follow-up survey, the early and late response incentives are $30 and $20, respectively. There are several reasons why the requested 18-month survey incentive amounts are higher:


  1. Respondent burden is an important consideration in determination of incentive amounts. The 18-month follow-up survey takes more time to complete than the 6-month follow-up survey. The incentive amounts requested for the 18-month follow-up survey are in line with incentive amounts offered for surveys of similar length.

  2. As is well documented in other surveys, it becomes increasingly difficult to locate individuals as more time since the baseline data collection has passed. The higher incentive amount should help to maintain our response rates and reduction attrition, which will improve precision of the impact estimates.

  3. Although the study is striving for 80-percent response rates to both follow-up surveys, the current cumulative response rate for the 6-month follow-up survey for sample members released in January 2019 is 66 percent as of March 11th, which is lower than expected. In addition, there is a small difference in response rates between the treatment and control groups, with a 70 percent response rate for the treatment group and a 62 percent response rate for the control group. Because the survey sample released has yet to be fully worked, it is too early to tell what the final response rate will be. Nevertheless, the lower-than-expected response rates to the 6-month follow-up survey so far suggest that a higher incentive amount for the 18-month follow-up survey is warranted to increase the likelihood that the survey can achieve the target 80-percent response rate.


There has been extensive research on the relative effectiveness of different incentive amounts. The research indicates that higher incentive amounts increase response rates and reduce panel attrition, although the relationship between incentive amounts and response rates is not linear.6 The marginal increase in response rate decreases as incentive amounts increase. An experiment conducted for DOL’s National Evaluation of the Trade Adjustment Assistance Program found that incentives of $50 and $75 resulted in significantly higher response rate than an incentive of $25. However, the $50 incentive was more cost effective than the $75 incentive. We believe that the early and late incentives of $50 and $40, respectively, strike a good balance between encouraging cooperation and the efficient use of project resources. The incentive amounts for which we seek approval are in line with other recent federal evaluations of programs that serve similar populations that have offered incentive amounts between $40 and $50. For example, the Green Jobs and Healthcare Impact Evaluation, conducted by DOL, offered a $45 incentive for its follow-up surveys. The Health Professions Opportunity Grant (HPOG) Impact Evaluation, conducted by the U.S. Department of Health and Human Services (HHS), Administration for Children and Families (ACF), offered a $40 incentive for its follow-up surveys.

Additionally, we seek approval to offer $5 with the advance letter for the 18-month survey. Considerable literature documents the effectiveness of prepaid incentives, especially in telephone surveys.7 Cantor, O’Hare, and O’Connor’s (2008) review of incentive experiments in telephone surveys found consistently significant effects for prepaid incentives between $1 and $5, with response rate increases of 2.2 to 12.1 percentage points.8 Most directly relevant to the current data collection, there is evidence that a prepaid incentive can increase response rates in sequential multimode designs similar to the one used for this study with hard-to-locate populations.9 An experiment conducted by DOL examined the use of a prepaid incentive in a survey of dislocated workers. The survey included web and telephone with an early response incentive for respondents to complete the survey on the web. The control group was offered a $40 incentive to complete the survey on the web and a $30 incentive to complete the survey on the telephone. The treatment group was offered $35 to complete on the web and $25 to complete on the telephone plus a $5 prepaid incentive. The $5 prepaid incentive significantly increased the response rate and reduced more costly locating efforts, making it cost effective. The increase in response rate was largely due to increases in the response rate to the web mode. Thus, given this evidence, we believe that offering a $5 prepaid incentive in addition to the early response incentive will increase the likelihood of reaching the 80 percent response rate target for the 18-month survey while maintaining the cost efficiency of the data collection.

Finally, we request permission to offer differential incentives to respondents at the end of the field person. As is common in RCT studies, differences in response rates by study group (treatment and control) can sometimes emerge during the course of data collection. Differences in response rates by treatment and control groups in some cases can lead to biased impact estimates. It is critical to achieve balanced response rates across treatment and control groups in each site. The evaluation contractor will continuously monitor the response rates for the treatment and control groups within each site throughout the field period. We seek permission to offer increase the incentive amount on a strategic or differential basis to ensure high response rates in the treatment and control groups in each site. The field period for the data collection includes 4 weeks in web and 4 weeks in telephone. We propose to increase the incentive to $75 for participants who have not cooperated with the data collection effort by the last week of telephone. At that point, telephone interviewers will mention the $75 incentive amount when introducing the survey and when leaving an answering machine message. Participants who have not responded may have greater barriers to participation in the survey, thereby justifying a larger incentive for such respondents. The use of differential incentives has a long history of use in evaluations, including the Moving to Opportunity (MTO) for Fair Housing demonstration.10 The use of differential incentives will increase the overall response rate and encourage participation in the control group, thereby reducing potential bias in the impact estimates. The use of differential incentives will also be used to assess non-response bias, assuming that the less cooperative participants resemble those that never respond at all. We will be able to assess whether those who are only motivated by the larger incentive differ significantly from those that cooperate early.


Research has documented that the use of differentials incentives can increase response rates and reduce nonresponse bias. In a nonresponse study conducted as part of the Health and Retirement Study (HRS), a sample of people who refused were offered $100 per individual (or $200 per couple) to participate, brought into the sample a group of people distinctly different from participants who were not offered the differential incentive.11 Similarly, Singer, Van Hoewyk, and Maher (2000) found that a $5 incentive paid to a random half of households for which an address could be located increased the number of low-education respondents in the sample.12



A.10 Assurance of Privacy


Information collected will be kept private to the extent permitted by law. Westat and MDRC are very cognizant of federal, state, and DOL data security requirements. All Westat and MDRC study staff will comply with relevant policies related to secure data collection, data storage and access, and data dissemination and analysis. All staff working with PII will sign data security agreements. The evaluation team will take the following precautions to ensure the privacy and anonymity of all data collected:

  • All project staff, including research analysts, and systems analysts, will be instructed in the privacy requirements of the data and will be required to sign statements affirming their obligation to maintain privacy;

  • Only evaluation team members who are authorized to work on the project access to respondent contact information, completed survey instruments, and data files;

  • Data files that are delivered will contain no personal identifiers for respondents;

  • All data will be transferred via a secure file transfer protocol (FTP); and

  • Analysis and publication of project findings will be in terms of aggregated statistics only.


All respondents will be informed that the information collected will be reported in aggregate form only and no reported information will identify any individuals. In addition, the study has obtained a Confidentiality Certificate from the U.S. government, meaning that we do not have to identify individual participants, even under court order or subpoena. This information will be conveyed to participants in the Informed Consent Form.


Access to the online surveys will require a unique PIN provided to the respondent. Survey data collection will use secure sockets layer encryption technology to ensure that information is secure and protected.


Evaluation team members working with the collected data will have previously undergone background checks that may include filling out an SF-85 or SF-85P form, authorizing credit checks, or being fingerprinted.


A.11 Justification of Questions of a Sensitive Nature


The survey asks questions are about respondents’ contact with the criminal justice system. Because grantees serve individuals who have criminal records, these are important questions to ask to comprehensively evaluate the impact of the programs. Respondents will be reminded that all of their answer will be kept private. In addition, as discussed above, the study has obtained a certificate of confidentiality.


A.12 Estimate of Annualized Burden Hours and Costs


Table A.12 presents the estimated respondent hour and cost burden. Burden estimates are annualized over a three-year period. Burden estimates are based on the contractor’s experience conducting similar data collections and BLS based hourly wage rates.


The evaluation team estimates an 80 percent response rate to the 18-month participant follow-up survey. The 80 percent response rate equates to (.80 x 568) = 454 annual participants. Completion of the 18-month follow-up survey will be approximately 30 minutes. The annual burden hours are 454 x 30/60 = 227 hours.


The cost represents the average hourly wage rate for the respondent multiplied by the corresponding number of hours, as shown in Table A.12. The annual cost to respondents for this data collection is $2,111.


Table A.12 Estimated Respondent Hour and Cost Burden

Instruments

Number of Respondents

Number of Responses per Respondent




Total Number of Responses

Avg. Burden per Response

(in Hrs.)

Total

Hour Burden (Rounded)




Average

Wage

Ratea




Total

Cost Burden

18-month follow up participant survey

454

1




454

30/60

227




$9.30




$2,111

Total

454


454


227


$2,111

  1. The hourly wage rate for participants was the weighted average of the 2018 minimum wage for each State where the RCT sites are located. Source: http://www.ncsl.org/research/labor-and-employment/state-minimum-wage-chart.aspx




A.13 Estimates of Annualized Respondent Capital and Maintenance Costs


There are no capital/start-up or ongoing operation/maintenance costs associated with this information collection.


A.14 Estimates of Annualized Cost to the Government

The total cost to conduct the information collected in this request is $559,298. The annualized cost is $559,298 / 3 = $186,432.

The estimated cost to the federal government for the contractor to carry out this study is based on the detailed budget of contractor labor and other costs is $500,000 for survey development, data collection, and analysis.

In addition, DOL expects the annual level of effort for Federal government technical staff to oversee the contract will require 200 hours for one Washington D.C.-based GS-14, Step 4 employee earning $61.77 per hour.13 To account for fringe benefits and other overhead costs the agency applies a multiplication factor of 1.6. The annual cost is $19,328 ($61.77 x 1.6 x 200 = $19,766). The data collection period covered by this justification is three years, so the estimated total cost is $59,298 ($19,766 x 3 = $59,298). The total cost is $500,000 + $59,298 = $559,298.

A.15 Changes in Hour Burden


This is a new data collection.


A.16 Plans for Tabulation and Publication

The 18-month follow-up survey will support the final report documenting the impact and will be submitted to DOL in 2021. The report will document the effects of participation on employment and earnings using the NDNH data and on employment, wages, hours worked, and non-economic outcomes using the 18-month survey.

A.17 Approval to Not Display the Expiration Date


The expiration date for OMB approval will be displayed.



A.18 Exceptions to the Certification Statement


There are no exceptions to the Certification for Paperwork Reduction Act (5 CFR 1320.9) for this study.

1 For example, see Brick, J. M., & Williams, D. (2013). Explaining rising nonresponse rates in cross-sectional

surveys. The ANNALS of the American academy of political and social science, 645(1), 36-59; and Curtin, R., Presser, S., & Singer, E. (2005). Changes in telephone survey nonresponse over the past quarter century. Public opinion quarterly, 69(1), 87-98..

2 Research Triangle Institute. Evidence-based Practice Center North Carolina Central University

(Durham, North Carolina) & West, S. (2002). Systems to rate the strength of scientific evidence (pp. 51-63). AHRQ (Agency for Healthcare Research and Quality).

3 Singer, E. (2002). The use of incentives to reduce nonresponse in household surveys. Survey

nonresponse, 51, 163-177; and Jackle, A., & Lynn, P. (2007). Respondent incentives in a multimode

panel survey: Cumulative e_ects on nonresponse and bias (no. 32). Colchester: University of Essex.

4 LeClere, F., Plumme, S., Vanicek, J., Amaya, A., & Carris, K. (2012). Household early bird

incentives: leveraging family influence to improve household response rates. In American Statistical Association Joint Statistical Meetings, Section on Survey Research.

5 Coopersmith, J., Vogel, L. K., Bruursema, T., & Feeney, K. (2016). Effects of Incentive Amount and Type of Web Survey Response Rates. Survey Practice, 9(1).

6 Mercer, A, Caporaso, A, Cantor, D, Townsend, R (2015) How much gets you how much? Monetary incentives and response rates in household surveys. Public Opinion Quarterly 79(1): 105129.

7Singer, E., van Hoewyk, J., and Maher, M. P. (2000). Experiments with incentives in telephone surveys. Public Opinion Quarterly, 64, 171–188; and Singer, E., Groves, R. M., and Corning, A. D. (1999). Differential incentives: Beliefs about practices, perceptions of equity, and effects on survey participation. Public Opinion Quarterly, 63, 251–260.

8 Cantor, D., O’Hare, B., & O’Connor, K. (2008). The use of monetary incentives to reduce non-response in random digit dial telephone surveys. In Advances in Telephone Survey Methodology, eds. James M. Lepkowski, Clyde Tucker, J. Michael Brick, Edith de Leeuw, Lilli Japec, Paul J. Lavrakas, Michael W. Link, and Roberta L. Sangster, 471-98. New York: Wiley.

9 Hock Heinrich Hock • Priyanka Anand • Linda Mendenko with Rebecca DiGiuseppe • Ryan McInerney The Effectiveness of Prepaid Incentives in a Mixed-Mode Survey Presentation at the 70th Annual Conference of the American Association for Public Opinion Research Hollywood, FL May 2015: http://www.aapor.org/AAPOR_Main/media/AnnualMeetingProceedings/2015/G2-3-Mendenko.pdf

10 Gebler N, Gennetian LA, Hudson ML, Ward B, Sciandra M. Achieving MTO’s High Effective Response Rates: Strategies and Tradeoffs. Cityscape. 2012;14(2):57–86.

11 Juster, F. Thomas and Richard Suzman. 1995. “An Overview of the Health and Retirement

Study.” The Journal of Human Resources 30 (Supplement 1995):49-.

12 Singer, Eleanor, John Van Hoewyk, and Mary P. Maher. 1998. “Does the Payment of Incentives Create Expectation Effects?” Public Opinion Quarterly 62:152-64.

13 See Office of Personnel Management 2018 Hourly Salary Table: https://www.opm.gov/policy-data-oversight/pay-leave/salaries-wages/salary-tables/pdf/2018/DCB_h.pdf

1


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created2021-01-13

© 2024 OMB.report | Privacy Policy