SSA- 0920-XXXX Essentials for parenting clean 10.21.15

SSA- 0920-XXXX Essentials for parenting clean 10.21.15.docx

Evaluation of Essentials for Parenting Toddlers and Preschoolers

OMB: 0920-1086

Document [docx]
Download: docx | pdf

Evaluation of Essentials for Parenting Toddlers and Preschoolers





SUPPORTING STATEMENT: PART A


OMB No. 0920-XXXX




Submitted by:



Department of Health and Human Services

Centers for Disease Control and Prevention

National Center for Injury Prevention and Control

Division of Violence Prevention


Government Project Officers:

Beverly Fortson, Ph.D., Science Officer

[email protected] 770-488-1284

Colby Lokey, M.S., Project Officer

[email protected] 770-488-3785




October 21, 2015




Table of Contents



x. SUMMARY TABLE

A. JUSTIFICATION

1. Circumstances Making the Collection of Information Necessary

2. Purpose and Use of Information Collection

3. Use of Information Technology and Burden Reduction

4. Efforts to Identify Duplication and Use of Similar Information

5. Impact on Small Business or other Small Entities

6. Consequences of Collecting Information Less Frequently

7. Special Circumstances Relating to the Guidelines of 5CFR 1320.5

8. Comments in Response to Federal Register Notice and Efforts to Consult Outside the Agency

9. Explanation of Any Payments or Gifts to Respondents

10. Assurance of Confidentiality Provided to Respondents

11. Justification for Sensitive Questions

12. Estimates of Annualized Burden Hours and Costs

13. Estimates of Other Total Annual Cost Burden to Respondents and Record-keepers

14. Annualized Cost to the Federal Government

15. Explanation for Program Changes

16. Plans for Tabulation, Publication, and Project Time Schedule

17. Reason(s) Display of OMB Expiration Date is Inappropriate

18. Exemptions to Certification for Paperwork Reduction Act Submissions




LIST OF ATTACHMENTS


Attachment A Authorizing Legislation

Attachment B 60-Day Federal Register Notice

Attachment C Public Comment Received on 60-Day Federal Register Notice

Attachment D Westat Institutional Review Board (IRB) Approval Memo

Attachment E Participant Phone Consent Script

Attachment F Post-Screening Introductory Email for Participants

Attachment G Post-Enrollment Follow-Up Email for Participants

Attachment H Sample Facebook Advertisements and Sample Text for Publishers

Attachment I1 Screening and Demographics Questionnaires

Attachment I2 Detailed Assessment Measures

Attachment I3 Core Assessment Measures (Rotating)

Attachment I4 Parental EFP Skills Knowledge Scale

Attachment I5 Parental EFP Skills Usefulness Scale

Attachment I6 Therapy Attitude Inventory and System Usability Scale

Attachment J Screenshots of All Web-Based Survey Measures

Attachment K Graphical Depiction of the Assessment Timeline for both the Guided Navigation (GN) and Natural Navigation (NN) Groups




SUMMARY TABLE

Shape1

  • Goals of the study

The purpose of this project is to conduct an evaluation of the Essentials for Parenting Toddlers and Preschoolers web-based resource. The main goal is to examine changes in parent and child behaviors. We also have several subgoals that will be addressed: 1) determine whether parents’ outcomes are better if guided through the website (or not), 2) determine if additional details are needed for any of the content areas, 3) determine whether parents find the information useful and applicable to everyday parenting challenges, and 4) determine whether changes in parent and child behaviors are consistent with those observed in the behavioral parent training literature.



  • Intended use of the results

The information and data gathered from this study will be used to revise the Essentials for Parenting Toddlers and Preschoolers web-based resource based on the outcomes observed among participants after exposure to each content area.



  • Methods to be used to collect data

We will conduct a single subject, multiple baseline study of 200 parents of 2- to 4-year-old children. With this design, participants serve as their own controls. The design is comparable to experimental designs, in that it allows for causal inference. The multiple baseline design demonstrates the effect of an intervention by showing that behavior change accompanies introduction of the intervention at different points in time.



  • How data will be analyzed 

A multilevel time series analysis will be used to analyze the data. A classic multiple baseline graphical approach will also be used for visualization and tracking of the effects of exposure to content.


































A. Justification

1. Circumstances Making the Collection of Information Necessary

The Centers for Disease Control and Prevention (CDC) is seeking OMB approval to conduct a new information collection for a study entitled, “Evaluation of Essentials for Parenting Toddlers and Preschoolers,” over a period of two years (2015-2017). It is estimated that 1 in 58 U.S. children had been abused and neglected in a 1-year period (i.e., victims of physical, sexual, and emotional abuse or neglect). Millions of other American children are exposed to abuse and neglect that does not meet thresholds for clinical significance, but is nonetheless detrimental to child health. Parent training is arguably the single most effective child abuse and neglect prevention initiative developed to date. Multiple benefits have been found for a large number of parent training programs such as Incredible Years, Parent-Child Interaction Therapy, Triple P, and Helping the Noncompliant Child – programs that emphasize very similar parenting skills (Webster-Stratton & Taylor, 2001). Although there are potentially far-reaching impacts of parent training to improve public health, empirically supported parent training is not widely available. Several factors may be at play. First, these programs are privately owned and are offered at considerable expense to public health agencies, communities, practitioners, and clients. Furthermore, for American parents to engage in an empirically supported parent training program, they must find a credentialed service provider, select an empirically supported bibliotherapy or online resource from among the plethora of parenting books and Internet sites, or become a participant in a research project evaluating a parenting program. The programs also require individual, face-to-face, repetitive contact with parents making them costly to disseminate on a wide scale. For many families, such as those at high risk for child abuse and neglect owing to socioeconomic conditions, an individual, face-to-face parenting session is difficult to attend due to other barriers such as employment and an inability to get off work for sessions, frequent housing moves, and lack of child care when accessing services, among others. These barriers mitigate widespread dissemination. The public health challenge is how to make the content of these empirically supported parent training programs – which largely focus on the same parenting skills and approaches – accessible to the majority of American parents. Moreover, one of the goals of the National Center for Injury Prevention and Control (NCIPC), Division of Violence Prevention (DVP) at the Centers for Disease Control and Prevention (CDC) is to generate an impact on child abuse and neglect at the population level; thus, it is critical to identify how to best promote positive parenting at a national level.


To meet this critical need, address barriers, and leverage the strength of empirically supported parent training as a broadly disseminated child abuse and neglect prevention tool, the CDC has developed a universal prevention approach called Essentials for Parenting Toddlers and Preschoolers (EFP). This web-based resource, which was released to the public in May 2014, includes the typical content of empirically supported parent training programs and uses a psychoeducational approach including modeling (through its videos) and practice (through its activities). The five content areas included are as follows: a) Communicating with Your Child; b) Creating Structure and Rules; c) Giving Directions; d) Using Discipline and Consequences; and e) Using Time-Out. Given its content, EFP is likely to improve parenting (e.g., discipline practices), reduce child behavior problems, and may ultimately reduce child abuse and neglect. Moreover, it is free for parents and can be easily disseminated at a relatively low cost, as it can be accessed through any device that can use the Internet, including computers, tablets, and smart phones. CDC has received positive feedback from federal and non-federal partners about the resource and the development team has received several awards: two communicator awards for the videos, Education Award of Excellence and Social Responsibility Award of Excellence; two CDC awards for Excellence in Domestic Program Delivery; and it is featured in a Sesame Street® Toolkit that encourages talking, singing, and reading to babies and young children as an “additional resource” for parents.

Although EFP content is evidence-based, the web-based method of content delivery is without empirical support and will be examined with this project. This study uses a single-subject, multiple baseline design to examine effects on parent and child behaviors.


The proposed data collection fits into the National Center for Injury Prevention and Control Research Agenda Priorities in Preventing Child Maltreatment (http://www.cdc.gov/injury/ResearchAgenda/index.html) with regard to Tier 1 Parts C (Evaluate the effectiveness of parenting-focused strategies for preventing child maltreatment and promoting safe, stable, nurturing relationships and environments”) and E (“Evaluate the dissemination and implementation of evidence-based strategies for preventing child maltreatment and promoting safe, stable, nurturing relationships and environments”).


Authority for CDC’s National Center for Injury Prevention and Control to collect this data is granted by Section 301 of the Public Health Service Act (42 U.S.C. 241) (Attachment A). This act gives federal health agencies, such as CDC, broad authority to collect data and do other public health activities, including this type of study.


2. Purpose and Use of the Information Collection

The purpose of this data collection request is to determine whether Essentials for Parenting Toddlers and Preschoolers (EFP), a web-based platform for delivery of positive parenting information, yields changes in parent and child behaviors. If EFP is successful at increasing positive parenting and safe, stable, nurturing relationships and environments for children, then CDC has a resource that can be easily and freely disseminated to communities that can potentially impact rates of child abuse and neglect.


We will conduct a study of 200 parents of 2- to 4-year-old children. The main goal is to determine whether changes result in parent and child behaviors after exposure to the web-based content. Half of the parents (n = 100) will be guided in how and when they use specific intervention modules using a single subject, multiple baseline design (called Guided Navigation [GN] group). The other half of the parents (n = 100) will have access to the same EFP content as those being guided through the content but will use as much or as little of the intervention as they wish and on whatever time line they wish (called the Natural Navigation [NN] group). The latter group of parents, the NN group, will simulate the type of “real” online experience parents have on the site (as the site is currently structured), and we want to determine if it results in changes in parent and child behaviors. The NN group will help us determine whether parents need to access only the information that is most relevant to them for changes to be observed in parent and child behaviors, while the GN group will better clarify whether receiving more or a specific ordering of the content is important for changes in behaviors. The overall effects from both groups will also be used to determine whether the observed changes are similar (or not) to changes observed when such programs have been implemented in one-on-one clinical settings. If parent and child behaviors do not change and the lack of effects can be pinpointed to specific content areas or modules, the modules can be revised to include additional details, activities, or other resources to enhance learning.


At the completion of this project, we hope to answer the following questions:

  1. Goal 1: What is the magnitude and direction of change in parenting thoughts (e.g., parental stress, parenting efficacy) and behaviors (e.g., use of praise, ignoring, redirection) and child externalizing behaviors (e.g., aggression, defiance) after exposure to the positive parenting skills (i.e., communicating with the child, creating structure, giving directions, using discipline and consequences, and using time-out)?

  2. Goal 1a: How do parents use the website? Do the outcomes differ for parents who are guided in their use of the website versus those who only access portions of the website they view as most applicable to their situation and needs?

  3. Goal 1b: Is the information provided on the website in sufficient detail for parents to implement the skills with their children or are additional details needed, as per parent reports?

  4. Goal 1c: How useful is the content and is it applicable to everyday parenting challenges, as per parent reports? Is the website easily navigable and usable?

  5. Goal 1d: Are the changes in parent and child behaviors in line with the changes observed in the behavioral parent training literature?


3. Use of Improved Information Technology and Burden Reduction

We will utilize web-based surveys to collect and process data to reduce respondent burden and make data processing reporting more timely and efficient. In all data collections, the number of questions will be held to the absolute minimum required for the intended use of the data. All surveys will take place online using electronic survey forms. Screen shots of all questions to be administered electronically can be found in Attachment J.


4. Efforts to Identify Duplication and Use of Similar Information

No publically available data on this topic exists and, as such, no other existing data may be used to assess the variables of interest in the current project.


Throughout the 6 years during which this resource was developed, project staff at CDC consulted with a wide range of individuals (both federal and non-federal partners) on the content, delivery mechanism, and other aspects of the resource. The non-federal partners who were consulted are included in the table in Section 8. Some of the federal partners consulted during this process include individuals from various offices within Department of Health and Human Services (HHS), including the Administration for Children and Families (ACF), the Assistant Secretary for Planning and Evaluation (ASPE), the Office of Planning, Research, and Evaluation (OPRE), the Administration on Intellectual and Developmental Disabilities (AIDD), the Office of the Assistant Secretary for Planning and Affairs (ASPA), as well as other federal partners such as the Department of Education (DOE), the Health Resources and Services Administration (HRSA), and the Substance Abuse and Mental Health Services Administration (SAMHSA). The videos that comprise this resource were reviewed by the HHS ASPA office to ensure that they were not duplicative of other federal efforts, given a current focus on parenting. The videos were approved as being of high quality and not duplicative of other efforts. Our other federal partners have been consulted to provide information about the resource and determine its potential utility. We communicate frequently with each of these federal partners in various interagency parenting and early childhood workgroups. As a result of our open communication with our fellow federal and non-federal partners about this project and the resource in general, we feel confident that there is no similar project or evaluation currently being conducted that would make this work duplicative of existing efforts. At this point, the web resource has only been promoted among our federal and non-federal partners. As of March 2015, we have had over 188,000 hits on the website, which demonstrates that our partners are driving parents to the resource. Furthermore, we have received overwhelmingly positive feedback and support from our partners about the value of the resource in their work with parents.


5. Impact on Small Businesses or Other Small Entities

Small businesses are not a part of the respondent universe.


6. Consequences of Collecting the Information Less Frequently

Parents will complete repeated assessments of child externalizing behavior (e.g., refusal to follow rules, physical aggression), parenting behaviors (e.g., use of praise and time outs), parenting thoughts (e.g., perceived parenting competence and burden), and parent psychological adjustment (e.g., depression and anxiety), as well as knowledge and perceived usefulness of EFP intervention content. Parents will complete 18 weekly online assessments.


Less frequent evaluation data collection would not allow for appropriate measurement of parenting thoughts, parenting behaviors, and child externalizing behaviors and changes that occur during the course of exposure to the web-based content. Furthermore, if this evaluation were not conducted, it would not be possible to determine the effects of the web-based platform on parent and child behaviors or its value and impact on the target audience. Failure to collect these data or less frequent data collection could preclude effective use of federal resources to benefit parents and children and prevent child abuse and neglect. In addition, collecting less frequent data would not be feasible due to the study design. The study design is a single-subject repeated measures multiple baseline design, which is being implemented to maximize what can be determined about changes in parent and child behaviors after exposure to the Essentials for Parenting Toddlers and Preschoolers resource. This design was selected as it is a stronger design than a pre-/post-test design and it allows us to pinpoint specific strengths and weaknesses of the resource so changes to the resource can be made as needed. With such designs, individuals are used as their own controls and comparisons made between the different intervention time periods. Replication across subjects is used to enhance control. Thus, there is no control group but the quasi-experimental methodology employed with the design “may provide persuasive information that attests to the efficacy of treatment” (Kazdin, 1998, p. 449). The design is considered comparable to the more traditional group experimental design in that it allows for causal inference (Kazdin, 1998). Moreover, in this type of design, within-group variation is eliminated, making it easier to detect an intervention effect with a smaller sample size and/or effect size. A more detailed description of the advantages and limitations of single-subject multiple baseline designs is included in Attachment L.


There are no legal obstacles to reduce the burden.


7. Special Circumstances Relating to the Guidelines of 5 CFR 1320.5

This request fully complies with the regulation 5 CFR 1320.5.


8. Comments in Response to the Federal Register Notice and Efforts to Consult Outside the Agency

A. Federal Register

A 60-day Federal Register Notice was published in the Federal Register on Friday, January 23, 2015, Vol. 80, No. 15, pp. 3600-3601 (see Attachment B). One non-substantive public comment was received during this review time (Attachment C), and the standard CDC response was sent.


B. Efforts to consult with persons outside the agency

Formative, consultation work was conducted to identify promising practices that may play a role in preventing child abuse and neglect.


Specifically, CDC staff consulted with the following individuals on the listed aspects of the resource and project:


Person

Agency/Affiliation

Dates of Involvement

Type of Consultation

Melissa Lim Brodowski, MSW, MPH

Office on Child Abuse and Neglect, Children’s Bureau, ACYF, ACF, HHS

03/2010 – 04/2010

04/2014

Information only

Overview of resource

Susan D. Kirby, Dr.P.H.

President, Kirby Marketing Solutions, Inc.

09/2009 – 10/2010

Audience segmentation; expert panel member

Shannon Self-Brown, Ph.D.

Associate Director, National SafeCare® Training and Research Center, Georgia State University

09/2008 - present

Audience segmentation; expert panel member; content development

Danie Watson, MA

President, The Watson Group Marketing Communication

09/2009 – 10/2010

Audience segmentation; expert panel member

Daphne Terry Babrow, Ph.D., M.P.H., M.Ed.

ECCS Program Manager, Georgia Department of Community Health

03/2010 – 08/2010

Expert panel member

Renee J Bator, Ph.D.

Associate Professor, State University of New York-Plattsburgh

03/2010 – 08/2010

Expert panel member

Jacqueline Moore Bowles, MBA

National President, Jack and Jill of America, Inc

03/2010 – 08/2010

Expert panel member

Brian C Castrucci, MA

Director, Maternal and Child Health Program, Georgia Department of Community Health

03/2010 – 04/2010

Information only

Robin Higa

Parent Leader Consultant, National Alliance of Children’s Trust and Prevention Funds

03/2010 – 08/2010

Expert panel member

Tammy Piazza Hurley, BS

Manager, Child Abuse and Neglect, Division of Safety and Health Promotion, American Academy of Pediatrics

03/2010 – 08/2010

Expert panel member

Rebecca Levin, MPH

Senior Manager, Injury, Violence, and Poison Prevention, American Academy of Pediatrics

03/2010 – 08/2010

Expert panel member

Stephanie Miles, MBA

Associated Director for Programming, WebMD

03/2010 – 08/2010

Expert panel member

Beth K. Rosen

Founder, 4 KEYS MEDIA, INC

03/2010 – 08/2010

Expert panel member

Jane F Silovsky, PhD

Associate Professor, University of Oklahoma Health Sciences Center

03/2010 – 08/2010

Expert panel member

Renee Wilson-Simmons, Dr.P.H.

Associate Director, Evidence-Based Practice Group, Annie E. Casey Foundation

03/2010 – 08/2010

Expert panel member

Lisa Witter

Chief Strategy Officer, Fenton Communications

03/2010 – 08/2010

07/2013

Expert panel member

Website design


The following list identifies those individuals consulted by Westat in the development of the specific aims and methods:

Person

Agency/Affiliation

Dates of Involvement

Type of Consultation

Amy Smith Slep, PhD

New York University

Family Translational Research Group

2014 - Present

Research design and method, study-specific aims, survey and measurement, sample, recruitment, and retention strategies

Michael Lorber, PhD

New York University

Family Translational Research Group

2014 – Present

Research design and method, study-specific aims, survey and measurement, sample, recruitment, and retention strategies

Nancy Weinfield, PhD

Westat

2014 – Present

Research design and method, study-specific aims, survey and measurement, sample, recruitment, and retention strategies

Nanmathi Manian, PhD

Westat

2014 – Present

Survey and data collection structure; sample, recruitment, and retention strategies

Crystal MacAllum, PhD

Westat

2014

Research design and method, study-specific aims, sample, recruitment, and retention strategies


Moreover, as noted in section A.4 above, several federal partners were consulted to ensure that we were not duplicating efforts of other federal entities. These included ACF, ASPA, ASPE, OPRE, AIDD, ASPA, DOE, HRSA, and SAMHSA.


9. Explanation of Any Payment or Gift to Respondents

Nonresponse bias has two major components: rate of response and differences between respondents and nonrespondents (Bose & West, 2002). All studies face the possibility of nonresponse bias if target participants are differentially attracted to enroll based on relevant personal characteristics. Longitudinal evaluation studies are particularly at risk for nonresponse bias if nonrandom subsets of enrolled participants then discontinue participation. This study’s complex multiple-baseline longitudinal design requires participation of the same individuals over the course of 18 weeks to produce high quality data on longitudinal patterns of behavior in response to exposure to the parenting website. Participants will be asked to engage in weekly data collection activities, during specified and brief windows of time, and during a period in their lives when they face competing demands from work and as parents to young children. These parents will be exerting unusual effort, and therefore the potential for nonresponse bias among subsets of participants must be avoided proactively to ensure high quality data. Parents who are finding parenting a young child challenging, and who are experiencing high levels of stress and demands on their time, are both an important subgroup for this study and the parents most likely to attrite if they feel their effort is not valued by the study. Monetary tokens of appreciation will ensure that participants feel their burden is acknowledged and appreciated, and therefore they are more likely to remain engaged.


Studies have demonstrated that it is increasingly difficult to achieve high response rates in surveys (Brick & Williams 2013; Curtin, Presser, & Singer 2000) , and it is therefore important to proactively combat the possibility of nonresponse bias. Nonresponse bias may become problematic when the research is conducted with hard to reach populations, when the material is sensitive, and when conducting longitudinal research (James 1997). Research has demonstrated that providing a token of appreciation can improve recruitment and response rates (James 1997; Singer & Couper 2008). In a recent project in the CDC’s Division of Violence Prevention, parents from a high risk community were asked to complete a survey assessing parenting and child factors that may contribute to or protect against teen dating violence. Although the outcome of interest in that study was teen dating violence, questions similar to those used in the currently proposed study were included (e.g., questions about parenting thoughts and behaviors, child behavior). In the first year of the project, parents were offered $2 for their participation. After getting almost no completed surveys from over 2,000 eligible parents, the incentive was increased to $25. While this improved participation, it was still difficult to recruit parents for the baseline survey, which was completed via paper and pencil in the families’ homes after researchers mailed it to the interested parents (OMB No. 0920-0941, exp 06/30/2015). Other research has demonstrated that tokens of appreciation can significantly improve participant response rates for surveys focusing on sensitive topics without resulting in biased reporting or coercion (Carley-Baxter, Black, & Twiddy 2007).


Achieving high recruitment and retention rates are critical to data quality for the current study, as parents will be asked to complete a baseline survey as well as repeated assessments for 18 weeks. Parent training programs, in general, are notorious for poor parental engagement and attendance in clinical settings. The research suggests that certain family characteristics, particularly single-parent status, socioeconomic disadvantage, and younger maternal age, are frequently associated with low levels of attendance (Lundahl, Risser, & Lovejoy 2006; Reyno & McGrath, 2006; Spoth, Goldberg, & Redmond, 1999), while socioeconomic disadvantage, family distress, parental depression, and single-parent status have predicted lower quality of engagement and participation (Baydar et al., 2003; Dumas & Albin, 1986; Dumas, Nissley-Tsiopinis, & Moreland 2007). Studies suggest that over half of all families who enroll in parent training programs may discontinue treatment prematurely (e.g., Barkley et al. 2001; Chacko et al. 2012) , and the literature cited above on engagement suggest this discontinuation may not be random. The current format for delivery of the information in the proposed study attempts to address these limitations, as parents can access the material in their own homes at a time of their choosing; however, enrollment and continued engagement likely will remain challenges for nonresponse bias.


Fathers also have been historically difficult to recruit for parenting research. In fact, much of the evidence-based parenting research has focused almost exclusively on mothers (Fabiano, 2007). A review of the literature on parenting programs illustrated that 87% of the studies included no information on fathers. Moreover, for the studies that provided information on fathers, very low rates of participation were reported for fathers in the programs (e.g., less than 10% attendance rates in some studies; Fabiano, 2007). In a recent study on best practices for recruiting fathers, transportation and incentives were noted (by fathers) as good strategies for recruitment into parenting programs (Stahlschmidt, Threlfall, Seay, Lewis, & Kohl 2013). For the current project, we are recruiting a range of parents to assess various parenting experiences. For example, mothers and fathers, as well as parents of different ethnicities and socioeconomic statuses will be recruited. The proposed survey also covers topics that are sensitive (e.g., parenting thoughts, parenting behaviors), which adds burden to the survey and may differentially impact certain populations and increase bias if not targeted appropriately.


Tokens of appreciation are a reliable way to recognize this burden and the challenges associated with recruitment and engagement of parents and to increase the overall quality of the survey by reducing the risk of nonresponse bias and increasing the efficiency of the survey operations. Prior research has documented that incentives are effective at encouraging parents to enroll in face-to-face preventive interventions, particularly younger and socioeconomically disadvantaged participants (Dumas, Begle, French, & Pearl 2010; Guyll, Spoth, & Redmond 2003; Heinrichs 2006). Furthermore, the current study will rely mostly on web-based interactions with respondents as opposed to in-person, face-to-face interactions. Research has demonstrated that tokens of appreciation motivate people to start web-based surveys and, once those individuals have accessed the survey, they are more likely to complete the survey if a token of appreciation is offered (Collins, Ellickson, Hays, & McCaffrey 2000; Goritz 2006). Consequently, we believe that using monetary tokens of appreciation will promote survey engagement and completion and will reduce the likelihood of nonresponse bias.


If this research was attempted without a token of appreciation, the cost to the government for recruitment alone would be substantial.  The token of appreciation not only encourages participants to sign up for the research, but also helps keep project costs under control.  Our approach to tokens of appreciation is based on the need to balance motivating respondents to participate without offering a coercive sum (i.e., a sum that a low-income individual would find difficult to refuse; Dillman, Smyth, & Christian 2009). Although we have considered alternative approaches for incentivizing parents, a low-cost graduated token of appreciation approach was decided to likely be the most effective design based on the literature and past experience of members of the current research group. Furthermore, the value is related to the burden of the assessment such that the more burdensome the assessment, the greater the value of the token of appreciation.


Given the study design, information about the difficulty in recruiting participants and considering some of the harder-to-reach populations in the current study, we have proposed the following incentive schedule: Parents will receive a $20 incentive for their first detailed baseline assessment and will then receive a $10 incentive when each subsequent weekly assessment is completed during weeks 2-17. Respondents will receive a $30 incentive for the longer, 1-month post intervention follow-up. A bonus incentive of $40 will be provided to parents who complete each of the assessments within its designated timeframe, which will hopefully assist in increasing data quality and decreasing nonresponse bias.. All incentives will be provided to the parents on a reloadable Payoneer MasterCard debit card, mailed to consented, enrolled parents once they complete the first baseline assessment. Funds will be automatically transferred to the card 24 hours after each assessment is completed. Thus, Payoneer affords two valuable advantages over other reimbursement systems in that it reduces administrative burden and it expedites reimbursement payment helping to strengthen participant engagement.


10. Assurance of Confidentiality Provided to Respondents

This submission has been reviewed by the National Center for Injury Prevention and Control who determined that the Privacy Act does apply. The applicable System of Records Notice (SORN) is 09-20-0160, “Records of Subjects in Health Promotion and Education Studies” (Federal Register: November 24, 1986, Volume 51, Number 226, Page 42484-42485).


10.1 Privacy Impact Assessment Information


i) Overview of the Data Collection System

Data collection will be conducted by qualified individuals employed by the contractor, Westat. Data will be collected from parents using online assessments. The following steps will be implemented by CDC to safeguard the objectivity of the evaluation: 1) all study staff with access to participants and personally identifying information (PII) will receive human subjects training; and 2) CDC will hold weekly or bi-weekly conference calls with the contractors to provide oversight and discuss data collection procedures.


Sample and Screening. We will recruit 400 parents to ensure enrollment of a diverse group and final sample size of 200. Methods used to recruit participants are described below. A brief online screening questionnaire (see Attachments I1 and J) will be used to capture key demographic eligibility information. All demographic assessments are in accord with OMB guidelines. Parents who do not meet the screening criteria (e.g., they are over age 45, their oldest child is 5 or older) will be informed immediately of their ineligibility and thanked for their interest. The following criteria will be used for eligibility screening:

  • Parent is between ages 18-45 years.

  • Parent is the biological, adoptive, or step-parent of least one child, the oldest of whom is between ages 2-4 years.

  • Parent has internet access at home.

  • Parent is willing to commit to intervention/assessment procedures.

  • Parent is comfortable answering questions in English for online surveys.


Once eligibility is established, the following additional demographic information will be sought to ensure diversity of participants enrolled in the study and to inform recruitment:

  • Parent race

  • Parent ethnicity

  • Household income relative to poverty

  • Geographical region of the country

  • Parent gender


Parents will be assigned to intervention condition based on randomization of blocks of 20 parents. Within each block of 20 parents, enrollees will be randomly assigned to either the Natural Navigation (NN) or Guided Navigation (GN) arm. If a parent is randomized to the GN arm, the randomization will also determine the order in which modules are presented. Within each block randomization, 10 parents will be assigned to NN, 10 to GN, and within the GN arm, two parents will be assigned to each possible module presentation order based on a Latin square design of module orders. Assignment to condition will be done after a parent consents to participation (see Attachment E: Participant Phone Consent Script) and is officially enrolled in the study. Parents will be informed of their assignment during the enrollment phone call and told to expect an email or text message containing login information. The Post-Screening Introductory Email for Participants and the Post-Enrollment Follow-up Email for Participants can be found in Attachment F and Attachment G, respectively.


ii) Items of Information to be Collected

Given the nature of the research design, names, email addresses, and phone numbers will be collected so that participants can be contacted at each of the phases of the research. Other demographic information will be collected, including gender, race, ethnicity, and household income. To provide tokens of appreciation, participants’ mailing addresses will be collected. Participants will also complete measures assessing their thoughts and perceptions on parenting, their child’s behavior, and their understanding and use of the EfP resource.


Parents will complete 18 weekly online assessments, cued by on-screen messages as they log into EFP modules, as well as by email or text messages (per participant’s preference). The assessments are aggregated into four groups that reflect what is assessed and when: (1) core assessment, (2) content knowledge and usefulness assessment, (3) detailed assessment, and (4) rotating assessment. Table A.10.1.A below includes the constructs measured, the measures that will be utilized, the number of items associated with each measure, and the specific study goal to be addressed. Attachment K is a graphical representation of the assessment process; additional information about each of the assessments follows.







Table A.10.1.A – Constructs, Measures, and Goals

Construct

Measure

# of

Items

Study Goal

Detailed Assessment/Rotating Assessment


Child Externalizing Behavior

Eyberg Child Behavior Inventory

36

Goal 1 & 1a

Harsh Discipline

Parenting Scale Short Form, Overreactivity Subscale

5

Goal 1 & 1a

Permissive Discipline

Parenting Scale Short Form, Laxness Subscale

5

Goal 1 & 1a

Corporal Punishment

Parent-Child Conflict Tactics Scale, Corporal Punishment Subscale

6

Goal 1 & 1a

Positive Parenting

Parent Behavior Inventory, Supportive/Engaged Subscale

10

Goal 1 & 1a

Burden in Parenting Role

Fragile Families Study Parenting Aggravation Scale

4

Goal 1 & 1a

Parental Sense of Competence

Parental Sense of Competence Scale, Efficacy Subscale

7

Goal 1 & 1a

Attitudes Toward Corporal Punishment

Attitudes toward Corporal Punishment Scale

4

Goal 1 & 1a

Dysfunctional Child-Centered Causal Attributions

Parent Cognition Scale, Child Responsible Subscale

9

Goal 1 & 1a

Parental Depression

PROMIS Emotional Distress-Depression – Short Form 4a

4

Goal 1 & 1a

Parental Anxiety

PROMIS Emotional Distress-Anxiety– Short Form 4a

4

Goal 1 & 1a

Parental Stress

Perceived Stress Scale Short Form

4

Goal 1 & 1a

Demographics

Demographics

18

Goal 1 & 1a

Core Assessment


Child Externalizing Behavior

Parent Daily Report Checklist Short Form

5

Goal 1 & 1a

Parental use of Praise

Praise Scale

3

Goal 1 & 1a

Parental use of Child Directed Play

Child Directed Play Scale

3

Goal 1 & 1a

Parental use of Commands and Consequences

Commands and Consequences Scale

3

Goal 1 & 1a

Parental use of Routines

Routines Scale

3

Goal 1 & 1a

Parental use of Time Out

Time Out Scale

4

Goal 1 & 1a

Knowledge/Usefulness Assessment


Parental Knowledge of EFP Skills

Parental EFP Skills Knowledge Scale

3

per module

Goal 1b

Perceived Usefulness/Applicability of EFP Skills

Parental EFP Skills Usefulness Scale

6

per module

Goal 1c

Consumer Satisfaction

Therapy Attitude Inventory

8

Goal 1c

System Usability

System Usability Scale

5

Goal 1c


Core Assessment (see Attachments I3 and J). For a multiple baseline design, the same measures must be completed each week. Thus, each parent will need to complete 18 core assessments. The core assessments are timed to allow the examination of change from pre-, to mid- (day 7), to post-completion (day 14) of each intervention module. To accommodate 3 assessments per module while allowing 2 weeks for parents to complete each module, the post assessment of one module is administered at the same time as the pre assessment for the next. The specific core assessment measures are listed below:

  • Child Externalizing Behavior. An appropriate measure of child externalizing behavior is an age-appropriate measure that has good coverage of the externalizing construct, has psychometrics robust to repeated administration, and has been shown to be sensitive to parenting intervention effects. Webster-Stratton’s Parent Daily Report Checklist fits those criteria (PDR; Webster-Stratton, Kolpacoff, & Hollinsworth 1988). We plan to administer 5 PDR items that represent a fair cross-section of the externalizing construct: (1) hitting, kicking, biting others, (2) being hyperactive or noisy, running around, (3) being non-compliant, defiant, (4) yelling, having temper tantrums, and (5) being destructive (damaging property). Answer choices will reflect the frequency of each behavior in the past 7 days. Other measures of child externalizing behavior were judged to be either too lengthy for weekly assessment or to have less complete construct coverage.

  • Parenting Behavior. We will measure parenting behaviors that are specific to each of the five EFP modules with a 16-item measure comprised of the following (which draws from validated measures where possible):

  • A 3-item praise measure drawn from Webster-Stratton, Reid, and Hammond (2001) (e.g., “When my child behaved well or did a good job at something, I praised or complimented her/him.”). It corresponds to skills that are emphasized in the Communicating with Your Child unit and responded to a preschool parenting intervention in Webster-Stratton et al. (2001). Answer choices will reflect the frequency of each behavior in the past 7 days.

  • A 3-item child-directed play skills measure derived from Strayhorn’s Clinical Parent Questionnaire on Promoting a Positive Emotional Climate (e.g., “How many days last week did you have a special playtime with just you and your child?”). Although this measure has not been formally validated, it directly taps skills that are emphasized in the Communicating with Your Child module (e.g., using tracking and verbal labeling during child directed play) and has face validity. Answer choices will reflect the frequency of each behavior in the past 7 days.

  • A 3-item commands and consequences skills measure derived from Strayhorn’s Clinical Parent Questionnaire on Promoting a Positive Emotional Climate (e.g., “I used a consequence if my child refused to comply with a command.”). It directly taps skills that are emphasized in the Giving Directions and Using Discipline and Consequences modules. Answer choices will reflect the frequency of each behavior in the past 7 days.

  • A 3-item routines measure derived from the Parent Practices Scale (PPS; Strayhorn & Weidman, 1988), a formally validated measure that responds to parenting interventions (e.g., McMahon et al., 1999). The PPS has three items that tap the regularity of children’s schedules (e.g., “How many days a week does your child go to bed at one particular time, known as his or her official bedtime?”), a behavior that is emphasized in the Creating Structure and Rules module. Answer choices will reflect the frequency of each behavior in the past 7 days.

  • A 4-item study specific time out measure (e.g., “How many times did you use a time-out with your child in the past 7 days?”), tapping the use of time out according to EFP intervention guidelines emphasized in the Time-Out module. Answer choices will reflect the frequency of each behavior in the past 7 days.


Content Knowledge and Usefulness Assessment (see Attachments I4, I5, and J). We will assess module-specific knowledge twice per module—at days 1 (pre) and 14 (post) of the unit—and perceived usefulness of content once per module—at day 14 into the unit (post). Knowledge items from all modules will also be part of both the week 1 and week 18 assessments. Usefulness items from all modules will also be part of the week 18 assessment. Each knowledge measure will be a 3-item module-specific true/false quiz tapping the major points of the skill that is being taught in the module the parent is working on. To illustrate, parents will be asked three questions about the features of effective commands during the Giving Directions module, and three features of age appropriate time-outs in the Time Out module. Each usefulness measure will comprise 3 module-specific items tapping usefulness, everyday applicability to parenting challenges, and whether the information is sufficiently detailed to be applied.


Detailed Assessment (see Attachments I2 and J). The detailed assessment will be administered at weeks 1 and 18 and is designed to measure child externalizing behavior, parenting behaviors, parenting stress, parenting thoughts, and parent psychological symptoms with validated measures. Demographics items not asked through the screening and enrollment process will also be added to the week 1 questionnaire. The specific detailed assessment measures are listed below:

  • Child Externalizing Behavior. Parents will complete the 36-item Eyberg Child Behavior Inventory (ECBI; Boggs, Eyberg, & Reynolds, 1990). It is a widely used and validated measure of externalizing behavior (e.g., “Acts defiant when told to do something.”) for children as young as two and has been shown to respond to parenting interventions (e.g., Schuhmann, Foote, Eyberg, Boggs, & Algina, 1998).

  • Harsh and Permissive Discipline. Parents will complete the overreactivity (5 items; e.g., “I get so frustrated or angry that my child can see that I’m upset.”) and laxness (5 items; e.g., “When I say my child can’t do something, I let my child do it anyway.”) short form subscales of the Parenting Scale (Arnold, O’Leary, Wolff, & Acker, 1993), a widely used measure that responds to parenting interventions (Sanders, Markie-Dadds, Tully, & Bor, 2000). Parents will also complete the 6-item corporal punishment (e.g., “Have you spanked him/her on the bottom with your bare hand?”) subscale of the Parent-Child Conflict Tactics Scale (CTS-PC; Straus, Hamby, Finkelhor, Moore, & Runyan, 1998). The CTS-PC corporal punishment scale has established concurrent and predictive validity (Lorber & Slep, 2014; Mahoney, Donnelly, Lewis, & Maynard, 2000).

  • Positive Parenting. Parents will complete the supportive/engaged (e.g., “I hold or touch my child in an affectionate way.”) subscale of the 10-item Parent Behavior Inventory (PBI; Lovejoy, Weis, O'Hare, & Rubin, 1999). PBI supportive/engaged subscale scores are associated with observations of positive parenting and questionnaire measures of child externalizing behavior, and exhibit significant test-retest stability (Lovejoy et al., 1999).

  • Burden in Parenting Role. A key aspect of the parenting stress construct, parental burden (e.g., “I feel trapped by my responsibilities as a parent.”), will be tapped by 4 items from the Fragile Families Study Parenting Aggravation scale (MacKenzie, Nicklas, Brooks-Gunn, & Waldfogel, 2011). It exhibits acceptable internal consistency and stability and predicts physically aggressive parenting (MacKenzie et al., 2011; Wilson, Fritz, & Lorber, 2014).

  • Parental Sense of Competence. Parents will complete the 7-item efficacy subscale (e.g., “I honestly believe I have all the skills necessary to be a good parent.”) of the Parental Sense of Competence Scale (Johnston & Mash, 1989). It has replicable factorial validity, convergent validity with other measures, is associated with parenting, and responds to interventions (Coleman, & Karraker, 2000; Ohan, Leung, & Johnston, 2000; Sanders et al., 2000).

  • Attitudes Toward Corporal Punishment. The 4-item Attitudes toward Corporal Punishment Scale (Lorber, O’Leary, & Slep, 2011) taps the extent to which parents believe spanking and slapping are justified and efficacious responses to misbehavior (e.g., “Is it justified for a mother to spank her child on the bottom with a bare hand?”). It is internally consistent and associated with parent-child physical aggression (Lorber et al., 2011; Slep & O’Leary, 2007).

  • Dysfunctional Child-Centered Causal Attributions. The 9-item child responsible subscale of the Parent Cognition Scale (PCS; Snarr, Slep, & Grande, 2009) reflects parents’ beliefs that their children’s negative behaviors are intentional and done with hostile intent (e.g., “My child tries to get my goat or push my buttons.”). The child responsible subscale has strong internal consistency, test-retest reliability, and associations with parenting (Snarr et al., 2009).

  • Parent Psychological Symptoms. Parents will complete 4-item depression (e.g., “I felt depressed.”) and anxiety (e.g., “My worries overwhelmed me.”) short forms from the NIH PROMIS version 1.0 item bank (Pilkonis et al., 2011). PROMIS measures are new and exceptionally well validated via item response theory techniques. Parents will also complete the 4-item Perceived Stress Scale (PSS; e.g., “In the last month, how often have you felt that you were unable to control the important things in your life?”) (Cohen, Kamarck, Mermelstein, 1983; Cohen & Williamson, 1988). The PSS is the most widely used subjective measure of stress and has been repeatedly validated in several studies and countries (Monroe, 2008).

  • Demographics. At the week 1 assessment, we will administer remaining demographic questions that were not already asked during the screening process (see Attachment I1 and J).


Rotating Assessment (see Attachments I3 and J). The rotating assessments will be administered during Weeks 2-17. It would be advantageous but too burdensome on parents to administer the detailed assessment weekly to examine intervention effects. Thus, we are administering only some of the measures from the detailed assessment each week, using a planned missingness design (Graham, Taylor, & Cumsille, 2001). The measures can be grouped together in 4 blocks of 9 to 11 items as follows: (1) short forms (Lorber, Xu, Slep, Bulling, & O’Leary, 2014) of the PS over-reactivity and laxness subscales (11 items), (2) CTS-PC corporal punishment and Fragile Families Parenting Aggravation subscales (10 items), (3) Positive Attitudes toward Corporal Punishment Scale and Parental Sense of Competence Scale efficacy subscale (11 items), and (4) the Parent Cognition Scale child responsible subscale (9 items). The order of administration will be counterbalanced with a 4 × 4 Latin square to ensure an equal number of administrations of each measure block with no order confounding.


Website Usage Assessment (see Attachments I6 and J). Currently, the CDC website collects metrics via Google Analytics. Google Analytics collects user information from the website to provide reporting on usage such as page views and average time spent. These same metrics will be applied to the study-specific mock website housed and managed by Westat. However, to answer the research question, “How do parents perceive the website?” we plan to administer a System Usability Scale. The items (e.g., “I would recommend that others use this website”; “I thought the website was easy to use”) tap perceived ease of use as well as usefulness of the website. Because system usability is not module-specific, the Website usage assessment will be administered only during the Week 18 assessment.


Comparison to past literature (Goal 1d). The Eyberg Child Behavior Inventory (ECBI), Parent Daily Report Checklist, Parenting Scale (PS), and Parent-Child Conflict Tactics Scale are among the most common measures of child externalizing behavior and related parental discipline practices used as outcomes in dozens of parenting-focused prevention and clinical intervention trials (e.g., Chen & Chan, 2015; Conduct Problems Prevention Research Group, 2011; Dodge et al., 2004; Feil et al., 2011; Gross et al., 2003; Kazdin, Siegel, & Bass, 1992; Melhuish et al., 2007; Oveisi et al., 2010; Patterson, Forgatch, & DeGarmo, 2010; Sanders, Markie-Dadds, Tully, & Bor, 2000; Schuhmann, Foote, Eyberg, Boggs, & Algina, 1998; Webster-Stratton, Kolpacoff, & Hollinsworth, 1988; Zubrick et al., 2005). The Parent Behavior Inventory has also been shown to respond to parent-focused interventions in two studies (Berkowitz et al., 2011; Sheeber et al., 2012). The Praise Scale has been used by Webster-Stratton’s group to evaluate parenting interventions (Webster-Stratton, Reid, & Hammond, 2001). The Parent Practices Scale has been used to evaluate the effects of the FAST TRACK program (e.g., McMahon et al., 1999). The Parental Sense of Competence Scale has been used in several evaluations of parenting interventions (e.g., Chen & Chan, 2015; Sanders et al., 2000; Sheeber et al., 2012). We will thus be able to easily compare the amount of pre- to post-intervention change in the above measures to previously published results. To illustrate, Sanders, Dittman, Farruggia, and Keown (2014) studied an on-line parenting intervention similar to Essentials for Parenting Toddlers and Preschoolers (EFP) and found pre- to post-intervention effect sizes (standardized mean differences) of d = 1.54 for the ECBI Intensity subscale and d = 1.29 for the PS Overreactivity subscale scores. Because we are using the same measures, we will be able to compute directly parallel effect sizes to those reported by Sanders et al. (2014). We will be able to use this strategy for each of the above measures.


We are also using the Therapy Attitude Inventory (TAI) as a measure “consumer satisfaction.” The TAI was specifically developed for use in conjunction with parenting interventions and has been used in several studies (Fabiano et al., 2009; Gardner, Burton, & Klimes, 2006; Schuhmann, Foote, Eyberg, Boggs, & Algina, 1998). Thus we will be able to compare our participants’ satisfaction with the program to those of participants in prior parenting intervention research.


A second group of measures, which includes the Parenting Aggravation Scale, Attitudes Toward Corporal Punishment Scale, Parent Cognition Scale, Time Out Scale, and PROMIS Depression and Anxiety Scales, have not been used in prior evaluations of parenting interventions; however, the constructs have been measured in evaluations of parenting interventions by different questionnaires (e.g., Bugental et al., 2002; Chen & Chan, 2015; Costin & Chambers, 2007; Drugli, Larsson, Fossum, & Mørch, 2010; Palusci, Crum, Bliss, & Bavolek, 2008; Reich, Penner, Duncan & 2012; Roberts & Powers, 1990; Sanders et al., 2004; Zubrick et al., 2005). We have chosen to use the current measures to reduce the burden on participants, as the measures have psychometric properties (i.e., reliability and validity) that are equivalent to or better than those used in prior research. Moreover, although we are using different measures, we will still be able to compare effect sizes (d) obtained in our research to those from the above studies, as d is a standardized (i.e., scale unifying) metric.


A third group of measures we are using are study specific measures. We designed the Child Directed Play and Commands and Consequences Scales because child directed play and the appropriate use of commands and consequences are key parenting competencies emphasized in the EFP web resource for which no pre-existing measures were available. Finally, the Parental EFP Skills Knowledge and Usefulness Scales, as well as the System Usability Scale, are specific to the EFP evaluation and will guide changes to the web resource.


iii) How information will be shared and for what purpose

Survey data at the individual level will never be shared. The survey data will be housed in a database on encrypted, password protected electronic storage files. All information shared will be in an aggregate form for the scientific community. The data will be translated for practitioners and others engaged in parent training work. Ultimately, the results of the work will be disseminated to researchers, states, and the public. In addition, knowledge and feedback gained from the data collected in this study will be used to improve and update the EFP resource, as needed.


iv) Impact of the proposed collection on respondents’ privacy

The proposed collection will have a minimal impact on respondents’ privacy. The respondents’ names, email addresses, and phone numbers will be collected and used to facilitate survey responses. All data collection and data management staff will be well trained in maintaining information security at all stages of the data collection and data management process. Protocols for data collection will ensure that names, addresses, and all other personally identifying information are kept secure during all stages of data collection. Recruitment lists and survey data will be kept in locked, secure facilities by the contractor.


Data will be stored in encrypted databases on password-secured data platforms. Data will be linked only with a unique identifier code and will be kept in a separate database from personally identifying data, and a third database with limited personnel access will contain information linking participants with their unique identifier codes. Identified data will be stored and maintained by the evaluation contractor, and only de-identified data will be given to CDC at the conclusion of the contract. The contractor will be required to destroy all data within 6 months of the end of the contract, provided that the data have been safely and successfully handed over to CDC and CDC has had an opportunity to verify the accuracy and completeness of the data.


v) Whether individuals are informed that providing the information is voluntary or mandatory.

Participants will be informed that the surveys are voluntary and that they may choose to discontinue participation in the survey at any time for any reason.


vi) Opportunities to consent, if any, to sharing and submission of information.

Participants will be required to provide consent to participate in the study via phone prior to participating in any study-related activities. The participant can choose not to consent and participate, if desired. The script for the phone consent process can be found in Attachment E.


vii) How the information will be secured?

Data that are collected will be stored physically and electronically by the contractors collecting the respective data at their offices. De-identified electronic database(s) will be transferred to CDC. Any hard copies of data will be destroyed after the data has been successfully entered, cleaned and backed up into the database.

Based on the current collection design (pre-OMB approval), Westat intends to de-identify the final data delivery to CDC as follows:

  1. Assign a unique, serial/random identifier (ParentID) to facilitate data linking. 

  2. Categorize household income into agreed upon categories instead of a continuous variable.

  3. Remove all administrative data (e.g., incentive payments data, electronic records of contact)

  4. Remove identifying demographic variables from the collected data. The variables would include:

  • Parent name

  • Child Name

  • Any collected physical or mailing addresses

  • Any collected telephone numbers

  • Any collected social media usernames

  • Any contact notes


The system will be hosted at Westat’s corporate campus in Rockville, Maryland. Westat maintains a secure, environment-monitored and controlled facility. The facility has generator backup, video monitors, and fire-suppression equipment. As a requirement of the contract with CDC, Westat must ensure that its information systems meet CDC certification and accreditation standards. This project has been assigned a security category and Westat has developed a System Security Plan (SSP) to ensure protocols are in place to meet this designation. Westat has received approval of an SSP for a full moderate security categorization for this project. Per conversations that have taken place between CDC, Westat, and the NCIPC ISSO, we anticipate Westat’s system will receive the Authority to Operate (ATO) by September 2015.


Access to servers, workstations, and other equipment containing sensitive or valuable data is limited to those personnel required to use these systems as part of their jobs. Data will not be hosted at any other location, other than encrypted off-site backups which are stored at a third party nationally recognized commercial storage facility that conforms to government requirements for off-site backup for systems hosted at Westat.


As needed, analytic data files will be created on secure Westat network areas. The files will draw data from the system, accessing only ID-based data. No PII will be visible to the analytic process.


At the study end, de-identified data will be provided to CDC. After CDC acceptance, Westat will delete all databases and data files related to the study. Encrypted off-site backups will be purged after one year.


viii) Whether a system of records is being created under the Privacy Act.

This project is subject to the Privacy Act. The applicable System of Records Notice (SORN) is 09-20-0160, “Records of Subjects in Health Promotion and Education Studies.”


10.2. IRB Approval

The project contractor, Westat, has obtained local IRB approval to collect data from study participants. The IRB Approval Letter can be found in Attachment D. As indicated in the letter, the Westat IRB conducted a review of the study and procedures and exempted the study from further review on the grounds that the study is a program evaluation.


11. Justification for Sensitive Questions

Some questions included in survey instruments might be considered sensitive by some respondents. The surveys include questions on sensitive issues such as parental use of and attitudes toward corporal punishment. Table 11.A below identifies the sensitive questions, explains the justification for their inclusion in the surveys, and describes how the data will be used. The informed consent protocol apprises participants that these topics will be covered during the surveys. Participants will be permitted to skip questions that they do not feel comfortable answering. All sensitive questions have been used previously in research and are from various validated assessment tools (e.g., Attitudes Toward Corporal Punishment, Conflict Tactics Scale-Parent-Child). As with all information collected, these data will be presented with all identifiers removed.





Table 11.A

Description of Questions

Justification for Inclusion

Use of Data

Parental use of and attitudes toward corporal punishment

Necessary to determine effects of the project in preventing or reducing clinical and subclinical levels of child abuse and neglect

Used in the multilevel time series analysis to determine the impact of the intervention in increasing positive parenting skills


12. Estimates of Annualized Burden Hours and Costs

Burden estimates were derived based on the number and nature of the questions, and the administration methods (e.g., open-ended questions).


A.12.A. Burden

Table A-12 details the annualized number of respondents, the average response burden per survey/measure/questionnaire, and the total response burden. Estimates of burden for the surveys are based on simulated runs with staff answering each survey, in addition to the estimated completion times provided in the manuals that accompany the validated measures being used in this study. We anticipate that the surveys will take between 15 minutes to 45 minutes to complete (depending on which survey is being completed). All surveys/measures will be completed by approximately 200 participants, with the exception of the Screening and Demographics Questionnaires which will be completed by approximately 400 participants. Some of the surveys/measures will be completed once, while others will be completed up to 18 times. The total estimated burden hours for this project are 2,050.


Table A.12.A. – Estimate of Annual Burden Hours.

Type of Respondents

Form Name

No. of Respondents

No. of Responses per Respondent

Avg. Burden per Response (in hrs)

Total Burden

(in hrs)

Parents

(both Natural Navigation [NN] and Guided Navigation [GN] groups)

Form 1 - Screening and Demographics Questionnaires – Attachment I1

400

1

15/60

100

Form 2 - Detailed Assessment Measures – Attachment I2

200

2

45/60

300

Form 3 - Core Assessment Measures (Rotating)

Attachment I3

200

18

15/60

900

Form 4 - Parental EFP Skills Knowledge Scale

Attachment I4

200

8

15/60

400

Form 5 - Parental EFP Skills Usefulness Scale

Attachment I5

200

6

15/60

300

Form 6 - Therapy Attitude Inventory and System Usability Scale

Attachment I6

200

1

15/60

50

Total

2,050


A.12.B. Estimated Annualized Burden Cost

The hourly wage used to calculate the Respondent Cost is $7.25, which is the minimum wage under the Fair Labor Standards Act (FLSA). Total Respondent Cost for this evaluation is $14,862.50.


Table A.12.B. – Estimate of Annual Burden Hours.

Type of Respondents

Form Name

No. of Respondents

No. of Responses per Respondent

Total Burden (in hrs)

Hourly Wage Cost

Respondent Cost

Parents

(both Natural Navigation [NN] and Guided Navigation [GN] groups)

Form 1 - Screening and Demographics Questionnaires – Attachment I1

400

1

100

$7.25

$725.00

Form 2 - Detailed Assessment Measures – Attachment I2

200

2

300

$7.25

$2,175.00

Form 3 - Core Assessment Measures (Rotating)

Attachment I3

200

18

900

$7.25

$6,525.00

Form 4 - Parental EFP Skills Knowledge Scale

Attachment I4

200

8

400

$7.25

$2,900.00

Form 5 - Parental EFP Skills Usefulness Scale

Attachment I5

200

6

300

$7.25

$2,175.00

Form 6 - Therapy Attitude Inventory and System Usability Scale

Attachment I6

200

1

50

$7.25

$362.50

Total

$14,862.50


13. Estimates of Other Total Annual Cost Burden to Respondents and Record Keepers

Respondents will incur no capital or maintenance costs.

14. Annualized Cost to the Federal Government

Contractual costs:

This is a contracted data collection, led by Westat under contract for CDC. The total cost of the contract over the 2 years of data collection is $236,766.00.


Budget Line Item

Budget

Personnel Costs

$133,654

Other Direct Costs (Computing, Web Ads, Web Server, Text Messaging Fees, etc.)

$25,490

Participant Tokens of Appreciation

$41,500

G&A + Fixed Fee

$36,122

Contractual Cost

$236,766


Federal employee costs:

NCIPC has assigned a Project Officer and Science Officer to assist with and oversee this data collection. A CDC project officer (GS-12) and science officer (GS-13) devote 20% of their FTE for an estimated cost of $35,000 per year for 2 years (for a total of $70,000).


Year

Budget

Year 1

$35,000

Year 2

$35,000

TOTAL

$70,000


Total project cost to the Federal Government is $306,766 (Year 1 and Year 2 Contract Cost + Year 1 and Year 2 CDC Labor = $306,766).


15. Explanation for Program Changes or Adjustments

This is a new data collection.


16. Plans for Tabulation and Publication and Project Time Schedule

A.16.A. Tabulation and Analysis Plan:

A multilevel time series analysis (Bolger & Laurenceau, 2013) will be used to analyze data. All participants who complete at least the Week 1 assessment will be included in each analysis, with missing data handled via full information maximum likelihood (FIML), following the intent to treat (ITT) principle. Effects will be measured for the core, rotating, content knowledge and content usefulness, and detailed assessment measures. More information on analyses is included below.


For the core measures, each parent-child dyad (Level-2 unit) has 18 repeated observations (Level-1 units) of each core measure and intervention status. The intervention effect on a core measure is estimated for each parent-child dyad — a Level-2 “random effect” — by regressing the core measure on intervention status at the immediately prior lag (i.e., 1 week prior). The overall intervention effect across all dyads is estimated by comparing the mean random effect (intervention→core measure) against zero. Intervention status will be modeled in three different ways, as (a) 1=intervention vs. 0=baseline, (b) 1=post-intervention vs. 0=baseline, and (c) 1=intervention/post-intervention vs. 0=baseline. This will allow us to assess (a) the concurrent impact of the intervention, (b) the short-term impact of the intervention upon its completion, and (c) the total (concurrent/short-term) impact of the intervention. An autoregressive process in the multilevel models’ residuals, reflecting “carryover effects” of the core measure from one week to the next, will also be modeled. Failure to account for autocorrelated residuals results in inaccurate estimates of intervention effects (Baek & Ferron, 2013). A similar process will be used to examine effects for the rotating, content knowledge and usefulness, and detailed assessment measures, except that there will be fewer data points, as these measures are repeated less often.


A classic multiple baseline graphical approach will also be used for visualization and tracking of the effects of exposure to content. Graphs will be created across participants. The item average composite of each measure (e.g., child directed play, parental competence, etc.) will be computed for each participant at each assessment. These item average scores will then be averaged across all participants and plotted for each of the time points, in the form of a line graph. The graphs will include plots separated by condition (NN vs. GN). Because the measures within the Rotating Assessments are conducted intermittently, each participant will have a varying schedule for these measures. Within each condition, we will produce a single histogram per unit and block for Usefulness measures to capture variation between the modules, and a single histogram per block for Consumer Satisfaction and System Usability.


Other analyses also may be conducted, depending on the outcomes achieved. For example, we can use regression analyses to predict intervention response by child (e.g., externalizing) and parental (e.g., depression, stress, demographics) characteristics.


A.16.B. Publications

The results of the evaluation will be reported in peer-reviewed journal articles, conference presentations, research briefs, and Web-based papers for dissemination to researchers, states, and the public.


Table A.16-1. Time Schedule

Activity

Time schedule

  • Recruitment of study participants

1 – 19 months after OMB approval (ongoing)

  • Participants complete study and measures

1 – 24 months after OMB approval (ongoing)

  • Data entry and cleaning

1 – 24 months after OMB approval (ongoing)

17. Reason(s) Display of OMB Expiration Date is Inappropriate

The display of the OMB expiration date is not inappropriate.


18. Exceptions to Certification for Paperwork Reduction Act Submissions

There are no exceptions to the certification.

References


Arnold, D. S., O’Leary, S. G., Wolff, L. S., & Acker, M. M. (1993). The Parenting Scale: A measure of dysfunctional parenting in discipline situations. Psychological Assessment, 5, 137-144.


Barkley, R. A., Edwards, G., Laneri, M., Flethcher, K., & Metevia, L. (2001). The efficacy of problem-solving training alone, behavior management training alone, and their combination for parent-adolescent conflict in teenagers with ADHD and ODD. Journal of Consulting and Clinical Psychology, 69, 926-941.


Baek, E. K. & Ferron, J. M. (2013). Multilevel models for multiple-baseline data: Modeling across participant variation in autocorrelation and residual variance. Behavior Research Methods, 45, 65-74.


Baydar, N., Reid, M. J., & Webster-Stratton, C. (2003). The role of mental health factors and program engagement in the effectiveness of a preventive parenting program for Head Start mothers. Child Development, 74, 1433-1453.


Boggs, S. R., Eyberg, S., & Reynolds, L. A. (1990). Concurrent validity of the Eyberg Child Behavior Inventory. Journal of Clinical Child Psychology, 19, 75-78.


Bolger, N., & Laurenceau, J-P. (2013). Intensive longitudinal methods: An introduction to diary and experience sampling research. New York: Guilford.


Bose, J., & West, J. (2002). Examining additional nonresponse bias introduced through attrition. In Proceedings of the Survey Research Methods Section, American Statistical Association (pp. 278-283).


Brick, J. M., & Williams, D. (2013). Explaining rising nonresponse rates in cross-sectional surveys. The ANNALS of the American Academy of Political and Social Science, 645(1), 36-59.


Carley-Baxter, L. R., Black, M., & Twiddy, S. (2007). The impact of incentives on survey participation and reports of intimate partner and sexual violence. Section on Survey Research Methods, 2954-2961.


Chacko, A., Wymbs, B. T., Chimiklis, A., Wymbs, F. A., & Pelham, W. E. (2012). Evaluating a comprehensive strategy to improve engagement to group-based behavioral parent training for high-risk families of children with ADHD. Journal of Abnormal Child Psychology, 40, 1351-1362.


Cohen, S., Kamarck, T., & Mermelstein, R. (1983). A global measure of perceived stress. Journal of Health and Social Behavior, 24(4), 385-396.


Cohen, S., & Williamson, G. (1988). Perceived stress in a probability sample of the United States. In S. Spacapan & S. Oskamp (Eds.), The social psychology of health: Claremont Symposium on applied social psychology (pp. 31-67). Newbury Park, CA: Sage.


Coleman, P. K., & Karraker, K. H. (2000). Parenting selfefficacy among mothers of schoolage children: Conceptualization, measurement, and correlates. Family Relations, 49, 13-24.


Collins, R. L., Ellickson, P. L., Hays, R. D., & McCaffrey, D. F. (2000). Effects of incentive size and timing on response rates to a follow-up wave of a longitudinal mailed survey. Evaluation Review, 24(4), 347-363.


Curtin, R., Presser, S., & Singer, E. (2000). The effects of response rate changes on the index of consumer sentiment. Public Opinion Quarterly, 64(4), 413-428.


Dillman, D. A., Smyth, J. D., & Christian, L. M. (2009). Mail and Internet surveys: The tailored design method, third edition. New York: John Wiley and Sons.


Dumas, J. E., & Albin, J. B. (1986). Parent training outcome: Does active parental involvement matter? Behaviour Research and Therapy, 24, 227-230.


Dumas, J. E., Begle, A. M., French, B., & Pearl, A. (2010). Effects of monetary incentives on engagement in the PACE parenting program. Journal of Clinical Child and Adolescent Psychology, 39(3), 302-313.


Dumas, J. E, Nissley-Tsiopinis, J., & Moreland, A. D. (2007). From intent to enrollment, attendance, and participation in preventive parenting groups. Journal of Child and Family Studies, 16, 1-26.


Fabiano, G. A. (2007). Father participation in behavioral parent training for ADHD: Review and recommendations for increasing inclusion and engagement. Journal of Family Psychology, 21, 683-693.


Goritz, A. S. (2006). Incentives in web studies: Methodological issues and a review. International Journal of Internet Science, 1(1), 58-70.

Graham, J. W., Taylor, B. J., & Cumsille, P. E. (2001). Planned missing-data designs in analysis of change. In Collins, L. M. & Sayer, A. G. (Eds.), New methods for the analysis of change (pp. 335-353). Washington, DC, US: American Psychological Association, xxiv

Guyll, M., Spoth, R., & Redmond, C. (2003). The effects of incentives and research requirements on participation rates for a community-based preventive intervention research study. Journal of Primary Prevention, 24, 25-41.


Heinrichs, N. (2006). The effects of two different incentives on recruitment rates of families into a prevention program. Journal of Primary Prevention, 27, 345-365.


James, T. L. (1997). Results of the wave I incentive experiment in the 1996 Survey of Income and Program Participation. Report to the U.S. Bureau of the Census, 834-839.


Johnston, C., & Mash, E. J. (1989). A measure of parenting satisfaction and efficacy. Journal of Clinical Child Psychology, 18, 167-175.


Kazdin, A. E. (1998). Research in clinical psychology (3rd ed.). Boston: Allyn and Bacon.


Kratochwill, T. R., Hitchcock, J., Horner, R. H., Levin, J. R., Odom, S. L., Rindskopf, D. M. & Shadish, W. R. (2010). Single-case designs technical documentation. Retrieved from http://ies.ed.gov/ncee/wwc/pdf/wwc_scd.pdf.


Lorber, M. F., O’Leary, S. G., & Slep, A. M. S. (2011). An initial evaluation of the role of emotion and impulsivity in explaining racial/ethnic differences in corporal punishment. Developmental Psychology, 47, 1744-1749.


Lorber, M. F., & Slep, A. M. S. (2014). Are persistent early onset child conduct problems predicted by the trajectories and initial levels of discipline practices? Unpublished manuscript.


Lorber, M. F., Xu, S., Slep, A. M., Bulling, L., & O’Leary, S. G. (2014). A new look at the psychometrics of the Parenting Scale through the lens of item response theory. Journal of Clinical Child and Adolescent Psychology, 43(4), 613-626.


Lovejoy, M. C., Weis, R., O'Hare, E., & Rubin E. C. (1999). Development and initial validation of the Parent Behavior Inventory. Psychological Assessment, 11(4), 534-545.


Lundahl, B., Risser, H. J., & Lovejoy, M. C. (2006). A meta-analysis of parent training: Moderators and follow-up effects. Clinical Psychology Review, 26(1), 86-104.


MacKenzie, M. J., Nicklas, E., Brooks-Gunn, J., & Waldfogel, J. (2011). Who spanks infants and toddlers? Evidence from the Fragile Families and Child Well-being Study. Children and Youth Services Review, 33, 1364-1373.


Mahoney, A., Donnelly, W. O., Lewis, T., & Maynard, C. (2000). Mother and father self-reports of corporal punishment and severe physical aggression toward clinic-referred youth. Journal of Clinical Child Psychology, 29, 266-281.


McMahon, R. J., Bierman, K. L., Coie, J. D., Dodge, K. A., Greenberg, M. T., Lochman, J. E., & Pinderhughes, E. E. (1999). Initial impact of the Fast Track prevention trial for conduct problems: I. The high-risk sample. Journal of Consulting and Clinical Psychology, 67, 631-647.


Monroe, S. M. (2008). Modern approaches to conceptualizing and measuring human life stress. Annual Review of Clinical Psychology, 4, 33-52.


Nock, M. K., Michel, B. D., & Photos, V. (2007). Single-case research designs. In D. McKay (Ed.), Handbook of research methods in abnormal and clinical psychology (pp. 337–350). Thousand Oaks, CA: Sage Publications.


Ohan, J. L., Leung, D. W., & Johnston, C. (2000). The Parenting Sense of Competence scale: Evidence of a stable factor structure and validity. Canadian Journal of Behavioural Science, 32, 251.


Pilkonis, P. A., Choi, S. W., Reise, S. P., Stover, A. M., Riley, W. T., & Cella, D. (2011). Item banks for measuring emotional distress from the patient-reported outcomes measurement information system (PROMIS®): Depression, anxiety, and anger. Assessment, 18, 263-283.


Reyno, S. M., & McGrath, P. J. (2006). Predictors of parent training efficacy for child externalizing behavior problems—A meta-analytic review. Journal of Child Psychology and Psychiatry, 47, 99-111.


Sanders, M. R., Markie-Dadds, C., Tully, L. A., & Bor, W. (2000). The Triple P-Positive Parenting Program: A comparison of enhanced, standard, and self-directed behavioral family intervention for parents of children with early onset conduct problems. Journal of Consulting and Clinical Psychology, 68, 624-640.


Schuhmann, E. M., Foote, R. C., Eyberg, S. M., Boggs, S. R., & Algina, J. (1998). Efficacy of Parent-Child Interaction Therapy: Interim report of a randomized trial with short-term maintenance. Journal of Clinical Child Psychology, 27, 34-45.


Singer, E., & Couper, M. P. (2008). Do incentives exert undue influence on survey participation? Experimental evidence. Journal of Empirical Research on Human Research Ethics, 3(3), 49-56.


Slep, A. M. S., & O’Leary, S. G. (2007). Multivariate models of mothers’ and fathers’ aggression toward their children. Journal of Consulting and Clinical Psychology, 75, 739-751.


Snarr, J. D., Slep, A. M. S., & Grande, V. P. (2009). Validation of a new self-report measure of parental attributions. Psychological Assessment, 21, 390-401.


Spoth, R., Goldberg, C., & Redmond, C. (1999). Engaging families in longitudinal preventive research: Discrete time survival analysis of socioeconomic and social-emotional risk factors. Journal of Consulting and Clinical Psychology, 67, 157-163.


Stahlschmidt, M. J., Threlfall, J., Seay, K. D., Lewis, E. M., & Kohl, P. L. (2013). Recruiting fathers to parenting programs: Advice from dads and fatherhood program providers. Children and Youth Services Review, 35(10), 1734-1741.


Straus, M. A., Hamby, S. L., Finkelhor, D., Moore, D. W., & Runyan, D. (1998). Identification of child maltreatment with the Parent–Child Conflict Tactics Scales: Development and psychometric data for a national sample of American parents. Child Abuse and Neglect, 22, 249-270.


Strayhorn, J. M., & Weidman, C. S. (1988). A parent practices scale and its relation to parent and child mental health. Journal of the American Academy of Child & Adolescent Psychiatry, 27, 613-618.


Taylor, B. J., & Cumsille, P. E. (2001). Planned missing-data designs in analysis of change. In L. M. Collins & A. G. Sayer (Eds.), New methods for the analysis of change (pp. 335-353). Washington, DC: American Psychological Association.


Webster-Stratton, C., Kolpacoff, M., & Hollinsworth, T. (1988). Self-administered videotape therapy for families with conduct-problem children: Comparison with two cost-effective treatments and a control group. Journal of Consulting and Clinical Psychology, 56, 558.


Webster-Stratton, C., Reid, M. J., & Hammond, M. (2001). Preventing conduct problems, promoting social competence: A parent and teacher training partnership in Head Start. Journal of Clinical Child Psychology, 30, 283-302.


Webster-Stratton, C., & Taylor, T. (2001). Nipping early risk factors in the bud: Preventing substance abuse, delinquency, and violence in adolescence through interventions targeted at young children (0-8 years). Prevention Science, 2, 165-192.


Wilson, L. M., Fritz, P. A. T., & Lorber, M. F. (2014). The development of parent-child physical aggression in early childhood. Unpublished manuscript.





26


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
Authorhci3
File Modified0000-00-00
File Created2021-01-25

© 2024 OMB.report | Privacy Policy