Instrument 2 CONSORT diagram template

Local Evaluations as part of the Personal Responsibility Education Program (PREP): Promising Youth Programs (PYP)

Instrument 2 - CONSORT Diagrams_Consent PRE RA_2016_disclosure.withburden_Clean

Instrument 2 CONSORT diagram template

OMB: 0970-0504

Document [docx]
Download: docx | pdf

Instructions

The information collected will be used for internal purposes to 1) assess the sample build-up and compare it to the target sample sizes on which power calculations were based, and 2) assess the likelihood that the final analytic sample for key follow-up time periods might have rates of overall or differential attrition that exceed the HHS evidence standard threshold.

All CONSORT diagrams require a “time stamp” for the CONSORT diagram. The CONSORT should provide information about the number of clusters (or individuals) randomized and the number of clusters (or individuals) participating (i.e., that have not dropped out) as of this particular point in time. Because participants may be at different stages of the study at any point in time due to rolling enrollment or multiple cohorts of implementation, all participants may not have an opportunity to contribute data to all data collection points.

CONSORT diagrams that track clusters as the unit of assignment

For cluster random assignment designs (e.g., random assignment of clinics, community-based organizations, teachers, or schools) and quasi-experimental designs, the following pieces of information are also necessary as documentation of the pooled sample flow in cluster-level CONSORT diagrams:

  1. A brief paragraph (3-4 sentences) describing what makes a cluster eligible for the evaluation; the number of clusters screened and the screening criteria used; the number of clusters determined to be eligible and the counts and reasons for those screened out; and the whether and how any clusters were prioritized for inclusion in the study sample.

  2. The number of clusters (randomly) assigned in total, to each condition (i.e., treatment and comparison), and the start and end dates of cluster random assignment.

  3. The number of clusters still participating (i.e. retained), by study condition, at each data collection time point. A cluster is defined as “participating” if at least one individual in the cluster completed the data collection effort.

    1. Grantees should also note any reasons for clusters dropping out, and the number of clusters the reason applies to.

  4. In addition to tracking cluster flow, it is necessary to complete a CONSORT diagram for the youth in participating clusters.

CONSORT diagrams that track individuals

For both individual-level and cluster-level designs, the following pieces of information are also necessary as documentation of sample flow in individual-level CONSORT diagrams:

  1. A brief paragraph (3-4 sentences) describing what makes a youth eligible for the evaluation; the number of youth screened and determined to be eligible and the counts and reasons for those screened out; and the process for selecting the pool to be evaluated among those eligible.

  2. The number of youth (randomly) assigned in total, the number assigned to each condition, and the start and end dates of assignment.

  3. The number of youth that have provided data, by study condition, at each data collection time point (baseline and subsequent follow-ups).

    1. If the evaluation uses a cluster design, then the number of youth in each condition at any time point should reflect the number of youth only in participating clusters at that time point. Youth in clusters that have dropped entirely from the study should be excluded from these counts.

    2. At a given time point, a subset of individuals may not have had the opportunity to contribute data for a particular data collection event. For example, individuals that have not completed the program would not have had the opportunity to contribute follow-up data. Therefore, it will be important to document the number of youth that are eligible (i.e. the number of youth that could have contributed data) at a given time point, in addition to providing the number of youth that actually did provide data.

    3. Grantees should also note any reasons for individual non-response dropping out, and the number of youth the reason applies to.

  4. The program start and end dates for the study period.

Shape1

According to the Paperwork Reduction Act of 1995, an agency may not conduct or sponsor, and a person is not required to respond to, a collection of information unless it displays a valid OMB control number. The valid OMB control number for this collection is 0970-XXXX; this number is valid through XX/XX/XXXX.  Public reporting burden for this collection of information is estimated to average 60 minutes, including the time for reviewing instructions, gathering and maintaining the data needed, and reviewing the collection of information. This collection of information is voluntary for individuals, but the information is required from Grantees.










CONSORT Diagram for Clusters

*Complete based on pooled sample to date. Also complete diagram for youth, using only retained clusters as starting point.

Shape3 Shape2

Data included in report are current as of (Date) __________­­­­­­­

Describe what makes a cluster eligible for the evaluation; the number of clusters screened and the screening criteria used; the number of clusters determined to be eligible and the counts and reasons for those screened out; and the whether and how any clusters were prioritized for inclusion in the study sample.






Shape29 Shape27 Shape30 Shape28 Shape25 Shape24 Shape31 Shape26 Shape13 Shape12 Shape20 Shape21 Shape16 Shape17 Shape9 Shape8 Shape23 Shape22 Shape19 Shape18 Shape15 Shape14 Shape11 Shape10 Shape7 Shape6 Shape5 Shape4

Analyzed (n = __)


List reason(s) for cluster(s) being excluded

  • ___ (n=__)

  • ___ (n=__)






Analyzed (n = __)


List reason(s) for cluster(s) being excluded

  • ___ (n=__)

  • ___ (n=__)






Retained at second follow up (n = __)


List reason(s) for cluster(s) dropping out

  • ___ (n=__)

  • ___ (n=__)


Retained at second follow up (n = __)


List reason(s) for cluster(s) dropping out

  • ___ (n=__)

  • ___ (n=__)






Retained at first follow up (n = __)


List reason(s) for cluster(s) dropping out

  • ___ (n=__)

  • ___ (n=__)


Retained at first follow up (n = __)


List reason(s) for cluster(s) dropping out

  • ___ (n=__)

  • ___ (n=__)






Retained at immediate post (n = __)


List reason(s) for cluster(s) dropping out

  • ___ (n=__)

  • ___ (n=__)






Retained at immediate post (n = __)


List reason(s) for cluster(s) dropping out

  • ___ (n=__)

  • ___ (n=__)




Completed baseline data collection (n = __)


List reason(s) for cluster(s) dropping out

  • ___ (n=__)

  • ___ (n=__)

Completed baseline data collection (n = __)


List reason(s) for cluster(s) dropping out

  • ___ (n=__)

  • ___ (n=__)


Assigned to Comparison (n = __)


Assigned to Treatment (n = __)


Clusters Randomized (n = __)

Date(s) of Cluster Random Assignment ______

Did not agree to be in study (n = __)

Did not pass screening criteria (n = __)

Other (n = __)



CONSORT Diagram for Youth

*Complete based on pooled sample to date.

Shape32 Shape33

Describe what makes a youth eligible for the evaluation; the number of youth screened and determined to be eligible and the counts and reasons for those screened out; and the process for selecting the pool to be evaluated among those eligible.



Data included in report are current as of (Date) __________­­­­­­­




Shape36 Shape34 Shape35

Did not agree to be in study (n = __)

Unable to contact family (n = __)

Did not pass screening criteria (n = __)

Other (n = __)




Shape45 Shape39 Shape56 Shape47 Shape55 Shape52 Shape51 Shape61 Shape62 Shape59 Shape57 Shape60 Shape53 Shape58 Shape49 Shape50 Shape54 Shape42 Shape46 Shape40 Shape44 Shape38 Shape43 Shape37 Shape48 Shape41

Analyzed (n = __)


List reason(s) for exclusion

  • ___ (n=__)

  • ___ (n=__)






Eligible for second follow-up (n = __)

Completed second follow-up (n = __)

Date(s) of data collection:


List reasons for non-completes

  • ___ (n=__)

  • ___ (n=__)





Eligible for immediate post (n = __)

Completed immediate post (n = __)

Date(s) of data collection:


List reasons for non-completes

  • ___ (n=__)

  • ___ (n=__)




Analyzed (n = __)


List reason(s) for exclusion

  • ___ (n=__)

  • ___ (n=__)






Eligible for second follow-up (n = __)

Completed second follow up (n = __)

Date(s) of data collection:


List reasons for non-completes

  • ___ (n=__)

  • ___ (n=__)

Eligible for first follow-up (n = __)

Completed first follow up (n = __)

Date(s) of data collection:


List reasons for non-completes

  • ___ (n=__)

  • ___ (n=__)




Eligible for immediate post (n = __)

Completed immediate post (n = __)

Date(s) of data collection:


List reasons for non-completes

  • ___ (n=__)

  • ___ (n=__)




Completed baseline (n = __)

Date(s) of data collection:


List reasons for non-completes

  • ___ (n=__)

  • ___ (n=__)





Completed baseline (n = __)

Date(s) of data collection:


List reasons for non-completes

  • ___ (n=__)

  • ___ (n=__)




Eligible for first follow-up (n = __)

Completed first follow-up (n = __)

Date(s) of data collection:


List reasons for non-completes

  • ___ (n=__)

  • ___ (n=__)





Program start date(s):



Program end date(s):

Assigned to Comparison (n = __)


Assigned to Treatment (n = __)


Randomized (n = __)

Date(s) of Random Assignment ______



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
Authorjknab
File Modified0000-00-00
File Created2021-01-22

© 2024 OMB.report | Privacy Policy