OME’s Responses to OMB’s Questions
Q1. The only submitted data collection instrument and narrative appear to be regarding collecting aggregate counts from State MEPs. Please clarify why there are no procedures or instruments for these entities to use in collecting data directly from students. How does ED plan to ensure consistency in methodology, consent, data quality, etc?
A1. States are going to already know the aggregate counts of binational students using data from the annual survey. Therefore, the states do not need to give this survey to the students directly. To ensure the accuracy and reliability of the information collected, persons responsible for collecting data will receive professional development which will include basic information and practical, hands-on experience with the survey. The content will include a definition of terms, sampling procedures, completion of the survey instrument, and submission of survey results. Training may be delivered through traditional means, or through any combination of the following: online web seminars, phone conferencing, and PowerPoint presentations sent electronically to data collection staff.
Q2. Please provide an elaborated discussion of the coverage/quality characteristics of the various lists to be used in constructing a study frame. How will these lists be unduplicated? What is the expected coverage of the final list? On what basis?
A2. We expect States to respond by providing summarized information in the cells of the tables of who they identify to be binational children. The count the State provides is meant to give us gross indicators of the binational children in the state. We do recognize that there may be some duplication across states, but we do not currently have a system to correct possible duplication. Each State has its own system for identifying unduplicated counts of migrant children. Therefore, we do expect the count of binational children within the State to be unduplicated.
Q3. The Supporting Statement does not clearly differentiate procedures and uses of data from year 1, year 2 and year 3. It appears that year 1 is a convenience sample-based pilot and therefore results are not generalizable but will be used to design year 2 and beyond. Please confirm this understanding and submit a clearer discussion of procedures and data tabulation and analysis plans for each year.
A3. Year one is a convenience sample-based pilot; therefore, the results would not be generalized. After conducting the survey with just eight states the first year, we anticipate being more knowledgeable and efficient when conducting the survey with 21 states the subsequent two years. We do not expect a different analysis plan from year one to year three. Although the content of the survey may be adjusted, a differentiation in the analysis plan is highly unlikely.
Q4. Is cross-state tracking of individual students a component of this study? If not, how does the study inform MEPs over and above what the state programs can learn examining their own data without a federal role?
A4. While cross-state tracking is not a component of this study, ED’s role in conducting this survey is to document the need of binational children nationwide.
File Type | application/msword |
File Title | OMB has the following questions about this collection |
Author | Lakesha.Davis |
Last Modified By | Lakesha.Davis |
File Modified | 2008-06-16 |
File Created | 2008-06-13 |