ED Response to OMB Comments

HSLS-09 First Follow-up 2012 Response to OMB Passback.doc

High School Longitudinal Study of 2009 (HSLS:09) First Follow-up Field Test 2011

ED Response to OMB Comments

OMB: 1850-0852

Document [doc]
Download: doc | pdf

November 21, 2011

MEMORANDUM



To: Shelly Martinez, OMB

From: Laura LoGerfo, NCES

Through: Kashka Kubzdela, NCES

Re: Response to OMB Passback on HSLS:09 First Follow-up 2012 (OMB# 1850-0852 v.9)



Passback of October 21st, 2011:


  1. Please summarize for us the extent to which the content changes post field test and post cog lab have been in order to address identified problems during testing versus other types of changes.  We want to be sure that any “other” changes and any significant departures from what was tested are subjected to additional cognitive labs if appropriate. 


Part D of this submission indicates which items have been dropped, revised, or added. These editions are driven by data from the field test, the April 2011 cognitive interview report, and the June 2011 Technical Review Panel meeting (and other interactions with the research and policy communities). Throughout this process, the substantive themes have remained stable, but review has identified numerous opportunities to enhance the focus and efficiency of the questionnaires. The revisions focused on simplifying the language and eliminating any potential for confusion. We do not believe, however, that any of the added items are in need of or would benefit from further cognitive interviews. A few examples from the student questionnaire illustrate how we borrowed from other surveys, where the strengths of these items have already been established.


For example, good items from ELS:2002, S22OTHSCH and S2OTHSCHM, were included at the request of the TRP to capture any enrollment at high schools other than the base year school. S2MEFFORT is a scale of school participation used successfully with 11th graders in studies by TRP member Jeremy Finn and was added at the recommendation of the TRP. Item S2NOSCI was recommended by NCES longitudinal data users to determine why students were not taking a science course (there is a parallel question for math) and is based on an item from the NELS:88 second follow-up. The TRP recommended dual/concurrent enrollment questions, and three such questions have been added from PEQIS (S2DUALWHERE, S2HSCREDIT, S2CLGCREDIT). Since these items have worked well on other surveys with similar populations, a separate program of cognitive research is not needed.


Please note that since our original submission, timing simulations in the last month as the survey was programmed into the telephone interview protocol established the need for further reduction of the student questionnaire to keep it within the stated respondent burden time. The additional needed reductions to the student survey instrument are itemized at the end of this memo, and we are including an appropriately revised version of the student questionnaire (Appendix 2) and justification grid (Part D).


  1. Please provide a clearer explanation of the change and the reason for it in the out of school student incentive from field test to that proposed for full scale.


The incentive section of Part A (pp.13-17) has been revised to clarify the proposed incentive strategy. In short, in the field test, the incentive amount for out-of-school student cases was based on propensity modeling: $40 for the 53 lowest-propensity cases (less than 20% of the out-of-school cases) and $15 for all other cases.1 The response rate was 85% for students still at the base-year school. However, the field test showed that the most challenging students to recruit are those identified by school personnel as having left the base-year school. The response rate among out-of-school students in the field test was 29%.


Students no longer at the base-year school include transfer students, dropouts, early graduates, and those with unknown status. Given the analytic value of such cases combined with their low field test response, it is crucial to encourage participation for this group. The 85% response rate for those still at their base-year schools demonstrates that the field test approach was effective for them and that none of these cases needs to be identified as having “low propensity to respond.” Relying on the school-provided enrollment status to determine the incentive amounts for out-of-school student data collection simplifies the process while focusing efforts and resources appropriately: $40 for students no longer at the base-year school; $15 for other out-of-school student cases; plus an extra $10 for all students who complete the out-of-school assessment (please refer to Exhibit A-1 and accompanying text in Part A of this submission).


  1. Please also provide a clearer explanation of the reason that a late prepaid incentive would be the best strategy.  For example, what do we know about the quality of locating data versus just simple nonresponse characteristics of those cases?


The prepaid $5 incentive combined with the abbreviated hardcopy survey should encourage response from “difficult” parent cases (15+ calls; initial refusals; and/or tracing dead-ends) at the end of data collection. Such an incentive is particularly important given the relatively low parent response rate in the base year (67%) and first follow-up field test (54%). The last-month prepaid parent incentive encourages response among “difficult” cases who have not responded in the first six months of data collection. Only “difficult” parent cases for which there is presumed-good address information would receive the $5 prepaid incentive. Any case for which all possible address information has been deemed obsolete/invalid would not receive this prepaid incentive.


In the base year, 77% of the “difficult” parent cases in the incentive experiment were identified as exceeding the number-of-calls limit; 18% were refusal conversion cases; and the other 5% were telephone tracing dead-end cases. Adding the prepaid incentive to the last-month mailing of the hardcopy parent survey is designed to raise the participation level above that experienced in the base year, given the documented success of prepaid incentives on the B&B:08/12 field test. No additional incentive will be provided to hardcopy-only respondents. However, the $20 incentive will be offered to the “difficult” cases if the parent completes the full interview rather than the abbreviated hardcopy survey included in the mailing. The full interview would provide valuable information that cannot be obtained from other sources.


1st Passback of November 18th, 2011:


  1. This material is fine but please also submit an updated Part B that includes the details about the abbreviated survey, including precisely to whom this survey would be administered, how they are sampled, and how these data fit into the larger analysis plan.


Edits were made, including:

Edits in Part A:

  • Added the cost for the transfer administrator effort ($225,000) to the cost table

  • Modified the data collection schedule and dates for subsequent products to account for data collection through October 2012

Edits in Part B:

  • Described the transfer school administrator collection and its value analytically in the sampling section

  • Mentioned the transfer school admin survey in the maximizing response rate section


2nd Passback of November 18th, 2011:


  1. Please explain the basis for the 70% response rate from the transfer administrators sample.  Given the estimate, we'd like to see a brief explanation of the non-response bias analysis plan for the administrator survey in Part B.


The text of Part B has been revised accordingly on pages 2-3 and 10-11.


The estimated 70 percent response rate was based on a number of factors including: past experience recruiting sampled schools for the study, in which 50 percent agreed to participate; the high level of cooperation for the administrator survey once the school agrees to participate (94 percent response rate among HSLS:09 school administrators in base year); and the lower level of cooperation for the administrator survey from non-participating schools (61 percent response rate in base year).


Because the administrator questionnaire request will be the first introduction to HSLS:09 for these transfer school administrators, some reluctance on their part is anticipated. In addition, the data collection period will be compressed relative to the time allocated to gaining cooperation from administrators at sampled schools. As a result, the participation level for transfer school administrators is not expected to be as high as that for administrators at sampled schools.


In-school data collection produces high student response rates relative to other HSLS:09 interview modes such as telephone and Web.  For this reason, permission from the transfer school administrators to conduct in-school sessions will be attempted when 4 or more sampled students now attend the transfer school.  However, if an administrator declines participation, student responses will be collected through modes outside of the school setting, and the principal (or designee) will still be asked to complete the administrator survey.


The approach for gaining cooperation includes a series of mailings and e-mails with a single follow-up telephone call to prompt participation from a subset of cases deemed of particular importance. Particular attention, including the more-intensive telephone prompting effort, will be provided to transfer school administrators that represent multiple sampled student respondents (i.e., schools to which more than 1 student respondent has transferred) since the administrator data will be linked to each associated student at the given school. Additionally, we will monitor response rates and precision estimates during data collection – and adjust case-prioritization if applicable – with the intent of achieving stable estimates and low bias.


A non-response bias analysis for the administrator questionnaire generally (not just among the transfer school administrators) will be conducted should the contextual response rate fall below 85 percent using the methodology detailed in part B, section 2.e. This contextual response rate is defined as the (weighted) proportion of responding students linked to the questionnaire responses from their school administrator. By including administrator data from transfer schools as well as sampled schools, the HSLS:09 data will contain a more comprehensive representation of the school environment for the student cohort as of 2012. Weighting and imputation procedures will be investigated in the event that item non-response rates for key administrator questionnaire variables exceed 15 percent.

Dropped to reduce burden (will collect all schools attended across years and eliminate year-by-year information):

Item

Source

Wording

S2ATT0910

HSLS F1 FT Student

Did you attend high school during the 2009-2010 school year?

S2PRVS

HSLS F1 FT Student

During the last school year (2009-2010), what school did you attend?

*School Name

*City

*State

S2PRVSYN

HSLS F1 FT Student

During the last school year (2009-2010), were you attending [school name], attending [S2CURSCH1], attending another school, or were you homeschooled?

1=attending [school name]

2=attending [S2CURSCH1]

3=attending another school, or

4=were you homeschooled?

S2INFLU1

HSLS F1 FT Student

Who has had the most influence in your choice of high school courses?
1=Your family
2=Your friends
3=A teacher
4=A school counselor
5=A coach
6=Someone at your work
7=People you admire in music, sports, TV
8=No one in particular
9=Other

S2CRJBWHLV

HSLS F1 FT Student

In what month and year did you leave this job?

S2MONTHLK

HSLS F1 FT Student

Which of these months were you looking for work? {months unemployed were listed}


Dropped to reduce burden (was recommended by TRP to measure engagement; TRP panelist Jeremy Finn used this item on a questionnaire for 11th graders; changed from subject-specific to general; eliminated items already included in S2BEHAVIOR; revised options to avoid vague quantifiers):

Item

Source

Wording

S2EFFORT

Finn Effortful Participation Scale

Over the last 6 months [you were in school], how often did you do these things?

*You paid attention to the teacher.
*You turn
ed in your assignments and projects on time.
*When an assignment was very difficult, you stopped trying.
*You did as little work as possible; you just wanted to get by

1=Never
2=Less than half of the time
3=Half of the time
4=More than half of the time
5=Always


Planned course-taking and test-taking dropped to reduce burden:

Item

Source

Wording

S2APIB

HSLS F1 FT Student

Do you plan to enroll in any Advanced Placement (AP) courses in the future?

1=Yes

2=No

3=You haven't decided yet

S2APSUBJ2

HSLS F1 FT Student

Have you taken, or do you plan to enroll in, any of the following courses?

*An Advanced Placement (AP) math course
*An Advanced Placement (AP) science course
*Another Advanced Placement (AP) course
*An International Baccalaureate (IB) math course
*An International Baccalaureate (IB) science course
*Another International Baccalaureate (IB) course

S2IBPLAN

HSLS F1 FT Student

Have you taken, or do you plan to enroll in, any of the following courses?

*An Advanced Placement (AP) math course
*An Advanced Placement (AP) science course
*Another Advanced Placement (AP) course
*An International Baccalaureate (IB) math course
*An International Baccalaureate (IB) science course
*Another International Baccalaureate (IB) course

S2IBSUBJ2

HSLS F1 FT Student

Have you taken, or do you plan to enroll in, any of the following courses?

*An Advanced Placement (AP) math course
*An Advanced Placement (AP) science course
*Another Advanced Placement (AP) course
*An International Baccalaureate (IB) math course
*An International Baccalaureate (IB) science course
*Another International Baccalaureate (IB) course

S2TEST

HSLS F1 FT Student

Are you planning to take these tests in the future?

*PSAT
*SAT
*ACT
*Any Advanced Placement (AP) test
*Any International Baccalaureate (IB) test


Dual enrollment dropped to reduce burden (were newly added for completeness; FFU FT only asked about dual enrollment courses in the past):

Item

Source

Wording

S2DUALPLAN

HSLS F1 FT

While you are still in high school, do you plan to enroll in any courses for college credit other than AP and IB courses?

1=Yes

2=No

3=You haven't decided yet

S2DUALSUBJ2

HSLS F1 FT

In which of the following subject areas do you plan to take these courses for college credit?

*Math?

*Science?

*Another subject?


ELS efficacy dropped to reduce burden (were added on TRP’s recommendation):

Item

Source

Wording

S2EFFICACY

ELS BY

  How often do these statements apply to you?

When you sit yourself down to learn something really hard, you can learn it
If you decide to get all good grades, you can really do it
If you decide to get all answers correct, you can really do it
If you want to learn something well, you can
1=Never
2=Less than half of the time
3=Half of the time
4=More than half of the time
5=Always


Eliminated half these items to reduce burden (originally revised to specify which math/science teacher when student is taking two math/science courses rather than asking student to choose "main" teacher; added items taken from the Measures of Effective Teaching (MET) questionnaire):

Item

Source

Wording

S2MTCHQ

HSLS F1 FT Student

How much do you agree or disagree with the following statements about your math teacher for this course? Remember, none of your teachers or your principal will see any of the answers you provide. Would you say your math teacher...

* values and listens to students’ ideas.

* treats students with respect.

* thinks every student can be successful.

* thinks mistakes are okay as long as all students learn.

* treats some kids better than other kids.

* makes math interesting.

* treats males and females differently.

* makes math easy to understand.

* wants students to think, not just memorize things.

* doesn't let people give up when the work gets hard.

1=Strongly agree

2=Agree

3=Disagree

4=Strongly disagree

S2STCHQ 

HSLS F1 FT Student

How much do you agree or disagree with the following statements about your science teacher for this course? Remember, none of your teachers or your principal will see any of the answers you provide. Would you say your science teacher...

* values and listens to students’ ideas.

* treats students with respect.

* thinks every student can be successful.

* thinks mistakes are okay as long as all students learn.

* treats some kids better than other kids.

* makes science interesting.

* treats males and females differently.

* makes science easy to understand.

* wants students to think, not just memorize things.

* doesn't let people give up when the work gets hard.

1=Strongly agree

2=Agree

3=Disagree

4=Strongly disagree


To reduce burden, respondents will be asked if they have participated in any of these programs. We will only ask about specific programs if affirmative response (originally revised to capture any participation, not just current participation). Added a "don't know" option since some 11th graders will not know these programs:

Item

Source

Wording

S2PGRM

HSLS F1 FT Student

Have you ever participated in any of the following programs?
*Talent Search
*Upward Bound
*Gear Up
*AVID (Advancement in Individual Determination)
*MESA (Mathematics, Engineering, Science Achievement)
1=Yes
2=No
3=You don’t know what this is



1 There were too few out-of-school student cases in the field test to allow for experimentation. An additional $10 is offered to student-survey respondents who also complete the assessment outside of the school setting. The in-school student incentive is $10 for the interview and assessment combined.

4


File Typeapplication/msword
AuthorElise M Christopher
Last Modified Bykashka.kubzdela
File Modified2011-11-21
File Created2011-11-21

© 2024 OMB.report | Privacy Policy