supporting statement_teacher_FINAL

supporting statement_teacher_FINAL.docx

How Differences in Pedagogical Methods Impact ChalleNGe Program Outcomes

OMB: 0704-0506

Document [docx]
Download: docx | pdf

SUPPORTING STATEMENT

How Differences in Pedagogical Methods Impact ChalleNGe Program Outcomes


A. JUSTIFICATION


1. Need for Information Collection


The National Guard Youth Challenge Program (ChalleNGe) is a unique residential program for “at risk” youth age 16 to 18 who have dropped out of high school. Enrollment in the program is voluntary. Participants must be high school dropouts or expellees, unemployed, and drug free. Those on probation or parole, as well as those awaiting sentencing or indictment, are not eligible for enrollment. The program is open to both men and women, but roughly 80 percent of the participants are men. There are no income-based requirements for eligibility for the program. However, participants must be residents of the state in which the program they attend is located.


The ChalleNGe program is residential and 22 weeks in length. The environment is perhaps best described as “quasi-military”; participants (referred to as “cadets”) form platoons, drill and march, and engage in intensive physical training. The program also includes classroom instruction on both academic topics and noncognitive “life skills,” including financial management, drug avoidance, and health and sexual education. The academic focus of the program is designed to help cadets attain GED (General Educational Development) credentials. Participants also perform volunteer work in the communities where the programs are located. The ChalleNGe programs include an important mentoring aspect that is designed to last beyond the end of the residential phase of the program. Cadets are matched with volunteer mentors who assist them with meeting their post-ChalleNGe goals.


The ChalleNGe program was first authorized by Congress in FY93 as a pilot program. It was authorized as a permanent program in the FY98 Defense Authorization bill. The program is operated jointly by the states and the state National Guard units, with federal funding to cover a portion of the program’s costs. For each state where the program is operating, the Governor and the National Guard Bureau enter into a cooperative agreement charging the State Adjutant General with administering the program in that state.


The ChalleNGe program has grown over time. In 1993, 10 states established ChalleNGe programs. Today there are 34 programs in 29 states (plus Puerto Rico and the District of Columbia). While most programs assist students with preparing to successfully obtain a GED, several of the programs award high school diplomas or alternate credentials to some or all graduates, either through agreements with a local high school or through designation as a high school of some sort. Overall, the ChalleNGe programs have graduated over 100,000 youth. For the 2010 program year, over 50 percent of graduates received either a GED or high school diploma.


Programs publicize and recruit through advertising, through building relationships with a variety of people who come in contact with young people, and through word of mouth. Each program typically receives more applications than it can accommodate during each class. In addition to being turned away because of ineligibility, applicants may be denied admission for reasons of space or funding limitations. However, programs do not deny admission based on test score requirements.


Teachers at ChalleNGe programs come from a wide variety of backgrounds; some are certified teachers, while others are not. Some teachers at ChalleNGe have substantial experience in traditional classrooms, some come from a background of working with disadvantaged youth, and some have taught primarily in GED preparatory programs or other adult education programs. The classroom setup varies across ChalleNGe programs as well. At some programs, teachers specialize in a single subject, while at others teachers work with the same group all day, covering all subjects. Most classrooms are single-sex, but some are not. Finally, many programs focus on GED preparation, while others focus on credit recovery and sending cadets back to their home high schools to graduate.


The purpose of this study is to look at how differences in the classroom affect program outcomes. The study involves the administration of an online survey to all classroom teachers at the 34 ChalleNGe programs. The survey asks about various pedagogical topics, including classroom management, curriculum development, and teaching strategies and methods. The survey also covers the amount of time spent on specific topics within each subject.


In traditional classrooms, the influence of pedagogy on student outcomes is fairly clear, while the influence of a teacher's characteristics, education, and experience is far less clear.1 However, virtually no research exists tying the elements of classroom preparation to GED success. This is mainly because most GED test-takers undertake only minimal preparation, generally not in a classroom setting.2 While there is evidence suggesting that better GED test scores are tied to better labor market outcomes, we know very little about how various aspects of preparation affect young GED test-takers. This study offers the potential to learn about the link between preparation and GED performance.


This study also offers the possibility to learn about classroom methods that are especially effective with disadvantaged youth who perform below grade-level. Past research shows that high school students performing below grade-level are most likely to experience "drill-and-kill" in the classroom.3 Finally, we will be able to link classroom practices to longer term outcomes, such as completion of the ChalleNGe program.


2. Use of the Information


The data collected will be used by the Office of the Assistant Secretary of Defense (Reserve Affairs) to evaluate how differences in pedagogical methods affect program outcomes. Research questions include whether there is a link between curriculum elements and test scores, whether there are differences in pedagogy and outcomes related to the focus of the program (GED preparation versus credit recovery), and whether classroom management strategies affect program outcomes. The data collected will be used only for the purposes of this study. The outcomes of the study will be used to advise the ChalleNGe sponsors and program directors on ways to improve the program. For example, based on the results of this study, ChalleNGe program sponsors and directors may decide to incorporate certain teaching methods with the goal of improving cadets’ GED pass rates.


3. Use of Information Technology


The survey will be administered online. Each ChalleNGe teacher has an email account and access to a computer. Conducting the survey online is therefore feasible and is the most cost-effective method of gathering data from the 34 ChalleNGe sites. The ChalleNGe program will provide the researchers with the name and work email addresses for all of the teachers in the program. Each respondent will be issued a unique user ID and password to prevent unauthorized access to the survey. This information will be emailed to each respondent along with information on the study and a link to the study homepage. The study homepage will contain the consent information. Survey respondents will be asked to click on various boxes to indicate agreement with the consent statements. Respondents will then be instructed to click a link indicating their agreement to proceed with the survey. This link will direct the respondent to the first survey question. As part of the follow-up plan for non-respondents, two follow-up emails will be sent to non-respondents one week and two weeks after the original email.


4. Nonduplication


The ChalleNGe program is run jointly by the states and the state National Guard units. The program is federally funded, and oversight is provided by the Office of the Assistant Secretary of Defense for Reserve Affairs, which does not have any forms or information collection that duplicate this information collection. After consultation with the various ChalleNGe program locations, it was determined that the information sought is not currently available at the program locations either. There is no other way to collect this information.


5. Burden on Small Business


Collection of this information does not have a significant impact on small businesses.


6. Less Frequent Collection


This information collection has been developed in support of a one-time research effort. The study methodology calls for the survey to be given once to each respondent. Less frequent collection is not possible.


7. Paperwork Reduction Act Guidelines


There are no special circumstances that require this collection to be conducted in a manner inconsistent with the guidelines in 5 CFR 1320.5(d)(2).


8. Consultation and Public Comments


A notice was published in the Federal Register on June 5, 2012, soliciting comments on the information collection prior to submission to OMB (77 FR 33201). No comments were received.

9. Gifts or Payment


No payment or gift will be provided to the respondents.


10. Confidentiality


Participation in the data collection will be optional for respondents and all survey responses will be anonymous. No personally identifying information (PII) will be collected. To protect the data collected, it will reside on CNA’s Scientific Computing Operations (SCO) system. SCO is a separate computing environment where all of the company’s sensitive data are stored and processed. SCO allows CNA staff members secure access to statistical analysis and mathematical applications in addition to other job-related information resources for studies requiring the use of sensitive data. SCO users can access SCO systems only from thin clients in their offices, in commuter offices, and remotely using a SCO supplied thin client laptop. Access to all SCO computing resources is permitted on an as-needed, need-to-know basis only. 


11. Sensitive Questions

There are no sensitive questions asked in this information collection.


12. Respondent Burden and its Labor Costs


a. Estimation of Respondent Burden


The survey was tested twice. First it was tested on three teachers at the Fort Gordon ChalleNGe site. It took the teachers an average of 19 minutes to complete the survey. Second, it was tested on three teachers at the Washington State ChalleNGe site. The teachers took an average of 18 minutes to complete the survey. Based on this, we expect each survey to take approximately 20 minutes to complete. At any point in time, there are approximately 190 ChalleNGe teachers. Based on a response rate of 80 percent4, we expect 152 teachers to complete the survey, which equates to 51 hours of time spent, in total, completing the survey.


b. Labor Cost of Respondent Burden


The salary of ChalleNGe teachers varies, but the average salary is approximately $40,000 per year or $20/hour. Using this, the labor cost for each survey is approximately $6.67. Based on 152 teachers completing the survey, the total labor cost of respondent burden is $1,013.84.


13. Respondent Costs Other Than Burden Hour Costs


Respondents will incur no other costs.


14. Cost to the Federal Government


This survey is part of a larger study, funded by the Office of the Secretary of Defense for Reserve Affairs (Resources). Other than the funding for the study ($190,000), there will be no additional cost to the federal government associated with this survey.


15. Reason for Change in Burden


This is a new collection. There is no change in burden.


16. Publication of Results


We plan to publish the results of the survey as a CNA report, and possibly also in a peer-reviewed journal. We will publish only averages; we will publish no individual results. We will roll out the survey during the middle weeks of each program's cycle beginning in 2014 (consultation with programs suggests that the middle weeks are the best time to survey classroom teachers). We expect to have completed all teacher surveys by mid-2014. Based on this timeframe, we expect to publish the final report in late 2014.


17. Nondisplay of OMB Expiration Date


Approval to not display an expiration date is not being sought.


18. Exceptions to “Certification for Paperwork Reduction Submissions”


No exceptions to the certification statement are being sought.



B. COLLECTION OF INFORMATION EMPLOYING STATISTICAL METHODS


See Supporting Statement Part B.

1 For discussions of classroom instruction and student outcomes, see (along with many others) Christopher B. Swanson and David Lee Stevenson, “Standards-based Reform in Practice: Evidence on State Policy and Classroom Instruction from the NAEP State Assessments,” Educational Evaluation and Policy Analysis 24, no. 1, Mar. 2002:1-27; and James E. Tarr et al., “The Impact of Middle-Grades Mathematics Curricula and the Classroom Learning Environment on Student Achievement,” 2008, Journal for Research in Mathematics Education 39, no. 3, 2008: 247-80. The first reference is particularly relevant because our surveys include some of the same measures of classroom pedagogy and practice as the NAEP surveys. For some of the disparate results on teacher characteristics and student outcomes, see The Handbook of the Economics of Education (edited by Eric Hanushek, Stephen Machin, and Ludger Woessmann), vol. 4, 2011, Elsevier Press.

2 John Tyler, “The General Educational Development (GED) Credential: History, Current Research, and Directions for Policy and Practice,” in J. Comings, B. Garner, and C. Smith (eds.), Review of Adult Learning and Literacy, vol. 5, Mahwah, NJ: Lawrence Erlbaum Associates, 2005.

3 See, for example, John Ogbu, Black American Students in an Affluent Suburb: A Study of Academic Disengagement, Mahwah, NJ: Erlbaum Press, 2003.

4 We expect this high response rate based on several factors. First, the program is relatively small and there is widespread support and enthusiasm from the program directors and classroom teachers for this project. Second, as described, we plan to send two follow-up emails to non-respondents to remind them about the survey and encourage their participation. Lastly, when the survey was tested the respondents found the survey easy to fill out which we expect will encourage potential respondents to participate.

4


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorJennifer R. Atkin
File Modified0000-00-00
File Created2021-01-28

© 2024 OMB.report | Privacy Policy