OMB clearance supporting justification 10 05 10 RAW B

OMB clearance supporting justification 10 05 10 RAW B.doc

21st Century Community Learning Centers (21st CCLC): Early Childhood Best Practices Project

OMB: 1810-0707

Document [doc]
Download: doc | pdf



PART B: COLLECTION OF INFORMATION EMPLOYING STATISTICAL METHODS



B1. Respondent Universe and Sampling Methods


In Phase 1, there is no sampling plan for the site coordinator survey; all 8900 site coordinators providing services to. A minimum of 1500 site coordinators will be recruited by the project team to ensure a margin of error of +/- 2 percent. The proposed Phase 1 data collection activities would enable an initial assessment of currently funded 21st CCLC centers serving young children across the United States. Site coordinators will be invited to complete a survey instrument about the range of services offered by their programs, program facilities and structures, staffing, and staff supervision and training. The collected information is expected to support the study team as it compiles an overall score or measure of self-reported program quality and prepares profiles of the highest and lowest performing programs using the statistical procedure cluster analysis.1 Using the profiles generated via the cluster analysis, the study team will identify smaller subsets of those programs at each end of the quality spectrum; and a total of 30 of these programs will be further and extensively studied and evaluated through site visits conducted by trained observers in Phase 2 of the study. While it would be possible to choose a random sample of 30 programs for further study, the research team assumes that a purposive sample will allow them to observe the greatest amount of variation across program styles, curriculum and academic support, staff, and health and safety measures. A statistical power analysis conducted by the study team demonstrated that a sample size of 30 programs would allow for adequate comparison among programs on a number of key quality indicators, including teacher-child ratios, group size, early childhood development and learning standards, and staff training and credentials.


B2. Information Collection Procedures


a. Statistical methodology and stratification


The site coordinator survey will be sent to all program sites. Cluster analysis will be used to identify programs performing on the higher and lower ends of the spectrum as discussed in section B1 above.


b. Estimation Procedures/Analysis Methods


Descriptive analyses, using a statistical package for social scientists, will be completed on all of the survey items. Inferential analyses will include cluster analysis to sort programs into groups offering similar service configurations. These groups may differ based on self-reported information provided by site coordinators about program styles, curriculum and academic support activities, staffing, and health and safety policy and procedures. These clusters will be used to identify candidates for Phase 2 of the research study, where programs of potential high and low quality will undergo site observation for further study.




c. Degree of Accuracy Needed


Response rates and attrition. We will do everything possible to ensure that data are collected for as many of the programs as possible. Specifics actions to obtain maximum response are described below in paragraph B3 (2). Nevertheless, there are likely to be some limits on our ability to collect data given relatively short periods of time available for data collection.


B3. Methods to Maximize Response Rates


An important challenge in conducting the Phase 1 Site Coordinator Survey will be to obtain a sufficiently large response rate so that the findings will be valid and reliable. To address this challenge, we will administer the survey as follows:


  1. Prior to contacting potential respondents, the U.S. Department of Education will provide State Education Coordinator (SEA) liaisons with information about the study and its importance for the field. This letter will also include information on how to access and complete the survey.

  2. Using the contact information provided by ED, the SEI and Children’s Institute team will send all site coordinators an email with an explanation of the study, a link to the electronic survey, and phone number to call for questions. Site coordinators will be asked to complete the survey online within two weeks, but will be given the opportunity to complete it via mail by calling Synergy Enterprises Inc to request a paper/mail-in copy of the survey. Returned emails will be corrected and resent when possible. If respondents have not completed the survey within one week, a reminder email will be sent out. A reminder email will be sent out once a week until the respondent completes the survey or the end of the field period, whichever occurs first.


Using this approach, we anticipate a response rate with a low margin of error (approximately 2.3 percent).


Centers/Site Coordinators chosen for the Phase 2 site visits will be contacted by telephone by their State Education Coordinator Liaison and by the project director of the study. The U.S. Department of Education will also host a webinar to discuss the study and answer any questions that site coordinators may have. They will be sent a packet of detailed information via email and will be given the opportunity to accept or decline the invitation to participate. Site coordinators, along with the grantee program director, will be asked to support the project team’s efforts to maximize the survey responses from parents and staff. If they do not respond to the invitation within two calendar weeks, the project director or other designated staff members will telephone the site coordinator until a decision has been made by the site coordinator.


B4. Pretesting of Surveys


We have conducted limited pretesting of the items designed specifically for these surveys to ensure clarity, and have administered the full surveys to nine respondents whose roles are similar to those we will sample for the full administration to ensure that the respondent burden does not exceed our estimates. This pretest confirmed that our burden estimates are conservative.


B5. Individuals Consulted on Statistical Aspects of Design


Contact Information: Synergy Enterprises

Roy Walker, M.B. A. Project Director

Synergy Enterprises Inc.

8757 Georgia Ave. Suite 1440

Silver Spring, MD 20910

[email protected]

(240) 485-1985


Sherri Lauver, Ph.D., Principal Investigator

Synergy Enterprises Inc.

8757 Georgia Ave. Suite 1440

Silver Spring, MD 20910

[email protected]

((585) 355-8506


Contact Information: Children’s Institute

Dirk Hightower, Ph.D. Subcontract Manager

Children’s Institute

274 N. Goodman Street

Rochester, NY 14607

(585) 295-1000

[email protected]


REFERENCES


Barnett, W., Lamy, C., & Jung, K. (2005). The effects of state prekindergarten programs on young children’s school readiness in five states. New Brunswick, NJ: National Institute for Early Education Research, Rutgers University.


Campbell, F. A., Ramey, C. T., Pungello, E., Sparling, J., & Miller-Johnson, S. (2002). Early childhood education: Young adult outcomes from the Abecedarian project. Applied Developmental Science, 6(1), 42-57.


Early, D. M., Maxwell, K. L., Burchinal, M., Bender, R. H., Ebanks, C., Henry, G. T., . . . Zill, N. (2007). Teacher’s education, classroom quality, and young children’s academic skills: Results from seven studies of preschool programs. Child Development, 78(2), 558-580.


Epstein, D. (2009). The changing landscape: National trends in quality standards in state-funded prekindergarten initiatives. New Brunswick, NJ: Rutgers University, National Institute for Early Education Research. Retrieved from www.nieer.org


Harms, T., Clifford, R., & Cryer, D. (1998). Early Childhood Environment Rating Scale- Revised. New York: Teachers College Press.


Lambert, R., Abbot-Shim, M., & Sibley, A. (2006). Evaluating the quality of early childhood educational settings. In B. Spodek & O. Saracho (Eds.), Handbook of research on the education of young children (pp. 457-475). Mahwah, NJ: Erlbaum.


Love, J. M., Harrison, L., Sagi-Schwartz, A., van Ijzendoorn, M.H., Ross, C., Ungerer, . . . Chazan-Cohen, R. (2003). Child care quality matters: How conclusions may vary with context. Child Development, 74(4), 1021-1033.


Love, J., Tarullo, L., Raikes, H., & Chazan-Cohen, R. (2006). Head Start: What do we know about its effectiveness? What do we need to know? In K. McCartney & D. Phillips (Eds.), Handbook of early childhood development (pp. 550-575). Malden, MA: Blackwell.


National Association for the Education of Young Children (1997). Position Statement: Developmentally appropriate practice in early childhood programs serving children from birth through age 8. Washington DC: National Association for the Education of Young Children.


NICHD Early Child Care Research Network. (2004). Type of child care and children’s development at 54 months. Early Childhood Research Quarterly, 19(2), 203-230.


Peisner -Feinberg, E.S., Burchinal, M.R., Clifford, R.M., Culkin, M.L., Howes, C., Kagan, S.L. & Yazejian, N. (2001). The relation of preschool child-care quality to children’s cognitive and social development trajectories through second grade. Child Development, 72(5), 1534-1553.


Pianta, R. C. (1999). Enhancing relationships between children and teachers. Washington, DC: American Psychological Association.


Pianta, R. C., Howes, C., Burchinal, M., Bryant, D., Clifford, R., Early, D., & Barbarin, O. (2005). Features of pre-kindergarten programs, classrooms, and teachers: Do they predict observed classroom quality and child-teacher interactions? Applied Developmental Science, 9(3), 144-159.

Pianta, R., LaParo, K., & Hamre, B. (2007). Classroom Scoring Assessment System (CLASS): K-3. Baltimore, MD: Brookes Publishing Co.

Schweinhart, L., Montie, J., Xiang, Z., Barnett, W., Belfield, C. R., & Nores, M. (2005). Lifetime effects: The HighScope Perry Preschool Study through age 40. Ypsilanti, MI: High Scope Press.


Vandell, D. L. (2004). Early childcare: The known and the unknown. Merrill-Palmer Quarterly, 50(3). 387-414.


Whitehurst, G., Zevenbergen, A., Crone, D., Schultz, M., Velting, O. & Fischel, J. (1999). Outcomes of an emergent literacy intervention from Head Start through second grade. Journal of Educational Psychology, 91, 261-272.


Zaslow, M., Halle, T., Martin, L., Cabrera, N., Calkins, J., Pitzer, L., & Margie, N. G. (2006). Child outcome measures in the study of child care quality. Evaluation Review, 30(5), 577-610.

1 Cluster analysis is an exploratory data analysis tool used to sort cases (people, things, events, etc.) into groups, or clusters, so that the degree of association is strong between members of the same cluster and weak between members of different clusters. Each cluster thus describes, in terms of the data collected, the class to which its members belong; and this description may be abstracted through use from the particular to the general class or type. (Source: http://www.clustan.com/what_is_cluster_analysis.html)

4

File Typeapplication/msword
File TitleIntroduction
AuthorSherri Lauver
Last Modified ByAuthorised User
File Modified2010-10-06
File Created2010-10-06

© 2024 OMB.report | Privacy Policy