State Level Participants

Case Studies of the Implementation of Kindergarten Entry Assessments

Case studies TO17 OMB Appendix D state protocol (8-21-14)

State Level Participants

OMB: 1875-0273

Document [docx]
Download: docx | pdf

State Date

Kindergarten Entry Assessment Case Studies

State Version Interview Protocol


Introduction and Purpose of Interview

Thank you for your willingness to participate in this interview. We are talking with state and district administrators, principals, kindergarten teachers, preschool administrators, and professional development staff in your state about the Kindergarten Entry Assessment, including the development and implementation of the assessment and how the data are used. We’d like to learn what is working well in addition to the challenges and how people are overcoming them. We believe information about your experiences will be valuable for other states and districts interested in implementing KEAs. Your experience also can help inform federal policy and help the U.S. Department of Education build technical assistance that is useful for all states tackling this important effort. A public report will be available at the end of the study to promote sharing of lessons learned but it will not identify any individual districts, schools, or people.


This interview will take approximately 90 minutes. We might ask questions that do not apply to your state, district, or schools. If so, please just let us know that this is the case.

Consent Process

Email the consent form to the respondent before the interview so he or she can read it, sign it, and send it back to you. Ask verbal permission to audio-record the conversation.

Shape1

In our first set of questions, we are trying to learn details about the [selection and modification or development] of your kindergarten entry assessment. We understand that your state has [selected and modified XX measure OR developed its own measure XX]. We are interested in understanding the process you went through to [select, modify, or develop] [local KEA name], such as the factors you needed to consider, stakeholders that were involved, and how much time it took to reach a decision.


  1. What has been your role in terms of the selection and implementation of [the KEA measure]?

    1. Are you knowledgeable about the history of the [selection and modification] [or development] process for the kindergarten entry assessment in your state? (If not, skip to question 8.)


  1. Can you tell me a little about the history of kindergarten entry assessments in [state]? Let’s start with your timeline:


    1. When did the initiative to implement a kindergarten entry assessment begin?

    2. What prompted your state’s development or adoption of a kindergarten entry assessment?

    3. About how long overall did it take to reach a decision about the [selection, modification, or development] of [local KEA name]?

    4. Who were the key drivers in initiating a kindergarten entry assessment for your state (e.g., state legislature, governor’s office, Early Learning Council, DHHS or Head Start, the research community, foundations)?

    5. Is there a particular agency, office, or committee with oversight of the selection and implementation of the kindergarten entry assessment (e.g., Early Learning Council, governor’s office, Department of Education, Department of Health and Human Services)?

    6. Is conducting the kindergarten entry assessment mandatory? If yes, is it built into legislation? How did you decide whether to make the kindergarten entry assessment mandatory or optional?

    7. How did the state try to create buy-in among school administrators, teachers, parents, and the public for the kindergarten assessment?

    8. How does [local KEA name] fit in with other assessments your state is using (e.g., Comprehensive Assessment System for early learning in the RTT-ELC states)?

    9. How does [local KEA name] fit with other early learning initiatives or efforts your state has underway (e.g., QRIS, pre or inservice PD, expansion of PreK or full day kindergarten)?

    10. How did federal policies affect planning and decisions?

    11. How did the availability of federal funding affect planning and decisions?


  1. For what purposes was the kindergarten entry assessment selected or developed in your state?

Let them first answer the open-ended question.


In addition to the purposes you mentioned, are any of the following uses of the kindergarten entry assessment key purposes for your state? (Mark all that are confirmed):


  • Screening: To identify children who may need additional supports or further assessment

  • Data-informed instruction: Identify specific skills where students need improvement to inform teaching of individual students or groups of students or curriculum planning

  • Family engagement: To inform parents about children’s strengths and needs and ways to support learning

  • Professional development: To inform what professional development is offered to teachers based on common student needs

  • Professional development: To help teachers understand where children should be on a comprehensive set of domains and how to assess children's progress

  • Benchmark: A baseline to benchmark learning in order to track progress throughout the year

  • EC system improvement: A snapshot to inform improvement of early learning programs/policies by understanding the needs of children

  • Longitudinal data tracking: To follow cohorts and subgroups of students over time to inform progress on closing achievement gaps

  • Alignment around school readiness: To promote dialogue between early childhood and elementary educators

  • Other (specify): __________________________________________________________



  1. Who was involved in the selection of your state’s kindergarten entry assessment measure and process?


In addition to the groups you mentioned, were any of the following groups engaged in the selection of [local KEA name] measure and process? (Mark all that are confirmed):


  • District representatives

  • Kindergarten teachers

  • First through third grade teachers

  • PreK teachers

  • Parents

  • Foundation representatives

  • Contractors/expert consultants, including researchers from higher education

  • Early childhood advocacy groups

  • Culturally and linguistically diverse groups

  • Teacher unions

  • Other (specify): __________________________________________________________


  1. What was the range of engagement of each of these stakeholder groups?

Probe: For example, some may have provided review and approval of materials, others may have been active in decisionmaking.

  1. Were any costs associated with including these stakeholders, such as stipends, fees, or release time? If so, what funding source supported these costs?


  1. Now we’d like to ask about the final decision: What were the primary reasons you chose to [use existing, modify existing, or develop a new measure]?

  • Probes for modification: How did you modify the [measure]? Why did you feel you needed to modify the [measure]? Were there particular aspects of the [measure(s)] that did not fit your population, purpose(s), or approach?

  • Probes for development: Why did you choose to develop a new measure?


  1. Did you have a set of criteria that the assessment needed to meet? If so, what were the criteria? How were they weighted? How were they used?

  2. What criteria and/or evidence were used to determine the validity and reliability of [local KEA name] for its intended purposes and populations (e.g., low to high performers)?

    1. Were there any purposes for which your state originally wanted to use its kindergarten entry assessment, but [local KEA name] was not valid for that purpose?

  1. Was the match to early learning standards considered? If so how? Was the match to K-12 standards considered? If so, how?

  2. Was burden considered? If so, how

  3. In what ways did particular aspects of the [measure] fit your population, purpose(s), or desired approach? What is the developmental range of the [measure]?

  4. How were the needs of English learners considered when selecting the assessment? What, if any criteria and/or evidence were used to determine the validity and reliability of [local KEA name] for English learners?

  5. How were the needs of students with disabilities considered when selecting the assessment? What, if any criteria and/or evidence were used to determine the validity and reliability of [local KEA name] to assess the skills of children with disabilities?

  6. How was the cost considered?

  7. Was a connection to a PreK and/or K–3 assessment considered? If so, how?

  8. What other measures for your kindergarten entry assessment did you consider? What factors led to them not being selected?


Is there a document that summarizes the selection process? If so, could you share a copy?


  1. What were the most important factors in supporting the selection or development of [local KEA name]?


  1. What were the biggest challenges with selecting or developing [local KEA name]?


  1. What solutions have you identified for addressing those challenges?


  1. Was [local KEA name] pilot- or field-tested? If yes:


  1. Can you briefly describe the pilot- or field-testing process?

Probe: How many people/locations participated in the testing? How was this sample created? Were English learners and students with disabilities included in the testing? What was the purpose of the pilot or field test? What did you measure? Who analyzed the data? How was feedback collected? How were results from the pilots and field tests shared?

  1. What did you learn from the results?

    1. What did you learn about reliability, validity, sensitivity of items, and differential item functioning (if applicable)?

  2. How were the results used to modify the content or procedures of your kindergarten entry assessment?


Is there a copy of the pilot and/or field test findings you could share with us?


  1. Are all the districts in the state using [local KEA name]?


    1. If not, who is opting to use the assessment?


Do you have a list of districts that are participating that you can provide to me?


    1. Are there any patterns to the participation (e.g., low-income districts, rural districts)?

    2. Why do you think districts that are not participating have opted out?

    3. What incentives are there to use the assessment?


  1. How many children were assessed in fall 2014? What percentage of the state’s kindergarten children does this represent? Probe: If you do not have this information now, could you email it to me?


  1. What have been the approximate overall and per-student costs to implement [local KEA name] so far?


  1. What were the approximate upfront costs of developing or selecting [local KEA name]?

  2. What are the approximate costs to implement and sustain [local KEA name] each year? (Please include training costs.)

  3. How is [local KEA name] funded?

  4. How will [local KEA name] be sustained in the future (beyond federal grant funding)?


Now we want to learn a bit more about [local KEA name] itself—the skills and abilities it assesses and what you learn from it.


  1. First, according to our review of documents, it appears [local KEA name] is assessing the following domains: [list them]. Please tell me if these are incorrect or if we are missing any domains. (Mark all that apply.)


  • Language and literacy development;

  • Cognition and general knowledge, including early mathematics and early scientific development;

  • Approaches toward learning;

  • Physical well-being and motor development, including adaptive skills;

  • Social and emotional development

  • English language development (for students whose home language is NOT English)

  • Other (specify): __________________________________________________________


  1. How does [local KEA name] align with the domains of your state’s early learning and/or kindergarten standards? Are there other domains that are not assessed?

  2. How does [local KEA name] fit in with existing statewide PreK and Kindergarten assessments?


Is there a document that maps [local KEA name] to your early learning and/or K-12 standards? If so, could you share a copy?


  1. What components and/or measures make up your kindergarten entry assessment?


  1. For each domain, what types of evidence are collected as part of the assessment?

Probe: For example, are knowledge, skills, and abilities assessed through observation, checklists, direct assessments, and/or parent reports?


In the next series of questions, we want to learn a bit more about the nuts and bolts of administering [local KEA name].


  1. How is [local KEA name] administered? [Probe about each component.]


  1. Who administers or provides information about the student for [local KEA name]?

  2. How are families involved in providing information?

  3. To what degree is technology used to support the administration or scoring of [local KEA name]


  1. What did the state communicate to parents, teachers and administrators about the purpose of the [local KEA name] and what it is trying to assess? How was this information communicated?


Can you share any relevant documents?


  1. What training, supports, or resources do teachers and administrators receive on the administration of [local KEA name]? Please describe the trainings. Probe: What training, supports or resources do they receive on the administration of [local KEA name]? What training, supports or resources do they receive for using the data from [local KEA name]? Do you receive the same training as the principals and teachers?

    1. So would you say your training and support model uses [read the following list]? (Mark all that apply.)


  • Workshops

  • Webinars

  • Coaching

  • Online modules (e.g., PPTs on websites)

  • Resource documents such as administration manuals

  • Other (specify): ________________________________________


    1. Is the training or support offered the same for all schools and districts statewide or is it school or district specific? If different, how do trainings differ and who tailors them?

    2. Who provides the training, supports, or resources (the state, the district, a contractor)? If using a train-the-trainer model, who are the local trainers and who provides their training?

    3. What plans are there to continue training after the initial introduction (e.g., booster training, coaching, professional learning communities)?


  1. When districts administer the [local KEA name], is it a standard process across all participating districts in the state? Or can districts or schools adapt their administration or implementation procedures? If so, what can vary and who can choose?

Probe: For example, can districts choose which measure(s) or components of measures (e.g., a particular domain) they administer? Can they sample children (versus universally assessing all children in their classroom)? Can they change administration procedures (e.g., opt in or out of parent reports, home observations)?


18a. If districts use different measures or processes to collect data, does the state collect information on the measures or processes that districts choose to use? Does the state compare results across districts using different measures or data collection processes to inform implementation?


Is there a document that summarizes the implementation guidelines that you can share with me?


  1. How do you and your districts monitor and evaluate whether [local KEA name] is being implemented according to the state's recommended guidelines?


    1. What roles do districts and schools play in the monitoring process?

    2. What issues have been uncovered? How have they been addressed?

    3. What does the state do if there are issues with implementation?

    4. In what ways does the state modify the kindergarten entry assessment based on information learned from monitoring or evaluating [local KEA name] implementation?


  1. Please describe the data submission process and requirements for districts.


    1. Who is involved in submitting the data?

    2. How do schools and districts submit data to the state?

    3. What role does technology play in the process?

    4. Are [local KEA name] results entered into a statewide longitudinal data system? If yes, what data are entered? If no, do they become part of another student permanent academic record?


Finding one assessment that can accurately identify young children’s development status across multiple learning domains is challenging. Ensuring that the measure is also linguistically appropriate for children whose primary language is not English adds another layer of complication, as does the need to accurately assess the strengths and needs of young children with developmental delays or disabilities.


Within this context, we’d like to ask about your considerations of these particular populations in the selection and implementation of [local KEA name].


  1. Was [local KEA name] designed or selected with English learners in mind? If so, how?


    1. Does implementation of [local KEA name] differ for students who are English learners (e.g., children are assessed in their home language, use of nonverbal stimuli, observations conducted by bilingual staff or adults)? If so, how?

    2. What future plans are there to make [local KEA name] more appropriate for English learners?


  1. Was [local KEA name] designed or selected with children with developmental delays or disabilities in mind? If so, how?


  1. In what ways are those who administer the kindergarten entry assessment prepared/trained to use [local KEA name] with children with developmental delays or disabilities?

  2. What accommodations are there in place for use with children with developmental delays or disabilities? For example, are children given augmentative or alternative communication or written systems, visual or sensory support, assistive equipment or devices? Is input collected from special education staff?

  3. Are there particular disability categories for which [local KEA name] is not appropriate?

  4. Are alternate assessments used with these children? If so, please describe.

  5. In what ways are the results used for screening and referral or IEP purposes?

  6. In what ways are expectations or interpretations of the findings modified or adapted for children with developmental delays or disabilities?

  7. To what degree do you believe [local KEA name] accurately assesses the knowledge, skills, and abilities of children with developmental delays or disabilities?

  8. What future plans are there to make [local KEA name] more appropriate for children with developmental delays or disabilities?


  1. What have been the most important factors in supporting the implementation of the KEA? (Q14)


  1. What have been the biggest challenges with implementing the KEA?

(Use probes below.)

  • Funding, If so, how?

  • Volunteerism/participation of districts, If so, how?

  • Administrator buy-in, If so, how?

  • Teacher buy-in or participation, If so, how?

  • The [local KEA name] instrument itself, If so, how?

  • Lack of training, If so, how?

  • Not enough or unclear administration directions, If so, how?

  • Political challenges, If so, how?

  • Timing challenges, If so, how?

  • Lack of shared vision among stakeholders about the purpose and use of the [local KEA name] or communication about its purpose and use with districts and schools, If so, how?

  • Technology difficulties (e.g., website for scoring and data entry not ready or functioning correctly), If so, how?

  • Other (specify): __________________________________________________________


    1. What solutions have you identified for addressing those challenges?


Now we’d like to switch gears and ask you a set of questions about how [local KEA name] results are shared and used to inform policy and practice. Let’s begin with the results of [local KEA name] and how people are informed of them.


  1. What information do you receive on [local KEA name] results?


  1. When and how often do you receive information?

  2. In what format is information shared (e.g., access to database, printed or online reports, parent conferences)?

  3. How is the information broken out (e.g., individual students or aggregated by student demographic characteristics, districts, schools, classrooms, or by preschool program)?

  1. What does the information tell you for each of the groups?

  1. What supporting information do you receive to interpret results and consider implications for learning supports at school and at home?


  1. Who else receives the results of [local KEA name]? (check all that apply) Probe for any groups not mentioned. Don’t give probe for respondent’s affiliation.


  • State Education Agency administration

  • State agencies that oversee preschool programs and services (e.g., DHHS and Head Start)

  • District administration

  • Principals

  • Kindergarten teachers

  • First through third grade teachers

  • Preschool educators

  • Parents (If so, are results translated into families’ languages?)

  • Policymakers/Legislators

  • Other (specify): ________________________________________


  1. What information do others receive about the results? Let’s take it one at a time and I will ask you some detailed questions about what the different roles receive.


    1. What is the schedule for sharing [local KEA name] results? In other words, when and how often does [each recipient checked above] receive information?

    2. In what format is information shared (e.g., access to database, printed or online reports, parent conferences)?

    3. How is the information broken out (e.g., individual students or aggregated by student demographic characteristics, districts, schools, classrooms, or by preschool program)?

    4. What supporting information does [each recipient checked above] receive to interpret results and consider implications for learning supports at school and at home?

    5. What challenges are there with sharing information with [each recipient checked above]?


Is there a report with aggregate results from 2014 available that you can share with me?

Do you also have sample reports that are tailored for district staff, principals, teachers, or parents you could share with me?


Now let’s move to how [local KEA name] results are used to inform policy and practice.


  1. How do state education and other state-level administrators (e.g., DHHS or Dept. of Early Learning) use [local KEA name] results to inform decisions about issues statewide or those concerning specific districts or preschool programs?

After getting a description, probe: Do you use results to make decisions to (mark all that apply):


  • Identify students in need of extra services or further assessment? If so, how?

  • Identify specific skills where students need improvement to inform teaching of individual students or groups of students or curriculum planning? If so, how?

  • Engage families in knowing children’s strengths and needs and ways to support learning at home? If so, how?

  • Promote dialogue between early childhood and elementary educators about ways to increase children’s readiness (e.g., align expectations and/or instruction)? If so, how?

  • Help teachers understand where children should be on a comprehensive set of domains and how to assess children's progress? If so, how?

  • Benchmark learning to track progress in kindergarten throughout the year, If so, how?

  • Follow cohorts and subgroups of students over time to inform progress on closing achievement gaps? If so, how?

Make decisions about [at state, district, school, or program levels]:

  • Budget allocations or targeting of investments? If so, how?

  • Curriculum? If so, how?

  • Professional development for PreK or K-12 teachers? If so, how?

  • Early childhood policies and practices? If so, how?

  • K-12 policies and practices? If so, how?

  • Other (specify): ________________________________________


  1. After thinking about how you and other state administrators use the data, what information from the [local KEA name] do you think is most useful?

  2. What other information would have been useful?


  1. How do you expect districts to use results? For example, do you expect them to use results to make decisions about [read list]? (Mark all that apply):


  • Identify students in need of extra services or further assessment? If so, how?

  • Identify specific skills where students need improvement to inform teaching of individual students or groups of students or curriculum planning? If so, how?

  • Engage families in knowing children’s strengths and needs and ways to support learning at home? If so, how?

  • Promote dialogue between early childhood and elementary educators about ways to increase children’s readiness (e.g., align expectations and/or instruction)? If so, how?

  • Help teachers understand where children should be on a comprehensive set of domains and how to assess children's progress? If so, how?

  • Benchmark learning to track progress in kindergarten throughout the year, If so, how?

  • Follow cohorts and subgroups of students over time to inform progress on closing achievement gaps? If so, how?

Make decisions about [at district, school, or program levels]:

  • Budget allocations or targeting of investments? If so, how?

  • Curriculum? If so, how?

  • Professional development for PreK or K-12 teachers? If so, how?

  • Early childhood policies and practices? (e.g., Identify changes needed across PreK programs throughout the district), If so, how?

  • K-12 policies and practices? (e.g., District-wide planning for Kindergarten) If so, how?

  • Other (specify): ________________________________________


a. How do you support districts to use the results in this way? What training, supports, or resources do regional and district administrators receive to help them use [local KEA name] results?


  1. Describe any policies regarding how [local KEA name] data should not be used.
    Listen for but don’t say the uses below, let the respondent say them. (Check all that apply):


  • Determine promotion or prevent kindergarten entry in any way

  • Principal evaluation, teacher evaluation

  • Evaluation of preschool programs

  • Screenings of children

  • Other (specify): __________________________________________________________


  1. How are these policies communicated to districts, schools, and teachers?

  2. How are these policies enforced? Please give me a specific example.


  1. What have been the most important factors in supporting use of [local KEA name] results to improve students’ educational experience?


  1. What have been the biggest challenges with using [local KEA name] results?

(Use probes below.)


  • Lack of training, If so, how?

  • Reports hard to interpret, If so, how?

  • Lack of alignment with standards, If so, how?

  • Timing of when results are shared, If so, how?

  • Lack of perceived reliability, If so, how?

  • Lack of buy-in, If so how?

  • Other (specify): __________________________________________________________


    1. What solutions have you identified for addressing those challenges?


This is our final set of questions! In closing we’d like you to reflect on how your state, districts, and schools administered and used kindergarten entry assessment data and try to offer lessons learned.


  1. Looking back over the [local KEA name] implementation, how well do you think it has gone? Has [local KEA name] served the purposes it was intended to serve? Probe: refer back to the general purpose they say has been communicated


    1. What concerns did you have going into implementation? Do these concerns still remain now that you have begun implementing [local KEA name]? Do you have new concerns that now that you are implementing [local KEA name]?

    2. What parts of implementation have gone especially well?

    3. How could implementation be improved?


  1. Are there lessons learned or recommendations you can share with other states about kindergarten entry assessment selection, implementation, and use of kindergarten entry assessment results to inform policy and practice?


Thank you for your time and thoughtful responses.

17

Appendix D: Stat Version Interview Protocol

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorTracy Huang
File Modified0000-00-00
File Created2021-01-26

© 2024 OMB.report | Privacy Policy