OMB comments and responses to comments

1875-0258 ED response to OMB comments on PEP.docx

Evaluation of the Carol M. White Physical Education Program

OMB comments and responses to comments

OMB: 1875-0258

Document [docx]
Download: docx | pdf


Comments on 201207-1875-003  Evaluation of the Carol M. White Physical Education Program (PEP)

 

OMB: This seems to be an improper submission. In May 2011, OMB cleared ED’s request to conduct a grantee-survey-based evaluation of the PEP program (control number 1875-0258). The current submission is described as a revision---under the same control number---but seems to want to be a new collection (but related to the “old”), adding a case study component. Very few sections of the supporting statement discuss both the surveys and the case studies as a whole (the burden sections being an exception). Section A8’s description of this as “the case study component” of the broader evaluation is the most overt in the document (though that passage, on Federal Register postings, is also disturbing for saying that notices on this “case study component” “will be published” (future tense). The statement includes several references to the “OMB-approved” surveys---and clearly wants to assume their continued clearance---even though much of this “revision” package discards mention of them.


ED: This is a Revision to a previously approved ICR for approval to conduct interviews with 75 individuals from five PEP projects.  ED revised Supporting Statements A and B to make it clear that the full ICR includes the previously approved surveys and the case study interview protocols. ED changed the tense in A8 regarding the Federal Register notices to past tense.

 


OMB: The burden calculations in this “revision” seem to do an odd fusion of both the survey and case study components, combining the number of respondents to the surveys (77; all grantee project directors) and the number of hours associated with the case studies (also 77; 80-minute commitment from 5 selected grantee project directors + 60-minute commitment from 70 “project and partner personnel” across the selected sites). If this is strictly a case-study focused package, then the discrepancy is minor: 77 hours but 75 respondents. But if this is actually intended to be a package on the burden of the whole evaluation then the table-reported numbers are off.


ED: The previously approved collection was for 77 hours for each of three years. However, the surveys actually only occur in two years, so the annual burden for the previously approved collection should have been 51 hours (=77 hours*3 years/2 surveys). A13 clarifies the total burden for the entire study (230.5 hours), or 77 annual hours.


 

OMB: ED posted a 60-day notice in the FR on 7/3/12, under the same generic “Evaluation of the Carol M. White Physical Education Program” title as the ICR. As “promised” by Section A8, it seems to focus squarely on the topic areas of the case study work (formation of partnerships and use of body mass index measurements) and mentions, at the very end, that “ED will conduct five case studies.” Yet that posting claims 25 respondents and 77 hours of burden, with no idea where the 25 comes from.


ED: ED corrected this language in a Federal Register Notice on XXXX [talk with Kathy]. The corrected language is attached.  

 


In the interest of strengthening a case-study-centric submission:

OMB: The supporting statement is extremely lacking in justification for pursuing the case-study format. “Establishment of official partnerships and the collection and use of body mass index (BMI) measurements” are described as “two new program competitive preference priorities.” What that means---how “new” is new, whether and how the changes are binding on the FY10 awardees used as the universe for this collection, and why those two topics are so new/different that they need special treatment rather than (say) adding questions to the Year 3 survey---is unclear. Section B1 suggests that findings from the Year 1 Survey were used to determine the “purposive” sample of case study grantees by helping ensure that “grantees who proposed both priorities actually established the partnerships and implemented BMI data collection”---if the topics are already covered by the Year 1 and 3 surveys, why the case studies?


ED: ED revised Supporting Statement A to make it clear why we propose a case study. Specifically, ED revised A2 and A16.



OMB: The supporting statement also contains no explicit recognition that---being based on a purposive sample of grantees and a who-knows-how sample of “project and partner personnel” within those chosen sites---the findings from the case study interviews are not generalizable to any particular population, and so cannot be taken as definitive or thorough. Section A4’s statement that “data from the surveys and the interviews will be triangulated to capture more detailed information about PEP projects’ partnerships and BMI data” is difficult to understand, but seems to hint at more being made of the interview data than is really justifiable.


ED: ED revised Supporting Statement A to make it clear that the case study findings will not be generalizable. Specifically, ED revised A2 and A16.



OMB: The package does not appear to attempt a definition of who “project and partner personnel” are, and Part B lacks explanation of how they will be identified and selected (or, at least, up to 14 of them per site will be so selected). The attached case study protocols includes a lengthy list of the people who might be considered “project personnel” but only a vague reference to “key stakeholders at the community partners.”


ED: ED revised Supporting Statement B to explain why we do not have a better idea of who the exact project and partner personnel will be. Specifically, ED revised B1.

 


Interview protocol (applies in two places):

OMB: The interview protocol asks “What are your perceptions of the impact of these activities on the behaviors and/or health of the youth in your community?” both of the PEP Project Personnel and the Community Partners.  It’s on p 15 and 23 of both the CBO and LEA surveys.

We need some version that’s not so open to subjectivity.  I suggest that we reword to ask something like “What evidence do you have of the impact of these activities...?” 


ED: As the “case study method inherently requires subjective and judgmental elements” and “case studies are always subjective and nongeneralizable” (GAO/PEMD-91-10.1.9), the study team and technical working group (TWG) experts took care to ensure the wording of questions reflected such subjectivity. The question in consideration emphasizes the individual’s own personal “perceptions” of the impact of the PEP activities, and the follow-up protocol procedure (“If necessary, ask for specific examples/details: e.g., changes in teaching methods/pedagogical approach, changes in policies, changes in youth eating habits and food choices, increased activity in youth and/or families, increased teacher, youth, and/or parent knowledge of healthy eating and/or physical activities”) was included to elicit specific details (i.e., what the respondents view as evidence) regarding the individuals’ responses. In addition, by wording the question in this manner, the respondent can indicate if they believe there was an impact in either or both the positive or negative direction, or if they believe there was no impact at all. We believe that asking for “evidence,” as opposed to personal perceptions, is leading as well as more vulnerable to assumed accuracy or objectivity, when such reported evidence cannot be confirmed with the current study design and it is still subject to the respondents’ own biases. As reflected in the revised OMB materials and the approved case study design, all the case study data will be treated and reported as illustrative and not generalizable to the population of grantees.



2


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorErica Lee
File Modified0000-00-00
File Created2021-01-30

© 2024 OMB.report | Privacy Policy