National Weatherization Assistance Program Evaluation
Supplemental Information for the Office of Management and Budget
November 2009
Data Collection Instrument: S1 All States Program Information Survey
Purpose: Collect information to describe state-level weatherization program operations, leveraging activities, support for weatherization training, and quality assurance monitoring.
Sequence (among state instruments): 2
Sample Frame: 50 state WAP agencies + District of Columbia
Sampling: census
Anticipated Respondent: Initial contact with state weatherization director who may assign parts of the survey to staff.
Survey Mode: Currently planning a mixed mode approach whereby respondents have a choice of (1) completing an online form (preferred), (2) providing the data via telephone, (3) completing a formatted, electronic copy of the instrument, or (4) filling in a paper version of the instrument.
Communications: Prior to the survey, state weatherization directors will have received an introductory package consisting of a letter from ORNL that introduces the evaluation’s history, purpose, and implementing team; a list of the data collection requests we will be making; a description of the process we will follow; and the name and contact information for that state’s case manager on the evaluation team for any questions.
The request for completion of this instrument will be sent to the state weatherization director or his/her designee via e-mail with a simple message, such as:
Dear [name]:
This is the second data request for [insert state name] from the national WAP evaluation team. In this survey, we are asking for information about how your state weatherization program worked in PY 2008, what types of funds were received to leverage DOE weatherization funds, what types of weatherization training were required and supported, and how your state monitored the quality of weatherization work performed in your state. Please provide this information at [insert url] by [requested completion date – about 2 weeks after the e-mailing]. If you would prefer, we can also take the information via telephone or on a form we can send you in electronic or paper form.
Feel free to contact me with any questions you may have about this information request. Thank you in advance for your assistance with this part of the evaluation.
Sincerely,
[case manager]
[signature line with contact information]
Non-Respondent Follow-Up: Non-responding states will receive formal follow-up inquiries by telephone and/or e-mail from the evaluation team case manager assigned to that state. The first formal follow-up will occur three weeks after a state receives the survey. Then, the state will be contact two more times, each time after a two week interval. If the state has not responded after the third contact, the project manager will contact the state to inquire about any difficulties the state might be having in completing the survey. If appropriate, the project will dispatch a staff member to help the state compile materials to complete the survey.
Burden Estimate: ORNL estimated an average burden of 16 hours per responding state. This burden estimate was developed based on expert judgment. ORNL has over three decades of experience in working with states, local weatherization agencies, and utilities in collecting household weatherization and billing data. ORNL has worked especially closely with agencies and utilities over the years. ORNL understands how much effort is needed to fill out this form, on average, by a typical agency.
Survey Testing: This survey was sent to several states for review and was also reviewed in draft form by an external review panel. Feedback from these two reviews indicated that the questions were appropriate. Because of the effort required by a state to fully complete this survey, the survey was not formally tested.
Response to Specific OMB Questions:
OMB Question |
Response |
Program Characterization Section |
|
i. Was question 2 used previously and/or tested? We find it confusing and are dubious that respondents will complete it accurately. |
The figure was deleted and a more straightforward question was added. |
ii. Q 10 and 11, a “d” is missing from “supposed.” |
This revision was made. |
iii. Q 11 the heading of “data format” should be something like “reporting frequency.” |
This revision was made. |
iv. Q12 and 26, what does “average” mean? Of what? Need better midpoint label. |
The word ‘average’ was replaced by the word ‘medium’ in all appropriate places. |
v. Q14 and Q20, what is the reference period? The program year? Ever? |
PY 2008 was added. |
vi. Q16, a “triple barreled” question. Tends to be cognitively burdensome and unreliable. Need to simplify and/or split. |
This question was improved. |
vii. Q16 and others – is it okay to admit that you don’t follow the department’s rules? Why wouldn’t there be concern about social desirability/lack of candor in this and other responses? |
States do not have to follow DOE rules when they weatherize homes that have zero (0) DOE dollars invested in those homes. We are interested in what other rules the states may have to follow. |
viii. Q17b, 18 and others – does DoE know that states and agencies keep data in these formats, thereby enabling extracting data from records or is this going to be an estimate? If the latter, what testing demonstrates the reliability and consistency of these estimates? |
We believe that states have these data. |
ix. Q30, shouldn’t these categories be mutually exclusive? If not, what does multiple selection mean? Unclear. |
These categories follow the jargon of the field. We expect a state to choose one federal guideline and then maybe the state poverty level guideline. |
k. Program Operations |
|
i. Q7 and 7a, what does “flexible” mean in this regard? Define/clarify. |
We have added an explanation. |
ii. Q9-15, were these questions tested? How? Results? We are concerned about the ability to differentiate among some of these items as well as the reliability of “opinions.” It may more reliable and less burdensome to ask respondents to rank the items this list. |
These questions were reviewed by several states and by an external peer review panel. We believe that they are reasonable. |
l. Training |
|
i. Q1, what does “moderate” training mean? Minimally acceptable? Please define/consider alternative midpoint on scale. |
We now use the term ‘moderately well trained’ in all appropriate places. |
ii. Again, why change formats from mark one to fill in blank when not necessary? This is cognitively burdensome. Consider response options such as “Circle one number….1 2 3 4 5.” |
We added the circle the number option in all appropriate questions. |
m. Monitoring |
|
i. Q4, “relative” to what? Each other, agencies overall training budget? Define/clarify. |
We clarified these types of questions. |
ii. Q7, meaning of “average?” For the field? For the state? Objectivity? |
We replaced this term. |
National Weatherization Assistance Program Evaluation
Supplemental Information for the Office of Management and Budget
November 2009
Data Collection Instrument: S2 All Agencies Program Information Survey
Purpose: Collect information to describe agency-level weatherization program operations and leveraging activities,.
Sequence ( local agency instruments): 2 for non-sampled agencies, 4 for sampled agencies
Sample Frame: 904 local WAP agencies
Sampling: census
Anticipated Respondent: Initial contact with local agency weatherization director who may assign to the two staff people who work most closely with (1) the agency’s WAP database.
Survey Mode: Currently planning a mixed mode approach whereby respondents have a choice of (1) completing an online form (preferred), (2) providing the data via telephone, (3) completing a formatted, electronic copy of the instrument, or (4) filling in a paper version of the instrument.
Communications: Prior to the survey, local agency weatherization directors will have received an introductory package consisting of a letter from ORNL that introduces the evaluation’s history, purpose, and implementing team; a list of the data collection requests we will be making; a description of the process we will follow; and the name and contact information for that agency’s case manager on the evaluation team for any questions.
The request for completion of this instrument will be sent to the local agency weatherization director or his/her designee via e-mail with a simple message, such as:
Dear [name]:
This is the first survey request for [insert agency name] from the national WAP evaluation team. In this survey, we are asking for information about how your agency weatherization program worked in PY 2008, and what types of funds were received to leverage DOE weatherization funds. Please provide this information at [insert url] by [requested completion date – about 2 weeks after the e-mailing]. If you would prefer, we can also take the information via telephone or on a form we can send you in electronic or paper form.
Please feel free to contact me with any questions you may have about this information request. Thank you in advance for your assistance with this part of the evaluation.
Sincerely,
[case manager]
[signature line with contact information]
Non-Respondent Follow-Up: Non-responding states will receive formal follow-up inquiries by telephone and/or e-mail from the evaluation team case manager assigned to that state. The first formal follow-up will occur three weeks after a state receives the survey. Then, the state will be contact two more times, each time after a two week interval. If the state has not responded after the third contact, the project manager will contact the state to inquire about any difficulties the state might be having in completing the survey. If appropriate, the project will dispatch a staff member to help the state compile materials to complete the survey.
Burden Estimate: ORNL estimated an average burden of 8 hours per responding agency. This burden estimate was developed based on expert judgment. ORNL has over three decades of experience in working with states, local weatherization agencies, and utilities in collecting household weatherization and billing data. ORNL has worked especially closely with agencies and utilities over the years. ORNL understands how much effort is needed to fill out this form, on average, by a typical agency.
Survey Testing: This survey was sent to several agencies for review and was also reviewed in draft form by an external review panel. Feedback from these two reviews indicated that the questions were appropriate. Because of the effort required by an agency to fully complete this survey, the survey was not formally tested.
Response to Specific OMB Questions:
OMB Question |
Response |
|
The survey will be ‘sent’ the agency director but we expect that agency staff will complete the survey. |
|
Included. |
|
Several agencies and an external review panel reviewed the survey. No one was asked to complete it in its entirety. |
|
We prefer agencies to complete the web version but will allow agencies to complete paper versions and will also conduct the survey over the phone as requested. |
|
This is a pre-programming draft. |
|
Explained above. |
|
The web version will allow respondents to save information input one day and continue to work on succeeding days. |
|
We know that the agencies keep records for ‘non-DOE’ weatherized homes. |
|
Local weatherization agencies are not required to follow DOE rules when weatherizing homes in which no DOE funds are invested. We wish to find out what kinds of rules guided non-DOE weatherization efforts. |
National Weatherization Assistance Program Evaluation
Supplemental Information for the Office of Management and Budget
November 2009
Data Collection Instrument: S3 Subset of Agencies Detailed Program Information Survey
Purpose: Collect information to describe more detailed agency-level weatherization program operations, leveraging, weatherization staff and training, weatherization activities, and quality assurance.
Sequence (among state & local agency instruments): 5
Sample Frame: 400 local WAP agencies
Sampling: It was decided to implement a two-stage sampling approach, with the first stage encompassing the sampling of local weatherization agencies (also known as sub-grantees) and the second stage encompassing the sampling of units weatherized by these agencies (see DF 2/3 note). This approach follows the sampling plan implemented for the first WAP evaluation two decades ago developed by Tommy Wright. This approach reduces the burden on the population of local weatherization agencies and overall project costs for collecting weatherization data.
The sampling approach was designed to meet several criteria. First, the number of agencies to be included in the sample was balanced against the need for data on the number of weatherized units needed to produce statistically defensible results for energy savings in weatherized units heated with natural gas and electricity. Fewer sampled agencies in the sample would mean that more units from each sampled agency would need to be sampled. Fewer agencies would also mean less program diversity and climate representation. More agencies would mean fewer units from each sampled agency would need to be sampled. More agencies would also reduce the ability to correlate agency characteristics with energy savings and other program outcomes. The balance chosen was to sample 400 agencies with the provision that one-third of their weatherized units that heat with natural gas and electricity would be randomly chosen to be included in that sample.
Second, larger agencies needed to have a higher chance of being included in the sample than smaller agencies. This provision would help ensure that the one-third sample rate for units would yield the required number of units nationwide for analysis. This provision also helps ensure that the project will collect data from those agencies that are responsible for most weatherization done in the country. Third, along these lines, it was decided that the sample ought to include with certainty the two biggest local weatherization agencies. Fourth, it was decided that the sample of local weatherization agencies needed to include at least one from each state.
To produce a sample of local weatherization agencies, probability proportional to size (PPS) sampling was used. The size of the agencies was measured by their DOE WAP funding in 2008 (see original Supporting Statement, Part B for more details). The PPS procedure in SAS was used to draw the sample, given the criteria listed above.
Anticipated Respondent: Local agency weatherization director and/or deputy director and other key and experienced staff.
Survey Mode: Currently planning a mixed mode approach whereby respondents have a choice of (1) completing an online form (preferred), (2) providing the data via telephone, (3) completing a formatted, electronic copy of the instrument, or (4) filling in a paper version of the instrument.
Communications: Prior to the survey, local agency weatherization directors will have received an introductory package consisting of a letter from ORNL that introduces the evaluation’s history, purpose, and implementing team; a list of the data collection requests we will be making; a description of the process we will follow; and the name and contact information for that agency’s case manager on the evaluation team for any questions.
The request for completion of this instrument will be sent to the local agency weatherization director or his/her designee via e-mail with a simple message, such as:
Dear [name]:
This is the second survey request for [insert agency name] from the national WAP evaluation team. In this survey, we are asking for additional information about how your agency weatherization program worked in PY 2008, and what types of funds were received to leverage DOE weatherization funds. We are also asking for information about other agency activities, for example, related to training and quality assurance. Please provide this information at [insert url] by [requested completion date – about 3 weeks after the e-mailing]. If you would prefer, we can also take the information via telephone or on a form we can send you in electronic or paper form.
Please feel free to contact me with any questions you may have about this information request. Thank you in advance for your assistance with this part of the evaluation.
Sincerely,
[case manager]
[signature line with contact information]
Non-Respondent Follow-Up: Non-responding states will receive formal follow-up inquiries by telephone and/or e-mail from the evaluation team case manager assigned to that state. The first formal follow-up will occur three weeks after a state receives the survey. Then, the state will be contact two more times, each time after a two week interval. If the state has not responded after the third contact, the project manager will contact the state to inquire about any difficulties the state might be having in completing the survey. If appropriate, the project will dispatch a staff member to help the state compile materials to complete the survey.
Burden Estimate: ORNL estimated an average burden of 16 hours per responding agency. This burden estimate was developed based on expert judgment. ORNL has over three decades of experience in working with states, local weatherization agencies, and utilities in collecting household weatherization and billing data. ORNL has worked especially closely with agencies and utilities over the years. ORNL understands how much effort is needed to fill out this form, on average, by a typical agency.
Survey Testing: This survey was sent to several agencies for review and was also reviewed in draft form by a external review panel. Feedback from these two reviews indicated that the questions were appropriate. Because of the effort required by an agency to fully complete this survey, the survey was not formally tested.
Response to Specific OMB Questions:
OMB Question |
Response |
|
The survey will be sent to the agency director. We expect the director and/or other key staff to complete the survey. |
|
Included. |
|
Several agencies and an external review panel reviewed the survey. No one was asked to complete it in its entirety. |
|
We prefer agencies to complete the web version but will allow agencies to complete paper versions and will also conduct the survey over the phone as requested. |
|
This is a pre-programming draft. |
|
Explained above. |
|
The web version will allow respondents to save information input one day and continue to work on succeeding days. |
|
Done. |
|
This will be the second survey sent to agencies. It will be the fifth information request (please see timeline). |
|
These questions were reviewed by several states and by an external peer review panel. We believe that they are reasonable. |
ii. Q8 and Again, why change formats from mark one to fill in blank when not necessary? This is cognitively burdensome. Consider response options such as “Circle one number ….1 2 3 4 5.” |
i. A priority list is a list of weatherization measures prioritized by expected cost effectiveness of energy savings. When a house is audited, the auditor uses the list to determine what measures should be installed in the unit. For example, if the home does not have ceiling insulation and if ceiling insulation is at the top of the priority list, then that home will receive this measure. If the unit already has ceiling insulation, then the auditor will continue down the list to find the next measure that is not found in the home. Priority lists are used in place of computer audits by many states. We are interested in comparing experiences with priority lists versus computer audits. ii. We added the circle the number option in all appropriate questions. |
|
We know that the agencies keep accounting records that can provide these types of data. |
National Weatherization Assistance Program Evaluation
Supplemental Information for the Office of Management and Budget
October 2009
prepared by Ingo Bensch, Energy Center of Wisconsin
Data Collection Instrument: DF1 All States Agencies Information Data Form
Purpose: Collect basic information about each state’s local agencies for use in sampling the 400 agencies to be studied in more detail, as well as special studies such as the bulk fuel studies (fuel oil and propane) and the case studies of high-performing and innovative agencies.
Sequence (among state & local agency instruments): 1
Sample Frame: 50 state WAP agencies + District of Columbia
Sampling: census
Anticipated Respondent: Initial contact with state weatherization director who may assign to the two staff people who work most closely with (1) the agency’s WAP database and (2) local agencies
Survey Mode: Currently planning a mixed mode approach whereby respondents have a choice of (1) completing an online form (preferred), (2) providing the data via telephone, (3) completing a formatted, electronic copy of the instrument, or (4) filling in a paper version of the instrument.
Communications: Prior to the survey, state weatherization directors will have received an introductory package consisting of a letter from ORNL that introduces the evaluation’s history, purpose, and implementing team; a list of the data collection requests we will be making; a description of the process we will follow; and the name and contact information for that state’s case manager on the evaluation team for any questions. (Our earlier draft of this package would need to be updated once OMB clearance is obtained. At OMB’s option, we could provide that earlier draft or rewrite and provide a new version now.)
The request for completion of this instrument will be sent to the state weatherization director or his/her designee via e-mail with a simple message, such as:
Dear [name]:
This is the first data request for [insert state name] from the national WAP evaluation team. In this request, we are asking for a list of all program year 2008 subgrantees in your state, their DOE funding amounts, and a few additional characteristics to help us know who is active in your state and assist with sampling agencies for subsequent parts of the evaluation. Please provide this information at [insert url] by [requested completion date – about 2 weeks after the e-mailing]. If you would prefer, we can also take the information via telephone or on a form we can send you in electronic or paper form.
Please feel free to contact me with any questions you may have about this information request. Thank you in advance for your assistance with this part of the evaluation.
Sincerely,
[case manager]
[signature line with contact information]
Non-Respondent Follow-Up: Non-responding states will receive follow-up inquiries by telephone and/or e-mail from the evaluation team case manager assigned to that state. The case manager will determine the appropriate number, timing, and type of follow-ups based on prior communications.
A state that is simply slow to respond to a data request will receive friendly reminders from the case manager approximately a week after the initial deadline with an inquiry concerning the agency’s expected timeline. Thereafter, the case manager would continue to follow up approximately a week after any agency-specified timelines have passed. If needed, a higher-level member of the evaluation team would contact the weatherization director to inquire what the evaluation team could do to help the state provide the data in the most convenient manner possible.
On the other hand, a state that is not responding at all may receive three inquires over the course of a few weeks from the case manager before a higher level member of the evaluation team would call the state weatherization director to inquire about any reasons for non-response, to overcome objections or barriers, and to obtain cooperation.
Burden Estimate: ORNL estimated an average burden of four hours per responding agency. This estimate was based on an item-by-item review of the questions in the instruments by two ORNL staff to determine which questions could be answered “top of mind” by responding agencies and which would require the retrieval of information from files or databases. For each question, ORNL staff developed an estimate of the amount of time required by an agency whose record-keeping system required an above-average amount of time to assemble the requested information or whose number of local agencies was above average. The intent was to develop a conservative estimate. The average burden per agency reflects the sum of the estimated burden for each question.
If needed, the burden estimate could be verified (and revised) through a pretest of states with varied numbers of local agencies and different recordkeeping systems.
Response to Specific OMB Questions:
OMB Question |
Response |
Q2: What does substantial mean? |
Revised to request agencies that were expected to perform 50 or more projects in any of the dwelling types (or the 10 agencies with the highest caseload in a dwelling type if there are many states that exceed 50 projects). |
Q3: How does this question fit into the sampling plan? |
Used to define the sample frame for the case studies of high-performing and innovative agencies. |
National Weatherization Assistance Program Evaluation
Supplemental Information for the Office of Management and Budget
October 2009
Data Collection Instrument: DF2 Housing Unit Information Survey
DF3 Building Information Survey
Purpose: Collect details about weatherization activities in a sample of weatherized units and buildings to characterize the dwellings treated and to conduct analyses of measure effectiveness and cost-effectiveness. (The housing unit and building information surveys will be used as part of field studies as well. We will discuss purpose, sampling, and approach for those efforts separately.)
Sequence (among state & local agency instruments): 3 (concurrent with several other instruments). Previously, we will have implemented DF1 (the All States Agencies Information Form) and DF4 (Electric & Natural Gas Bills Information from Agencies). This survey will be implemented concurrently with the All States Program Information Survey, the All Agencies Program Information Survey, and the Subset of Agencies Detailed Program Information Survey. However, we will work with the sampled agencies to establish a sequence for the instruments requested of them and a timeline that is feasible for each sampled agency.
Sample Frame: DOE weatherization jobs conducted by the subset of 400 local agencies during Program Years 2007-2009. (A “weatherization job” can refer to the weatherization of a single-family or mobile home, a single unit in a multi-family structure, or an entire multi-family building.)
Sampling: The evaluation team will sample weatherization jobs after implementing the first part of DF4 (Electric & Natural Gas Bills Information from Agencies). Sampling approaches vary based on:
the primary heating fuel of the units and buildings in the sample frame [We will sample housing units and buildings at a higher rate if they are heated electrically or with natural gas because those fuels will be included in the billing analysis.]
the record keeping practices of the agencies [For any agencies that keep the data we are requesting electronically or can otherwise furnish them easily, we will request data on ALL electrically and natural gas-heated units and buildings (rather than sampling them).]
the size of the utilities serving the electrically and natural gas-heated units and buildings in our sample frame [We will sample small utilities in an effort to cluster billing data requests for utilities that would otherwise receive requests for very small numbers of customers. See below for more detail.]
Electrically and Natural Gas-Heated Units and Buildings: We will collect housing unit/building information for at least one third of jobs performed in electrically and natural gas-heated units and buildings by the subset of agencies during Program Years 2007 through 2009. For agencies that keep those lists and project details electronically (or in an otherwise easy-to-provide format), we will request housing unit and building information forms for all weatherization projects (units and buildings) within the sample frame. For agencies that do not keep these lists electronically or in a format that makes it easy for them to provide data for all units within the sample frame, we will sample one-third of projects, but a minimum of seven per agency.
Units and Buildings Heated Primarily with Other Fuels: To allow characterization of a full range of weatherized dwellings, we will also sample a quarter of weatherization projects performed by the subset of agencies during Program Years 2007 through 2009 in units and buildings that are heated primarily with fuels other than electricity and natural gas.
Anticipated Respondent: Initial contact with the sampled agency will be with the agency contact person listed by state agencies in their quarterly reports to the Department of Energy through the WINSAGA system. We expect that most of these contacts are agency directors or weatherization directors for the local agencies who will delegate our information request to one or more administrative or IT staff (depending on how the agency stores project data) with some involvement by auditors and crew chiefs for sampled projects. We anticipate that someone – either the contact person or an administrative staff person – will oversee the gathering of the data internally.
Survey Mode: Currently planning a mixed mode approach whereby respondents have a choice of the following:
(1) completing an online form,
(2) providing the data via telephone,
(3) completing a formatted, electronic copy of the instrument (such as a template we create in Excel),
(4) filling in a paper version of the instrument.
Communications: Prior to this data collection effort, we will have implemented several components of a communications plan that addresses outreach to intended respondents in a holistic fashion. Components of this plan include:
sending an introductory package of information about the evaluation projects to state agencies;
making informal contact with state weatherization directors and encouraging them to let their network of local agencies know about the impending evaluation and requesting their timely cooperation;
sending an introductory letter to local agency contacts reported to DOE by the states;
posting an updated Q&A page about the project on the Weatherization Assistance Program Technical Assistance Center (WAPTAC) web page; and
(depending on timing of the data collection) reaching out to various associations that serve local agencies to inform them about the impending evaluation with the goal of creating goodwill within their networks, develop a broad understanding of how this evaluation provides useful information for the weatherization community, and having the associations encourage cooperation by their members when contacted by the evaluation team.
[We will provide a copy of this plan with relevant materials to OMB shortly, so the communications for any one data collection effort can be seen in context of the broader outreach efforts.]
In addition, we will already have requested a list of weatherization jobs and some detail about the sampled jobs (see DF4) from the 400 sampled local agencies. Thereby, the request for housing unit and building information forms will follow multiple communications with a point person at each sampled local agency handling our data requests. In other words, we will already have established a relationship with our local agency contact person.
Depending on the nature of the communications with the agency contact person up to that point and his or her preferred means of communicating with us, our request for completion of the housing unit and building information survey may be via e-mail, telephone, or a more formal letter if the agency has been generally unresponsive. Please see below for the template we would use for any requests via e-mail.
Dear [local agency contact name]:
We appreciate your help with the list of weatherization jobs that [agency name] completed in program years 2007 through 2009. As noted earlier, we have sampled the jobs completed by your agency and 399 other weatherization providers from around the country for more detailed analysis, including a study of energy savings before and after weatherization.
For this analysis, we will need detailed information about measures completed, diagnostic measurement, and funding used for each of these jobs. The attached Xcel workbook lists the applicable jobs for your agency and identifies the data we will need. Depending on how you store this data, it might be easiest for you to complete and return the attached workbook or enter the data into an online form at [include link and username/password here]. We can discuss other possibilities as well.
I will call you to discuss this data request in the next few days. I can explain the data transfer options we have prepared, discuss the difference between the housing unit and building information surveys, and answer any questions you might have when we talk.
Ideally, we would like the data by [date that is about 3 weeks in the future] to keep the national WAP evaluation on track. Please let me know when we talk whether you foresee any difficulties in providing this data by then.
Thank you in advance for your assistance with this part of the evaluation.
Sincerely,
State & Local Agency Outreach Specialist
[signature line with contact information]
Non-Respondent Follow-Up: Local agencies will be asked for DF2 and DF3 only after they have completed DF4, so non-respondents will have dropped out of the process at this point already. However, it is possible that some agencies will refuse to provide the requested data or be slow to do so.
Sampled agencies that refuse to provide the information will receive a call from the state & local agency liaison manager in charge of agency data collection at the Energy Center of Wisconsin to inquire about the reason for the refusal and to seek to overcome any obstacles or objections. At the manager’s discretion, the Energy Center may refer any additional refusing agencies to the ORNL project manager overseeing the project for an additional effort to overcome agency concerns.
Sampled agencies that are slow at providing the requested data will receive periodic follow ups from the evaluation team’s state & local agency outreach specialist assigned to that agency (i.e., their case manager). We will begin with reminder calls or e-mails approximately a week after the initial agreed-upon deadline. After approximately three follow-up contacts, the case manager would refer the agency to a supervisor or the manager in charge of agency data collection for additional follow up.
Burden Estimate: ORNL estimated an average burden of forty hours per responding agency for the housing unit information survey and eight hours per responding agency for the building information survey. This burden estimate was developed based on expert judgment. ORNL has over three decades of experience in working with states, local weatherization agencies, and utilities in collecting household weatherization and billing data. ORNL has worked especially closely with agencies and utilities over the years. ORNL understands how much effort is needed to fill out this form, on average, by a typical agency. Based on ORNL’s experience, it will take an agency about 45 minutes to complete this form per housing unit or building for weatherized homes (Program Year 2007 and 2008) and 10 minutes per waitlisted or control group housing unit or building (Program Year 2009). We estimate that an average agency will be asked to provide DF-2 or DF-3 for 45 sampled units and buildings per year and that these 45 sample points will comprise 24 individual units and 5 complete buildings (representing the remaining 21 units). This yields an average burden of 40 hours per agency for DF-2 and 8 hours for DF-3 across the three years.
Response to Specific OMB Questions:
OMB Question |
Response |
a. Clarify who (by position/title) is completing this form. |
Varies, but mostly administrative and IT staff. See above. |
b. Please provide the advance and cover letter and other communications materials. |
Communication process described above; will submit a complete communications plan that shows how the request to local agencies for this form fits within the larger flow of information and data requests to states and agencies. |
c. What is the survey mode(s)? |
Multiple modes available to respondents. Likely emphasis on online form and spreadsheet template. See above. |
d. Is this the format in which the survey will be presented to the respondent (or is this eg a pre-programming draft)? If so, why does the question format differ so dramatically across the form? This is not good survey design. |
We anticipate that most respondents will provide data in an electronic format and not on the paper version. However, we believe we have fixed most formatting issues in the paper version. See accompanying file. |
e. What is the source of the burden estimate? |
We have revised the burden estimate after reviewing the updated instruments and clarifying our sampling approach. See above for an explanation of the source of the burden estimate. |
f. Given length, what is anticipated strategy for respondents to complete (hand off to others; one person does over several days)? |
The internal process will be up to the responding local agency. We will design online data forms so that multiple individuals can easily enter batches of data in separate sessions and easily see what fields still require completion. Online forms and electronic files will both be formatted to allow completion of data by housing unit or question type (i.e., vertically or horizontally). Paper forms would need to be routed internally. We do not anticipate telephone completions except for agencies with very few sampled units. We would complete an electronic copy of the survey for them while on the telephone. |
g. How many of these forms will each agency (average and range) be expected to complete? Over what time period? |
The average number of units sampled will be approximately 45 per agency per sampled year. The range will be 7 to 600 units per sampled year (but some of the units will be reported as a smaller number of buildings using DF-3). We will work with the sampled agencies to establish a timeline that is feasible for them. |
h. How are these HUs selected? By whom? |
The evaluation team will sample the housing units from the list of projects provided by local agencies in DF4. |
i. Where in the sequence of questionnaires does this one fall? |
This questionnaire will be in the third wave of data collection. See above. |
j. Fix confidentiality pledge. |
Done. |
k. Household section – the “ethnicity” question does not meet OMB standards. Please see: http://www.whitehouse.gov/omb/fedreg_1997standards/for the complete standard. The concepts are “race” and “ethnicity” although it is not necessary to use those terms in your question. If the data are coming from records, a combined format is acceptable. The instruction should read check or mark “one or more,” you cannot use an “other” category. If you have evidence that offices may not have this data available, we will permit you to use a category like “don’t know.” |
Done. |
National Weatherization Assistance Program Evaluation
Supplemental Information for the Office of Management and Budget
November 2009
Data Collection Instrument: DF4 Electric & Natural Gas Bills Information from Agencies Form
Purpose: (1) Collect lists of units and buildings weatherized by sampled agencies. (2) Collect the details needed to make billing data requests from utilities for a sample of those units and buildings.
Sequence (among state & local agency instruments): 1
Sample Frame: DOE weatherization jobs conducted by the subset of 400 local agencies during Program Years 2007-2009. (A “weatherization job” can refer to the weatherization of a single-family or mobile home, a single unit in a multi-family structure, or an entire multi-family building.)
Sampling: There are two parts to DF-4. For part 1, there is no sampling. We are requesting a list of all jobs within the sample frame from the subset of 400 local agencies (see S3 and the original Supporting Statement on how those 400 local agencies were sampled).
For part 2 of DF-4, we will randomly sample one-third homes and buildings that heat with natural gas or electricity for inclusion in the major billing analysis study. We will also randomly select one-quarter of homes and buildings that heat with bulk fuels fuel oil and propane in order to characterize weatherization activities in these homes. We will ask agencies to complete part 2 (and also DF-2 or DF-3) for these samples. For agencies that keep those lists and project details electronically (or in an otherwise easy-to-provide format), we will request housing unit and building information forms for all weatherization projects (units and buildings) within the sample frame. {Note that we will offer agencies the option to complete both parts 1 and 2 when we first approach them about DF-4 if it is more efficient for them to access their files once but provide a greater amount of data for all units and buildings within the sample frame.}
Anticipated Respondent: Initial contact with the sampled agency will be with the agency contact person listed by state agencies in their quarterly reports to the Department of Energy through the WINSAGA system. We expect that most of these contacts are agency directors or weatherization directors for the local agencies who will delegate our information request to one or more administrative or IT staff (depending on how the agency stores project data) with some involvement by auditors and crew chiefs for sampled projects. We anticipate that someone – either the contact person or an administrative staff person – will oversee the gathering of the data internally.
Survey Mode: Currently planning a mixed mode approach whereby respondents have a choice of the following:
(1) completing an online form,
(2) providing the data via telephone,
(3) completing a formatted, electronic copy of the instrument (such as a template we create in Excel),
(4) filling in a paper version of the instrument.
We intend to expend a good deal of effort to produce an Excel tool for use by the agencies.
Communications: Prior to this data collection effort, we will have implemented several components of a communications plan (please see separate document) that addresses outreach to intended respondents in a holistic fashion. Components of this plan include:
sending an introductory package of information about the evaluation projects to state agencies;
making informal contact with state weatherization directors and encouraging them to let their network of local agencies know about the impending evaluation and requesting their timely cooperation;
sending an introductory letter to local agency contacts reported to DOE by the states;
posting an updated Q&A page about the project on the Weatherization Assistance Program Technical Assistance Center (WAPTAC) web page; and
(depending on timing of the data collection) reaching out to various associations that serve local agencies to inform them about the impending evaluation with the goal of creating goodwill within their networks, develop a broad understanding of how this evaluation provides useful information for the weatherization community, and having the associations encourage cooperation by their members when contacted by the evaluation team.
We anticipate making our data request via e-mail to our agency contact person when possible. In some cases, we may contact an agency via a formal letter if we have been unable to obtain the needed e-mail address or established that a letter is a more effective mode of communication with the agency. Please see below for the text of our data request.
Dear [local agency contact name]:
The national Weatherization Assistance Program evaluation team is beginning data collection from local agencies in [state], and I am writing to request your assistance with our first data request from [agency name].
One of our tasks is to characterize the units and buildings being weatherized nationally and to analyze the energy impacts on those dwellings. To do this, we are requesting from [agency] a list of:
DOE weatherization jobs you completed and reported to your state for Program Years 2007, 2008, or so far in 2009; and
potential weatherization jobs that are currently on your waiting list.
In the coming weeks, we will work with you to obtain additional details about a subset of these units and buildings. This process will occur in two steps. First, we will ask for weatherization dates, utility account information, and billing releases for the subset of units and buildings so we can request billing data from applicable utilities. Second, we will ask for detailed information about these weatherization jobs.
The data we request today – the list of DOE weatherized and waitlisted jobs – might be easiest to provide using the attached workbook or online at [URL]. Please note that there are two parts to the spreadsheet. We are only asking for Part 1 right now and will be requesting Part 2 for a sample of your projects in a few weeks. However, if it would be easier for you to complete the entire spreadsheet now rather than in two stages, please feel free to complete both parts for all PY 2007 – 2009 jobs.
We would need this list of projects in the next two weeks if at all possible. I will call you in the next couple of days to follow up and try to answer any questions you may have. I can also outline other ways you could provide this data at that time.
I look forward to speaking with you.
Sincerely,
State & Local Agency Outreach Specialist
[signature line with contact information]
Non-Respondent Follow-Up: Our data request comprises a written request and a follow-up telephone call to answer questions and verify the timeline for submission of the data. If this process does not result in affirmative contact with the sampled agency, we will continue to follow up as follows:
1st follow-up call: If the agency contact does not respond within two weeks, the outreach specialist will make a first follow-up call. If unable to reach our contact person directly when following up on the data request, the outreach specialist will leave a message and callback number.
2nd follow-up call: If the agency contact does not respond within another two weeks, the outreach specialist’s supervisor will make a second follow-up call. The outreach specialist will make several calls at varying times of the day to reach the contact person directly. If this effort is unsuccessful, the outreach specialist’s supervisor will leave a message.
3rd follow-up call: If the agency contact does not respond within another two weeks, the outreach specialist’s supervisor will make a second follow-up call. The outreach specialist’s supervisor will make several more calls at varying times of the day to reach the contact person directly or an alternate contact at the agency who may be able to assist with the data collection. If this effort is unsuccessful, the state and local agency liaison manager will call and leave one more message for the agency contact person. If deemed helpful for that agency’s state (based on the state agency’s prior support of the evaluation and assistance with encourage agency responsiveness to the data requests), the manager may also contact the state agency for assistance in contacting the local agency.
Sampled agencies that refuse to provide the information will receive a call from the state & local agency liaison manager in charge of agency data collection at the Energy Center of Wisconsin to inquire about the reason for the refusal and to seek to overcome any obstacles or objections. At the manager’s discretion, the Energy Center may refer any additional refusing agencies to the ORNL project manager overseeing the project for an additional effort to overcome agency concerns.
Sampled agencies that are slow at providing the requested data will receive periodic follow ups from the evaluation team’s state & local agency outreach specialist assigned to that agency (i.e., their case manager). We will begin with reminder calls or e-mails approximately a week after the initial agreed-upon deadline. After approximately three follow-up contacts, the case manager would refer the agency to a supervisor or the manager in charge of agency data collection for additional follow up.
Burden Estimate: ORNL estimated an average burden of 10 hours per responding agency. This burden estimate was developed based on expert judgment. ORNL has over three decades of experience in working with states, local weatherization agencies, and utilities in collecting household weatherization and billing data. ORNL has worked especially closely with agencies and utilities over the years. ORNL understands how much effort is needed to fill out this form, on average, by a typical agency. Based on ORNL’s experience, it will take an agency with paper records about 15 to 20 hours to gather and provide the requested data for all three years. It will take an agency with electronic records about two to four hours. Assuming half of the agencies have paper records and half have electronic records, this amounts to about 10 hours per agency.
Response to Specific OMB Questions:
OMB Question |
Response |
Confidentiality statement |
Fixed. |
About how many “jobs” will the average agency include in this submission? |
The average number of units weatherized by a sample of 400 agencies is 140 per agency per year (or 420 over a three-year period). Some of these units will represent individual single-family buildings, mobile homes, shelters, or individually weatherized units in multi-family buildings. Other units will be represented by multi-family buildings that were weatherized as entire buildings. |
Why isn’t this document formatted more like a spreadsheet to facilitate more automation? Is manual entry of the same job across multiple tables really the only way to collect this information? |
We have modified the paper instrument to function more like a spreadsheet. We agree that electronic transfer of this data will be easiest for most agencies. We will make that choice available to them, so they can choose the paper form we are providing you or an electronic spreadsheet that will be structured in a very similar way. |
National Weatherization Assistance Program Evaluation
Supplemental Information for the Office of Management and Budget
November 2009
Data Collection Instrument: DF5 Electric and Natural Gas Bills: Information from Utilities
Purpose: Obtain monthly electric and gas usage and charge data for weatherized and control homes for the period from January 1, 2006 through April 30, 2010. Used in conjunction with weather data for the same time period, these data will facilitate measurement of changes in the way homes use energy over the time frame of the evaluation.
Sequence: 1 for utilities. {Note: Data from DF-4 will be used to identify homes weatherized by sampled agencies, to select a sample of homes that heat with utility gas or electricity, and to obtain household information and utility account numbers for the sampled homes. Once we have retrieved DF-4 data for all of the agencies in a particular electric or gas utility’s service territory, we will administer the DF-5 instrument to that utility.}
Sample Frame: The population is all households weatherized in PY 2007, weatherized in PY 2008, or weatherized or on the waiting list for weatherization in PY 2009. The sample frame is all such housing units weatherized by the subsample of 400 WAP agencies.
Sampling: There is no sampling associated with collecting utility billing data. As set out in the cover letters for S3 and DF4 and in the original Supporting Statement, there are plans to sample local weatherization agencies and homes weatherized by those agencies (as well as control homes). The project team plans to contact every electric and gas utility that served every sampled treatment home and building identified by DF4 to collect the appropriate billing histories. We expect that a large majority of treatment and control homes will have been served by a relatively few large utilities, mainly in urban areas. Treatment and control homes located in rural areas are served by a relatively large number of small utilities and coops. This project will expend a significant amount of resources to collect billing histories from these small utilities and coops in order to ensure that we have an unbiased sample of rural homes.
Anticipated Respondent: The initial contact will be made with the manager in charge of billing at the electric or natural gas company. Prior experience with this population indicates that this individual will usually have the best understanding of the data request and will have the authority to allocate resources to complete the data request. In most cases, the initial contact will delegate the responsibility to a lower level manager or to an Information Technology (IT) staff person. In cases where the utility has not previously supplied data for such studies, this individual may ask the survey contractor to furnish information that can be supplied to upper management and or corporate legal departments.
Survey Mode: Over twenty years of experience with Energy Supplier surveys has demonstrated that the following approach yields the highest response rates for such surveys. For the 1984, 1987, 1990, 1993, and 1997 RECS Energy Supplier Surveys, these procedures resulted in a 100% response rate for electric and gas utility companies and over a 95% response rate for delivered fuel vendors (fuel oil and LP companies). For the 2001 RECS Energy Supplier Survey, the company response rate was 100% for all types of companies. While the RECS survey has legislation that allows EIA to levy fines on Energy Suppliers that do not respond, the proposed approach rarely even required reference to that authority.
The proposed approach uses systematic and consistent follow-up to signals to the respondent that this is an important research activity and that we are going to pursue these data even if we are met with initial rejection from the targeted respondent.
The following data collection procedure will be implemented.
Initial Contact – The first contact will be a special mailing (Express Mail, Fed Ex, or other carrier) to the identified respondent. That mailing will include an introductory letter from DOE (legitimacy of data request) and an operational letter (substance of data request, number of accounts, and next steps) from the Contractor.
Initial Contact Follow-Up – The first follow-up will be phone (if initiated by Contractor) or email/phone (if initiated by respondent) within two days of when the mailing is received. There are several purposes for this quick follow-up. First, is that it reduces cycle time when the wrong respondent was selected. Second, is that it signals to the respondent that this is a short-term, quick turnaround activity. During the first follow-up, contractor staff will verify that the correct respondent has been identified and will address any respondent concerns. If a different respondent is recommended, the first step will be repeated. If additional supporting documentation is required to obtain respondent cooperation, that additional documentation will be supplied.
Nonresponse Conversion – This is the point at which nonresponse is most likely to occur. The following options are available to convert a nonrespondent.
Supervisor Contact – The first approach will be to have a supervisor contact the targeted respondent. The supervisor may be able to give the targeted respondent additional information that is persuasive. Having a supervisor call also signals to the respondent that this is an important research activity.
Contact with Higher Level Company Executive – If the targeted respondents is nonresponsive to the contact, we will identify a higher level company executive for the next refusal conversion contact. Often, public affairs or community relations executives can see the benefit of participating in this research and will help to convert the targeted respondent.
Contact from State WAP or LIHEAP Manager – Managers of State WAP and/or LIHEAP offices often have direct relationships with targeted respondents. At a recent meeting, one LIHEAP manager said, “When I really need something from a utility company, I just call them up and remind them that I furnish $5 million in bill payments each year. That usually gets their attention.”
Contact from ORNL – For the most important utility respondents (i.e., those with 500 or more accounts, the data collection team may need to make use of the support of ORNL. ORNL will be asked to make contact with a small number of nonrespondents to request their assistance.
Data Request – A data request will be sent to the respondent on a password protected CD or will be made available through a secure FTP site. The data request will contain information on the name, service address, and account number for weatherized units. The password for the CD will be delivered separately from the CD.
Data Request Follow-Up – Within two days of the receipt of the CD, a phone or email contact will be made with the designated respondent to confirm that the CD was received, to furnish the password, to ask whether the respondent has questions, and to obtain a target date for completion.
Data Response – The data will be furnished by the respondent in electronic format by sending a password protected CD or via a secure FTP site.
Nonresponse Identification & Conversion – At the time of the data request follow-up, the target date for receipt of the data will be recorded in the contact database. Two days after the target date, the utility will be identified as a nonrespondent if the data have not been received. The designated respondent will be contacted within one business day of being identified as a nonrespondent. Daily contacts will be made until a new target date is obtained. If no response is received after five contacts, the step 2 nonresponse conversion techniques will be employed.
Data Checking / Follow-Up – The data will be reviewed for completeness within 5 business days of submission. Any data problems and/or missing data will be referred back to the designated respondent within 10 business days of data submission.
Overarching Communications: Prior to this data collection effort, we will have implemented several components of a communications plan that will help to establish the appropriateness of this data collection. Components of this plan include:
Utility Associations – Sending information to state and national electric and gas utility associations regarding the study and the importance of responding to the study.
Regulatory Bodies – Sending information to state and national regulatory bodies informing them of the data collection activity and describing the procedures that will be used to ensure the confidentiality of the consumption and expenditures data.
State WAP and LIHEAP Managers – Sending information to each State WAP and LIHEAP manager to inform them of the study, to identify the utilities who will be asked to participate, and to let them know how they can support the data collection process.
Direct Communications: The initial mail request for completion of DF-5 will be sent to the manager of billing for the target utility. The mail request will include a letter from ORNL describing the purpose of and need for the data collection (see Communications Plan) and a letter from the contactor describing the substance of the data request, the number of households, and the next steps (see Communications Plan).
Burden Estimate: ORNL estimated an average burden of 24 hours per utility responding agency. This burden estimate was developed based on expert judgment. ORNL has from previous utility data collection and from reports by State WAP directors who have conducted State-level evaluations. This information is consistent with reports from evaluation team members who received estimates of between 8 hours and 60 hours from the utilities that they have worked with for potentially responding to this data request.
Response to Specific OMB Questions:
OMB Question |
Response |
a. The first portion of the form is text that should appear in a cover letter and not be part of the questionnaire. It should be clearly addresses to the respondent and signed by a cognizant DoE official. |
The Communications Plan includes two letters, a letter from ORNL establishing the survey legitimacy and a letter from the contactor describing the survey logistics. Note that some of this information is repeated on the form, since the initial contact at the utility often passes only the data collection form (not the letters) to the designated respondent. |
b. Given response rate difficulties in the past, please provide a robust plan for pre-notices, follow-ups and other activities designed to achieve an acceptable response rates from the utilities. |
We have outlined a survey plan that was successfully used to achieve a 100% response rate for the 2001 RECS Energy Supplier survey. Note that the RECS has the authority to levy a fine on nonrespondents and the WAP evaluation does not. However, in 20 years of conducting the RECS Energy Supplier survey, we rarely found that it was necessary to reference that authority when using this systematic approach to data collection. |
National Weatherization Assistance Program Evaluation
Supplemental Information for the Office of Management and Budget
November 2009
Data Collection Instrument: DF-6 Metered Fuels and Air Conditioning Studies
Purpose: (a) collect a sample frame of pre-treatment housing units for recruiting into several on-site metering studies; (b) collect contact information for sampled housing units for recruitment into the studies.
Sequence (among state & local agency instruments): 2 for a subset of the 400 sampled agencies (see table below)
Sample Frame: DOE weatherization jobs that are queued for participation in the program by a subsample of 400 local agencies during Program Years 2009 and 2010. (A “weatherization job” can refer to the weatherization of a single-family or mobile home, a single unit in a multi-family structure, or an entire multi-family building.) Only a small number of agencies from the sample of 400 will be asked to participate in the special studies discussed immediately below.
Sampling: This data form is designed to collect information to allow this project to identify weatherized and control single family and mobile homes that heat with two bulk fuels (fuel oil and propane). As explained in the original Supporting Statement, the samples were designed to test the null hypothesis that energy use and savings in these bulk fuel homes is the same as in the larger sample of homes heated with electricity and natural gas. This approach allows the project to collect much less data from the bulk fuel homes, which in turn, lowers project costs substantially because these data cannot be collected via regular monthly bills. Instead, expensive meters need to be installed in these homes.
The form is also designed to identify a relatively small number of large multi-family buildings that heat with fuel oil. We expect that these buildings will be located in a few states in the Northeast. Lastly, the form is designed to help identify a small number of homes for a special metered air conditioning study. Control units will be identified for all of these special studies except the large multi-family building fuel oil study, where we believe it would be quite difficult to find appropriate matching control buildings.
Two-stages sampling will be employed. First local agencies will be randomly sub-sampled from the 400 local agencies that are sampled for the natural gas and electricity study (see explanation in S3 and original Supporting Statement). The first part of DF-6 will then be administered to the various subsets of sampled agencies (see table below), and used as a sampling frame to select eligible jobs that have not yet been weatherized. This will allow submeters to be installed for a period prior to weatherization. The table below shows the geographic scope, number of agencies and number of jobs for each metering study.
Study |
Geographic scope |
Sampled agencies |
Sampled Jobs |
Fuel oil heat, Single-family |
Northeast Census Region |
16 |
128 |
Fuel-oil heat, Multifamily |
New York, New Jersey, Vermont |
8 |
24 |
Propane heat, Single-family |
National |
16 |
128 |
Propane heat, Mobile home |
National |
116 |
|
Air conditioning Single-family and mobile home |
14 hot-climate states |
33 |
264 |
Note that the two propane studies will be implemented with the same 16 agencies. Also, 4 of the propane-study agencies and jobs will be overlapped with the air conditioning study in the 14 hot-climate states. In all, we expect to administer DF-6 to 69 agencies.
Anticipated Respondent: Initial contact with the sampled agency will be with the agency contact person listed by state agencies in their quarterly reports to the Department of Energy through the WINSAGA system. We expect that most of these contacts are agency directors or weatherization directors for the local agencies who will delegate our information request to one or more administrative or IT staff. We anticipate that someone – either the contact person or an administrative staff person – will oversee the gathering of the data internally.
Survey Mode: Currently planning a mixed mode approach whereby respondents have a choice of the following:
(1) completing an online form,
(2) providing the data via telephone,
(3) completing a formatted, electronic copy of the instrument (such as a template we create in Word),
(4) filling in a paper version of the instrument.
Communications: Prior to this data collection effort, we will have implemented several components of a communications plan (please see separate document) that addresses outreach to intended respondents in a holistic fashion.
We anticipate making our data request via e-mail to our agency contact person when possible. In some cases, we may contact an agency via a formal letter if we have been unable to obtain the needed e-mail address or established that a letter is a more effective mode of communication with the agency. Please see below for the text of our data request.
Dear [local agency contact name]:
As part of the national Weatherization Assistance Program evaluation, we are conducting several studies that involve on-site metering of propane and fuel oil used for space heating and electricity consumption for air conditioning.
[local agency] has been sampled for our study of [study]. I am writing to request your assistance with identifying a sample of scheduled and wait-listed [home type] housing units that meet the following study criteria:
[study criteria]
Once we have identified and recruited jobs for the study, we would appreciate your assistance in helping us with field installation and (later) removal of monitoring equipment along with gathering of additional on-site data for the study. We will be providing a technician for this work, but would like to have an agency staff person on-site at all times. Oak Ridge National Laboratory or one its subcontractors will arrange to reimburse your agency for staff time spent on this project.
Note that our study requires sufficient pre- and post-weatherization time periods to allow us to measure the change in heating or cooling energy associated with weatherization. This may require delaying weatherization work for some recruited sites. We will provide a cash incentive for participating households to compensate for any such delays in service.
The data we request today – the list of wait-listed DOE weatherization jobs – might be easiest to provide using the attached workbook or online at [URL].
We would need this list of projects in the next three weeks if at all possible. I will call you in the next couple of days to follow up and try to answer any questions you may have. I can also outline other ways you could provide this data at that time.
I look forward to speaking with you.
Sincerely,
State & Local Agency Outreach Specialist
[signature line with contact information]
Non-Respondent Follow-Up: Our data request comprises a written request and a follow-up telephone call to answer questions and verify the timeline for submission of the data. If this process does not result in affirmative contact with the sampled agency, we will continue to follow up as follows:
1st follow-up call: If the agency contact does not respond within three weeks, the outreach specialist will make a first follow-up call. If unable to reach our contact person directly when following up on the data request, the outreach specialist will leave a message and callback number.
2nd follow-up call: If the agency contact does not respond within another two weeks, the outreach specialist’s supervisor will make a second follow-up call. The outreach specialist will make several calls at varying times of the day to reach the contact person directly. If this effort is unsuccessful, the outreach specialist’s supervisor will leave a message.
3rd follow-up call: If the agency contact does not respond within another two weeks, the outreach specialist’s supervisor will make a third follow-up call. The outreach specialist’s supervisor will make several more calls at varying times of the day to reach the contact person directly or an alternate contact at the agency who may be able to assist with the data collection. If this effort is unsuccessful, the state and local agency liaison manager will call and leave one more message for the agency contact person. If deemed helpful for that agency’s state (based on the state agency’s prior support of the evaluation and assistance with encourage agency responsiveness to the data requests), the manager may also contact the state agency for assistance in contacting the local agency.
Sampled agencies that refuse to provide the information will receive a call from the state & local agency liaison manager in charge of agency data collection at the Energy Center of Wisconsin to inquire about the reason for the refusal and to seek to overcome any obstacles or objections. At the manager’s discretion, the Energy Center may refer any additional refusing agencies to the ORNL project manager overseeing the project for an additional effort to overcome agency concerns.
Sampled agencies that are slow at providing the requested data will receive periodic follow ups from the evaluation team’s state & local agency outreach specialist assigned to that agency (i.e., their case manager). We will begin with reminder calls or e-mails approximately a week after the initial agreed-upon deadline. After approximately three follow-up contacts, the case manager would refer the agency to a supervisor or the manager in charge of agency data collection for additional follow up.
Burden Estimate:
ORNL estimated an average burden of sixteen hours per completion of the metered fuels and air conditioning studies form. This burden estimate was developed based on expert judgment. ORNL has over three decades of experience in working with states, local weatherization agencies, and utilities in collecting household weatherization and billing data. ORNL has worked especially closely with agencies and utilities over the years. ORNL understands how much effort is needed to fill out this form, on average, by a typical agency.
Response to Specific OMB Questions:
OMB Question |
Response |
Confidentiality statement |
Fixed. |
Who are respondents? Are they the same people completing the other agency surveys? Clarify sequence and timing. |
Agency directors and their designated staff.
Since the purpose of DF-6 is to obtain a sample frame for heating and cooling metering studies that are seasonally dependent—and because agency wait lists change quickly—the form needs to be administered within two months of anticipated installation of equipment.
|
National Weatherization Assistance Program Evaluation
Supplemental Information for the Office of Management and Budget
November 2009
Data Collection Instrument: DF10 All Agencies Overview Data Form
Purpose: Gather some basic data about local agency practices for measure selection, client education, staff training, and quality control to allow sampling of agencies for process evaluation studies.
Sequence (among state & local agency instruments): 1 for non-sampled agencies; 3 for sampled agencies
Sample Frame: All local agencies reported by states as active providers of DOE weatherization jobs in Program Year 2008 (n=904)
Sampling: Census
Anticipated Respondent: Initial contact with the local agency will be with the agency contact person listed by state agencies in their quarterly reports to the Department of Energy through the WINSAGA system. We expect that the contact person will complete the form or delegate it to someone on staff.
Survey Mode: Currently planning a mixed mode approach whereby respondents have a choice of the following:
(1) completing an online form,
(2) providing the data via telephone,
(3) completing a formatted, electronic copy of the instrument (such as a template we create in Word),
(4) filling in a paper version of the instrument.
Communications: Prior to this data collection effort, we will have implemented several components of a communications plan that addresses outreach to intended respondents in a holistic fashion. Components of this plan include:
sending an introductory package of information about the evaluation projects to state agencies;
making informal contact with state weatherization directors and encouraging them to let their network of local agencies know about the impending evaluation and requesting their timely cooperation;
sending an introductory letter to local agency contacts reported to DOE by the states;
posting an updated Q&A page about the project on the Weatherization Assistance Program Technical Assistance Center (WAPTAC) web page; and
(depending on timing of the data collection) reaching out to various associations that serve local agencies to inform them about the impending evaluation with the goal of creating goodwill within their networks, develop a broad understanding of how this evaluation provides useful information for the weatherization community, and having the associations encourage cooperation by their members when contacted by the evaluation team.
[Please see Communications Plan for further discussion.]
Prior to this data request, local agency weatherization directors will have received an introductory package consisting of a letter from ORNL that introduces the evaluation’s history, purpose, and implementing team; a list of the data collection requests we will be making; a description of the process we will follow; and the name and contact information for that agency’s case manager on the evaluation team for any questions.
The request for completion of this instrument will be sent to the local agency weatherization director or his/her designee via e-mail with a simple message, such as:
Dear [name]:
As part of the national evaluation of the Weatherization Assistance Program, we are asking your agency to complete a data collection form. In the All Agencies Program Overview Data Form, we are asking for very abbreviated information about the way you choose measures, your client education processes, your staff training, and your quality control process.
Please provide this information at [insert url] by [requested completion date – about 1 week after the e-mailing]. If you would prefer, we can also take the information via telephone or on a form we can send you in electronic or paper form.
Please feel free to contact me with any questions you may have about this information request. Thank you in advance for your assistance with this part of the evaluation.
Sincerely,
[case manager]
[signature line with contact information]
Non-Respondent Follow-Up: Non-responding states will receive formal follow-up inquiries by telephone and/or e-mail from the evaluation team case manager assigned to that state. The first formal follow-up will occur three weeks after a state receives the survey. Then, the state will be contacted two more times, each time after a two week interval. If the state has not responded after the third contact, the project manager will contact the state to inquire about any difficulties the state might be having in completing the survey. If appropriate, the project will dispatch a staff member to help the state compile materials to complete the survey.
Burden Estimate: ORNL estimated an average burden of one hour per responding agency. This burden estimate was developed based on expert judgment. ORNL has over three decades of experience in working with states, local weatherization agencies, and utilities in collecting household weatherization and billing data. ORNL has worked especially closely with agencies and utilities over the years. ORNL understands how much effort is needed to fill out this form, on average, by a typical agency.
Response to Specific OMB Questions:
OMB Question |
Response |
a. Please clarify how this form relates to the “All Agencies Program Information Survey.”
|
This form is similar in nature to the All Agencies Program Information Survey (S-2), but used for a different purpose. This form is designed to collect a small amount of information quickly that will be used to guide several other project tasks. |
National Weatherization Assistance Program Evaluation
Supplemental Information for the Office of Management and Budget
July 14, 2010
Data Collection Instrument: DF9 Occupant Survey Data Form and S4 Occupant Survey
Purpose: To collect information that addresses client energy use behavior, non-energy impacts of weatherization, and client satisfaction with weatherization service.
Sequence (among state & local agency instruments): 13 and 14
Anticipated Respondents: Clients that will receive audits and home weatherization services and a control group composed of LIHEAP recipients that did not receive weatherization assistance.
Sample Frame: Lists of clients waiting to have audits will be collected from each of the 400 sub-sampled agencies. Clients for occupant survey will be randomly selected as discussed below. Lists of up to 20 households that are receiving LIHEAP but not weatherization services will also be collected from each of the 400 sub-sampled agencies. Households will be randomly selected from these lists to form the control group. This is an appropriate control group because this low-income group is assumed to be comparable to the treatment group (pre-weatherization) with respect to energy education, and health. It is our assumption that the treatment group will be more comparable to the control group with respect to energy consumption and issues related to paying energy bills post-weatherization because households that apply for weatherization services tend to have higher energy uses than low-income homes that do not apply for weatherization services.
Sampling: Clients, one per sampled household (e.g., head of household or appropriate proxy) will be sampled through agencies for identification and contact information (phone numbers). After the clients have been sampled, client lists will be destroyed. Although the primary sampling unit is agency (and the client lists will be obtained from the 400 sampled agencies as noted above), the sample size calculation is based on the approximation that the sampling is simple random (agency sampling weights will be used in the analysis, however). The weatherization group sample size nw will be determined to ensure, at a 90% level of confidence, a maximum error of .03 (three percentage points) in a binary (e.g., yes/no) response probability. This implies a weatherization group sample size of 752 clients. (Where Z.95 = 1.64 is the 95th percentile of the standard normal distribution, Z.95[p(1–p)/n]1/2 Z.95[.5(1–.5)/nw]1/2 = 1.64.5/nw1/2, which is .03 when nw =752.) The control group sample size will be determined to ensure, at a 90% level of confidence, a maximum error of .05 (five percentage points) in estimating pw - pc, the difference between weatherization and control group binary response probabilities. This implies a control group sample size of 423. (Z.95[pw (1– pw)/752 + pc (1– pc)/nc]1/2 Z.95[.5(1–.5)(1/752 + 1/nc]1/2, which is .05 when nc =423.
Because this occupant survey will be conducted in longitudinal installments over several months, attrition is possible, but is not expected to exceed 25% (based on four-year average occupancy). Therefore, an additional 25% of subjects, 188 weatherized and 106 control subjects, will be sampled at the initial stage of the survey to compensate for subjects who move. Thus, a total of 940 (752 + 188) weatherized and 529 (423 + 106) control occupants will be sampled.
Survey Mode: A CATI survey will be conducted with clients prior to audits with a control group composed of non-weatherized households that received LIHEAP grants.
It needs to be noted that the Occupant Survey has several sections and will be administered three separate times. Part I of the Occupant Survey will be administered to weatherization clients just before their homes are weatherized. This part collects information about demographics, energy consumption behavior, and non-energy impacts, which includes questions related to occupant health. Part 1 will also be administered to clients one-year after weatherization. The control group will be surveyed twice, during these same time periods and they will be administered Part 1. Lastly, just after weatherization, Part 2. Customer Satisfaction will be administered to the clients.
Communications: Prior to the survey,
With respect to the first stage of this survey, a request for completion of DF11 will be sent to the director of each of the sub-sample agencies or his/her designee via e-mail with a simple message, such as:
Dear [name]:
This is the ninth data request for [insert agency name] from the national WAP evaluation team. In this request, we are asking for a list of clients in the queue for energy audits. Please provide this information at [insert url] by [requested completion date – about 2 weeks after the e-mailing]. If you would prefer, we can also take the information via telephone or on a form we can send you in electronic or paper form.
Please feel free to contact me with any questions you may have about this information request. Thank you in advance for your assistance with this part of the evaluation.
Sincerely,
[case manager]
[signature line with contact information]
Communications: To potential respondent
Dear [name]:
I am calling you today to ask you to participate in a national survey of recipients of weatherization program services. This survey is being funded by the U.S. Department of Energy as part of a larger national evaluation of the Weatherization Assistance Program. I work with The Energy Center of Wisconsin, which is part of a team of independent evaluators under subcontract to Oak Ridge National Laboratory to conduct this evaluation.
The information collected from the survey will help us evaluate how satisfied you have been with the weatherization services received by your household. The survey will also help us evaluate a range of benefits related to weatherization, such as those related to improvements in your health and the health any others living with you in your home. The survey will also help us better understand how recipients of weatherization services use energy in their homes.
The survey will be given to you three separate times. Each time, we will pay you $25 if you participate and complete that part of the survey. The first part of the survey will be administered before your home receives its energy audit. This part will take about 45 minutes. The second part of the survey will be administered a few days after your home is weatherized. The second part will take only 15 minutes. The third part of the survey will take place approximately one year after your home is weatherized and will also be 45 minutes in length.
All of the information that we obtain from this survey will remain confidential and will be analyzed in such a way that your answers cannot be associated with your name. Your answers will not be shared with or reported back to anyone within your agency or state.
Would you like to take the first part of the survey now?
Sincerely,
[case manager]
[signature line with contact information]
Response rates and non-Respondent Follow-Up: Nonresponse is expected to be small for this survey because (1) weatherized clients will have benefited from weatherization assistance, (2) control group clients will have benefited from another federal assistance program (LIHEAP), and (3) both groups will be identified from contact information maintained by the agencies that serve them. Thus, we should have valid phone numbers for both land-lines and cell phones. For households whose phone service has been disconnected, we will mail a post card with the invitation to participate and a phone number to call. Furthermore, a $25 incentive will be provided to respondents who complete each Part 1 of the survey and $10 for Part 2 to minimize attrition.
Additional Use of Occupant Survey: The Oak Ridge National Laboratory team is conducting two other tasks that will make use of this survey. The first task, known as the Field Process Study, entails field observation of the weatherization process. Teams of weatherization experts and social scientists will observe weatherization intake procedures, home audits, the installation of weatherization measure by crews, and final inspections. The teams will not ask questions of weatherization staff or clients during observational periods. However, occupants observed by the teams will be asked to complete a reduced version of Part 1 of the occupant survey just once after all observations have been completed. The Field Process Study will follow weatherization activities in 20 agencies across the country, including approximately 8 different homes per agency. Occupants will be observed in three contexts, at the time of the audit, during the installation of weatherization measures, and at the time of the final inspection. Ten percent of the homes will be observed from beginning to end. Ninety percent of the observations will involve conceptual homes. To reduce project costs, our observation teams will observe as many audits, measure installation and final inspections as possible during a week spent with a local agency. Thus, instead of observing (and surveying) one client, the team will observe (and survey) three different clients to complete a conceptual home. Thus, it is estimated that the occupant survey will be administered to 448 occupants who are observed during this study. Respondents will not be offered incentives to participate in the survey.
The second task is the Indoor Air Quality study task. In this task, the ORNL team will monitor indoor air quality pre- and post-weatherization in 309 treatment homes and 59 control homes. Data will be collected on radon, carbon monoxide, formaldehyde, indoor temperature, and indoor humidity. Only those questions that address health issues contained in Part 1 of the Occupant Survey will be administered to participants in this study. Respondents will be offered a $10 incentive to participate.
Burden Estimate: The Occupant Survey is being designed so that Part 1 can be completed in 45 minutes and Part 2 in 15 minutes. Also, it is estimated that it will take one -hour for each of the 400 sampled local weatherization agencies to complete DF9. The burdens on the following groups of respondents are as follows:
National sample, treatment group – 1.75 hours x 940 = 1645 hours
National sample, control group – 1.50 hours x 529 = 793 hours
Field Process Study – .5 hours x 448 = 224 hours
IAQ Study, treatment and control groups -- .67 hours x (309+59) = 247 hours
Local weatherization agencies – 1 hour x 400 = 400 hours
Total = 3,309 hours
National Weatherization Assistance Program Evaluation
Supplemental Information for the Office of Management and Budget
August 24, 2010
INTRODUCTION – This cover letter provides the implementation details of the Weatherization Staff Survey that is a component of the retrospective evaluation of the Weatherization Assistance Program. This survey is conceived as having two parts. Part I focuses on a nationally representative sample of weatherization staff. Part II focuses on individuals who will receive weatherization training services from the 34 DOE-funded training centers.
PART I – Nationally Representative Sample of Weatherization Staff
Data Collection Instrument: DF11 Weatherization Staff Survey Data Form and S5 Weatherization Staff Survey (Part I)
Purpose: To collect information that addresses weatherization staff workforce issues.
Sequence (among state & local agency instruments): 11 and 12
Anticipated Respondents: For DF11, the anticipated respondent is an agency staff person. For the survey S5, the anticipated respondents are those who perform weatherization services. The respondents are stratified into three categories: auditors, weatherization crew chiefs, and weatherization crew members. It is expected that the majority of auditors will work for local weatherization agencies (i.e., the subgrantees), although some states have directed agencies to hire auditors under contract. Local weatherization agencies differ on their approaches towards in-house weatherization crews versus contractor crews. Thus, it is anticipated that weatherization crew chief and weatherization crew member respondents will be a mix of those who are employees of local weatherization agencies and contractor crews.
Sample Frame: Auditors, weatherization crew chiefs, and weatherization crew members who provided weatherization services to the 400 sub-sampled local weatherization agencies during PY 2010.
Sampling: Sampling will be implemented by the three stage sample approach described below. As an approximation in reckoning sample sizes, we ignore the agency sampling weights, though they will be accounted for in the data analysis. The proportions of correct responses in each sampling strata (auditor, weatherization crew chief, weatherization crew member) will be estimated to within five percentage points with 90% confidence. The standard error of the combined proportion of correct responses can be no greater than the standard error of an individual (correct/incorrect) response, which cannot exceed .5/n1/2 (maximum standard error of binomial proportion). This will be achieved if 1.645.5/n1/2 = .05, that is, if n = 271, where n is the sample size in each stratum. Nonresponse in this survey is expected to be negligible because the survey will be of weatherization staff whose contact information has been provided by the agencies and weatherization staff. The total sample size will be 2713 = 813.
A three-stage sample approach will be used. The first stage entails asking each of the 400 agencies to complete the DF11 Weatherization Auditor Data Form. This simple data form requests agencies to provide the names and contact information for all their auditors who completed audits up to that point during PY2010. A sample of 271 will be randomly selected from the pool of auditors compiled through DF11. At the completion of the survey S5, each auditor respondent will be asked to provide the names and contact information of the weatherization crew chiefs who led the weatherization crews that weatherized homes that the auditor audited during PY 2010. The auditor will be asked to identify the crew chiefs associated with the last six completed jobs in homes that the auditor audited.
Thus, the second stage entails surveying the ensuing list of weatherization crew chiefs. A sample of 271 will be randomly selected from the pool of weatherization crew chiefs. As noted above, these individuals could be employees of the local weatherization agencies or contractors. At the completion of the survey S5, each weatherization crew chief will be asked for the names and contact information of the members of his or her crew that worked on their last completed job.
Finally, the third stage entails surveying the ensuing list of weatherization crew members. A sample of 271 will be randomly selected from the pool of weatherization crew members.
The three-stage sampling approach has several benefits. First, it places minimum burdens on the agencies. Through DF11, they are only asked to provide the names and contact information of their auditors. This should take less than 30 minutes. Second, it effectively deals with the in-house crews and contractor crews. We anticipated that it might be very difficult for agencies to provide the names of all the contractor crew chiefs and especially contractor crew members since they have no reason to have this information in the first place. We believe that the auditors should be able to provide the names of the weatherization crew chiefs regardless of whether they are in-house or contractor. Providing at most six names and contact information is a small burden on the auditors. Then, the crew chiefs are only burdened a small amount to provide the names of their crew. Third, this approach captures data of high quality. Auditors are asked about recently completed jobs. The other respondents worked on recently completed jobs and the probability is high that the respondents would still be working in the weatherization field. No access to old records is required.
Survey Mode: A web-based survey will be conducted of the sampled staff. We believe that this is the most cost and time efficient manner to administer this survey. The survey will be included in the web-based system already developed for this project. Respondents will be mailed paper surveys if requested. Phone surveys can also be implemented if requested.
Communications: Prior to the survey,
With respect to the first stage of this survey, a request for completion of DF11 will be sent to director of each of the sub-sample agencies or his/her designee via e-mail with a simple message, such as:
Dear [name]:
This is the eighth data request for [insert agency name] from the national WAP evaluation team. In this request, we are asking for a list of all staff who audited homes during PY2010. Please provide this information at [insert url] by [requested completion date – about 2 weeks after the e-mailing]. If you would prefer, we can also take the information via telephone or on a form we can send you in electronic or paper form.
Please feel free to contact me with any questions you may have about this information request. Thank you in advance for your assistance with this part of the evaluation.
Sincerely,
[case manager]
[signature line with contact information]
Communications: To potential respondent
Dear [name]:
I am emailing (writing, calling) you today to ask you to participate in a national survey of low-income weatherization auditors, crew chiefs and crew members. This survey is being funded by the U.S. Department of Energy as part of a larger national evaluation of the Weatherization Assistance Program. I work with The Energy Center of Wisconsin, which is part of team of independent evaluators under subcontract to Oak Ridge National Laboratory to conduct this evaluation.
The information collected from the survey will help us evaluate weatherization staff training and other workforce issues. We estimate that the survey will take 30 minutes of your time. The Energy Center of Wisconsin will pay you $50 if you participate and complete the survey. All of the information that we obtain from this survey will remain confidential and will be analyzed in such a way that your answers cannot be associated with your name. Your answers will not be shared with or reported back to anyone within your agency or state.
To participate in the survey, please click on this link [http://????????], which will take you to the survey site. Please feel free to contact me with any questions you may have about this information request. The survey can also be conducted over the phone or we can send you a paper copy in the mail as your request. Thank you in advance for your assistance with this part of the evaluation.
Sincerely,
[case manager]
[signature line with contact information]
Response rates and non-Respondent Follow-Up: Several methods will be employed to maximize response rates. As noted above, a web-based survey will be used. The three-stage approach ensures that up-to-date contact information will be provided. Also, a $50 incentive will be provided to respondents who complete the survey.
Potential respondents will be called at least four times to solicit their participation. Agency directors will be contacted about their auditors, crew chiefs and crew members who are non-responsive.
Additional Use of Weatherization Staff Survey: The Oak Ridge National Laboratory team is conducting another task that will make use of this survey. This task, known as the Field Process Study, entails field observation of the weatherization process. Teams of weatherization experts and social scientists will observe weatherization intake procedures, home audits, the installation of weatherization measure by crews, and final inspections. The teams will not ask questions of weatherization staff or clients during observational periods. However, weatherization staff observed by the teams will be asked to complete the weatherization staff survey after all observations have been completed. The Field Process Study will follow weatherization activities in 20 agencies across the country, including approximately 8 different homes per agency. It is anticipated that on average the teams will observe 4 different auditors, 2 different crew chiefs, and 4 different crew members in each agency. Thus, the weatherization staff survey will be administered to an additional 200 weatherization staff.
Burden Estimate: ORNL estimated an average burden of less than 0.25 hours per agency respondent for DF11 and approximately 0.5 hours for S5 (Part I) per respondent. This is based on pre-testing of the instrument by the Energy Center of Wisconsin conducted during Spring 2010. The total burden is expected to be a maximum of 606 hours.
Part II. Survey of Training Center Participants
Data Collection Instrument: S5 Weatherization Staff Survey (Part II)
Purpose: To collect information that addresses weatherization staff workforce issues.
Sequence (among state & local agency instruments): N/A
Anticipated Respondents: Individuals who complete training at one of DOE’s 34 weatherization training centers. These individuals may or may not be directly employed by local agencies or contractors to conduct low-income weatherization. The individuals will be surveyed at the training centers upon completion of their training. Completion of the survey will constitute the final requirement of their training.
Sample Frame: Individuals receiving training at DOE-funded weatherization training centers during WAP PY 2011 (April 2011 to June 2012).
Sampling: DOE will ask all individuals who receive training to complete Part II of the survey. It is anticipated that ORNL will follow-up with a sample of these respondents two years hence. The sampling approach for this follow-up project will be included in a separate ICR to OMB.
Survey Mode: A web-based survey will be conducted. We believe that this is the most cost and time efficient manner to administer this survey. The survey will be included in the web-based system already developed for this project. Training centers will be mailed paper surveys if requested.
Communications: To potential respondent
Dear [name]:
As a final component of this training, we are asking that you participate in a national survey. This survey is being funded by the U.S. Department of Energy as part of a larger national evaluation of the Weatherization Assistance Program. I work with The Energy Center of Wisconsin, which is part of team of independent evaluators under subcontract to Oak Ridge National Laboratory to conduct this evaluation.
The information collected from the survey will help us evaluate weatherization staff training and other workforce issues. We estimate that the survey will take 20 minutes of your time. All of the information that we obtain from this survey will remain confidential and will be analyzed in such a way that your answers cannot be associated with your name. Your answers will not be shared with or reported back to anyone within your agency or state.
To participate in the survey, please click on this link [http://????????], which will take you to the survey site. Please feel free to contact me with any questions you may have about this information request. Thank you in advance for your assistance with this part of the evaluation.
Sincerely,
[case manager]
[signature line with contact information]
Response rates and non-Respondent Follow-Up: DOE has agreed to require completion of Part II of the survey as part of the training services provided. We do expect very high response rates. We do not have a non-response follow-up plan.
Burden Estimate: ORNL estimated an average burden of approximately 0.33 hours for S5 (Part II) per respondent. It is estimated that approximately 3000 individuals will receive training during the period of this survey. Thus, the expected burden is 1000 hours.
File Type | application/msword |
Author | Bruce Edward Tonn |
Last Modified By | Bruce Edward Tonn |
File Modified | 2010-08-25 |
File Created | 2010-08-25 |