60D Notice Comment Response Summary

RGEES Pilot 60 Day FRN Public Comment Final Department Responses 7 22 2015 pm.docx

Gainful Employment Recent Graduates Employment and Earning Survey Pilot Test

60D Notice Comment Response Summary

OMB: 1845-0136

Document [docx]
Download: docx | pdf

Responses to Public Comment on the RGEES Pilot Test

Docket # ED-2015-ICCD-0063


Donna Stelling-Gurnett

Association of Proprietary Colleges


General: Please note that this information collection is for the National Center for Education Statistics to conduct a pilot test of the proposed survey items for the RGEES; it is not a field test of an institutional administration of the RGEES institutional survey that will be used to appeal graduate earnings under the gainful employment regulations. The earnings survey items on the pilot test version of the RGEES and the institutional version of the RGEES are the same; the pilot test survey language about purpose, confidentiality, and incentives is determined by its sponsorship (NCES) and operational procedures. The public has the opportunity to comment on the institutional version of the RGEES and the standards required for its administration at Docket ID: ED-2015-ICCD-0085 (http://www.regulations.gov/#!docketDetail;D=ED-2015-ICCD-0085). After public comment and pilot testing, the Department will finalize the institutional survey and publish it in both English and Spanish on its website in December of 2015. Because this is a pilot test of survey items and not a field test of institutional survey administration and reporting procedures, comments on how institutions can and should administer the RGEES survey and evaluate its results are not relevant for this pilot study information collection.


However, the Department recognizes that institutions have concerns about how to plan and implement RGEES survey operations as a basis for an appeal. To support institutions in conducting the RGEES according to required standards, the Department will be providing two additional resources—a web-based survey platform and a guide to best practices for its administration. The RGEES web-based survey platform is designed as a free, user-friendly, high-functioning data collection system that can be downloaded, installed, and administered by the institution with minimal support. The RGEES Platform contains the survey, has the ability to send e-mails, can be used to monitor response rates during data collection, can perform the requisite statistical analyses, and can produce many components of the survey appeal. The RGEES Best Practices Guide (BPG) is designed to help institutions adhere to the required standards by providing non-prescriptive recommendations about how to plan the survey, locate graduates, collect the data, conduct prescribed statistical analyses, and document the survey for auditing. The appendices include suggestions for letters, postcards, and emails that institutions can adapt for their own use in contacting graduates. The RGEES Platform and BPG will be released to the public in December of 2015; an initial draft of the BPG will be available in August of 2015.


Comments from Association of Proprietary Colleges


Bullet 1: The Department's cognitive testing information indicated that a number of test graduates responded that they would not complete the survey and disclose information about their earnings if the information was to be shared with others. The Department was able to address this by including a disclosure that the individual student in formation would be kept confidential and would not be shared with any other third party.


However, an institution sending out the survey could not include that same disclosure since the information provided to the college must be disclosed to other parties, including independent auditors as well as the Department. Additionally, it is unclear what level of personally identifiable student information will be required to verify and audit the survey results, which only further muddies the school's ability to ensure confidentiality. These issues are sure to depress survey responses.


Reply to bullet 1: To clarify the record with respect to the results of the cognitive interviews, please note that the report from the first round of cognitive interviews indicated that most of the respondents “had no privacy concerns about any of the questions and would leave none blank, but several emphasized that they would participate only if the survey looked official and was clearly from their prior institution or the U.S. Department of Education.” (Appendix 4, RGEES results of cognitive testing, page 5) In the second round, 3 of the 15 respondents said that they would not be willing to fill out the survey due to privacy concerns.


For the conduct of the pilot test, NCES is using language derived from legislation (Education Sciences Reform Act of 2002 (ESRA 2002) (20 U.S. Code, Section 9573)) and approved by the Office of Management and Budget to assure pilot survey respondents of the confidentiality of the data collected. The same laws and language do not apply to institutions that conduct the survey as part of the alternate earnings appeal process. Comments on the language that institutions should use are not relevant to the conduct of the pilot test. Recommendations for language that institutions may use to assure graduates of the privacy of their data will be included in the RGEES Best Practices Guide. Furthermore, institutions will only report survey results to the Department in the aggregate (e.g. mean and median earnings for the cohort) and will not submit individual graduate-level data. The institution is responsible for hiring the RGEES auditor and establishing procedures and protections to ensure data privacy in accordance with applicable federal and state law.


Bullet 2: The pilot test also revealed graduates' confusion about, or unwillingness to report earnings information for "off the books" or unreported income. This ultimately begs the question whether the survey results will provide a true picture of graduate earnings. If students have not reported earnings to the IRS, it is unlikely that they would then willingly and truthfully inform their school that they earned more than reported when the information they provide will be supplied to another federal agency. The Department's survey questions and procedures fail to address this reality.


Reply to bullet 2: The pilot test has not yet been conducted so it is unclear why the comment would refer to its results unless the reference was instead to the responses to the cognitive interviews. In the second round of cognitive interviews, one respondent did express “hesitance to include ‘off the books’ income.” (Appendix 4, RGEES results of cognitive testing, p. 9) One of the purposes of the RGEES is to give graduates an opportunity to report earnings from all sources, whether or not they have reported those earnings to the IRS. The RGEES specifically asks graduates to report earnings from all sources, uses well-tested and researched earnings items from federal surveys to do so, and probes for all sources of earnings to elicit the fullest picture of earnings possible. For these reasons, the RGEES offers institutions the opportunity to collect earnings data as an alternative to the records from the Social Security Administration.


The Census Bureau and the Bureau of Labor Statistics have done a number of studies comparing survey-based estimates of earnings to administrative data sources. For more information, please consult the following sources:


http://www.census.gov/hhes/www/income/publications/asa2002.pdf


http://www.census.gov/hhes/www/income/publications/assess1.pdf


http://www.census.gov/acs/www/Downloads/library/2007/Evaluation_of_Income_Estimates31207.pdf



Bullet 3: The Department offered an incentive of $25 to each individual graduate to respond to their pilot survey test. If all 3,400 graduates in the Department's selected population had responded, the total incentive would have amounted to $85,000.


It is not clear if the Department will allow institutions to offer the same or similar incentive. The success of the surveys will depend, among other things, on the ability of institutions to obtain responses, so the Department should be clear what sort of incentives are allowed for this purpose.


Reply to bullet 3: Pilot test respondents will be offered a $25 incentive for completion of the RGEES to encourage response for the reasons cited in Supporting Statement A, Section 9. Comments on the ability of institutions to use incentives are not relevant to the conduct of the pilot test; however the RGEES BPG will recommend that institutions consider using various types of respondent encouragements to help obtain graduate cooperation.


Bullet 4: The pilot survey test included the ability for graduates to make a phone call to a specific person and verbally answer the questions. Schools must be allowed this same option; otherwise the Department's pilot survey test will not be representative of the response volume that institutions can expect to receive.


Reply to bullet 4: Pilot test procedures include the provision of a toll-free number so that graduates can call and ask questions during data collection. This is in accordance with best practices for survey administration. The ability of institutions to provide a telephone number to answer questions is not relevant to the conduct of the pilot test; however the RGEES BPG recommends that institutions consider providing this service to encourage survey completion. In addition, the RGEES Platform allows institutional administrators to customize the survey with their own contact information.


Bullet 5: The proposed survey form includes three separate questions about earnings which is confusing and will inevitably lead to duplicative or conflicting information. Rather than asking the question three different ways, the question should be clearly and simply written to ask a graduate to provide all earnings from any employment for the calendar year, including freelance and self-employment. The survey needs to be much clearer that all employment income should be reported on a gross basis, and not as net income or net of any expenses.


Reply to bullet 5: The RGEES asks about earnings using a series of items designed to elicit total earnings by asking separately about 1) wages, salary, tips, overtime pay, bonuses or commission, 2) earnings from self-employment, and 3) earnings from other work including freelancing, consulting, moonlighting, or other casual jobs. Based on standard practice for earnings/income items in federal surveys, the multiple item approach is designed to help the respondent remember all of their sources of earnings. These items were derived from existing person-level surveys conducted regularly by the United States Census Bureau (Census) and the Bureau of Labor Statistics (BLS):


Question 2. This leading “No/Yes” item summarizes several pages of employment questions that precede earnings questions in the March CPS. This is the same approach as used in the National Youth Longitudinal Survey income questions (please see https://www.nlsinfo.org/content/cohorts/nlsy97/other-documentation/questionnaires, under round 16 income – item YINC – 1400).


Question 2a asks about earnings from the job held the longest during a calendar year and was derived from March CPS items Q47a (instruction to focus on the job held the longest), and Q48aa and Q48aad (request for all earnings from the longest job)(please see http://www2.census.gov/programs-surveys/cps/techdocs/cpsmar14.pdf, appendix D).


Question 2b asks about earnings from all other jobs and was derived from March CPS items Q49b1d and Q49B1A.


Questions 3 and 3a ask specifically about self-employment earnings. The items were derived from parallel items in NSLY, particularly items YINC-2000 and YINC-2100. The approach to asking about earnings is also similar to item Q48b in the March CPS. Note that in concordance with recommendations from expert reviewers, the concept of ‘net earnings’ was clarified for item 3a in the current survey.


Questions 4 and 4a ask about earnings from other jobs outside of regular jobs. It is based on items Q73A1 T and Q731 in the March CPS. Wording modifications were needed to fit the population of interest for the current earnings survey.


Bullet 6: Survey response rates can vary widely. Even when engaging all the optimization methods standard in surveying students (multiple contacts, monetary or other incentive, paid return envelopes) the response rate generally does not exceed 32%.2 Will this return rate be statistically valid? Will ED accept results at this level or below as sufficiently viable to appeal the published D/E rates? This is also an issue that has not been addressed.


2 Public Opinion Quarterly, Vol. 68 No. I Pp. 94- 10 I American Association for Public Opinion Research 2004; all rights reserved


Reply to bullet 6: Based on its extensive experience conducting numerous surveys of students and graduates, NCES estimates a response rate of 60% for the RGEES pilot test. The NCES standards require a nonresponse bias analysis for any surveys that have a response rate below 85% (Standard 4.4.1, http://nces.ed.gov/statprog/2012/pdf/Chapter4.pdf). NCES plans to conduct a nonresponse bias analysis on the pilot test results in order to conform to this requirement. The draft RGEES standards (Docket ID: ED-2015-ICCD-0085) require institutions to achieve a 50% response rate for a successful appeal.


Bullet 7: Many institutions already conduct graduate surveys, often based on specific accreditation requirements. In order to avoid duplication of efforts, or the perception of students that the RGEES survey is a disposable repeat of an already completed survey, the Department should work with accreditors to determine what processes might be combined and streamlined to ensure better response rates.


Reply to bullet 7: Graduate surveys conducted by institutions for accreditation purposes are not relevant to the conduct of the RGEES pilot test. The draft RGEES standards state that “if the survey provided by ED is administered in conjunction with another survey of program completers, the RGEES questions must be used intact. That is, no alteration of the wording of the survey questions is permitted, and the order of individual items must be preserved.” This is to ensure comparability in the collection of earnings data across institutions for the purposes of the appeal.


Responses to Public Comment on the RGEES Pilot Test

Docket # ED-2015-ICCD-0063



Richard Them

Education Management Corporation


General: Please note that this information collection is for the National Center for Education Statistics to conduct a pilot test of the proposed survey items for the RGEES; it is not a field test of an institutional administration of the RGEES institutional survey that will be used to appeal graduate earnings under the gainful employment regulations. The earnings survey items on the pilot test version of the RGEES and the institutional version of the RGEES are the same; the pilot test survey language about purpose, confidentiality, and incentives is determined by its sponsorship (NCES) and operational procedures. The public has the opportunity to comment on the institutional version of the RGEES and the standards required for its administration at Docket ID: ED-2015-ICCD-0085 (http://www.regulations.gov/#!docketDetail;D=ED-2015-ICCD-0085). After public comment and pilot testing, the Department will finalize the institutional survey and publish it in both English and Spanish on its website in December of 2015. Because this is a pilot test of survey items and not a field test of institutional survey administration and reporting procedures, comments on how institutions can and should administer the RGEES survey and evaluate its results are not relevant for this pilot study information collection.


However, the Department recognizes that institutions have concerns about how to plan and implement RGEES survey operations as a basis for an appeal. To support institutions in conducting the RGEES according to required standards, the Department will be providing two additional resources—a web-based survey platform and a guide to best practices for its administration. The RGEES web-based survey platform is designed as a free, user-friendly, high-functioning data collection system that can be downloaded, installed, and administered by the institution with minimal support. The RGEES Platform contains the survey, has the ability to send e-mails, can be used to monitor response rates during data collection, can perform the requisite statistical analyses, and can produce many components of the survey appeal. The RGEES Best Practices Guide (BPG) is designed to help institutions adhere to the required standards by providing non-prescriptive recommendations about how to plan the survey, locate graduates, collect the data, conduct prescribed statistical analyses, and document the survey for auditing. The appendices include suggestions for letters, postcards, and emails that institutions can adapt for their own use in contacting graduates. The RGEES Platform and BPG will be released to the public in December of 2015; an initial draft of the BPG will be available in August of 2015.


Comments from Education Management Corporation


Bullet 1: The survey asks the graduate to report specific calendar year income for multiple jobs from a period that is roughly two years earlier, and that is difficult to recall. Moreover, the survey does not seem to allow for the graduates to consult their records to come up with accurate figures. Based on the Department's background material, many graduates were clearly guessing at what they thought their income was for the relevant year. The survey questions even include prompts to ask the student to "think about their main job for the year," which seem s to recognize that students have difficulty remembering this information and, moreover, suggests that the students might not recall their income from other "non-main jobs" of that year. This method of earnings reporting is inadequate to produce accurate results, which is critical to determine the livelihood, even existence, of the affected educational programs.


Reply to bullet 1: The results of the second round of cognitive testing found that “Fourteen interviewees confirmed that they were recalling information from the 2013 year, and that recall was not difficult.” (Appendix 4, RGEES results of cognitive testing, page 8) A panel of experts consulted in the development of the survey items encouraged NCES to anchor the respondents’ recollection of their earnings by asking first about earnings for the “longest job” and then about “all other jobs”. Based on the results of the second round of cognitive interviews, this approach seems to have worked. While respondents in the cognitive interviews did not have the opportunity to consult records because of the nature of that kind of testing, pilot test respondents who fill out the survey at home will have the opportunity to do so.


Bullet 2: The Department's cognitive testing information indicated that a number of test graduates responded that they would not complete the survey and disclose information about their earnings if the information was to be shared with others. The Department was able to address this by including a disclosure that the individual student information would be kept confidential and would not be shared with any other third party.


However, an institution sending out the survey could not include that same disclosure since the information provided to the college must be disclosed to other parties , including independent auditors as well as the Department. Additionally, it is unclear what level of personally identifiable student information will be required to verify and audit the survey results, which only further muddies the school’s ability to ensure confidentiality. These issues are sure to depress survey responses. Moreover, it renders the Department’s survey results as non-representative since the Department was able to provide the graduates with an assurance of confidentiality that no institution will be able to give.


Reply to bullet 2: To clarify the record with respect to the results of the cognitive interviews, please note that the report from the first round of cognitive interviews indicated that most of the respondents “had no privacy concerns about any of the questions and would leave none blank, but several emphasized that they would participate only if the survey looked official and was clearly from their prior institution or the U.S. Department of Education.” (Appendix 4, RGEES results of cognitive testing, page 5) In the second round, 3 of the 15 respondents said that they would not be willing to fill out the survey due to privacy concerns.


For the conduct of the pilot test, NCES is using language derived from legislation (Education Sciences Reform Act of 2002 (ESRA 2002) (20 U.S. Code, Section 9573)) and approved by the Office of Management and Budget to assure pilot survey respondents of the confidentiality of the data collected. The same laws and language do not apply to institutions that conduct the survey as part of the alternate earnings appeal process. Comments on the language that institutions should use are not relevant to the conduct of the pilot test. Recommendations for language that institutions may use to assure graduates of the privacy of their data will be included in the RGEES Best Practices Guide. Furthermore, institutions will only report survey results to the Department in the aggregate (e.g. mean and median earnings for the cohort) and will not submit individual graduate-level data. The institution is responsible for hiring the RGEES auditor and establishing procedures and protections to ensure data privacy in accordance with applicable federal and state law.


Bullet 3: The pilot test also revealed graduates’ confusion about, or unwillingness to report earnings information for “off the books” or unreported income. This ultimately begs the question whether the survey results will provide a true picture of graduate earnings. If students have not reported earning to the IRS, it is extraordinarily unlikely that they would then willingly and truthfully inform their school that they earned more than reported when the information they provide will be supplied to another federal agency. The Department’s survey questions and procedures fail to address this reality.


Reply to bullet 3: The pilot test has not yet been conducted so it is unclear why the comment would refer to its results unless the reference was instead to the responses to the cognitive interviews. In the second round of cognitive interviews, , one respondent did express “hesitance to include ‘off the books’ income.” (Appendix 4, RGEES results of cognitive testing, p. 9) One of the purposes of the RGEES is to give graduates an opportunity to report earnings from all sources, whether or not they have reported those earnings to the IRS. The RGEES specifically asks graduates to report earnings from all sources, uses well-tested and researched earnings items from federal surveys to do so, and probes for all sources of earnings to elicit the fullest picture of earnings possible. For these reasons, the RGEES offers institutions the opportunity to collect earnings data as an alternative to the records from the Social Security Administration.


The Census Bureau and the Bureau of Labor Statistics have done a number of studies comparing survey-based estimates of earnings to administrative data sources. For more information, please consult the following sources:


http://www.census.gov/hhes/www/income/publications/asa2002.pdf


http://www.census.gov/hhes/www/income/publications/assess1.pdf


http://www.census.gov/acs/www/Downloads/library/2007/Evaluation_of_Income_Estimates31207.pdf


Bullet 4: It is not clear how the purpose of the survey will be conveyed to graduates who are asked to provide their information. Unless the survey serves some concrete purpose they care about, they are all the more likely to view it as one more piece of "junk mail" that they can ignore. The survey should include a statement regarding the importance of reporting the information and explaining that the program from which the student graduated could lose Title IV funding, and be closed down, without this important information.


Reply to bullet 4: The pilot test includes a cover letter and nonresponse follow up letters explaining to graduates that the purpose of the pilot test is to help develop a survey to support institutional appeals under the gainful employment regulations. It also includes a purpose statement at the top of the survey form itself (“The purpose of the Recent Graduates Employment and Earnings Survey is to collect information about the average earnings of graduates from programs whose students receive federal financial aid and which are subject to regulations on gainful employment.”) The Department does not agree that the pilot survey should include a statement saying that the program the student graduated from could lose Title IV funding. The purpose of the pilot test is to evaluate the performance of the survey items used to collect earnings and not the introductory text. This pilot will not result in any programs potentially losing funding so changing the purpose to include that statement would be misleading to respondents. The RGEES BPG will include examples of graduate contact letters that help explain the purpose of the institutional survey to respondents.


Bullet 5: The Department apparently offered an incentive of $25 to each individual graduate to respond to their pilot survey test. If all 3,400 graduates in the Department’s selected population had responded, the total incentive would have amounted to $85,000.


It is not at all clear if the Department will allow institutions to offer the same or similar incentive. This is very important to clarify in the final guidance. The success of the surveys will depend, among other things, on the ability of institution s to obtain responses, so the Department needs to be clear what sort of incentives are allowed for this purpose.


Of course, the incentives may also represent a large expense that many institutions would not be able to afford. Thus, the potential results of ED's pilot survey test will not be representative of a school's results if the school doesn't offer a similar incentive. In addition, institutions with limited resources will be directly disadvantaged by a system in which their ability to demonstrate their qualifications for a federal program will depend on the size of their pocketbook, an outcome that is both inequitable and entirely unfair.


Reply to bullet 5: Pilot test respondents will be offered a $25 incentive for completion of the RGEES to encourage response for the reasons cited in Supporting Statement A, Section 9. Comments on the ability of institutions to use incentives are not relevant to the conduct of the pilot test; however the RGEES BPG will recommend that institutions consider using various types of respondent encouragements to help obtain graduate cooperation. The Department does not claim that the results of the pilot test should be representative of a school’s results.


Bullet 6: The pilot survey test included the ability for graduates to make a phone call to a specific person and verbally answer the questions. School must be allowed this same option; otherwise the Department’s pilot survey test will not be representative of the response volume that institutions can expect to receive. The proposed survey form includes three separate questions about earnings which is confusing and will inevitably lead to duplicative of conflicting information. Rather than asking the question three different ways, the question should be clearly and simply written to ask graduate to provide all earnings from any employment for the calendar year including freelance and self-employment. The survey needs to be much clearer that all employment income should be reported on a gross basis, and not as net income or net of any expenses.


Reply to bullet 6: Pilot test procedures include the provision of a toll-free number so that graduates can call and ask questions during data collection. This is in accordance with best practices for survey administration. The ability of institutions to provide a telephone number to answer questions is not relevant to the conduct of the pilot test; however the RGEES BPG recommends that institutions consider providing this service to encourage survey completion. In addition, the RGEES Platform allows institutional administrators to customize the survey with their own contact information.


Bullet 7: The Department fails to recognize that many institutions already conduct graduate surveys, often based on specific accreditation requirements. In order to avoid duplication of efforts, or the perception of students that the RGEES survey is a disposable repeat of an already completed survey, the Department should work with accreditors to determine what processes might be combined and stream lined to ensure better response rates.


Reply to bullet 7: Graduate surveys conducted by institutions for accreditation purposes are not relevant to the conduct of the RGEES pilot test. The draft RGEES standards state that “if the survey provided by ED is administered in conjunction with another survey of program completers, the RGEES questions must be used intact. That is, no alteration of the wording of the survey questions is permitted, and the order of individual items must be preserved.” This is to ensure comparability in the collection of earnings data across institutions for the purposes of the appeal.


Bullet 8: Survey response rates can vary widely. Even when engaging all the optimization methods standard in surveying students (multiple contacts, monetary or other incentive, paid return envelopes) the response rate generally does not exceed 32%.*2 Will this return rate be statistically valid? Will ED accept results at this level or below as sufficiently viable to appeal the published D/E rates? This is also an issue that has not been addressed.


*2 Public Opinion Quarterly, Vol. 68 No. I Pp. 94-10 l, © American Association for Public Opinion Research 2004; all rights reserved


Reply to bullet 8: Based on its extensive experience conducting numerous surveys of students and graduates, NCES estimates a response rate of 60% for the RGEES pilot test. The NCES standards require a nonresponse bias analysis for any surveys that have a response rate below 85% (Standard 4.4.1, http://nces.ed.gov/statprog/2012/pdf/Chapter4.pdf). NCES plans to conduct a nonresponse bias analysis on the pilot test results in order to conform to this requirement. The draft RGEES standards (Docket ID: ED-2015-ICCD-0085) require institutions to achieve a 50% response rate for a successful appeal.


Responses to Public Comment on the RGEES Pilot Test

Docket # ED-2015-ICCD-0063



Deborah Magruder

Full Sail University


General: Please note that this information collection is for the National Center for Education Statistics to conduct a pilot test of the proposed survey items for the RGEES; it is not a field test of an institutional administration of the RGEES institutional survey that will be used to appeal graduate earnings under the gainful employment regulations. The earnings survey items on the pilot test version of the RGEES and the institutional version of the RGEES are the same; the pilot test survey language about purpose, confidentiality, and incentives is determined by its sponsorship (NCES) and operational procedures. The public has the opportunity to comment on the institutional version of the RGEES and the standards required for its administration at Docket ID: ED-2015-ICCD-0085 (http://www.regulations.gov/#!docketDetail;D=ED-2015-ICCD-0085). After public comment and pilot testing, the Department will finalize the institutional survey and publish it in both English and Spanish on its website in December of 2015. Because this is a pilot test of survey items and not a field test of institutional survey administration and reporting procedures, comments on how institutions can and should administer the RGEES survey and evaluate its results are not relevant for this pilot study information collection.


However, the Department recognizes that institutions have concerns about how to plan and implement RGEES survey operations as a basis for an appeal. To support institutions in conducting the RGEES according to required standards, the Department will be providing two additional resources—a web-based survey platform and a guide to best practices for its administration. The RGEES web-based survey platform is designed as a free, user-friendly, high-functioning data collection system that can be downloaded, installed, and administered by the institution with minimal support. The RGEES Platform contains the survey, has the ability to send e-mails, can be used to monitor response rates during data collection, can perform the requisite statistical analyses, and can produce many components of the survey appeal. The RGEES Best Practices Guide (BPG) is designed to help institutions adhere to the required standards by providing non-prescriptive recommendations about how to plan the survey, locate graduates, collect the data, conduct prescribed statistical analyses, and document the survey for auditing. The appendices include suggestions for letters, postcards, and emails that institutions can adapt for their own use in contacting graduates. The RGEES Platform and BPG will be released to the public in December of 2015; an initial draft of the BPG will be available in August of 2015.


Comment from Full Sail University


Bullet 1: The survey asks the graduate to report specific calendar year income for multiple jobs from a period that is roughly two years earlier, and that is difficult to recall. Moreover, the survey does not seem to allow for the graduates to consult their records to come up with accurate figures. Based on the Department's background material, many graduates were clearly guessing at what they thought their income was for the relevant year. The survey questions even include prompts to ask the student to "think about their main job for the year," which seems to recognize that students have difficulty remembering this information and, moreover, suggests that the students might not recall their income from other "non-main jobs" of that year. This method of earnings reporting is inadequate to produce accurate results, which is critical to determine the livelihood, even existence, of the affected educational programs.


Reply to bullet 1: The results of the second round of cognitive testing found that “Fourteen interviewees confirmed that they were recalling information from the 2013 year, and that recall was not difficult.” (Appendix 4, RGEES results of cognitive testing, page 8) A panel of experts consulted in the development of the survey items encouraged NCES to anchor the respondents’ recollection of their earnings by asking first about earnings for the “longest job” and then about “all other jobs”. Based on the results of the second round of cognitive interviews, this approach seems to have worked. While respondents in the cognitive interviews did not have the opportunity to consult records because of the nature of that kind of testing, pilot test respondents who fill out the survey at home will have the opportunity to do so.


Bullet 2: The Department's cognitive testing information indicated that a number of test graduates responded that they would not complete the survey and disclose information about their earnings if the information was to be shared with others. The Department was able to address this by including a disclosure that the individual student information would be kept confidential and would not be shared with any other third party.


However, an institution sending out the survey could not include that same disclosure since the information provided to the college must be disclosed to other parties, including independent auditors as well as the Department. Additionally, it is unclear what level of personally identifiable student information will be required to verify and audit the survey results, which only further muddies the school's ability to ensure confidentiality. These issues are sure to depress survey responses. Moreover, it renders the Department's survey results as non-representative since the Department was able to provide the graduates with an assurance of confidentiality that no institution will be able to give.


Reply to bullet 2: For the conduct of the pilot test, NCES is using language derived from legislation (Education Sciences Reform Act of 2002 (ESRA 2002) (20 U.S. Code, Section 9573)) and approved by the Office of Management and Budget to assure pilot survey respondents of the confidentiality of the data collected. The same laws and language do not apply to institutions that conduct the survey as part of the alternate earnings appeal process. Comments on the language that institutions should use are not relevant to the conduct of the pilot test. Recommendations for language that institutions may use to assure graduates of the privacy of their data will be included in the RGEES Best Practices Guide. Furthermore, institutions will only report survey results to the Department in the aggregate (e.g. mean and median earnings for the cohort) and will not submit individual graduate-level data. The institution is responsible for hiring the RGEES auditor and establishing procedures and protections to ensure data privacy in accordance with applicable federal and state law.


Bullet 3: The pilot test al so revealed graduates' confusion about, or unwillingness to report earnings information for "off the books" or unreported income. This ultimately begs the question whether the survey results will provide a true picture of graduate earnings. If students have not reported earnings to the IRS, it is extraordinarily unlikely that they would then willingly and truthfully inform their school that they earned more than reported when the information they provide will be supplied to another federal agency. The Department's survey questions and procedures fail to address this reality.


Reply to bullet 3: The pilot test has not yet been conducted so it is unclear why the comment would refer to its results unless the reference was instead to the responses to the cognitive interviews. In the second round of cognitive interviews, however, one respondent did express “hesitance to include ‘off the books’ income.” (Appendix 4, RGEES results of cognitive testing, p. 9) One of the purposes of the RGEES is to give graduates an opportunity to report earnings from all sources, whether or not they have reported those earnings to the IRS. The RGEES specifically asks graduates to report earnings from all sources, uses well-tested and researched earnings items from federal surveys to do so, and probes for all sources of earnings to elicit the fullest picture of earnings possible. For these reasons, the RGEES offers institutions the opportunity to collect earnings data as an alternative to the records from the Social Security Administration.


The Census Bureau and the Bureau of Labor Statistics have done a number of studies comparing survey-based estimates of earnings to administrative data sources. For more information, please consult the following sources:

http://www.census.gov/hhes/www/income/publications/asa2002.pdf


http://www.census.gov/hhes/www/income/publications/assess1.pdf


http://www.census.gov/acs/www/Downloads/library/2007/Evaluation_of_Income_Estimates31207.pdf


Bullet 4: It is not clear how the purpose of the survey will be conveyed to graduates who are asked to provide their information. Unless the survey serves some concrete purpose they care about, they are all the more likely to view it as one more piece of "junk mail" that they can ignore. The survey should include a statement regarding the importance of reporting the information and explaining that the program from which the student graduated could lose Title IV funding, and be closed down, without this important information.


Reply to bullet 4: The pilot test includes a cover letter and nonresponse follow up letters explaining to graduates that the purpose of the pilot test is to help develop a survey to support institutional appeals under the gainful employment regulations. It also includes a purpose statement at the top of the survey form itself (“The purpose of the Recent Graduates Employment and Earnings Survey is to collect information about the average earnings of graduates from programs whose students receive federal financial aid and which are subject to regulations on gainful employment.”) The Department does not agree that the pilot survey should include a statement saying that the program the student graduated from could lose Title IV funding. The purpose of the pilot test is to evaluate the performance of the survey items used to collect earnings and not the introductory text. This pilot will not result in any programs potentially losing funding so changing the purpose to include that statement would be misleading to respondents. The RGEES BPG will include examples of graduate contact letters that help explain the purpose of the institutional survey to respondents.


Bullet 5: The Department apparently offered an incentive of $25 to each individual graduate to respond to their pilot survey test. If all 3,400 graduates in the Department's selected population had responded, the total incentive would have amounted to $85,000.


It is not at all clear if the Department will allow institutions to offer the same or similar incentive. This is very important to clarify in the formal guidance. The success of the surveys will depend, among other things, on the ability of institutions to obtain responses, so the Department needs to be clear what sort of incentives are allowed for this purpose.


Of course, the incentives may also represent a large expense that at many institutions would not be able to afford. Thus, the potential results of ED's pilot survey test will not be representative of a school's results if the school doesn't offer a similar incentive. In addition, institutions with limited resources will be directly disadvantaged by a system in which their ability to demonstrate their qualifications for a federal program will depend on the size of their pocketbook, an outcome that is both inequitable and entirely unfair.


Reply to bullet 5: Pilot test respondents will be offered a $25 incentive for completion of the RGEES to encourage response for the reasons cited in Supporting Statement A, Section 9. Comments on the ability of institutions to use incentives are not relevant to the conduct of the pilot test; however the RGEES BPG will recommend that institutions consider using various types of respondent encouragements to help obtain graduate cooperation. The Department does not claim that the results of the pilot test should be representative of a school’s results.


Bullet 6: The pilot survey test included the ability for graduates to make a phone call to a specific person and verbally answer the questions. Schools must be allowed this same option, otherwise the Department's pilot survey test will not be representative of the response volume that institutions can expect to receive.


Reply to bullet 6: Pilot test procedures include the provision of a toll-free number so that graduates can call and ask questions during data collection. This is in accordance with best practices for survey administration. The ability of institutions to provide a telephone number to answer questions is not relevant to the conduct of the pilot test; however the RGEES BPG recommends that institutions consider providing this service to encourage survey completion. In addition, the RGEES Platform allows institutional administrators to customize the survey with their own contact information.


Bullet 7: The proposed survey form includes three separate questions about earnings which is confusing and will inevitably lead to duplicative or conflicting information. Rather than asking the question three different ways, the question should be clearly and simply written to ask a graduate to provide all earnings from any employment for the calendar year, including freelance and self-employment. The survey needs to be much clearer that all employment income should be reported on a gross basis, and not as net income or net of any expenses.


Reply to bullet 7: The RGEES asks about earnings using a series of items designed to elicit total earnings by asking separately about 1) wages, salary, tips, overtime pay, bonuses or commission, 2) earnings from self-employment, and 3) earnings from other work including freelancing, consulting, moonlighting, or other casual jobs. Based on standard practice for earnings/income items in federal surveys, the multiple item approach is designed to help the respondent remember all of their sources of earnings. These items were derived from existing person-level surveys conducted regularly by the United States Census Bureau (Census) and the Bureau of Labor Statistics (BLS):


Question 2. This leading “No/Yes” item summarizes several pages of employment questions that precede earnings questions in the March CPS. This is the same approach as used in the National Youth Longitudinal Survey income questions (please see https://www.nlsinfo.org/content/cohorts/nlsy97/other-documentation/questionnaires, under round 16 income – item YINC – 1400).


Question 2a asks about earnings from the job held the longest during a calendar year and was derived from March CPS items Q47a (instruction to focus on the job held the longest), and Q48aa and Q48aad (request for all earnings from the longest job)(please see http://www2.census.gov/programs-surveys/cps/techdocs/cpsmar14.pdf, appendix D).


Question 2b asks about earnings from all other jobs and was derived from March CPS items Q49b1d and Q49B1A.


Questions 3 and 3a ask specifically about self-employment earnings. The items were derived from parallel items in NSLY, particularly items YINC-2000 and YINC-2100. The approach to asking about earnings is also similar to item Q48b in the March CPS. Note that in concordance with recommendations from expert reviewers, the concept of ‘net earnings’ was clarified for item 3a in the current survey.


Questions 4 and 4a ask about earnings from other jobs outside of regular jobs. It is based on items Q73A1 T and Q731 in the March CPS. Wording modifications were needed to fit the population of interest for the current earnings survey.


Bullet 8: The Department fails to recognize that many institutions already conduct graduate surveys, often based on specific accreditation requirements. In order to avoid duplication of efforts, or the perception of students that the RGEES survey is a disposable repeat of an already completed survey, the Department should work with accreditors to determine what processes might be combined and streamlined to ensure better response rates.


Reply to bullet 8: Graduate surveys conducted by institutions for accreditation purposes are not relevant to the conduct of the RGEES pilot test. The draft RGEES standards state that “if the survey provided by ED is administered in conjunction with another survey of program completers, the RGEES questions must be used intact. That is, no alteration of the wording of the survey questions is permitted, and the order of individual items must be preserved.” This is to ensure comparability in the collection of earnings data across institutions for the purposes of the appeal.


Bullet 9: Survey response rates can vary widely. Even when engaging all the optimization methods standard in surveying students (multiple contacts, monetary or other incentive, paid return envelopes) the response rate generally does not exceed 32%.*2 Will this return rate be statistically valid? Will ED accept results at this level or below as sufficiently viable to appeal the published D/E rates? This is also an issue that has not been addressed.


*2 Public Opinion Quarterly, Vol. 68 No. I Pp. 94-10 l, © American Association for Public Opinion Research 2004; all rights reserved


Reply to bullet 9: Based on its extensive experience conducting numerous surveys of students and graduates, NCES estimates a response rate of 60% for the RGEES pilot test. The NCES standards require a nonresponse bias analysis for any surveys that have a response rate below 85% (Standard 4.4.1, http://nces.ed.gov/statprog/2012/pdf/Chapter4.pdf). NCES plans to conduct a nonresponse bias analysis on the pilot test results in order to conform to this requirement. The draft RGEES standards (Docket ID: ED-2015-ICCD-0085) require institutions to achieve a 50% response rate for a successful appeal.


Responses to Public Comment on the RGEES Pilot Test

Docket # ED-2015-ICCD-0063



Marc Jerome

Monroe College


Comment 1 - 60 Days to Conduct and Report the Results of the Survey is Not Sufficient


If the Debt/Earnings rates are appealed, the GE regulation allows for 60 days to conduct and return the survey to the Department of Education (DOE). The National Center for Education Statistics (NCES) conducted a pilot survey and indicated it took 60 days to complete the survey once the survey was developed and the participants were identified. Institutions are required to follow the rigorous guidelines and assemble similar information. They will need more time as they must

  • Identify a sample set of students

  • Obtain contact information

  • Prepare, provide and disseminate the survey

  • Communicate with respondents- Explain the survey and commitment to confidentiality

  • Follow up with non-respondents (four suggested communications)

  • Gather results of the survey

  • Prepare a non-biased responsive analysis (not prepared by NCES)

  • Summarize and communicate results back to the DOE.


It should also be noted that institutions will need to accomplish all of this with a "learning curve" that NCES did not have.


Recommendation: The institution should have the ability to submit data from a survey previously gathered by a third party. There are organizations that collect actual wage data by individual and can report this data to the institution for the identified students in a program's cohort. These surveys are far more accurate and timely than the proposed standards.


Recommendation: If an institution employs its best efforts to complete the survey process and is unable to complete it within the 60 day timeframe, the institution should be given a minimum of a 90 day extension of time to complete.


Reply to comment 1: Please note that this information collection is for the National Center for Education Statistics to conduct a pilot test of the proposed survey items for the RGEES; it is not a field test of an institutional administration of the RGEES institutional survey that will be used to appeal graduate earnings under the gainful employment regulations. The earnings survey items on the pilot test version of the RGEES and the institutional version of the RGEES are the same; the pilot test survey language about purpose, confidentiality, and incentives is determined by its sponsorship (NCES) and operational procedures. The public has the opportunity to comment on the institutional version of the RGEES and the standards required for its administration at Docket ID: ED-2015-ICCD-0085 (http://www.regulations.gov/#!docketDetail;D=ED-2015-ICCD-0085). After public comment and pilot testing, the Department will finalize the institutional survey and publish it in both English and Spanish on its website in December of 2015. Because this is a pilot test of survey items and not a field test of institutional survey administration and reporting procedures, comments on how institutions can and should administer the RGEES survey and evaluate its results are not relevant for this pilot study information collection.


However, the Department recognizes that institutions have concerns about how to plan and implement RGEES survey operations as a basis for an appeal. To support institutions in conducting the RGEES according to required standards, the Department will be providing two additional resources—a web-based survey platform and a guide to best practices for its administration. The RGEES web-based survey platform is designed as a free, user-friendly, high-functioning data collection system that can be downloaded, installed, and administered by the institution with minimal support. The RGEES Platform contains the survey, has the ability to send e-mails, can be used to monitor response rates during data collection, can perform the requisite statistical analyses, and can produce many components of the survey appeal. The RGEES Best Practices Guide (BPG) is designed to help institutions adhere to the required standards by providing non-prescriptive recommendations about how to plan the survey, locate graduates, collect the data, conduct prescribed statistical analyses, and document the survey for auditing. The appendices include suggestions for letters, postcards, and emails that institutions can adapt for their own use in contacting graduates. The RGEES Platform and BPG will be released to the public in December of 2015; an initial draft of the BPG will be available in August of 2015.


Comment 2 - The 60% Threshold for Responses is Not Reasonable


If colleges had 100% correct contact information for graduates years after graduation, a 60% response rate would be difficult. Given that this is not the case and the percentage of correct addresses may very well be below 60%, the prescribed threshold is unreasonable.


In addition, the time frame given does not allow for adequate time to obtain updated contact information. The administration of a survey will begin 3- 5 years after the student has graduated from the institution. Students rarely provide updated contact information to their college after they graduate. This will further diminish the ability of the Institution to receive an adequate sample size.


Further, a significant number of participants indicated that they probably would not complete the survey if received by mail. They indicated that they were not comfortable with the maintenance of confidentiality. It is unclear if those who indicated that they would complete the survey, only said "yes" because they had been engaged in the focus group and were given assurance of the confidentiality. Again, this diminishes the sample size.


NCES indicated that a participant had responded if only one question was completed. This implies that the completed survey response rate is actually lower than indicated. This further diminishes the ability to obtain an adequate sample size.


Recommendation: The acceptable responsive rate threshold should be reduced to 25%.


Reply to comment 2: The 60% response rate referenced in the pilot test supporting materials reflects the response rate that NCES expects to achieve based upon the pilot test procedures and processes and is an estimate based upon extensive experience conducting surveys of students and graduates. It is not a “prescribed threshold”, but rather an expectation for the results of the pilot test.

The report from the first round of cognitive interviews indicated that most of the respondents “had no privacy concerns about any of the questions and would leave none blank, but several emphasized that they would participate only if the survey looked official and was clearly from their prior institution or the U.S. Department of Education.” (Appendix 4, RGEES results of cognitive testing, page 5) In the second round, 3 of the 15 respondents said that they would not be willing to fill out the survey due to privacy concerns.


The pilot test procedures include a provision that the survey will be considered “complete” if the respondent fills out at least one of the earnings items. This is consistent with survey procedures used in other NCES studies and is particularly appropriate for the RGEES pilot since respondents are highly likely to have earnings in only one of the categories. A decision rule that required explicit answers to all of the earnings items would actually depress response rates because more of the surveys would be considered “incomplete” and therefore the graduate would be considered a nonresponse.


Comment 3 - The 60% Threshold is Inconsistent with the RGEES Expectations


The response by NCES contains inconsistencies. The expected yield as defined in statement part B assumed a 60% response rate but the unit response rate anticipated by the Recent Graduates Employment and Earnings Survey (RGEES) indicates at least SO% is expected . Analyzing the data based on a 60% response rate is arbitrary.


Recommendation: As stated above the acceptable responsive rate threshold should be reduced to 25%.


Reply to comment 3: The reference to the 50% response rate appears in Supporting Statement Part A Section 2 and Supporting Statement Part B Section 1 and says the following: “Overall unit response rates for the RGEES are expected to be at least 50 percent of an identified cohort and, in keeping with the draft standards for its administration, a nonresponse bias analysis (NRBA) will be required when unit response rates are less than 80 percent.” This sentence refers to the draft standards for the administration of the RGEES by institutions as part of the appeal process, and not to the pilot test. The expected response rate for the pilot test is 60% based on NCES experience with surveys of similar type.


Comment 4 - Burdensome to prepare and respond


The response indicated that a student would complete the survey within five minutes. While it may take 5 minutes to complete the survey there is significantly more time involved to complete it accurately. The overall impression of the survey highlighted the confusion regarding confidentiality and where to report income. The revised survey has not yet been piloted. Without verbal communication it is impossible to eliminate the confusion regarding the completion of the form or the assurance of confidentiality. In order to ensure accurate completion, the institution will be required to reach out to all respondents to ensure their understanding of the questions. This process will take longer than 5 minutes for the respondent. This process is quite burdensome to the student and institution. It is estimated that it will take a student approximately 90 minutes to complete the survey. The breakdown is as follows:


5 minutes to review the survey when received and decide if he/she will complete based on the incentive


20 minutes to review the questionnaire with the institution


45 minutes to retrieve the salary information for the time frame indicated 10 minutes to decide if unreported income will be disclosed on the survey 10 minutes to deliver the envelope to the post office for mailing


This translates to 183,600 burden hours using the same assumptions used for the RGEES.


Recommendation: As stated above, the institution should have the ability to submit data from a survey previously gathered by a third party. There are organizations that collect actual wage data by individual and can report this data to the institution for the identified students in a program's cohort. These surveys are far more accurate and timely than the proposed standards.


Reply to comment 4:

The estimate of 5 minutes for respondents to complete the RGEES pilot test survey is based on the amount of time it took respondents in the cognitive interviews to complete the survey and includes, as stated in the Paperwork Reduction Act statement at the end of the survey, “time to review instructions, gather the data needed, and complete and review the survey.” Because respondents to the RGEES pilot test will have the opportunity to consult their records to gather the data needed, the actual time to complete the survey may be longer. As part of the pilot test, the Department is giving respondents the opportunity to comment on the burden estimate. Based upon comments received, the Department may adjust its estimate at the conclusion of the pilot test.


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorSharon Boivin
File Modified0000-00-00
File Created2021-01-24

© 2024 OMB.report | Privacy Policy