This is the accessible text file for GAO report number GAO-12-860R entitled 'Veterans' Reemployment Rights: Department of Labor and Office of Special Counsel Need to Take Additional Steps to Ensure Demonstration Project Data Integrity' which was released on September 10, 2012. This text file was formatted by the U.S. Government Accountability Office (GAO) to be accessible to users with visual impairments, as part of a longer term project to improve GAO products' accessibility. Every attempt has been made to maintain the structural and data integrity of the original printed product. Accessibility features, such as text descriptions of tables, consecutively numbered footnotes placed at the end of the file, and the text of agency comment letters, are provided but may not exactly duplicate the presentation or format of the printed version. The portable document format (PDF) file is an exact electronic replica of the printed version. We welcome your feedback. Please E-mail your comments regarding the contents or accessibility features of this document to Webmaster@gao.gov. This is a work of the U.S. government and is not subject to copyright protection in the United States. It may be reproduced and distributed in its entirety without further permission from GAO. Because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. GAO-12-860R: [End of section] September 10, 2012: The Honorable Patty Murray: Chairman: The Honorable Richard Burr: Ranking Member: Committee on Veterans' Affairs: United States Senate: The Honorable Jeff Miller: Chairman: The Honorable Bob Filner: Ranking Member: Committee on Veterans' Affairs: House of Representatives: Subject: Veterans' Reemployment Rights: Department of Labor and Office of Special Counsel Need to Take Additional Steps to Ensure Demonstration Project Data Integrity: Congress enacted the Uniformed Services Employment and Reemployment Rights Act of 1994 (USERRA)[Footnote 1] to protect the employment and reemployment rights of federal and nonfederal employees when they leave their employment to perform military or other uniformed service and return to civilian employment after that service.[Footnote 2] Among other rights, servicemembers who meet the statutory requirements are entitled to reinstatement to the positions they would have held if they had never left their employment or to positions of similar seniority, status, and pay. With the drawdown in Iraq complete and the drawdown in Afghanistan underway, thousands of current and former military servicemembers are undergoing a transition from their military service back to their civilian employment, thereby increasing the importance of USERRA to help facilitate this transition. Under USERRA, an employee or applicant for employment who believes that his or her USERRA rights have been violated may file a claim with the Department of Labor's (DOL) Veterans' Employment and Training Service (VETS), which investigates and attempts to resolve the claim. If DOL's VETS cannot resolve the claim and the servicemember is a federal government employee or applicant to a federal agency, DOL is to inform the claimant of the right to have his or her claim referred to the Office of Special Counsel (OSC)[Footnote 3] for further review and possible OSC representation before the Merit Systems Protection Board (MSPB) or that they may file a complaint directly with the MSPB. Under a demonstration project established by the Veterans Benefits Improvement Act of 2004 (VBIA),[Footnote 4] from February 8, 2005, through December 31, 2007, OSC was authorized to receive and investigate certain USERRA claims, while DOL continued its investigative role for others. In 2007, we evaluated the demonstration project and made recommendations to DOL to help establish internal controls for claims review, claimant notification, and data management.[Footnote 5] The Veterans' Benefits Act of 2010 (VBA) directed DOL and OSC to establish a second demonstration project (36-month duration) for receiving, investigating, and resolving USERRA claims filed against federal executive agencies.[Footnote 6] As in the first demonstration project, DOL and OSC each receive claims and are each authorized to investigate and seek corrective action for those claims.[Footnote 7] The VBA also required that we evaluate how DOL and OSC designed the demonstration project and assess their relative performance during and after the demonstration project. In June 2011, we reported on the methods and procedures that DOL and OSC had agreed to establish for the demonstration project and recommended that both agencies take a number of steps to ensure a comparable process and sufficiently reliable data.[Footnote 8] In response to our recommendations, DOL neither agreed nor disagreed with our recommendations, but discussed actions underway to address the recommendations. OSC generally concurred with our recommendations. This first interim assessment of the demonstration project (1) determines the number of USERRA demonstration project claims DOL and OSC have received and resolved from August 9, 2011 (start of the demonstration project), to May 9, 2012, and additional DOL and OSC data reportable for the demonstration project to date, and (2) assesses DOL and OSC implementation of our recommendations on the demonstration project's design, including ensuring data reliability, and their adherence to other requirements in the VBA. To determine the number of USERRA demonstration project claims that DOL and OSC received and resolved and what additional data is reportable to date, we reviewed and analyzed data from DOL's and OSC's case tracking system for cases opened between August 9, 2011, and May 9, 2012. We also interviewed DOL and OSC staff on the data they plan to report for the demonstration project. To assess the extent to which DOL and OSC have implemented our recommendations on the demonstration project's design, we reviewed the recommendations from our report on the design of the project, interviewed DOL and OSC staff on the steps taken to implement the demonstration project, and reviewed supporting documentation and the requirements of the demonstration project set forth in the VBA. To assess the reliability of the data systems used to collect and track performance data for the demonstration project, we reviewed relevant documentation and interviewed DOL and OSC staff. We also tested the data collected by reviewing the data for errors, missing entries, duplicate entries, and other logic testing and also selected a random sample of 12 cases that had been opened and closed at each agency and traced certain data elements from those cases in DOL's and OSC's case tracking systems to source case files. We conducted this performance audit from April 2012 to September 2012 in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives. We shared the findings of our interim assessment of the status to date of DOL and OSC implementation of the demonstration project with both Senate and House committee staff. This report transmits the briefing slides provided to Senate and House committee staff, and are included in enclosure I. Summary of Findings: DOL and OSC began the USERRA demonstration project on August 9, 2011, meeting the time frame (within 60 days of our report on the project's design) required by the VBA. From August 9, 2011, to May 9, 2012, DOL has received 87 USERRA demonstration project cases and OSC has received 123 cases. See table 1 for cases received and resolved (i.e., claim granted or claim resolved in claimant's favor) and average processing times in the investigation phase between August 9, 2011, and May 9, 2012. Table 1: Cases Received and Resolved Favorably and Average Processing Times in the Investigation Phase between August 9, 2011, and May 9, 2012: Department of Labor; Number of cases received: 87; Number of cases closed: 78; Number of cases resolved favorably: 15; Average processing time of closed cases: 36.6 days; Average processing time of cases resolved favorably: 51.1 days. Office of Special Counsel; Number of cases received: 123; Number of cases closed: 46; Number of cases resolved favorably: 8; Average processing time of closed cases: 79.1 days; Average processing time of cases resolved favorably: 83.9 days. Source: GAO analysis of DOL and OSC data. [End of table] The data reported in this study cover only 9 months of the demonstration project and do not represent the overall results of the 36-month project nor are we drawing any conclusions of the relative performance at either agency. As both agencies continue to collect and track data, we will be able to provide an in-depth evaluation of relative performance. We did not report customer satisfaction survey data in this assessment due to the short amount of time the survey has been available to claimants and the low survey response rate. Also, while both agencies track time spent on cases on an ongoing basis, OSC only compiles cost data on those cases that have been closed while DOL compiles cost data on open and closed cases. Therefore, we plan to evaluate and compare the relative cost data during later assessments of the demonstration project. In June 2011, we made five recommendations to DOL and OSC on the design of the demonstration project jointly agreed by them. Since then we have continuously reviewed the steps both agencies have taken to implement the five recommendations. In August 2011, DOL and OSC provided documentation to us on the steps both agencies had taken since June 2011 to implement the recommendations. We reviewed the documentation provided by both agencies in August 2011 and determined that two of the recommendations had been implemented at that time, while the remaining three had not yet been fully implemented prior to the start of the demonstration project on August 9, 2011. The two recommendations implemented by DOL and OSC prior to the start of the demonstration project in August 2011 were to (1) establish a comparable two-phase process at both agencies and (2) establish a common set of case outcomes. To satisfy the first recommendation, OSC had developed a plan to establish and implement a two-phase process, and DOL officials agreed that OSC's process is comparable to DOL's process. To satisfy the second recommendation, OSC and DOL provided us with a crosswalk that identified similar case outcomes at each agency for USERRA demonstration project cases. Although we would have preferred that all five recommendations be fully implemented prior to the start of the demonstration project we did not believe that the remaining actions needed on the three outstanding recommendations warranted delaying the August 2011 start of the demonstration project. This first interim assessment of the demonstration project assesses the steps DOL and OSC have taken since August 2011 to implement the three outstanding recommendations that we determined were not fully implemented when we reviewed DOL and OSC documentation in August 2011. The outstanding recommendations not fully implemented prior to the start of the demonstration project in August 2011 were to (1) establish comparable methods for administering a customer satisfaction survey; (2) establish comparable methods for tracking the time spent on and costs of USERRA demonstration project cases; and (3) agree upon a controls plan and implementation strategy for ensuring the integrity, reliability, and accuracy of performance data for the USERRA demonstration project. Based on this interim assessment of the steps taken by DOL and OSC since August 2011, we have determined that DOL and OSC have now fully implemented the three outstanding recommendations from our assessment of the demonstration project's design, in line with the requirements in the VBA. However, while this interim assessment found that DOL and OSC fully implemented the three outstanding recommendations, both agencies could take additional steps to improve data integrity beyond what we recommended in our assessment of the demonstration project's design in June 2011. DOL and OSC actions to implement the three outstanding recommendations are described in detail in the following paragraphs. * DOL and OSC have established and administered, on an on-going basis, a customer satisfaction survey by entering into an interagency agreement with the Office of Personnel Management (OPM) as survey administrator, which provides comparable information, includes a survey plan and protocols for contacting respondents, in line with the recommendation from our assessment of the demonstration project's design. The customer satisfaction survey was first sent out on April 19, 2012, to all claimants whose cases had been closed from August 9, 2011, to April 19, 2012, 8 months after the start of the demonstration project. Since then, DOL and OSC have sent the survey on an ongoing basis after cases are closed. As of May 22, 2012, DOL and OSC have achieved a 28.2 and 46.3 percent response rate for their respective customer satisfaction surveys. While the survey has been established and deployed, the response rates for the surveys account for less than half of claimants whose cases have been closed. In addition, the survey response rates appear to present a response bias in the results as claimants who indicated that they had a favorable case outcome in the survey responded at higher rates than exist in the total population of closed cases, making the survey results generally more positive. However, DOL and OSC have no plans to assess the reasons that a claimant did not respond to the survey, although OPM stated that it may be able to conduct such an analysis for DOL and OSC but neither agency has requested it. In guidance to executive branch agencies administering surveys, the Office of Management and Budget states that agencies should try to achieve the highest practical rates of response and recommends an analysis of nonresponse bias if the overall survey response rate is less than 80 percent. While the response rate achieved is low, the results of the survey may still provide information indicating opportunities for DOL and OSC to improve their USERRA claims processing. For example, the customer satisfaction survey asked respondents to indicate which statement best describes the outcome of their complaint and, as of May 22, 2012, 5 of the 22 DOL claimants and 3 of the 18 OSC claimants said they did not know the outcome of their claim after their claim was investigated by DOL or OSC.[Footnote 9] This may indicate that DOL and OSC can take additional steps to communicate the outcome of USERRA claims. The customer satisfaction survey asks 18 questions of claimants about their customer experience with DOL and OSC, including the thoroughness of the investigation and clarity of written and oral communication. Both agencies said they would consider modifying their USERRA claims processing if they identify areas or trends from reviewing the survey results that require changes. * DOL and OSC also established cost accounting systems by the start of the demonstration project on August 9, 2011, to collect and track actual time spent processing USERRA demonstration project cases, implementing the recommendation from our assessment of the demonstration project's design. The cost accounting systems allow both agencies to compute the average cost of USERRA cases across the investigation and legal review phases, as well as in the aggregate. While the cost accounting systems developed at each agency vary somewhat in the way they track time spent, both systems track actual salary, benefits, and indirect cost components by applying an hourly rate that includes those components for each specific employee who works on and tracks time spent on demonstration project cases. * Both agencies have documented the steps they take to ensure the validity and reliability of the performance data to be reported during the demonstration project in line with the recommendation from our assessment of the demonstration project's design. Based on our assessment of the data--namely customer satisfaction survey data, cost data, and case-tracking data--we found that the performance data that both agencies report is sufficiently reliable for the purposes of evaluating relative performance during the demonstration project. When reviewing DOL and OSC cost data, we found a number of errors related to compiling the data that the agencies had to address. While DOL and OSC described the steps they take to review the cost data entered by staff, neither agency has established and documented procedures for checking the compilation of the data when reporting it during the demonstration project. Such procedures may help DOL and OSC identify errors when reporting cost data in the future, as the cost accounting systems at DOL and OSC are relatively new and both agencies reported the cost data to us for the first time during this assessment. In addition, when reviewing data from OSC's case tracking system, we found a number of discrepancies and errors in the data including cases that were missing certain data elements or were entered incorrectly. Both DOL and OSC identified the issues causing these errors, corrected the issues in their data systems to ensure reliable data going forward, and provided us with updated and corrected data during our assessment. See enclosure I for a more detailed discussion of our analysis. Conclusions: Both DOL and OSC have established methods and procedures that should allow them to report comparable and reliable performance data for the demonstration project, as required by the VBA and in accordance with our recommendations on the demonstration project's design. However, DOL and OSC could take additional steps to improve data integrity beyond what we recommended in our assessment of the demonstration project's design in June 2011. For example, after reviewing the customer satisfaction survey data, DOL and OSC may want to consider additional steps to increase the response rate and address any potential survey response bias. In addition, for the cost accounting systems, DOL and OSC could establish and document procedures for compiling and reporting the cost data during the demonstration project. These changes would increase the value of the data when evaluating relative performance and would ensure data reliability going forward with the demonstration project. In subsequent assessments of the demonstration project, we plan to evaluate the performance data in depth as DOL and OSC continue to collect and track data. Recommendations for Executive Action: We recommend that the Secretary of Labor direct the Assistant Secretary for Veterans' Employment and Training, and that the Special Counsel, take the following two actions: * To ensure that customer satisfaction survey data provides value when reviewing the relative performance of DOL and OSC during the demonstration project, DOL and OSC should, working with OPM, (1) consider additional efforts to increase the response rate, such as but not limited to additional follow-ups, contacting the claimant via other modes, or notifying the claimant of the survey initially when investigating the claim, and (2) conduct a nonresponse analysis to account for any response bias in the survey data. * To ensure that both agencies present reliable cost data for USERRA demonstration project cases going forward, DOL and OSC should establish and document procedures for checking the compilation of cost data when they report it during the demonstration project. Agency Comments and Our Evaluation: We provided a draft of this report to the Special Counsel and the Secretary of Labor for their review and comment. In written comments, which are included in enclosure II, the Special Counsel neither agreed nor disagreed with our recommendations but discussed actions that it is taking to address the recommendations. In commenting on our recommendation to ensure that customer satisfaction survey data provides value, OSC said it is collaborating with DOL on efforts to increase the response rate for the customer satisfaction survey and to conduct a nonresponse analysis. In commenting on our recommendation to establish and document procedures for checking the compilation of cost data when they report it, OSC said it is reviewing its procedures for compiling and reporting cost data during the demonstration project. OSC also stated that it is committed to making any necessary changes to ensure the demonstration project satisfies Congress's goals. In addition to providing comments on our two recommendations, OSC shared its views on the relative resources available at each agency and noted that our report did not address relative resources or staffing levels at DOL and OSC. OSC stated that for the first 6 months of the period covered by our report, it did not have funding to support its increased USERRA mission requirements and that it is important to consider resources when assessing the relative performance of DOL and OSC. As we describe in this report, the performance data presented in this interim assessment cover only 9 months and not the overall results of the 36-month demonstration project. The data presented cover the number of cases received and resolved favorably and average processing times in the investigation phase between August 9, 2011, and May 9, 2012, but we do not make any conclusions about the relative performance of DOL and OSC from the data in this report. In subsequent assessments of the demonstration project, we plan to evaluate the performance data, including capacity at each agency, in depth as DOL and OSC continue to collect and track data. In written comments, which are included in enclosure III, the Deputy Assistant Secretary for Veterans' Employment and Training neither agreed nor disagreed with our recommendations but discussed actions that DOL plans to take to implement the recommendations. In commenting on our recommendation to consider additional efforts to increase the response rate of the customer satisfaction survey and to conduct a nonresponse analysis to account for any response bias in the survey data, DOL said it has discussed with OPM options for increasing the response rate of the customer satisfaction survey and conducting an analysis of the characteristics of the claimants who did not respond to the survey to determine if there is a nonresponse bias. DOL also said that since the customer satisfaction survey methodology must be consistent at both DOL and OSC, DOL will coordinate with OSC regarding any changes it makes to increase the response rate and to account for nonresponse bias. In commenting on our recommendation to establish and document procedures for checking the compilation of cost data, DOL said it will initiate internal audits on a quarterly basis when compiling the cost data into report format. In addition, each quarter, management and investigative staff will review the report for any inconsistent or questionable data and any identified data issues will be addressed, corrected, and reported each quarter as necessary. We will send copies of this report to the Secretary of Labor and to the Special Counsel, and other interested parties. This report will also be available at no charge on GAO's website at [hyperlink, http://www.gao.gov]. If you have any questions on this report, please contact me at (202) 512-2717 or jonesy@gao.gov. Contact points for our offices of Congressional Relations and Public Affairs may be found on the last page of this report. Key contributors to this are listed in Enclosure IV. Signed by: Yvonne D. Jones: Director Strategic Issues: Enclosures: 4: [End of section] Enclosure I: Briefing Slides: Veterans' Reemployment Rights: Department of Labor and Office of Special Counsel Need to Take Additional Steps to Ensure Demonstration Project Data Integrity: Briefing for the Senate and House Committees on Veterans' Affairs: Overview: * Objectives; * Scope and Methodology; * Results in Brief; * Background; * Findings; * Conclusions; * Recommendations for Executive Action. Objectives: Our objectives for the interim assessment were: 1. to determine the number of Uniformed Services Employment and Reemployment Rights Act of 1994 (USERRA) demonstration project claims the Department of Labor (DOL) and Office of Special Counsel (OSC) have received and resolved from August 9, 2011 (start of the demonstration project), to May 9, 2012, and additional DOL and OSC data reportable for the demonstration project to date, and; 2. to assess DOL and OSC's implementation of our recommendations on the demonstration project's design, including ensuring data reliability, and their adherence to other requirements in the Veterans' Benefits Act of 2010 (VBA).[Footnote 1] Scope and Methodology: To determine the number of USERRA demonstration project claims that DOL and OSC received and resolved and what additional data are reportable to date, we reviewed and analyzed data from DOL's case tracking system (the USERRA Information Management System, or UIMS) and OSC's case tracking system (OSC 2000) for cases opened between August 9, 2011, and May 9, 2012. We also interviewed DOL and OSC staff on the data they plan to report for the demonstration project. To assess the extent to which DOL and OSC have implemented our recommendations on the demonstration project's design, we reviewed the recommendations from our report on the design of the demonstration project,[Footnote 2] interviewed DOL and OSC staff on the steps they have taken to implement the demonstration project, and reviewed supporting documentation. We also reviewed the requirements of the demonstration project set forth in the VBA. To assess the reliability of the data systems used to collect and track performance data for the demonstration project, we reviewed relevant documentation of internal controls and quality checks. We also interviewed DOL and OSC staff on the steps they take to ensure data validity and reliability. We also tested the data collected at DOL and OSC by reviewing the data for errors, missing entries, duplicate entries, and other logic testing. We also selected a random sample of 12 cases at both agencies that have been opened and closed and traced certain data elements from those cases in DOL's and OSC's respective case tracking systems to the source case files. We conducted this performance audit from April 2012 to September 2012 in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives. Results in Brief: DOL and OSC Data Reportable for Interim Assessment: DOL and OSC began the USERRA demonstration project on August 9, 2011, meeting the time frame (within 60 days of our report on the project's design) required by the VBA. See table 1 in following slide for cases received and resolved (i.e., claim granted or claim resolved in claimant's favor) and average processing times in the investigation phase between August 9, 2011, and May 9, 2012. Table 1: Cases Received and Resolved Favorably and Average Processing Times in the Investigation Phase between August 9, 2011, and May 9, 2012: Department of Labor: Number of cases received: 87; Number of cases closed: 78; Number of cases resolved favorably: 15; Average processing time of closed cases: 36.6 days; Average processing time of cases resolved favorably: 51.1 days. Office of Special Counsel: Number of cases received: 123; Number of cases closed: 46; Number of cases resolved favorably: 8; Average processing time of closed cases: 79.1 days; Average processing time of cases resolved favorably: 83.9 days. Source: GAO analysis of DOL and OSC data. [End of table] As both agencies continue to collect and track data, we will be able to provide an in-depth evaluation of relative performance. We did not report customer satisfaction survey data in this assessment due to the short amount of time the survey has been available to claimants and the low survey response rate. Also, while both agencies track time spent on cases on an ongoing basis, OSC only compiles cost data on closed cases while DOL compiles cost data on open and closed cases. We plan to evaluate and compare the relative cost data during later assessments. Results in Brief: Recommendations on the USERRA Demonstration Project's Design: In June 2011, we made five recommendations to DOL and OSC on the design of the demonstration project jointly agreed by them.[Footnote 3] In August 2011, DOL and OSC provided documentation to us on the steps both agencies had taken since June 2011 to implement the recommendations. We reviewed the documentation provided by both agencies in August 2011 and determined that two of the recommendations had been implemented at that time, while the remaining three had not yet been fully implemented prior to the start of the demonstration project on August 9, 2011. Two recommendations implemented prior to the start of the demonstration project were to: * establish a comparable two-phase process (investigation and legal review phases) at both agencies and; * establish a common set of case outcomes. To satisfy the first recommendation, OSC had developed a plan to establish and implement a two-phase process, and DOL officials agreed that OSC's process is comparable to DOL's process. To satisfy the second recommendation, DOL and OSC provided us with a crosswalk that identified similar case outcomes at each agency for USERRA demonstration project cases. Although we would have preferred that all five recommendations be implemented prior to the start of the demonstration project, we did not believe that the remaining actions needed on the three outstanding recommendations warranted delaying the start of the demonstration project. This first interim assessment of the demonstration project assesses the steps DOL and OSC have taken since August 2011 to implement the three outstanding recommendations that we determined were not fully implemented when we reviewed DOL and OSC documentation in August 2011. Three recommendations not fully implemented prior to the start of the demonstration project in August 2011 were to: * establish comparable methods for administering a customer satisfaction survey; * establish comparable methods for tracking the time spent on and costs of USERRA demonstration project cases; and; * agree upon a controls plan and implementation strategy for ensuring the integrity, reliability, and accuracy of performance data for the USERRA demonstration project. Results in Brief: DOL and OSC Implementation of the USERRA Demonstration Project's Design: Based on this interim assessment of the steps taken by DOL and OSC since August 2011, we have determined that DOL and OSC have now fully implemented the three outstanding recommendations from our assessment of the demonstration project's design, in line with the requirements in the VBA. However, we found that DOL and OSC could take steps to improve data integrity beyond what we recommended in our assessment of the demonstration project's design. DOL and OSC have established a customer satisfaction survey by entering into an interagency agreement with the Office of Personnel Management (OPM), which provides comparable information and includes a survey plan and protocols for contacting respondents. DOL and OSC also established cost accounting systems by the start of the demonstration project on August 9, 2011, to collect and track actual time spent processing USERRA demonstration project cases, and to allow both agencies to compute the average cost of USERRA cases across the investigation phase and legal review phases, as well as in the aggregate. Both agencies have documented the steps they take to ensure the validity and reliability of the performance data to be reported during the demonstration project and based on our assessment of the data, we found that the performance data both agencies report is sufficiently reliable for the purposes of evaluating relative performance during the demonstration project, after both agencies identified and addressed some issues in compiling the data. Background: Veterans' Reemployment Rights: Congress enacted USERRA to protect the employment and reemployment rights of federal and nonfederal employees when they leave their employment to perform military or other uniformed service.[Footnote 4] Under USERRA, an employee or applicant for employment who believes that his or her USERRA rights have been violated may file a claim with DOL's Veterans' Employment and Training Service (VETS), which investigates and attempts to resolve the claim. If DOL's VETS cannot resolve the claim and the servicemember is a federal government employee or applicant to a federal agency, DOL is to inform the claimant of the right to have his or her claim referred to OSC, an independent investigative and prosecutorial agency, for further review and possible OSC representation before the Merit Systems Protection Board (MSPB), or that the claimant may file a complaint directly with the MSPB. Background: Veterans' Benefits Act of 2010: Congress included language in the VBA[Footnote 5] to establish a second demonstration project (36-month duration) at DOL and OSC for receiving, investigating, and resolving USERRA claims filed against federal executive employers.[Footnote 6] Under the demonstration project, DOL is authorized to investigate and seek corrective action for those claims filed against federal executive agencies if the servicemember's Social Security number (SSN) ends in an even number, and OSC is authorized to investigate and seek corrective action for USERRA claims against federal executive agencies if the servicemember's SSN ends in an odd number, as well as USERRA claims that involve a prohibited personnel practice (PPP),[Footnote 7] which include. See background figure on next slide for depiction of USERRA claims processing at DOL and OSC under the demonstration project. Background Figure: USERRA Claims Processing under the Demonstration Project: [Refer to PDF for image: process illustration] Claimant submits claim (Form 1010) electronically or hard copy: DOL/VETS: 1) Odd-numbered claim[A]? If yes: Refer odd-numbered claims to OSC[A]. If no: continue. 2) Even-numbered claim investigated: Claim resolved: If yes: Claimant is notified of resolution; If no: continue. 3) Claimant is notified that claim is unresolved and of right to referral to OSC. Total time for #1's 1, 2, and 3: 90 days. #s 1, 2, and 3 are part of the Investigative Process under the demonstration period. 4) If claimant requests referral to OSC, VETS investigator prepares memorandum of referral (MoR). VETS regional office reviews MoR. DOL Solicitor conducts legal review prepares analysis and representation recommendation. Total time for #4: 60 days. #4 is part of the referral phase under USERRA. OSC: 5) Odd-numbered (or PPP[A]) claims investigated. Claim resolved: If yes: Claimant is notified of resolution; If no: continue. 6) Claimant is notified of investigation results and of rights to have OSC consider claim for possible representation before MSPB. Total time for #5's and 6: 90 days. #s 5 and 6 are part of the Investigative Process under the demonstration period. 7) OSC reviews investigative file (from step #4). 8) OSC determines claim has merit? If yes: OSC attempts resolution, including offering representation before MSPB; If no: Claimant informed of OSC decision and of option to file claim with MSPB without OSC representation. Total time for #'s 7 and 8: 60 days. #'s 7 and 8 are part of the referral phase under USERRA. Source: GAO (data); Art Explosion (images). [A] If, during initial processing or investigation phase, VETS personnel identify a possible PPP case, VETS and OSC will jointly determine at what point, if at all, the case should be transferred to OSC for investigation. [End of figure] The VBA also requires that DOL and OSC jointly establish methods and procedures for the demonstration project to facilitate the review of their relative performance, which included: * Defining the following performance measures: - customer satisfaction, - cost, - timeliness, - capacity (e.g. staffing levels, education, grade level, and training received), and, - case outcomes; * Defining case outcomes; * Data collection methods and timing of collection; * Data quality assurance process. The VBA requires that we provide the following assessments to Congress: * A report assessing the methods and procedures jointly established by both agencies to facilitate the review of their relative performance during the demonstration project and provide recommendations for improving the methods and procedures if necessary. - In June 2011, we reported on the methods and procedures that DOL and OSC had agreed to establish for the demonstration project and recommended that both agencies take a number of steps to ensure a comparable process and sufficiently reliable data.[Footnote 8] * Interim assessments of the demonstration project (no later than 1 year after the start of the demonstration project and annually thereafter). * A final report on the relative performance of both agencies during the 36-month demonstration project (within 90 days after the conclusion of the demonstration project). USERRA Demonstration Project Cases Received and Resolved and Average Processing Times during Investigation Phase (August 9, 2011, to May 9, 2012): From August 9, 2011 (start of the demonstration project), to May 9, 2012, DOL received 87 cases and OSC received 123 cases.[Footnote 9] Of the 87 cases received by DOL, 78 cases were closed as of May 9, 2012, in the investigation phase. * Fifteen of the closed cases were resolved in claimant's favor and had an average processing time of 51 days. * The 63 cases that were not resolved in claimant's favor had an average processing time of 33 days. * Three cases were referred to DOL's legal review phase to prepare them for referral to OSC. Of the 123 cases received by OSC, 46 were closed as of May 9, 2012, in the investigation phase. * Eight of the closed cases were resolved in claimant's favor and had an average processing time of 84 days. * The 38 cases that were not resolved in claimant's favor had an average processing time of 78 days. * Nine cases were referred to OSC's legal review phase for possible legal representation before the MSPB. DOL and OSC Have Implemented Customer Satisfaction Survey but Conduct Limited Follow-up: DOL and OSC established a customer satisfaction survey that provides comparable information and includes a survey plan and protocols for contacting respondents. To implement the survey they entered into an interagency agreement with OPM, which helped develop and administers the survey, compiles the survey data, and provides summary results of the data to both agencies. The customer satisfaction survey was first sent out on April 19, 2012, to all claimants whose cases had been closed from August 9, 2011, to April 19, 2012, 8 months after the start of the demonstration project. Since then, DOL and OSC have sent the customer satisfaction survey, on an ongoing basis, in conjunction with the closing letters being sent to claimants. DOL and OSC e-mail the initial customer satisfaction survey and follow it with a reminder e-mail 1 week later.[Footnote 10] A hard copy reminder that includes a URL to the survey is sent 2 weeks after the initial e-mail. DOL and OSC said they have no plans to send more than two reminders to claimants, even if the response rate for the customer satisfaction survey is below 40 to 50 percent. The customer satisfaction survey includes 18 questions covering the customer experience, overall satisfaction, and comments on the USERRA claims process. Two of the questions are open-ended. Responses to the two open-questions are provided to both agencies in a list and can be matched with other survey responses, but are not analyzed by DOL or OSC to identify common responses. Response Rate of Customer Satisfaction Survey May Require DOL and OSC to Take Additional Steps: Due to the survey response rates a response bias appears to exist, as claimants who indicated that they had a favorable case outcome in the survey responded at higher rates than exist in the total population of closed cases, making the survey results generally more positive. See table 2. Table 2: Customer Satisfaction Survey Data, as of May 22, 2012: Department of Labor: Response rate: 28.2%; Number of survey responses: 22; Number of claimants who indicated claim was decided or partially decided in the claimant's favor: 13; Number of overall cases resolved or partially resolved in the claimant's favor (from case tracking data)[Footnote 11]: 15; Number of cases closed (from case tracking data): 78. Office of Special Counsel: Response rate: 46.3%; Number of survey responses: 19; Number of claimants who indicated claim was decided or partially decided in the claimant's favor: 7; Number of overall cases resolved or partially resolved in the claimant's favor (from case tracking data)[Footnote 11]: 8; Number of cases closed (from case tracking data): 46. Source: GAO analysis of DOL and OSC data. [End of table] However, both agencies have no plans to assess the reasons that a claimant did not respond to the survey. OPM stated that it may be able to conduct such an analysis, but DOL and OSC have not requested it. The Office of Management and Budget (OMB) issued guidance in 2006 that provides standards and guidelines for executive branch agencies that are administering surveys.[Footnote 12] OMB states that agencies should try to achieve the highest practical rates of response and recommends an analysis of nonresponse bias if the overall survey response rate is less than 80 percent. Results of Customer Satisfaction Survey Provide Insight into Additional Opportunities for Improving USERRA Claims Process: While the response rate achieved for the customer satisfaction survey as of May 22, 2012, is low, the results of the survey may still provide information suggesting opportunities for DOL and OSC to improve their USERRA claims processing. The customer satisfaction survey asks a number of questions of claimants about their customer experience with DOL and OSC, including the thoroughness of the investigation and clarity of written and oral communication. For example, the customer satisfaction survey asked respondents to indicate which statement best describes the outcome of their complaint and, as of May 22, 2012, 5 of the 22 DOL claimants and 3 of the 18 OSC claimants[Footnote 13] said they did not know the outcome of their claim after their claim was investigated by DOL or OSC. This may indicate that DOL and OSC can take additional steps to communicate the outcome of USERRA claims. Both agencies said they would consider modifying their USERRA claims processing if they identify areas or trends from reviewing the survey results that require changes. DOL and OSC Have Implemented Cost Accounting Systems and Can Provide Comparable Cost Measures: Both DOL and OSC established cost accounting systems by the start of the demonstration project on August 9, 2011, to collect and track actual time spent processing USERRA demonstration project cases and to compute the average cost of USERRA cases across the investigation phase and legal review phase, as well as in the aggregate. DOL's cost accounting system does not track time spent on individual project cases while OSC's system does. Although not a VBA requirement, the ability to track time and costs for individual cases will create more depth in OSC's data, in addition to providing average cost and time spent per case. Both DOL and OSC cost accounting systems track actual salary, benefits, and indirect cost components by applying an hourly rate that includes those components for each specific employee who works on and tracks time spent on demonstration project cases. DOL also applies travel costs and shipping expenses, as it has staff in regional and state offices. All but one OSC USERRA staff are located in its Washington, D.C., office and OSC said that it includes travel costs and shipping expenses if they are incurred. Data Reliability: Customer Satisfaction Survey: Since OPM administers the survey and collects the data, we assessed the reliability of the customer satisfaction survey data by reviewing OPM documentation; interviewing OPM, DOL, and OSC staff; and testing the data by reviewing the data for missing entries, errors, and other logic testing. OPM described and provided supporting documentation of the procedures they have in place to ensure data reliability and validity, including running checks on the data for completeness and illegible responses and having multiple internal reviewers assess the quality of the data. Based on the collective results of our data reliability assessment, we consider data provided by OPM on the customer satisfaction survey to be sufficiently reliable for the purposes of evaluating relative performance of DOL and OSC during the demonstration project. Data Reliability: Cost Accounting Data: We assessed the reliability of DOL's and OSC's USERRA cost accounting systems by reviewing DOL and OSC documentation; interviewing DOL and OSC staff; and testing the data by reviewing the data for missing entries, errors, and other logic testing. Both DOL and OSC have developed steps for ensuring the reliability of cost data, including updating USERRA operations manuals, providing instructions to staff entering the data, and describing the steps for reviewing the data after entered by staff. When reviewing DOL and OSC cost data, we found a number of errors related to compiling and reporting the data that the agencies had to address. For DOL, we identified a number of cases that continued into the next fiscal year, but were not carried over in DOL's cost accounting system and one case in which the costs and time spent were tracked by DOL in the wrong fiscal year. For OSC, we identified one case that was closed but was not included in its cost data. DOL and OSC identified the issues causing these errors, corrected them in their data systems to ensure reliable data going forward, and provided updated and corrected data to us during our assessment. While DOL and OSC described the steps they take to review the cost data entered by staff, neither agency has established and documented procedures for checking the compilation of the data when reporting it during the demonstration project. Such procedures may help DOL and OSC identify errors when reporting cost data in the future, as the cost accounting systems at DOL and OSC are relatively new and both agencies reported the cost data to us for the first time during this assessment. Based on the collective results of our data reliability assessment, we consider the DOL and OSC cost accounting data to be sufficiently reliable for the purposes of evaluating the relative performance of DOL and OSC during the demonstration project. Data Reliability: Assessment of DOL and OSC Case-Tracking Data: We assessed the reliability of DOL and OSC USERRA case tracking databases by comparing data elements from the database to a random sample of 12 closed case files at each agency, opened between August 9, 2011, through May 9, 2012. We determined the following data elements were reliable: case number, last four digits of SSN at DOL, federal agency, investigation open date, investigation closed date, legal review open date, legal review closed date, and case outcome code. We did identify two cases at OSC where the last digit of the SSN was not entered into its case tracking system. However, both cases had SSNs ending in an odd-number and therefore correctly went to OSC. In addition to tracing electronic data to case files, we interviewed DOL and OSC staff, reviewed previous GAO reports, and reviewed the data for missing entries, errors, duplicate entries, and other logic testing. We also reviewed DOL and OSC documentation. DOL said that it is using its existing USERRA operations manual during the demonstration project to ensure data reliability and validity while OSC drafted a data reliability plan specifically for the demonstration project. Based on the collective results of our data reliability assessment, we consider the data elements we assessed in DOL and OSC case tracking databases to be sufficiently reliable for the purposes of evaluating the relative performance of DOL and OSC during the demonstration project. However, when reviewing data from OSC's case tracking system, we found a number of discrepancies and errors in the data. For example, we identified cases that had a legal review phase open date prior to the investigation closed data or cases that had a legal review phase open date, but no investigation phase closed date. OSC identified the issues causing these errors, corrected them in their system to ensure reliable data going forward, and provided corrected and updated data to us during our assessment. Conclusions: Both DOL and OSC have established methods and procedures that should allow them to report comparable and reliable performance data for the demonstration project, as required by the VBA and in accordance with our recommendations on the demonstration project's design. * However, DOL and OSC could take additional steps to improve data integrity beyond what we recommended in our assessment of the demonstration project's design in June 2011. For example, after reviewing the customer satisfaction survey data, DOL and OSC may want to consider additional steps to increase the response rate and address any potential survey response bias. In addition, for the cost accounting systems, DOL and OSC could establish and document procedures for compiling and reporting the cost data during the demonstration project. * These changes would increase the value of the data when evaluating relative performance and would ensure data reliability going forward with the demonstration project. In subsequent assessments of the demonstration project, we plan to evaluate the performance data in-depth as DOL and OSC continue to collect and track data. Recommendations for Executive Action: We recommend that the Secretary of Labor direct the Assistant Secretary for Veterans' Employment and Training, and that the Special Counsel take the following two actions: * To ensure that customer satisfaction survey data provides value when reviewing the relative performance of DOL and OSC during the demonstration project, DOL and OSC should, working with OPM, (1) consider additional efforts to increase the response rate, such as but not limited to additional follow-ups, contacting the claimant via other modes, or notifying the claimant of the survey initially when investigating the claim, and (2) conduct a nonresponse analysis to account for any response bias in the survey data. * To ensure that both agencies present reliable cost data for USERRA demonstration project cases going forward, DOL and OSC should establish and document procedures for checking the compilation of cost data when they report it during the demonstration project. GAO on the Web: Website: [hyperlink, http://www.gao.gov/] Contact: Chuck Young, Managing Director, Public Affairs, voungc1gao.gov, (202) 512-4800, U.S. Government Accountability Office 441 G Street NW, Room 7149, Washington, D.C. 20548. Copyright: This is a work of the U.S. government and is not subject to copyright protection in the United States. The published product may be reproduced and distributed in its entirety without further permission from GAO. However, because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. Briefing slides Footnotes: [1] GAO, Veterans' Reemployment Rights: Steps Needed to Ensure Reliability of DOL and Special Counsel Demonstration Project's Performance Information, [hyperlink, http://www.gao.gov/products/GAO-11-312R] (Washington, D.C.: June 10, 2011). [2] [hyperlink, http://www.gao.gov/products/GA0-11-312R]. [3] [hyperlink, http://www.gao.gov/products/GA0-11-312R]. [4] USERRA, as amended, is codified at 38 U.S.C. §§ 4301-4335. [5] Pub. L. No. 111-275. § 105. 124 Stat. 2864. 2868-70 (Oct. 13. 2010). [6] GAO, Military Personnel Improved Quality Controls Needed over Servicemembers' Employment Rights Claims at DOL, [hyperlink, http://www.gao.gov/products/GAO-07-807] (Washington, D.C., July 20, 2007). [7] There are 12 prohibited personnel practices including discrimination, retaliation, unauthorized preference or improper advantage. 5 U.S.C. § 2302. [8] http://www.gao.gov/products/GA0-11-312R]. [9] The data reported in this study cover only 9 months of the demonstration project and do not represent the overall results of the 36-month project nor are we drawing any conclusions of the relative performance at either agency. [10] For those claimants who did not provide an e-mail or who provided an incorrect e-mail, the initial contact and first follow-up about the availability of the survey are sent in hard copy form. [11] DOL and OSC case tracking data is as of May 9, 2012. [12] OMB, Standards and Guidelines for Statistical Surveys, (Washington, D.C.: September 2006). [13] One OSC claimant who responded to the customer satisfaction survey did not respond to this question. [End of Enclosure I] Enclosure II: Comments from the Office of Special Counsel: U.S. Office of Special Counsel: The Special Counsel: 1730 M Street, S.W., Suite 300: Washington, D.C. 20036-4505: August 9, 2012: Yvonne D. Jones: Director, Strategic Issues: U.S. Government Accountability Office: 441 G St., NW: Washington, DC 20548: Re: Response to GAO Draft Report GAO-12-860R: Dear Ms. Jones: Thank you for the opportunity to comment on the Government Accountability Office's (GAO) Draft Report GAO-12-860R, Veterans' Reemployment Rights: Department of Labor and Office of Special Counsel Need to Take Additional Steps to Ensure the Demonstration Project Data Integrity (Draft Report). This report is the first interim assessment of the 36-month USERRA Demonstration Project established by the Veterans' Benefits Act of 2010 (VBA; Pub. L. No. 111-275), which began on August 9, 2011. Under the VBA, GAO must report to Congress; 'on the relative performance of the Office of Special Counsel (OSC) and the Department of Labor (DOL) in investigating and resolving federal USERRA claims. OSC enforces USERRA and protects the employment rights of service members by prosecuting USERRA claims before the Merit Systems Protection Board (MSPB). OSC's representation of service members has strengthened the law and the protections afforded to service members. OSC also provides technical assistance to employees and conducts outreach and training for federal agencies on USERRA rights and responsibilities. The Demonstration Project significantly increased OSC's role in USERRA enforcement. In addition to OSC's existing USERRA responsibilities, outlined above, the project also requires OSC to investigate and seek resolution of claims for service members in over half of all federal sector USERRA cases. Prior to the Demonstration Project, DOL was responsible for all investigations. During the period covered by the Draft Report, OSC received 123 USERRA cases and DOL received 87. In assessing the relative performance of the OSC and DOL, it is important to take into account the respective resources available to each agency. For the first six months of the ten-month period covered by the Draft Report, OSC did not have funding to support its increased USERRA mission requirements.[Footnote 1] During this period, OSC's USERRA Unit consisted of four full-time employees, who handled all aspects of the Demonstration Project as well as OSC's existing USERRA enforcement functions. During the same period, DOL employed approximately 17 investigators to handle its reduced share of federal USERRA claims. GAO's Draft Report contains no information about the relative resources or staffing levels of OSC and DOL. Despite staffing challenges, OSC favorably resolved 17.4% of the cases it completed during the reporting period. This included reinstatement and restoration of employment benefits for returning veterans from Iraq and Afghanistan, as well as relief for service members who suffered employment discrimination based upon their military service. During the same period, OSC also completed its legal review of 31 additional USERRA cases referred from DOL. Regarding the "Recommendations for Executive Action" on page 7 of the Draft Report, OSC is collaborating with DOL on efforts to increase the response rate for the customer satisfaction survey and to conduct a non-response analysis, and we are reviewing our methods and procedures for compiling and reporting cost data during the project to ensure accurate and reliable cost data. We are committed to making any necessary changes to improve our service to veterans and to ensure that the USERRA Demonstration Project satisfies Congress's goals. We look forward to continuing to work with you on these important matters. Sincerely, Signed by: Carolyn N. Lerner: Special Counsel: Enclosure II Footnote: [1] OSC and DOI, are now operating under an inter-agency reimbursement agreement. [End of Enclosure II] Enclosure III: Comments from the Department of Labor: U.S. Department of Labor: Office of the Assistant Secretary for Veterans' Employment and Training: Washington, D.C. 20210: August 3, 2012: Ms. Yvonne D. Jones: Director, Strategic Issues: United States Government Accountability Office (GAO): Washington, D.C. 20548: Re: GAO Draft Report: GAO-12-860R: Dear Ms. Jones: Thank you for the opportunity to review and comment on the Government Accountability Office (GAO) draft Report 12-860R, your first interim assessment of the demonstration project involving complaints filed against Federal government agencies under the Uniformed Services Employment and Reemployment Rights Act of 1994 (USERRA), 38 U.S.C. §§ 4301-4335. The Veterans' Benefits Act of 2010, Pub. L. No. 111-275, established a 36-month USERRA Demonstration Project. Over the course of this project, the Department of Labor (DOL) Veterans' Employment and Training Service (VETS) investigates complaints against Federal agencies filed by individuals with even-numbered social security numbers, and the Office of Special Counsel (OSC) investigates claims filed by individuals with odd-numbered social security numbers, as well as claims involving allegations of prohibited personnel practices mixed with USERRA complaints. The demonstration project began on August 9, 2011. In its report, GAO made two recommendations. First, GAO recommended that DOL and OSC find ways to increase the response rate for the statutorily-required customer service survey (CSS). The survey was designed to measure Federal-sector USERRA claimants' overall satisfaction with the handling of their USERRA cases. The recommendation further suggested that the agencies conduct a non- response analysis to account for any response bias in the survey data. DOL and OSC entered into a contract with the U.S. Office of Personnel Management (OPM) to conduct the CSS. In response to GAO's recommendation, DOL staff contacted OPM to discuss options for increasing the response rate for the CSS as well as conducting an analysis of the characteristics of the cases of claimants who did not respond to the survey, to determine whether there is a non-response bias. In addition, VETS and OPM discussed the feasibility of advising claimants of the CSS at the outset of the investigation, possibly by including additional language in the opening case letter to claimants, Because the survey methodology must be consistent across the two agencies, any such notification by DOL would need to be mirrored by OSC. DOI. will coordinate with OSC to pursue both the non-response analysis and methods to encourage an increased response rate. GAO's second recommendation was that DOL and USC establish procedures to ensure accurate and reliable cost data. To address that recommendation, VETS will initiate internal audits on a quarterly basis that will compile and use demonstration project time and cost data to generate the "Time & Cost Accounting Master Report" through that quarter (similar to the report generated through May 5, 2012, at GAO's request). Each quarter's report will be reviewed by VETS' USERRA management team as well as each region's Senior Investigator(s) to look for any inconsistent or questionable data elements. Any identified data issues will be addressed, corrected and reported each quarter, as necessary. VETS believes that moving forward, all cost data submitted in connection with the demonstration will be accurate and correct. DOL again appreciates the opportunity to provide its views on the subject draft GAO report, and looks forward to implementing GAO's recommendations in the manner detailed above. Sincerely, Signed by: Ismail Ortiz, Jr. Deputy Assistant Secretary for Veterans' Employment and Training: [End of Enclosure III] Enclosure IV: GAO Contact and Staff Acknowledgments: GAO Contact: Yvonne D. Jones, (202) 512-2717 or jonesy@gao.gov: Staff Acknowledgments: In addition to the contact listed above, key contributors to this report were Trina V. Lewis, Assistant Director; Jason Vassilicos, Analyst-In-Charge; Gerard Burke; Dean Campbell; Karin Fangman; Shannon Finnegan; Sharon Miller; Melanie Papasian; Cynthia Saunders; and Gregory Wilmoth. [End of Enclosure IV] Related GAO Products: Veterans' Reemployment Rights: Steps Needed to Ensure Reliability of DOL and Special Counsel Demonstration Project's Performance Information. [hyperlink, http://www.gao.gov/products/GAO-11-312R]. Washington, D.C.: June 10, 2011. Servicemember Reemployment: Agencies Are Generally Timely in Processing Redress Complaints, but Improvements Needed in Maintaining Data and Reporting. [hyperlink, http://www.gao.gov/products/GAO-11-55]. Washington, D.C.: October 22, 2010. Military Personnel: Improvements Needed to Increase Effectiveness of DOD's Programs to Promote Positive Working Relationships between Reservists and Their Employers. [hyperlink, http://www.gao.gov/products/GAO-08-981R]. Washington, D.C.: August 15, 2008. DOD Financial Management: Adjudication of Butterbaugh Claims for the Restoration of Annual Leave or Pay. [hyperlink, http://www.gao.gov/products/GAO-08-948R]. Washington, D.C.: July 28, 2008. Military Personnel: Federal Agencies Have Taken Actions to Address Servicemembers' Employment Rights, but a Single Entity Needs to Maintain Visibility to Improve Focus on Overall Program Results. [hyperlink, http://www.gao.gov/products/GAO-08-254T]. Washington, D.C.: November 8, 2007. Military Personnel: Considerations Related to Extending Demonstration Project on Servicemembers' Employment Rights Claims. [hyperlink, http://www.gao.gov/products/GAO-08-229T]. Washington, D.C.: October 31, 2007. Military Personnel: Improved Quality Controls Needed over Servicemembers Employment Rights Claims at DOL. [hyperlink, http://www.gao.gov/products/GAO-07-907]. Washington, D.C.: July 20, 2007. Office of Special Counsel Needs to Follow Structured Life Cycle Management Practices for Its Case Tracking System. [hyperlink, http://www.gao.gov/products/GAO-07-318R]. Washington, D.C.: February 16, 2007. Military Personnel: Additional Actions Needed to Improve Oversight of Reserve Employment Issues. [hyperlink, http://www.gao.gov/products/GAO-07-259]. Washington, D.C.: February 8, 2007. Military Personnel: Federal Management of Servicemember Employment Rights Can Be Further Improved. [hyperlink, http://www.gao.gov/products/GAO-06-60]. Washington, D.C.: October 19, 2005. U.S. Office of Special Counsel's Role in Enforcing Law to Protect Reemployment Rights of Veterans and Reservists in Federal Employment. [hyperlink, http://www.gao.gov/products/GAO-05-74R]. Washington, D.C.: October 6, 2004. [End of section] FOOTNOTES [1] Pub. L. No. 103-353, 108 Stat. 3149 (Oct. 13, 1994) (codified at 38 U.S.C. §§ 4301-4335). USERRA is the most recent in a series of laws protecting veterans' employment and reemployment rights going back to the Selective Training and Service Act of 1940. Pub. L. No. 783, 54 Stat. 885, 890 (Sept. 16, 1940). [2] In addition to those serving in the armed forces and the Army and Air National Guards (when engaged in active duty for training, inactive duty training, or full-time National Guard duty), USERRA covers the commissioned corps of the Public Health Service and other persons designated by the President in time of war or national emergency. [3] OSC is an independent investigative and prosecutorial agency with the primary mission of protecting the employment rights of federal employees and applicants for federal employment. [4] Pub. L. No. 108-454, §204, 118 Stat. 3598, 3606-08 (Dec. 10, 2004). Under VBIA, the demonstration project was originally scheduled to end on September 30, 2007, but through a series of extensions ran through December 31, 2007. [5] See GAO, Military Personnel: Improved Quality Controls Needed over Servicemembers' Employment Rights Claims at DOL, [hyperlink, http://www.gao.gov/products/GAO-07-907] (Washington, D.C.: July 20, 2007). [6] Pub. L. No. 111-275, § 105, 124 Stat. 2864, 2868-70 (Oct. 13, 2010). [7] DOL is authorized to investigate and seek corrective action for those claims filed against federal executive agencies if the servicemember's Social Security number (SSN) ends in an even number, and OSC is authorized to investigate and seek corrective action for USERRA claims against federal executive agencies if the servicemember's SSN ends in an odd number. If a claim does not contain an SSN, VETS will assign a claim number based on the date of the month the claim is received. For example, claims filed on an odd-numbered date will be assigned an odd case number and forwarded to OSC; claims filed on an even-numbered date will be assigned an even case number and be investigated by VETS. Also, under the demonstration project, OSC is authorized to handle any "mixed claims" in which a claimant files a USERRA claim against a federal executive agency and also brings a related prohibited personnel practice claim. There are 12 prohibited personnel practices including discrimination, retaliation, or unauthorized preference or improper advantage. 5 U.S.C. § 2302. [8] See GAO, Veterans' Reemployment Rights: Steps Needed to Ensure Reliability of DOL and Special Counsel Demonstration Project's Performance Information, [hyperlink, http://www.gao.gov/products/GAO-11-312R] (Washington, D.C.: June 10, 2011). [9] One OSC claimant who responded to the customer satisfaction survey did not respond to this question. [End of section] GAO’s Mission: The Government Accountability Office, the audit, evaluation, and investigative arm of Congress, exists to support Congress in meeting its constitutional responsibilities and to help improve the performance and accountability of the federal government for the American people. GAO examines the use of public funds; evaluates federal programs and policies; and provides analyses, recommendations, and other assistance to help Congress make informed oversight, policy, and funding decisions. GAO’s commitment to good government is reflected in its core values of accountability, integrity, and reliability. Obtaining Copies of GAO Reports and Testimony: The fastest and easiest way to obtain copies of GAO documents at no cost is through GAO’s website [hyperlink, http://www.gao.gov]. Each weekday afternoon, GAO posts on its website newly released reports, testimony, and correspondence. To have GAO e-mail you a list of newly posted products, go to [hyperlink, http://www.gao.gov] and select “E-mail Updates.” Order by Phone: The price of each GAO publication reflects GAO’s actual cost of production and distribution and depends on the number of pages in the publication and whether the publication is printed in color or black and white. Pricing and ordering information is posted on GAO’s website, [hyperlink, http://www.gao.gov/ordering.htm]. Place orders by calling (202) 512-6000, toll free (866) 801-7077, or TDD (202) 512-2537. Orders may be paid for using American Express, Discover Card, MasterCard, Visa, check, or money order. Call for additional information. Connect with GAO: Connect with GAO on facebook, flickr, twitter, and YouTube. Subscribe to our RSS Feeds or E mail Updates. Listen to our Podcasts. Visit GAO on the web at [hyperlink, http://www.gao.gov]. To Report Fraud, Waste, and Abuse in Federal Programs: Contact: Website: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]; E-mail: fraudnet@gao.gov; Automated answering system: (800) 424-5454 or (202) 512-7470. [End of document] Congressional Relations: Katherine Siggerud, Managing Director, siggerudk@gao.gov, (202) 512-4400 U.S. Government Accountability Office, 441 G Street NW, Room 7125 Washington, DC 20548. Public Affairs: Chuck Young, Managing Director, youngc1@gao.gov, (202) 512-4800 U.S. Government Accountability Office, 441 G Street NW, Room 7149 Washington, DC 20548.