NRS NPRM Language requesting approval |
Expected # of respondents |
Frequency of the Response |
Response time |
Language from sections 462.11, 462.12, 462.13, and 462.14.
§462.11 What must an application contain? (a) Application content and format. In order for the Secretary to determine whether a standardized test is suitable for measuring the gains of participants in an adult education program required to report under the NRS, a test publisher must-- (1) Include with its application information listed in paragraphs (b) through (i) of this section, and, if applicable, the information listed in paragraph (j) of this section; (2) Provide evidence that it holds a registered copyright of a test or is licensed by the copyright holder to sell or distribute a test. (3)(i) Arrange the information in its application in the order it is presented in paragraphs (b) through (j) of this section; or (ii) Include a table of contents in its application that identifies the location of the information required in paragraphs (b) through (j) of this section. (4) Submit to the Secretary three copies of its application. (b) General information. (1) A statement, in the technical manual for the test, of the intended purpose of the test and how the test will allow examinees to demonstrate the skills that are associated with the NRS educational functioning levels in §462.44. (2) The name, address, e-mail address, and telephone and fax numbers of a contact person to whom the Secretary may address inquiries. (3) A summary of the precise editions, forms, levels, and, if applicable, sub-tests and abbreviated tests that the test publisher is requesting that the Secretary review and determine to be suitable for use in the NRS. (c) Development. Documentation of how the test was developed, including a description of-- (1) The nature of samples of examinees administered the test during pilot or field testing, such as-- (i) The number of examinees administered each item; (ii) How similar the sample or samples of examinees used to develop and evaluate the test were to the adult education population of interest to the NRS; and (iii) The steps, if any, taken to ensure that the examinees were motivated while responding to the test; and (2) The steps taken to ensure the quality of test items or tasks, such as-- (i) The extent to which items or tasks on the test were reviewed for fairness and sensitivity; and (ii) The extent to which items or tasks on the test were screened for the adequacy of their psychometric properties. (3) The procedures used to assign items to-- (i) Forms, for tests that are constructed prior to being administered to examinees; or (ii) Examinees, for adaptive tests in which items are selected in real time. (d) Maintenance. Documentation of how the test is maintained, including a description of-- (1) How frequently, if ever, new forms of the test are developed; (2) The steps taken to ensure the comparability of scores across forms of the test; (3) The steps taken to maintain the security of the test; (4) A history of the test’s use, including the number of times the test has been administered; and (5) For a computerized adaptive test, the procedures used to-- (i) Select subsets of items for administration; (ii) Determine the starting point and termination conditions; (iii) Score the test; and (iv) Control for item exposure. (e) Match of content to the NRS educational functioning levels (content validity). Documentation of the extent to which the items or tasks on the test cover the skills in the NRS educational functioning levels in §462.44, including-- (1) Whether the items or tasks on the test require the types and levels of skills used to describe the NRS educational functioning levels; (2) Whether the items or tasks measure skills that are not associated with the NRS educational functioning levels; (3) Whether aspects of a particular NRS educational functioning level are not covered by any of the items or tasks; (4) The procedures used to establish the content validity of the test; (5) The number of subject-matter experts who provided judgments linking the items or tasks to the NRS educational functioning levels and their qualifications for doing so, particularly their familiarity with adult education and the NRS educational functioning levels; and (6) The extent to which the judgments of the subject matter experts agree. (f) Match of scores to NRS educational functioning levels. Documentation of the adequacy of the procedure used to translate the performance of an examinee on a particular test to an estimate of the examinee’s standing with respect to the NRS educational functioning levels in §462.44, including-- (1) The standard-setting procedures used to establish cut scores for transforming raw or scale scores on the test into estimates of an examinee’s NRS educational functioning level; (2) If judgment-based procedures were used-- (i) The number of subject-matter experts who provided judgments, and their qualifications; and (ii) Evidence of the extent to which the judgments of subject-matter experts agree; (3) The standard error of each cut score, and how it was established; and (4) The extent to which the cut scores might be expected to differ if they had been established by a different (though similar) panel of experts. (g) Reliability. Documentation of the degree of consistency in performance across different forms of the test in the absence of any external interventions, including-- (1) The correlation between raw (or scale) scores across alternate forms of the test or, in the case of computerized adaptive tests, across alternate administrations of the test; (2) The consistency with which examinees are classified into the same NRS educational functioning levels across forms of the test. Information regarding classification consistency should be reported for each NRS educational functioning level that the test is being considered for use in measuring; (3) The adequacy of the research design leading to the estimates of the reliability of the test, including-- (i) The size of the sample(s); (ii) The similarity between the sample(s) used in the data collection and the adult education population; and (iii) The steps taken to ensure the motivation of the examinees; and (4) Any other information explaining the methodology and procedures used to measure the reliability of the test. (h) Construct validity. Documentation of the appropriateness of a given test for measuring educational gain for the NRS, i.e., documentation that the test measures what it is intended to measure, including-- (1) The extent to which the raw or scale scores and the educational functioning classifications associated with the test correlate (or agree) with scores or classifications associated with other tests designed or intended to assess educational gain in the same adult education population as the NRS; (2) The extent to which the raw or scale scores are related to other relevant variables, such as teacher evaluation, hours of instruction, or other measures that may be related to test performance; (3) The adequacy of the research designs associated with these sources of evidence (see paragraph (g)(3) of this section); and (4) Other evidence demonstrating that the test measures gains in educational functioning resulting from adult education and not from other construct-irrelevant variables, such as practice effects. (i) Other information. (1) A description of the manner in which test administration time was determined, and an analysis of the speededness of the test. (2) Additional guidance on the interpretation of scores resulting from any modifications of the tests for an individual with a disability. (3) The manual provided to test administrators containing procedures and instructions for test security and administration. (4) A description of the training or certification required of test administrators and scorers by the test publisher. (5) A description of retesting (e.g., re-administration of a test because of problems in the original administration such as the test taker becomes ill during the test and cannot finish, there are external interruptions during testing, or there are administration errors) procedures and the analysis upon which the criteria for retesting are based. (6) Such other evidence as the Secretary may determine is necessary to establish the test’s compliance with the criteria and requirements the Secretary uses to determine the suitability of tests as provided in §462.13. (j) Previous tests. (1) For a test used to measure educational gain in the NRS before the effective date of these regulations that is submitted to the Secretary for review under this part, the test publisher must provide documentation of periodic review of the content and specifications of the test to ensure that the test continues to reflect NRS educational functioning levels. (2) For a test first published five years or more before the date it is submitted to the Secretary for review under this part, the test publisher must provide documentation of periodic review of the content and specifications of the test to ensure that the test continues to reflect NRS educational functioning levels. (3) For a test that has not changed in the seven years since the Secretary determined, under §462.13, that it was suitable for use in the NRS that is again being submitted to the Secretary for review under this part, the test publisher must provide updated data supporting the validity of the test for use in classifying adult learners with respect to the NRS educational functioning levels and the measurement of educational gain as defined in §462.43 of this part. (4) If a test has been substantially revised--for example by changing its structure, number of items, content specifications, item types, or sub-tests--from the most recent edition reviewed by the Secretary under this part, the test publisher must provide an analysis of the revisions, including the reasons for the revisions, the implications of the revisions for the comparability of scores on the current test to scores on the previous test, and results from validity, reliability, and equating or standard-setting studies undertaken subsequent to the revisions. (Approved by the Office of Management and Budget under control number 1830- ) (Authority: 20 U.S.C. 9212)
§462.12 What procedures does the Secretary use to review the suitability of tests? (a) Review. (1) When the Secretary receives a complete application from a test publisher, the Secretary selects experts in the field of educational testing and assessment who possess appropriate advanced degrees and experience in test development or psychometric research, or both, to advise the Secretary on the extent to which a test meets the criteria and requirements in §462.13. (2) The Secretary reviews and determines the suitability of a test only if an application-- (i) Is submitted by a test publisher; (ii) Meets the deadline established by the Secretary; (iii) Includes a test that-- (A) Has two or more secure, parallel, equated forms of the same test--either traditional paper and pencil or computer-administered instruments--for which forms are constructed prior to administration to examinees; or (B) Is an adaptive test that uses computerized algorithms for selecting and administering items in real time; however, for such an instrument, the size of the item pool and the method of item selection must ensure negligible overlap in items across pre- and post-testing; (iv) Includes a test that samples one or more of the major content domains of the NRS educational functioning levels of ABE, ESL, or ASE with sufficient numbers of questions to represent adequately the domain or domains; and (v) Includes the information prescribed by the Secretary, including the information in §462.11 of this part. (b) Secretary’s determination. (1) The Secretary determines whether a test meets the criteria and requirements in §462.13 after taking into account the advice of the experts described in paragraph (a)(1) of this section. (2) For tests that contain multiple sub-tests measuring content domains other than those of the NRS educational functioning levels, the Secretary determines the suitability of only those sub-tests covering the domains of the NRS educational functioning levels. (c) Suitable tests. If the Secretary determines that a test satisfies the criteria and requirements in §462.13 and, therefore, is suitable for use in the NRS, the Secretary-- (1) Notifies the test publisher of the Secretary’s decision; and (2) Annually publishes in the Federal Register and posts on the Internet at www.nrsweb.org a list of the names of tests and the educational functioning levels the tests are suitable to measure in the NRS. A copy of the list is also available from the U.S. Department of Education, Office of Vocational and Adult Education, Division of Adult Education and Literacy, 400 Maryland Avenue, SW., room 11159, Potomac Center Plaza, Washington, DC 20202-7240. (d) Unsuitable tests. (1) If the Secretary determines that a test does not satisfy the criteria and requirements in §462.13 and, therefore, is not suitable for use in the NRS, the Secretary notifies the test publisher of the Secretary’s decision and of the reasons why the test does not meet those criteria and requirements. (2) Within 30 days after the Secretary notifies a test publisher that its test is not suitable for use in the NRS, the test publisher may request that the Secretary reconsider the Secretary’s decision. This request must be accompanied by-- (i) An analysis of why the information and documentation submitted meet the criteria and requirements in §462.13, notwithstanding the Secretary’s earlier decision to the contrary; and (ii) Any additional documentation and information that address the Secretary’s reasons for determining that the test was unsuitable. (3) The Secretary reviews the additional information submitted by the test publisher and makes a final determination regarding the suitability of the test for use in the NRS. (i) If the Secretary’s decision is unchanged and the test remains unsuitable for use in the NRS, the Secretary notifies the test publisher, and this action concludes the review process. (ii) If the Secretary’s decision changes and the test is determined to be suitable for use in the NRS, the Secretary follows the procedures in paragraph (c) of this section. (e) Revocation. (1) The Secretary’s determination regarding the suitability of a test may be revoked if the Secretary determines that-- (i) The information the publisher submitted as a basis for the Secretary’s review of the test was inaccurate; or (ii) A test has been substantially revised--for example, by changing its structure, number of items, content specifications, item types, or sub-tests. (2) The Secretary notifies the test publisher of the-- (i) Secretary’s decision to revoke the determination that the test is suitable for use in the NRS; and (ii) Reasons for the Secretary’s revocation. (3) Within 30 days after the Secretary notifies a test publisher of the decision to revoke a determination that a test is suitable for use in the NRS, the test publisher may request that the Secretary reconsider the decision. This request must be accompanied by documentation and information that address the Secretary’s reasons for revoking the determination that the test is suitable for use in the NRS. (4) The Secretary reviews the information submitted by the test publisher and makes a final determination regarding the suitability of the test for use in the NRS. (5) If the Secretary revokes the determination regarding the suitability of a test, the Secretary publishes in the Federal Register and posts on the Internet at www.nrsweb.org a notice of that revocation along with the date by which States and local eligible providers must stop using the revoked test. A copy of the notice of revocation is also available from the U.S. Department of Education, Office of Vocational and Adult Education, Division of Adult Education and Literacy, 400 Maryland Avenue, SW., room 11159, Potomac Center Plaza, Washington, DC 20202-7240. (Approved by the Office of Management and Budget under control number 1830- ) (Authority: 20 U.S.C. 9212)
§462.13 What criteria and requirements does the Secretary use for determining the suitability of tests? In order for the Secretary to consider a test suitable for use in the NRS, the test or the test publisher, if applicable, must meet the following criteria and requirements: (a) The test must measure the NRS educational functioning levels of members of the adult education population. (b) The test must sample one or more of the major content domains of the NRS educational functioning levels of ABE, ESL, or ASE with sufficient numbers of questions to adequately represent the domain or domains. (c)(1) The test must meet all applicable and feasible standards for test construction and validity provided in the 1999 edition of the Standards for Educational and Psychological Testing, prepared by the Joint Committee on Standards for Educational and Psychological Testing of the American Educational Research Association, the American Psychological Association, and the National Council on Measurement in Education incorporated by reference in this section. The Director of the Federal Register approves this incorporation by reference in accordance with 5 U.S.C. 552(a) and 1 CFR part 51. You may obtain a copy from the American Psychological Association, Inc., 750 First Street, NE., Washington, DC 20002. You may inspect a copy at the Department of Education, room 11159, 550 12th Street, SW., Washington, DC 20202 or at the National Archives and Records Administration (NARA). For information on the availability of this material at NARA, call (202) 741-6030, or go to: http://www.archives.gov/federal_register/code_of_federal_regulations/ibr_locations.html. (2) If requested by the Secretary, a test publisher must explain why it believes that certain standards in the 1999 edition of the Standards for Educational and Psychological Testing were not applicable or were not feasible to meet. (d) The test must contain the publisher’s guidelines for retesting, including time between test-taking, which are accompanied by appropriate justification. (e) The test must-- (1) Have two or more secure, parallel, equated forms of the same test--either traditional paper and pencil or computer administered instruments--for which forms are constructed prior to administration to examinees; or (2) Be an adaptive test that uses computerized algorithms for selecting and administering items in real time; however, for such an instrument, the size of the item pool and the method of item selection must ensure negligible overlap in items across pre- and post-testing. Scores associated with these alternate administrations must be equivalent in meaning. (f) For a test that has been modified for individuals with disabilities, the test publisher must-- (1) Provide documentation that it followed the guidelines provided in the Testing Individuals With Disabilities section of the 1999 edition of the Standards for Educational and Psychological Testing; (2) Provide documentation of the appropriateness and feasibility of the modifications relevant to test performance; and (3)(i) Recommend educational functioning levels based on the information obtained from adult education students who participated in the pilot or field test and who have the disability for which the test has been modified; and (ii) Provide documentation of the adequacy of the procedures used to translate the performance of adult education students with the disability for whom the test has been modified to an estimate of the examinees’ standing with respect to the NRS educational functioning levels. (Approved by the Office of Management and Budget under control number 1830- ) (Authority: 20 U.S.C. 9212)
§462.14 How often and under what circumstances must a test be reviewed by the Secretary? (a) The Secretary’s determination that a test is suitable for use in the NRS is in effect for a period of seven years from the date of the Secretary’s written notification to the test publisher, unless otherwise indicated by the Secretary. After that time, if the test publisher wants the test to be used in the NRS, the test must be reviewed again by the Secretary so that the Secretary can determine whether the test continues to be suitable for use in the NRS. (b) If a test that the Secretary has determined is suitable for use in the NRS is substantially revised--for example, by changing its structure, number of items, content specifications, item types, or sub-tests--and the test publisher wants the test to continue to be used in the NRS, the test publisher must submit, as provided in §462.11(j)(4), the substantially revised test or version of the test to the Secretary for review so that the Secretary can determine whether the test continues to be suitable for use in the NRS. (Approved by the Office of Management and Budget under control number 1830- ) (Authority: 20 U.S.C. 9212)
|
50 |
Once a test maker is approved, the test is suitable for seven years.
Approved test publishers would resubmit application every seven years for approval.
New test publishers can also seek approval on an annual basis from the Department.
|
40 hrs per response. This includes time to: review instructions, search for existing data sources, gathering and organizing the data needed, completing and reviewing the application as described in section 462.11, and responding to questions the Secretary may raise. |
File Type | application/msword |
File Title | NRS NPRM Language requesting approval |
Author | sue.liu |
Last Modified By | DoED User |
File Modified | 2008-01-11 |
File Created | 2008-01-11 |