Download:
pdf |
pdfU.S. Department
of Labor
Employment
and Training
Administration
O*NET® Data
Collection Program
Office of Management and Budget
Clearance Package Supporting Statement
Part B: Statistical Methods
with
Table of contents
List of exhibits
References
February 15, 2012
Table of Contents
Section
A.
Page
Justification ....................................................................................................................................... 1
A.1
A.2
A.3
A.4
A.5
A.6
A.7
A.8
A.9
A.10
A.11
A.12
A.13
A.14
A.15
A.16
A.17
A.18
Circumstances of Information Collection ................................................................................... 1
A.1.1 What Is the O*NET Program? ...................................................................................... 2
A.1.2 The O*NET Data Collection Approach ......................................................................... 5
A.1.3 Summary of the O*NET Data Collection Process ........................................................ 9
A.1.4 Summary of Response Rate Experience to Date....................................................... 12
A.1.5 Statutory and Regulatory Information ......................................................................... 16
A.1.6 Federal Register Notice .............................................................................................. 23
Uses, Products, and Services Based on the O*NET Program ............................................... 23
A.2.1 The O*NET Database, O*NET OnLine, My Next Move, O*NET Career Tools,
O*NET Training Academy, and O*NET Code Connector .......................................... 24
A.2.2 O*NET Web Site Statistics ......................................................................................... 27
A.2.3 Examples of O*NET Data and Products in Use ......................................................... 30
A.2.4 Examples of the O*NET Program in Published Literature ......................................... 47
A.2.5 Looking to the Future with Competency Models ........................................................ 48
Uses of Information Technology ............................................................................................. 49
A.3.1 Web Questionnaires ................................................................................................... 50
A.3.2 Project Web Site ......................................................................................................... 50
A.3.3 The Case Management System and Data Collection Utilities .................................... 52
A.3.4 Additional Uses of the Internet for Data Collection .................................................... 53
Efforts to Identify Duplication................................................................................................... 54
Efforts to Minimize Burden on Small Establishments ............................................................. 55
Consequences of Collecting the Information Less Frequently ................................................ 56
Special Circumstances ............................................................................................................ 57
Consultation Outside the Agency ............................................................................................ 58
Payments or Gifts to Respondents ......................................................................................... 59
A.9.1 Incentives for the Point of Contact and the Employer ................................................ 59
A.9.2 Incentives for the Employee ....................................................................................... 60
A.9.3 Incentives for Occupation Experts .............................................................................. 61
Assurance of Confidentiality .................................................................................................... 61
Questions of a Sensitive Nature .............................................................................................. 63
Estimates of Annualized Hour Burden .................................................................................... 63
Annual Reporting Burden Cost ............................................................................................... 68
Estimates of Annualized Cost to Government ........................................................................ 68
Reasons for Program Changes or Adjustments Reported in Sections A.13 and A.14 ........... 68
Time Schedule, Publication, and Analysis Plans .................................................................... 69
A.16.1 Data Analysis Tasks Conducted for Each Cycle ........................................................ 69
A.16.2 Creation of the Occupation Database ........................................................................ 73
Display of Expiration Date ....................................................................................................... 73
Exceptions to Certification Statement ..................................................................................... 73
iii
B.
Collection of Information Employing Statistical Methods .......................................................... 75
B.1
B.2
B.3
B.4
B.5
C.
iv
Sampling Universe, Sampling Methods, and Expected Response Rates .............................. 75
B.1.1 Establishment Method ................................................................................................ 75
B.1.2 Occupation Expert Method ....................................................................................... 100
Procedures for the Collection of Information ......................................................................... 101
B.2.1 Establishment Method .............................................................................................. 102
B.2.2 Occupation Expert Method ....................................................................................... 108
Methods to Maximize Response Rates ................................................................................. 110
Tests of Procedures .............................................................................................................. 112
B.4.1 1999 Pretest Experiments ........................................................................................ 112
B.4.2 Wave 1.1 Experiment ............................................................................................... 112
B.4.3 Point-of-Contact Incentive Experiment ..................................................................... 113
B.4.4 Experiments in Weight Trimming ............................................................................. 113
B.4.5 Experiments in Model-Aided Sampling .................................................................... 114
B.4.6 Alternative Measures of Uncertainty ........................................................................ 114
B.4.7 Suppression of Estimates with Poor Precision ......................................................... 115
B.4.8 Dual-Frame Sampling for Hard-to-Find Occupations ............................................... 115
B.4.9 Alternative Levels of Population Coverage .............................................................. 116
B.4.10 Adaptive Total Design .............................................................................................. 116
B.4.11 Analysis of Unit and Item Nonresponse ................................................................... 116
B.4.12 Additional Tests of Procedures ................................................................................ 117
Statistical Consultants ........................................................................................................... 117
References .................................................................................................................................... 119
List of Exhibits
Number
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
11.
12.
13.
14.
15.
16.
17.
18.
19.
Page
O*NET Content Model ...................................................................................................................... 3
O*NET Data Collection Program Questionnaires ............................................................................ 4
Establishment Method Data Collection Results ............................................................................. 12
Occupation Expert Method Data Collection Results ...................................................................... 15
O*NET Citations in Code of Federal Regulations .......................................................................... 18
Database Updates .......................................................................................................................... 24
Main Organization Types Submitting O*NET Certifications ........................................................... 29
O*NET Product Downloads ............................................................................................................ 29
Distribution of Frame and Sample Establishments by Employment Size ...................................... 56
Estimate of Hour and Cost Burden by Year ................................................................................... 66
Comparison of Hour and Cost Burden Between 2009–2011 and June 2012–May 2015 .............. 70
Data Analysis and Publication Schedule ........................................................................................ 71
Summary of Sample Selection Process ......................................................................................... 80
Half-Width of 95% Confidence Intervals ........................................................................................ 88
Classification of Establishments by Occupation Model-Aided Sampling Status ............................ 93
Overlap of Full and Supplemental Frames..................................................................................... 96
Establishment Method Data Collection Flowchart........................................................................ 104
Occupation Expert Method Data Collection Flowchart ................................................................ 110
Statistical Consultants .................................................................................................................. 118
v
B. Collection of Information Employing
Statistical Methods
B.1
Sampling Universe, Sampling Methods, and Expected Response
Rates
A multiple-method data collection approach for creating and updating the Occupational
Information Network (O*NET) database has been developed to maximize the information for
each occupation while minimizing data collection costs. The primary source of information for
the database is a survey of establishments and sampled workers from within selected
establishments, which is referred to as the Establishment Method of data collection. With its twostage sample design, employees are sampled in their workplace, the establishments being
selected in the primary stage and employees being selected in the secondary stage.
Although the Establishment Method provides the best approach for most occupations, a
special frame (e.g., a professional association membership list) is sometimes used to supplement
the Establishment Method in a dual-frame approach when additional observations are required.
When this supplementation to the Establishment Method is used, a dual-frame adjustment is
made to the sampling weights to account for the coverage overlap between the two sources of
collected data.
A second method entails recruitment of appropriate occupation experts who can supply
the information required for an occupation (Occupation Expert Method, or OE Method). An
occupation expert is someone who has worked in the occupation for at least one year and has 5
years of experience as an incumbent, trainer, or supervisor. Additionally, an occupation expert
must have had experience with the occupation within the most recent 6 months. The OE Method
is used for occupations as necessary to improve sampling efficiency and avoid excessive use of
burden, as when it is difficult to locate industries or establishments with occupation incumbents;
employment is low; or employment data are not available, as is the case for many new and
emerging occupations.
B.1.1 Establishment Method
Establishment Method Sampling Universe
The central goal of the O*NET Data Collection Program is to provide data for each of the
O*NET Standard Occupational Classification (SOC) occupations, which are prevalent to varying
degrees in different industries in the United States. Estimates from this program are designed to
75
Statistical Methods
OMB Supporting Statement
assist users in distinguishing among occupations and are not necessarily designed to capture all
of the subtle differences between jobs in different industries. Nonetheless, the O*NET sampling
universe for each occupation is generally a subset of all employees in the occupation who are
working in the United States. This subset, or target population for the occupation, is defined by
two criteria: (1) its workers represent a majority of job incumbents in the occupation, and
(2) data among this set of establishments can be gathered with reasonable efficiency.
Previous O*NET experience has shown that trying to build a sampling frame that covers
100% of an occupation is inefficient and poses undue burden for some establishments. For
example, the occupation-by-industry matrix data suggested that a very small number of
bricklayers could be found in establishments in the hospital industry; however, asking a point of
contact (POC) in a hospital about bricklayers led to some difficulties. In addition to the
questioning’s being unduly burdensome, often the Business Liaison (BL) lost credibility when a
POC was asked about occupations not likely associated with his or her establishment, as with
bricklayers in hospitals. Moreover, the establishment POC may give some false negative
responses because the POC simply does not know whether some rare occupations exist in his or
her establishment. This situation would be particularly likely for larger establishments. To
address these concerns, the target population is defined so that it includes establishments in
industries and size categories in which the occupation is most prevalent.
When less than complete population coverage is allowed, it is possible that some bias
may be introduced into the study estimates if the covered and noncovered population members
would give substantially different responses to the survey questions. To evaluate this potential
bias in the O*NET estimates, an investigation was conducted that considered 18 randomly
selected occupations for which at least 80% population coverage had been achieved. The
linkages of these 18 occupations to industries were then reconsidered, and reduced sets of
industries were determined that covered only 50% of workers in each occupation. Estimates for a
selected set of outcomes were then computed from the reduced data set, which simulated
estimates at the 50% level of population coverage. When the original data with at least 80%
coverage were compared with the reduced data with 50% coverage, no systematic differences in
the estimates were observed. The great majority of the differences between the two sets of
estimates were very small and symmetrically distributed around zero. The observed pattern could
be explained by random sampling error alone and provided no evidence of bias due to reduced
frame coverage. The investigation concluded that no systematic bias is introduced with the use of
a population coverage minimum of 50% for each occupation compared with a minimum
coverage of 80%. On the basis of these results, O*NET Establishment Method data collection
now maintains a population coverage of at least 50% for each occupation.
76
OMB Supporting Statement
Statistical Methods
Sampling Waves
To help identify industries in which particular occupations are employed, the O*NET
sampling method uses employment statistics published by the U.S. Bureau of Labor Statistics
(BLS) and supplemented by empirical information learned during O*NET data collection.
Groups of approximately 50 occupations each, called primary waves, are formed so that the
occupations in a primary wave are employed in a similar set of industries. For example,
carpenters and plumbers are both employed by establishments in construction-related industries.
Thus, when establishments are selected from the industries associated with a primary wave of
occupations, a selected establishment is much more likely to employ one or more of the
occupations in the wave than it would have been to employ one or more occupations not grouped
this way. This method minimizes the number of establishments that must be contacted for
selection of the required number of employees for an occupation. For example, when
construction trades, such as those of carpenters and plumbers, are grouped together in a primary
wave of occupations, it is much more likely that an establishment selected from constructionrelated industries will employ at least one of the 50 related occupations in the wave than would
be the case if sampling had been from a broader set of industries associated with a group of
unrelated occupations.
Each primary wave of occupations is scheduled to be fielded in three subwaves of
establishment samples. The subwaves are identified as X.1, X.2, and X.3, where X represents the
set of primary occupations and where the accompanying number represents the order in which
the subwaves of establishment samples occur. For example, Subwave 3.1 denotes the first
sample of establishments for the occupation set known as Wave 3, and 3.3 denotes the third
sample of establishments for the occupation set. Any occupation that requires additional
respondents is included in the next subwave. The first subwave of establishments uses the
Occupational Employment Statistics (OES) data to indicate those industries most likely to
employ the occupations. It is designed to include a wide range of industries and to cover at least
50% of the target population. As each subwave establishment sample is selected, the experience
gained from the previous subwaves is used to more effectively target the sample to industries in
which the occupations have been demonstrated to be found.
If, after being fielded in its X.3 subwave, an occupation lacks a sufficient number of
completed respondents, then it is fielded in a completion wave. Completion waves combine the
difficult-to-complete occupations from several waves and are designed to target industries with a
high probability of employing the occupations. The goal of a completion wave is to ensure that
the number of establishments selected for each occupation is sufficient to complete all
occupations in the wave. Statistically, a completion wave is no different from the X.1, X.2, and
X.3 subwave sampling process, with the same sampling, weighting, and estimation methods
77
Statistical Methods
OMB Supporting Statement
being used to conduct the completion wave. Essentially, a completion wave adds a fourth
subwave of sampling for some difficult-to-complete occupations. Packaging together some of
these occupations in a combined wave maintains operational efficiency.
Sampling steps are carried out for the primary occupations associated with a wave. The
primary occupations are those occupations selected for a wave as a result of the clustering of
occupations likely to be employed in the same industries. Once the sets of industries to be
targeted in a wave are identified, additional secondary occupations likely to be found in these
industries are added to the wave and allocated to the selected establishments. To improve
efficiency, if a selected establishment employs fewer than the maximum number of allowed
primary occupations, secondary occupations are included for that establishment.
The method described above for sampling waves yields two major benefits. First, by
prescribing sampling from industries that have been determined by empirical data to employ the
occupation of interest, it maximizes the efficiency of the sample. Second, it minimizes the
oversampling of any one occupation. Because the establishment sample size for a particular set
of occupations is spread over three subwaves, if an occupation is found more easily than
expected, the sample for future subwaves can be used to find other occupations in the wave
rather than to continue searching for this particular occupation.
To minimize both the cost of conducting the O*NET Data Collection Program and the
burden placed on the establishments, the number of employees selected into the sample and the
number of returned questionnaires are monitored carefully on a daily basis. Once it becomes
clear that at least 15 of the goal of 20 completed respondents will be available for each of the
three domain questionnaires for an occupation, consideration is given to terminating further
sampling of employees for that occupation. This step is taken because of the difficulty of
estimating the rate at which employees will be encountered during employee sampling. In some
cases, employees from an occupation are much easier to locate than anticipated, and the desired
number of responding employees is quickly exceeded; continuing to sample employees for such
an occupation would use resources inefficiently and burden the establishments unnecessarily.
The method used to control employee sample selection is called Model-Aided Sampling
(MAS). With this method, target numbers of sampled employees are defined, before data
collection begins for each occupation, by Census region, business size, and industry division.
MAS ensures that the resulting sample of employees is distributed across the target cells
approximately in proportion to the population distribution across the target cells. MAS sample
size targets are based on information from the OES survey conducted by BLS and from
establishment information provided by Dun & Bradstreet (D&B). Once data collection begins,
daily progress toward the targets is monitored closely for each occupation. Once a cell’s MAS
78
OMB Supporting Statement
Statistical Methods
target is achieved, the selection of employees in that cell is stopped. This cessation allows future
data collection subwaves to focus on the cells that have not yet reached the MAS targets.
The use of MAS to enhance the efficiency of the O*NET sample is considered one of the
most important sample design innovations in the history of the program. This approach has
dramatically reduced data collection costs, with minimal effects on the accuracy of the
estimators. Development of the MAS methodology began in early 2004 and continued through
the end of 2006. As part of this research effort, various sample cutoff or stopping rules were
investigated by means of Monte Carlo simulation. A cutoff rule determines the point at which
efforts to interview certain types of establishments and employees are discontinued because the
prespecified cell sample size criteria have been satisfied. These studies showed that, under a
fairly wide range of cutoff rules, MAS does not bias the O*NET estimates or their standard
errors. At the same time, MAS dramatically reduces the number of establishment contacts
required to satisfy the random sample allocation for an occupation. This finding has resulted in
substantial reductions in respondent burden, data collection costs, and time required to complete
an occupation (Berzofsky, Welch, Williams, & Biemer, 2006).
Sampling Steps for the Primary Occupations Associated with a Wave
As mentioned, the Establishment Method involves multiple sample selection steps.
Establishments are selected during the first steps of selection, and employees are selected during
the later steps. This sample selection process is diagrammed in Exhibit 13 and is further detailed
in text.
Step 1: Create establishment sampling frame. Two major sources of information are
used to create the establishment sampling frame. First, a list covering nearly 12 million
establishments in the United States is constructed from the Dun & Bradstreet list of U.S.
establishment locations. Use of the D&B frame is based on an evaluation that showed it to be the
least costly data source that had the essential establishment-level information, including industry
type and location-specific employment. The frame is updated quarterly to ensure that the most
current and accurate establishment information possible is used for selecting the sample for each
subwave. Additional information from the OES survey, conducted by BLS, is merged with the
D&B list of establishments. From the combined file a matrix is created for each occupation; the
matrix lists the industries in which the occupation is found and, for each industry, the number of
employees and associated establishments by Census region and by each of four size categories.26
26
The establishment size category has four levels based on the number of employees working at an establishment.
These levels are unknown number of employees (none) and 1–9 employees, 10–49 employees, 50–249
employees, and 250 or more employees.
79
Statistical Methods
OMB Supporting Statement
Exhibit 13. Summary of Sample Selection Process
Step 1: Create Establishment Sampling Frame.
Step 2: Determine Industries to Target for Each
Occupation in Subwave.
Step 3: Select Initial Sample of Establishments from
Frame.
Step 4: Select Final Set of Establishments for Subwave.
Step 5: Assign Occupations to Selected Establishments.
Step 6: Specify Employee Sample Selection Algorithm
for Each Occupation/Establishment.
Step 7: Specify Occupation-Specific Algorithm for
Random Assignment of Questionnaire Types of Sampled
Employees.
Step 8: Randomly Assign Selected Establishments to
Business Liaisons for Data Collection.
Step 2: Determine industries to target for each occupation in subwave. Using the matrix
developed in Step 1, the industry–region–employee-size categories for each occupation are
classified into one of four concentration groups: high, medium, low, or none. This classification
helps to target industries in which an occupation is likely to be found. For example, if it is
believed that establishments in a particular industry or industries have a good chance of
containing a particular occupation, then the industry is classified as high. Similarly, industries
with a low chance of having an occupation are classified as low. To increase efficiency in the
sampling process, the sample is designed to oversample the high industries and undersample the
low industries for each occupation. None denotes those industries that were not expected to
80
OMB Supporting Statement
Statistical Methods
contain any employees in the occupation of interest or that had a negligible proportion of the
employees.
Step 3: Select initial sample of establishments from frame. First, for each industry, the
number of establishments to be selected is determined. However, because only limited access to
some D&B list data fields is possible before sample selection of establishments, a stratified
simple random sample of establishments is first selected from the list, with a sample size larger
than the number that will ultimately be contacted. For this large simple random sample of
establishments, the O*NET team has access to the full D&B establishment information that will
be used in Step 4 of selection. Within each stratum, the selection probability for the ith selected
establishment is
p1i n1 / N1 ,
(1)
where n1 and N 1 are the number of establishments selected and number of establishments in the
population, respectively. The associated sampling weight for this step is
w1i 1 / p1i .
(2)
Step 4: Select final set of establishments for subwave. A subsample of establishments is
selected with probability proportionate to a composite size measure (Folsom, Potter, & Williams,
1987) from the simple random sample of establishments selected in Step 3. If a single occupation
were being studied, then common sampling practice would be to select a sample of
establishments with probabilities proportional to the number of employees working in the
occupation at each establishment. Because several occupations are to be sampled from the
selected establishments, a cross-occupation composite size measure is used to lessen the
variation in selection probabilities and in the final analysis weights. The composite size measure
accounts for the estimated number of employees in the occupations of interest within an industry
group, as well as for the overall sampling rates. The composite size measure for ith establishment
selected in Step 3 is
S i w1i j f j M ij
,
(3)
where the summation is over the occupations (j), f j m j / M j is the overall sampling fraction for
the jth occupation, mj is the desired national sample size of employees from the jth occupation,
th
M j i w1i M ij is the estimated total number of employees in the j occupation on the frame, and
Mij is the estimated number of employees in the jth occupation from the ith establishment.
81
Statistical Methods
OMB Supporting Statement
For each occupation, the sampling rate is generally greatest for those establishments
classified in the high industry group for the occupation and successively lower for each of the
remaining medium, low, and none industry groups. Once the composite size measure is
calculated, all nongovernmental industries are sorted in descending order by industry so that
similar industries are proximate to each other before they are split into four equal groups based
on the composite size measure. These four groups form the nongovernmental industry strata.
Governmental establishments are not classified into separate industries on the D&B frame.
Consequently, all governmental establishments constitute a fifth industry stratum in the design.
In addition, some occupations are highly concentrated in only one or two industries.
These industries are split from the previously defined industry strata groupings to form separate
―special‖ industry strata. These special strata often prove valuable for particular occupations
because otherwise a small number of establishments would be selected from these industries.
Forming the special strata for occupation-valuable industries with comparatively small size
measures ensures that a minimum number of establishments from the ―special‖ industry strata
are selected into the final sample. Establishments are further stratified by number of employees
in the establishment and by Census region. Establishments with a large number of employees are
oversampled, while establishments with few employees are undersampled. The degree of overor undersampling varies for each subwave.
Chromy’s (1979) probability minimum replacement (PMR) selection procedure is then
used to select a probability-proportional-to-size sample of establishments within each stratum,
with probabilities of selection
p 2i n 2 S i / S ,
where
n2
is the number of establishments selected from the stratum and
(4)
S
is the sum of the
composite size measures for establishments from the stratum. The sampling weight associated
with this step is
w2i 1 / p2i .
(5)
Step 5: Assign occupations to selected establishments. To limit the burden for a
particular establishment, each establishment selected in Step 4 is assigned a maximum of 10
occupations randomly selected with probability proportional to size. Here the size measure is the
product of the sampling rate for the occupation (fj) and the establishment’s estimated number of
employees within the occupation, Mij. Before selection of a sample of the occupations,
occupations certain to be selected because of their large sizes are included in the sample and
removed from the frame, and the number of times they would have been ―hit‖ (which, by the
82
OMB Supporting Statement
Statistical Methods
PMR method, can exceed 1) is recorded. Then the remaining (noncertainty) occupations are
sampled and placed in a random order. The certainty occupations are listed first, followed by the
randomly ordered noncertainty units. For each establishment, both the set of up to 10
occupations and the number of times each occupation was selected (which could be greater than
1 for certainty occupations) is entered into the Case Management System (CMS).
As before, Chromy’s (1979) PMR selection method is used to select occupations with
probability proportional to size. To understand how his method is applied here, one may suppose
the ith establishment has Ji occupations associated with it. The size measure for the jth occupation
is defined as
Oij f j M ij ,
so that
Oi j Oij j f j M ij Si / w1i
.
(6)
A sample of up to 10 occupations for each establishment will be selected with the
expected number of times that the jth occupation is selected being 10Oij / Oi , which may be greater
than 1 for some occupations. For an occupation j where it is greater than 1, the occupation is
selected with certainty and assigned an O*NET point value equal to vij by randomly rounding
10Oij / Oi
to one of its two adjacent integers. That is,
ij Int(10Oij / Oi ) with probability 1-Frac(10Oij / Oi )
and
ij Int(10Oij / Oi ) 1 with probability Frac(10Oij / Oi ),
(7)
where Int and Frac are the integer and fractional parts of a decimal number. This rounding
provides an integer number of selections associated with each selected establishment while
retaining the correct noninteger expected value for the number of selections. The certainty
occupations appear at the top of the list used by BLs to inquire about occupations at the
establishment. From among the remaining occupations, a probability-proportional-to-size sample
is selected. If Ci is the number of certainty-occupation sampling hits from the ith establishment,
Ci j ij , summation over the certainty occupations, then the remaining occupations are
selected with probabilities ( 10 Ci )Oij / Oij , summation over noncertainty occupations. The
selected occupations are assigned an O*NET point value vij = 1. As noted previously, the
83
Statistical Methods
OMB Supporting Statement
selected noncertainty occupations are then placed in a random order and follow the certainty
occupations on the list of occupations used by the BLs.
When a POC is identified within each establishment, the RTI BL reviews the list of
occupations with the POC, asking the POC to estimate the number of employees at that
establishment in each occupation. Each time the BL receives a response that is greater than zero,
a counter within the CMS is increased by the associated O*NET point value, the randomly
rounded number of times the occupation was selected. If the counter reaches 5 before the BL has
completed the list of occupations, the BL stops. After the maximum 5 occupations are identified,
the POC is asked to roster all individuals in the selected occupations.
To determine the final occupation selection probabilities, one must adjust for the
occupations remaining on the establishment’s sampling list at the point where the BL stopped as
a result of having found the maximum number of occupations to be included in data collection. It
is assumed that the resulting sample of up to 5 occupations is a random sample of the originally
selected occupations for an establishment. This assumption is supported by the random ordering
of the noncertainty occupations. Let ai be the total number of sampling hits among all of the
occupations about which the BL inquired before stopping; then, ai j ij , summation over the
occupations inquired about by the BL. The final selection probability for the jth occupation from
the ith establishment is
p3ij
ai 10Oij a i Oij
10 Oi
Oi
.
(8)
The associated sampling weight is
w3ij 1/ p3ij .
(9)
This method accomplishes two important goals:
It results in an approximate random sample of occupations with known probabilities
of selection.
It limits POC and establishment burden. This goal is achieved because the number of
positive POC responses is limited to a maximum of 5. If the company is large and
happens to have employees in all 10 occupations, then stopping after 5 occupations
minimizes the perceived burden on the POC, as opposed to the alternative of asking
for employment estimates for all 10 occupations and then subselecting 5.
Step 6: Specify employee sample selection algorithm for each occupation/
establishment. In this step of selection, the algorithm is specified for randomly selecting
employees from an employee roster provided by the POC. The resulting number of employees
selected from each occupation is proportional to the number of times the occupation was selected
84
OMB Supporting Statement
Statistical Methods
in Step 5, its O*NET point value. However, to further minimize burden on an establishment, the
total number of employees selected within any single establishment never exceeds 20, and the
total number of employees selected in each occupation (within each establishment) never
exceeds 8. If fewer than 20 employees are rostered and fewer than 8 are rostered for each
occupation, then all rostered employees are selected. Otherwise, a random sample of employees
is selected, subject to the constraints just described. If nij and Nij are the number of employees
selected and the number of employees listed, respectively, from the jth occupation at the ith
establishment, then the selection probability for an employee k from the jth occupation is
p4ijk nij / N ij
(10)
w4ijk 1/ p4ijk .
(11)
and the associated sampling weight is
Step 7: Specify occupation-specific algorithm for random assignment of questionnaire
types to sample employees. In this step, the algorithm is specified for assigning each selected
employee to a domain questionnaire type. The survey is designed to collect data for each
occupation from 20 respondents to each of three different domain questionnaires (Generalized
Work Activities, Work Context, and Knowledge). At this step of selection, all employees
selected in Step 6 are randomly assigned to one of the three questionnaire types. The
questionnaire assignments are made in proportion to the number of employee respondents
required for each questionnaire type in a subwave. To implement this algorithm, a queue is
established that lists the order in which questionnaire types are assigned to employees from a
specific occupation. The questionnaire types are listed in random order, with each type occurring
in the queue at a rate proportional to the number of completed questionnaires required from that
type. When an employee is selected, he or she is assigned the questionnaire type at the head of
the queue. The next listed questionnaire type is then moved to the top of the queue for use with
the next selected employee.
Although employees are randomly assigned to a questionnaire, analysis weights are
computed for each occupation, not separately for each of the three O*NET questionnaires used
within each occupation. Many of the questions are the same across the three questionnaires; the
estimates for these questions are produced from all the respondents for the occupation. Selected
incumbents are assigned randomly, with equal probabilities, to each of the three questionnaires,
with approximately the same number of incumbents responding to each of the questionnaires.
Consequently, bias is not introduced for estimates of means and proportions specific to a single
questionnaire. Producing the analysis weights for the entire occupation effects greater stability
85
Statistical Methods
OMB Supporting Statement
because the entire occupation offers a larger available sample size than each of the three
individual questionnaires offers.
Step 8: Randomly assign selected establishments to Business Liaisons for data
collection. The final step of sample selection randomly assigns selected establishments to a BL.
To make this process more efficient and ensure that the BL workloads are equal, establishments
are stratified by industry grouping, and assignments are made within strata. To do so, the
Operations Center manager assigns up to three BLs to each stratum and indicates the number of
establishments to be assigned to each BL. The specified number of establishments is then
assigned to each BL. The only exceptions to the randomization process occur when multiple
establishments are sampled from the same company, in which all establishments are assigned to
the same BL, or when observed trends inform decisions about matching specific BLs to
industries for future case assignments. Although establishments are randomly assigned to BLs,
this step does not enter into the overall selection probabilities or analysis weights: it is an
administrative step to reduce any potential bias associated with the BLs.
The various weighting components associated with many of the Establishment Method
sampling steps are combined to produce the final analysis weights, as will be described in the
subsection on weighting.
Supplemental Frames
If the sample yield for an occupation proves to be far less than expected from the
Establishment Method, the use of a special frame is considered for completing an occupation
when additional Establishment Method subwaves would likely be nonproductive or inefficient.
In this situation, if a suitable supplemental frame can be obtained, then the occupation is
removed from the Establishment Method wave and sampled separately. Supplemental frames
were used for 2.4% of the O*NET-SOC occupations published to date. A supplemental frame
may be either a listing of establishments highly likely to employ a particular occupation, or a
listing of incumbents working in an occupation. For example, a trade association of business
establishments highly likely to employ an occupation would be appropriate when the occupation
is highly concentrated in a particular type of establishment. In addition, some occupations’
workers tend to be members of associations or are licensed. In such situations, it is often possible
to obtain a frame listing either establishments or incumbents working in the occupation.
When a listing of establishments is obtained, a random sample of the establishments is
selected from the frame. The sample usually will be stratified by geographic location and any
other available establishment information that may be related to type of incumbents working in
the occupation. Simple random samples are usually selected, because often little information is
known about the number of incumbents the establishments employ. Consequently, if n and N are
86
OMB Supporting Statement
Statistical Methods
the number of establishments selected and the number on the frame, respectively, within a
stratum, then the selection probability for the ith selected establishment is
p5i n / N ,
(12)
w5i 1/ p5i .
(13)
and the associated weight is
The selected establishments are then included in Step 6 of the sampling process, with the single
target occupation associated with each establishment.
On the other hand, when the supplemental frame lists incumbents working in an
occupation, then a simple random sample of incumbents is selected, usually stratified by
geographic location and, if available, subspecialty of the occupation. If n and N are the number
of incumbents selected and the number on the frame, respectively, within a stratum, then the
selection probability for the kth selected incumbent working in the jth occupation from the ith
establishment is27
p6ijk n / N ,
(14)
w6ijk 1 / p6i .
(15)
and the associated weight is
The selected incumbents are then directly contacted, and their participation is solicited. Those
who agree to participate are entered into sampling process Step 7, by which a question type is
assigned to each incumbent.
The supplemental frame weights are adjusted for nonresponse and combined with the
Establishment Method weights, as will be described in the subsection on weighting.
Employee Sample Size
A key issue in sample design is the level of precision required in the resulting data and
the cost of producing a particular level of precision, in terms of both dollars and respondent
burden. The O*NET sample design has been developed to provide results with a level of
precision that should be adequate to meet the needs of general-purpose users (those seeking
information at the occupational level). Consistent with the procedures used by the O*NET
27
Subscripts corresponding to occupation and establishment are added here for ease of notation when the
supplemental samples are combined with the Establishment Method samples in the subsection on weighting.
87
Statistical Methods
OMB Supporting Statement
Program since 2001, an occupation is considered complete and ready for inclusion in the final
O*NET database when at least 15 valid completed questionnaires (after data cleaning) have been
obtained for each of the three questionnaire domains.
The current sample size goal is based on the final technical report of Peterson et al.
(1997), which presents means and standard deviations for both 5- and 7-point Likert scales, with
consecutive integer scores, for the descriptors within Skills, Knowledge, Generalized Work
Activities, Abilities, and Work Styles. Statistics were computed separately with the reported data
for each of six occupations. The data in these tables indicate that when 15 responses per
descriptor are obtained, the mean values for virtually all of the 5-point and the 7-point
descriptors have 95% confidence intervals (CIs) that are no wider than plus or minus 1 to 1.5
scale points for all occupations.
Exhibit 14 displays the half-width of 95% CIs for means of 5- and 7-point scales asked of
all respondents, by sample size, from Analysis Cycles 9 through 12 for all incumbent
occupations. The items are summarized in Exhibit 2 as those with a data source of job
incumbents and are presented as part of the questionnaires in Appendix A. The scales were given
consecutive integer scores, and estimates were produced as described in the ―Estimation‖
subsection of B.1.1. Across all sample sizes, nearly all of the scale means have 95% CIs that are
no wider than plus or minus 1.5 scale points. For those scale means based on sample sizes of
between 15 and 25 respondents, more than 95% of the 5-point scales and more than 90% of the
7-point scales have 95% CIs no wider than plus or minus 1.5 scale points. In addition, 90% of
the 7-point scales have 95% CIs no wider than plus or minus 1.4 scale points.
Furthermore, Mumford, Peterson, and Childs (1997) have cited Fleishman and Mumford
(1991) as support that variation of 1 to 1.5 scale points on a 7-point scale ―is typical of that found
for well-developed level scales.‖ Setting a minimum employee sample size of 15 (with many
occupations achieving a larger sample size) therefore will generally satisfy this requirement.
Additionally, Peterson and colleagues (2001) state that 15–30 incumbents typically provide
sufficient interrater reliability for describing occupations, given the types of measures the
O*NET Program uses to describe occupations.
Exhibit 14. Half-Width of 95% Confidence Intervals
5-Point Scales
7-Point Scales
Sample Sizes
of 15 to 25
All Sample Sizes
Sample Sizes
of 15 to 25
All Sample Sizes
95th
+/−1.0
+/−0.9
+/−1.6
+/−1.5
90th
+/−0.9
+/−0.8
+/−1.4
+/−1.3
75th
+/−0.7
+/−0.6
+/−1.2
+/−1.0
50th
+/−0.5
+/−0.5
+/−0.9
+/−0.8
Percentile
Note: Data are taken from 135 5-point scales and 74 7-point scales measured on each of 322 occupations.
88
OMB Supporting Statement
Statistical Methods
Weighting
After the raw data are edited and cleaned, weights are constructed for each establishment
and employee respondent to reduce estimate bias and variance due to factors such as
nonresponse, undercoverage, and the complex sample design. The weighting process for the
basic Establishment Method is described first. Subsequently, weighting for the supplementalframe samples is described, together with weighting methods for combining the Establishment
Method and supplemental-frame samples.
Estimates generated from O*NET survey data are computed with analysis weights to
reflect the combined effects of the following:
probabilities of establishment selection;
probabilities of occupation selection;
early termination of employee sampling activities for particular occupations, because
of higher-than-expected yields;
probabilities of employee selection;
multiple-sample adjustments;
nonresponse at both the establishment and the employee levels; and
under- and overcoverage of the population, caused by frame omissions and
undetected duplications.
The starting point for each of these stages is the inverse of the probabilities of selection at
each stage (establishment, occupation, and employee)—called the base sampling weight for the
stage. The base sampling weight accounts for the unequal probabilities with which
establishments, occupations, and employees are selected at each stage and are presented in
28
Equation (2), w1i associated with the initial simple random sample of establishments
from the D&B frame;
Equation (5), w2i associated with the probability-proportional-to-size sample of
establishments from the initial simple random sample;
Equation (8), w3ij associated with the selection of an occupation at an establishment;
and
Equation (11), w4ijk associated with the selection of an employee from an occupation
at an establishment.28
As noted in conjunction with Establishment Method sampling Step 7, analysis weights are computed for each
occupation, not separately for each of the three O*NET questionnaires used within each occupation. Many of the
questions are the same across the three questionnaires; the estimates for these questions are produced with use of
all the respondents for the occupation. Selected employees are assigned randomly, with equal probabilities, to
each of the three questionnaires, with approximately the same number of employees responding to each of the
questionnaires. Consequently, bias is not introduced for estimates of means and proportions specific to a single
questionnaire. Producing the analysis weights for the entire occupation effects greater stability because the entire
occupation offers a larger available sample size than each of the three individual questionnaires offers.
89
Statistical Methods
OMB Supporting Statement
The product of these four weights would be the appropriate analysis weight if effects due
to such issues as nonresponse and under- or overcoverage were negligible; however, weight
adjustments likely will improve the accuracy of the estimates. The weight adjustments are
implemented in three weighting steps corresponding to the three main steps of Establishment
Method sampling:
Weighting Step 1, by which establishment weights are computed to account for the
probabilities of selecting establishments and to account for adjustments for
establishment nonresponse;
Weighting Step 2, by which occupation weights are computed to account for the
probabilities of selecting specific occupations from each establishment and to account
for adjustments for the early termination of sampling for some occupations under
MAS; and
Weighting Step 3, by which employee analysis weights are computed to account for
the probabilities of selecting employees within each occupation and to account for
adjustments for employee nonresponse and for overlapping frames across the
subwaves.
The weights are calculated separately for each subwave in Weighting Steps 1 and 2, and then
they are combined into an overall analysis weight in Weighting Step 3. The specific methods
used in each of these weighting steps are described here after the unit-nonresponse adjustment
method is described.
Nonresponse adjustment. The sampling weights are adjusted for nonresponse with use of
a generalized exponential model (GEM). RTI has used the GEM method to create sampling
weight adjustments for the 1999 through 2010 annual National Household Survey on Drug Use
and Health conducted for the Substance Abuse and Mental Health Services Administration and
for several other surveys conducted by RTI, including the 2000 National Postsecondary Student
Aid Study and the 2000 Beginning Postsecondary Student Longitudinal Study, both sponsored
by the U.S. Department of Education.
The GEM calibration is a generalization of the well-known weighting class approach, the
iterative proportional fitting algorithm that is generally used for poststratification adjustments,
Deville and Särndal’s (1992) logit method, and Folsom and Witt’s (1994) constrained logistic
and exponential modeling approach. The GEM calibration process causes the weighted
distribution of the respondents to match specified distributions simultaneously for all of the
variables included in the model. One advantage of the GEM method over simpler weighting
class or poststratification adjustments is that the adjustment model can use a larger and more
diverse set of control variables because main effects and lower-order interactions can be used in
the model, rather than complete cross-classifications. Folsom and Singh (2000) described the
GEM method in a paper presented to the American Statistical Association.
90
OMB Supporting Statement
Statistical Methods
To summarize, a set of predictor, or adjustment, variables is specified, together with the
control total for each variable to which the weighted sample is expected to match. The GEM
method is designed to determine a weight adjustment factor for each respondent, such that
k
xk wk ak Tx ,
where the summation is over the respondents, xk is an adjustment variable in the model, wk is the
base sampling (or unadjusted) weight, ak is the adjustment factor, and Tx is the control total for
the variable x. Tx may be either a nonresponse adjustment control total estimated by the sum of
base sampling weights for both respondents and nonrespondents, or an external control total to
adjust for under- or overcoverage of the frame. The adjustment factors, ak, are determined to
match the control totals for all of the variables in the model simultaneously. Furthermore, upper
and lower bounds on the weight adjustment factors can be set to reduce the influence of
observations that otherwise might have received a very large weight adjustment. The upper and
lower bounds also reduce the effect of unequal weighting that may result from uncontrolled
weight adjustments.
Weighting Step 1: Establishment weights. The base sampling weight, wi(1) , for the
selected establishments in a subwave is the product of the weights in Equations (2) and (5):
wi(1) w1i w2i .
(16)
The establishment sampling weights are adjusted for nonresponse, by subwave, with use of the
GEM method with a model that contains different combinations of the following variables29:
sampling industry division
U.S. Census division
establishment size
headquarters/branch type
number of occupations assigned to an establishment
urban or rural location
time zone
zip code information from the 2000 U.S. Census (quartile distribution of owneroccupied housing)
Variable selection proceeds by first fitting a model containing only main effects and tightening
the upper and lower bounds so that all upper bounds are less than 8 and a minimal increase in the
29
Early empirical evidence showed that these characteristics had disproportionate response rates within them.
91
Statistical Methods
OMB Supporting Statement
unequal weighting effect (UWE) is achieved.30 Two-way interactions among the variables are
then added to the model. Cells that do not contain any respondents or that are collinear with other
cells are removed from the model. If a convergent model cannot be obtained, some covariate
levels are collapsed together; for example, U.S. Census divisions are collapsed to regions. Other
variables or interactions may be removed from the model until a convergent model is obtained
(i.e., a solution is found given all constraints) that maintains as many of the covariates and their
two-way interactions as possible.
Variable selection and testing are conducted for each sampling subwave to determine the
final model for a subwave. Extremely large weights are trimmed back to smaller values. Even
though the GEM method provides control over the size of the adjustment factors, it is still
possible for large weights to result, though at a rate lower than that from an uncontrolled
adjustment process. The total amount trimmed within a subwave is proportionally reallocated
across the responding establishments to maintain the same estimated total number of
establishments for each subwave. The adjusted establishment weights are denoted as wi(1a) .
Weighting Step 2: Occupation weights. The base occupation weight, wij( 2 ) , for ith
occupation selected from the jth establishment is
wij( 2) w3ij wi(1a )
,
(17)
which is the product of w3ij, defined by Equation (8), times the adjusted establishment weight for
the subwave defined in Weighting Step 2. For most occupations, no further nonresponse
adjustments are necessary, because once an establishment agrees to participate, all of its selected
occupations are available. However, MAS is used to terminate incumbent sampling early for
some occupations with higher-than-expected numbers of sampled incumbents. For such
occupations, the rate at which an occupation is encountered is estimated from the establishments
contacted before the early termination; the estimated rate is then used to predict the number of
additional establishments that would have reported employing the occupation. The occupation
weights for the establishments that complete employee sampling are then adjusted to account for
the predicted additional establishments. To understand the adjustment for early termination of
sampling for some occupations, consider the classification of establishments shown in
Exhibit 15.
30
An upper bound of 8 equates to a response rate of 12.5% and is based on early empirical evidence within potential
nonresponse characteristics. Note that upper bounds are adjusted to be as small as possible to help minimize
changes in weights.
92
OMB Supporting Statement
Statistical Methods
Exhibit 15. Classification of Establishments by Occupation Model-Aided
Sampling Status
Group
Description
A
Inquired about occupation, and it is present at establishment
B
Inquired about occupation, but it is not present at establishment
C
Did not inquire about occupation because of early termination of incumbent sampling for occupation
Groups A and B are those establishments where the presence or absence of an occupation
is known and can be used to estimate the rate at which the jth occupation is present, or the
presence rate, by
pj
A
wij( 2)
w ( 2) B wij( 2)
A ij
,
(18)
where the summations are over the establishments in Group A or Group B. Next, the additional
number of establishments where the jth occupation would have been found if sampling had not
been terminated early is estimated by applying the presence rate to number of establishments in
Group C. Thus, the estimated total number of establishments where the jth occupation would
have been found is given by
T j A wij( 2) p j C wij( 2) ,
(19)
where the summations are over the establishments in Group A or Group C. It is tacitly assumed
in Equation (18) that the establishments where occupations are inquired about approximate a
random sample from all three groups listed in Exhibit 15. This assumption is consistent with the
random assignment of establishments to the BLs and the random order in which establishments
are initially contacted. The base occupation weights for the establishments in Group A are then
adjusted to sum to Tj for the jth occupation.
To make this adjustment more sensitive, the process for estimating the number of
establishments where the jth occupation would have been found is completed separately by
Census regions, by the business size groups, and by industry divisions—as with the process for
defining the MAS target cells. This process yields three sets of estimated marginal totals
corresponding with the three variables. The GEM method is then used with a model containing
the marginal, or main, effects of Census regions, business size groups, and industry divisions to
adjust the base occupation weights from Equation (17) for those establishments in Group A. The
adjusted weight for the jth occupation from the ith establishment is denoted by wij( 2 a ) .
Weighting Step 3: Employee analysis weights. The base weights for the responding
employees in a subwave are
93
Statistical Methods
OMB Supporting Statement
( 3)
wijk
w4ijk wij( 2 a ) ,
(20)
which is the product of w4ijk, defined by Equation (11), and the adjusted occupation weight for
the subwave, defined in Weighting Step 2. At this point the responding employees from all
subwaves are combined, and a multiple-frame adjustment is made. The overlap of target
populations among the subwaves for each occupation is determined, and a multiple-frame
adjustment is made, as described by Korn and Graubard (1999), using the sample sizes in the
overlaps. For example, if two subwaves overlap for an occupation within a set of industries, then
the adjustment factors for the subwaves are
al t l (t1 t 2 ) ,
where
tl
(21)
is the sample size from the lth subwave in the overlap between the subwaves. Then, the
multiple-frame adjusted employee weights are
(3 )
(3)
wijk
al wijk
,
*
(22)
where the adjustment factor, al, is selected to correspond with the industry overlap portion
associated with the ijkth employee. This adjustment process is completed separately for each
combination of overlapping subwaves.
Next, the employee weights in Equation (22) are further adjusted for nonresponse, using
the GEM method with a model that contains different combinations of the following variables:
94
sampling industry division
U.S. Census division
establishment size
sampling wave
questionnaire type
headquarters or branch type
number of occupations asked about in an establishment
number of occupations assigned to an establishment
total number of selected employees in an establishment
primary or secondary occupation
whether POC has ever heard of O*NET
expected sampling yield (high, medium, or low)
quintile distribution of percentage of occupation with industry
quintile distribution of percentage of industry within occupation
OMB Supporting Statement
Statistical Methods
urban or rural location
time zone
zip code information from the 2000 U.S. Census (quartile distribution of owneroccupied housing)
As before, variable selection and testing are conducted to determine the final model. Indicator
variables for the occupations are included in the final model so that the adjustment maintains the
correct sum of weights for each occupation; at the same time, to improve the adjustment, the data
across occupations are used for the other variables.
At this point, an examination for extreme weights is conducted for each occupation by
domain questionnaire. To prevent a few respondents from being too influential, weight trimming
is performed according to two rules. Weights are deemed too extreme for a particular occupation
by domain group if
any weight exceeds the mean weight of the group plus 1.5 standard deviations of the
weights, or
a single weight accounts for more than 50% of the weight sum of the group.
Extreme weights are trimmed to the smaller of the two values bulleted above; the total
amount trimmed for an occupation by domain group is proportionally allocated to all
respondents in the domain group. The 50% check on the contribution of a single respondent is
repeated until no single respondent exceeds the proportion limit.
The employee weights are then ratio-adjusted to match the distribution of workers in an
occupation by industry sector as estimated from BLS’s OES data. If the OES indicated
employment in a specific industry division for an occupation for which the O*NET data lacked
respondents, the OES total employment for that industry division is proportionally distributed
across the industries for which O*NET respondents existed.
Occasionally, this final ratio adjustment will induce a large UWE,31 inflating the
variances of O*NET estimates or producing undesirably large employee weights. Accordingly,
final checks are conducted to determine whether (1) an occupation has a UWE greater than 10.0,
or (2) the weight of a single respondent accounts for more than 30% of the total weight of all
respondents within a domain questionnaire. If either of these criteria is met, then this final ratio
adjustment is repeated; however, the distribution of workers for the subject occupation is allowed
to deviate from the OES estimated distribution in order to satisfy these criteria.
31
The UWE measures the increase in the variance of an estimate—an increase due to unequal weighting above the
variance that a sample of the same size would yield if the weights were equal. The UWE is estimated by
ni wi2
w
2
i
i
.
95
Statistical Methods
OMB Supporting Statement
(3a )
The resulting weights are denoted by wijk
. If a supplemental frame was not used to
complete the sample of an occupation, then these weights are the final employee analysis
weights used to produce the estimates for the O*NET Data Collection Program. If a
supplemental frame was used for an occupation, then an additional, supplemental-frame
weighting step completes the calculation of the analysis weights.
Supplemental-frame weighting. As noted in the discussion of sampling methods,
locating a sufficient number of respondents under the Establishment Method occasionally
requires sampling from a supplemental frame. This situation usually occurs for rare occupations
or occupations difficult to locate through Establishment Method sampling. When it does occur,
additional weighting steps are necessary to account for the dual frames used to select the
samples. Two situations must be considered: one in which the supplemental frame consists of
establishments likely to employ the occupation of interest, and another in which the
supplemental frame consists of employees likely to be working in the occupation. Described here
are the steps taken in each situation.
First, consider the situation in which a supplemental frame of establishments is used, as
illustrated in Exhibit 16. In this figure the dashed rectangle represents the full frame; the oval, the
supplemental frame. The full frame, denoted by F, is the D&B establishment listing used for the
usual Establishment Method sampling, which includes N establishments. The supplemental
frame, denoted by FS, includes NS establishments.
Exhibit 16. Overlap of Full and Supplemental Frames
Full Frame (F) with N Establishments
Supplemental Frame (FS)
with NS Establishments
Frame (Fo) with No Establishments
(excluding the supplemental establishments)
For a supplemental-frame sample of establishments, Equation (16) of Weighting Step 1 is
modified to start with a base sampling weight, wi(1s ) , given by
96
OMB Supporting Statement
Statistical Methods
wi(1s ) w5i ,
(23)
where w5i is defined in Equation (13). The supplemental-frame establishment sampling weights
are then adjusted for nonresponse with use of the GEM method and under a model containing the
stratification variables used to select the supplemental-frame sample. To determine the final
model with the GEM software, variable selection and testing are conducted for each
supplemental-frame sample. The nonresponse-adjusted supplemental-frame establishment
weights are denoted as wi(1sa) .
A supplemental frame is developed to target a small number of related occupations—and
often only one occupation—because of the difficulty of locating them. For a supplemental-frame
sample, occupations are not randomly assigned to establishments from a large set of occupations,
as is done for the subwaves in Establishment Method sampling. Consequently, the base sampling
weight for the jth occupation selected from the ith establishment for Weighting Step 2, wij( 2 ) in
Equation (17), is modified to be
wij( 2 s ) wi(1sa) ,
(24)
which excludes the weighting factor related to the random assignment of an occupation to an
establishment. The subscript j is added, however, to recognize that a specific occupation is
associated with each selected establishment in the supplemental-frame sample. The
supplemental-frame sample then proceeds through the rest of Weighting Step 2. The adjusted
weight for the jth occupation from the ith establishment is denoted by wij( 2 sa) .
As part of Weighting Step 3, the base sampling weights for the responding employees in
a supplemental-frame sample are defined as in Equation (20),
(3s )
wijk
w4ijk wij( 2 sa) ,
(25)
which is the product of w4ijk , defined in Equation (11), and the adjusted occupation weight for
the supplemental-frame sample, defined in Weighting Step 2 as wij( 2 sa) .
Next, the occupation’s employee weights from the supplemental-frame sample must be
combined with the same occupation’s weights from the Establishment Method sample. The
employee weights from each of the Establishment Method subwaves are first combined as shown
in Equation (22) into a single set of weights for all employees from an occupation selected by the
Establishment Method. An extra step is added to Weighting Step 3 to combine the Establishment
97
Statistical Methods
OMB Supporting Statement
Method employee weights from Equation (22) with the supplemental-frame employee weights
from Equation (25).
At this point it must be assumed that the establishments listed on the supplemental frame,
FS in Exhibit 13, are equivalent to a random sample of the establishments on the full frame, F.
This assumption is made because of the inability to determine which establishment on the full
D&B-derived frame, F, links to which establishment on the supplemental frame. Under this
assumption, a multiple-frame situation again emerges because data come from two samples: the
Establishment Method sample and the supplemental-frame sample. A multiple-frame adjustment
is made, as described by Korn and Graubard (1999, sec. 8.2), using the sample sizes from the
two samples. Similarly, as was described for Equation (21), let tF be the sample size for an
occupation from the Establishment Method full sample, and let tS be the sample size for the
occupation from the supplemental sample. The multiple-frame adjustment factors are then given
by
aF t F (t F t S )
and
aS t S (t F t S ) ,
(26)
which correspond with the full frame and the supplemental frames, respectively. Finally, the
weight for the kth employee from the jth occupation in the ith establishment is given by
( 3**)
( 3*)
wijk
a F wijk
for the full sample and
( 3**)
( 3s )
wijk
aS wijk
(27)
( 3*)
(3s )
for the supplemental sample, where wijk
and wijk
are defined in Equations (22) and (25),
( 3**)
are then used in
respectively, and aF and aS correspond with the jth occupation. The weights wijk
the adjustment steps after Equation (22) to complete Weighting Step 3, which yields the final
(3a )
.
analysis weights wijk
Next, consider the situation in which a supplemental frame consists not of establishments
but of employees working in the occupation. In this situation, no actions equivalent to those in
Weighting Steps 1 and 2 exist for the supplemental-frame sample. For Weighting Step 3, the
supplemental-frame sample weighting starts with the base sampling weight
98
OMB Supporting Statement
Statistical Methods
(3s )
wijk
w6ijk ,
(28)
where w6ijk was defined in Equation (15). These weights are then combined with the
Establishment Method employee weights for an occupation, as was described in connection with
( 3**)
. Again, the key assumption is made that the employees on
Equations (26) and (27), to yield wijk
the supplemental frame are equivalent to a random sample of all employees working in the
occupation who might have been selected through the Establishment Method. This assumption is
necessary because it is not possible to link the employees on the supplemental frame back to the
( 3**)
D&B frame of establishments used for the Establishment Method. The weights wijk
are then
used in the adjustment steps after Equation (22) to complete Weighting Step 3, which yields the
(3a )
final analysis weights wijk
.
Estimation
The estimates produced for each occupation consist of scale means and percentage
estimates. The number and type of scales are those listed in Exhibit 2, with a data source of job
incumbents. Each of these scales is a 3-, 5-, or 7-point Likert scale. The final estimates are the
means of the scales for each occupation. No subgroups or domain estimates are produced or
released, both to protect against disclosure and because small sample sizes are not conducive to
reliable estimation. The standard deviation will be available for each item mean as a measure of
response variation among an occupation’s respondents. Finally, there are several percentage
estimates produced for items concerning work context, education and training, background
items, and occupation tasks. Again, the final estimates are the percentages for each occupation,
and no subgroup or domain estimates are produced or released.
For each item, if respondents do not provide an answer to a particular question, they are
excluded from both the numerator and the denominator of the estimated mean. Because item
nonresponse tends to be very low for this study (see Appendix E), no item imputation is
conducted, and no value for missing items is assumed for estimation.
Variances are estimated with the first-order Taylor series approximation of deviations of
estimates from their expected values. These design-based variance estimates are computed with
SUDAAN® software (RTI International, 2004). These estimates properly account for the
combined effects of clustering, stratification, and unequal weighting—all of which are present in
the O*NET data. The variance estimation clusters are the establishments; the stratification is by
industry grouping and establishment size as used in selection of the establishment samples.
These estimated variances are used to estimate both the standard errors associated with the mean
99
Statistical Methods
OMB Supporting Statement
or percentage and the confidence intervals (CIs). Standard error estimates and 95% CIs are
included with all estimates of means and proportions.
The estimate of a mean or a proportion is given by the formula
yˆ
w
( 3a )
hik
hik
y hik
w
( 3a )
hik
,
(29)
hik
( 3a )
where whik
is the final analysis weight for the kth respondent from the ith establishment in the hth
stratum, and y hik is the response variable. For a scale mean, the response variable, y hik , is the
scale value reported by the respondent; for a proportion, the response is one for a positive
response and zero for a negative response. The Taylor series linearized values for the estimated
mean or proportion are given by
ˆ
( 3 a ) y hik y
z hik whik
.
(30)
(3a )
whik
hik
The variance of yˆ is estimated by
var( yˆ )
h
nh
( zhi zh )2 ,
nh 1 i
(31)
where nh is the number of variance estimation clusters from the hth stratum, z hi z hik and
k
z h z hi nh .
i
Expected Response Rates
Data collection had been completed for 104 subwaves as of December 31, 2011. These
subwaves consisted of 203,266 sampled establishments and 248,507 selected employees. The
overall response rate was 76.1% for establishments and 65.0% for employees. Although these
response rates compare favorably with those of similar studies (see Section A.1.4), methods to
further enhance response rates are continually being evaluated and implemented (see Section 0 ).
B.1.2 Occupation Expert Method
The OE Method is used for occupations as necessary to improve sampling efficiency and
avoid excessive burden, as when it is difficult to locate industries or establishments with
occupation incumbents, employment is low, or employment data are not available, as is the case
for many new and emerging occupations. To determine which sampling method should be used
for an occupation, a comparison is made of the advantages and disadvantages of the
Establishment and OE Methods. For each occupation, information on the predicted establishment
eligibility rate and the predicted establishment and employee response rates is used to quantify
100
OMB Supporting Statement
Statistical Methods
the efficiency of sampling the occupation by means of the Establishment Method. The OE
Method is used for an occupation when the Establishment Method of data collection is not
feasible and an appropriate source of occupation experts is available, as when a membership list
of a professional or trade association exists and provides good coverage of the occupation. A
random sample is selected from provided lists to prevent investigator bias in the final selection of
occupation experts. Sample sizes are designed to ensure that at least 20 completed questionnaires
are available for analysis after data cleaning. A goal of 20 questionnaires was set as a reasonable
number to enable adequate coverage of experts, occupation subspecialties, and regional
distribution.
Through December 31, 2011, the OE Method was used to collect data for 236
occupations. Of these, 208 were completed and 28 were still in process as of that date. A total of
10,081 occupation experts were sampled, of which 7,580 were found to be eligible. Of these,
6,050 occupation experts participated, for an overall OE response rate of 79.8%.
For the OE method, unweighted estimates of the same means and percentages are
reported as for the Establishment Method, together with the estimated standard deviation of the
mean estimates. Because occupation experts are not selected as a random sample from all
incumbents working in an occupation, the weights and weighting steps used for Establishment
Method occupations are not appropriate, and weights are not calculated for OE Method
occupations.
B.2
Procedures for the Collection of Information
Data collection operations are conducted at the contractor’s Operations Center in Raleigh,
North Carolina, and its Survey Support Department in Research Triangle Park, North Carolina.
For the Establishment Method, the Operations Center’s BLs contact sample business
establishments, secure the participation of a POC, and work with the POC to carry out data
collection in the target occupations. The data are provided by randomly selected employees
within the occupations of interest. All within-establishment data collection is coordinated by the
POC; the BLs do not contact employees directly.32 After the POC agrees to participate,
informational materials and questionnaires are mailed to the POC, who distributes the
questionnaires to the sampled employees. Completed questionnaires are returned directly to the
survey contractor for processing. Respondents also have the option of completing the survey
online.
Survey Support Department staff mail materials to POCs, job incumbents, and
occupation experts, and they receive and process completed questionnaires that are returned by
32
The BLs contact occupation experts (the OE Method) directly, as well as job incumbents when sampling from a
professional membership list in a dual frame approach; no POC is involved.
101
Statistical Methods
OMB Supporting Statement
respondents. Both the telephone operations of the BLs and the mailing and questionnaire-receipt
operations of the support staff are supported by the CMS. Data-entry software supports the
keying and verification of incoming survey data.
B.2.1 Establishment Method
As described in Section B.1.1, the Establishment method uses a two-stage design
involving a statistical sample of establishments expected to employ workers in the target
occupations, followed by a sample of the workers in the target occupations within the sampled
establishments. The sampled workers are asked to complete the survey questionnaires.
The Establishment Method works well for most occupations. Occasionally, however, the
standard protocol is supplemented with a special frame, such as a professional association
membership list, when additional observations are required to complete data collection for an
occupation. The primary difference with this approach is that the supplemental respondents are
sampled directly from the special frame and contacted directly by the BLs, without involvement
of a sampled establishment or a POC.
O*NET Operations Center
Data collection activities are housed in the O*NET Operations Center, located in Raleigh,
North Carolina, and covering 3,581 square feet. The Operations Center staff includes BLs, Team
Leaders, a Monitoring Coordinator, and the Operations Center Manager, who reports to the Data
Collection Task Leader. Usual operating hours for the Operations Center are Monday through
Friday, 8:45 a.m. to 5:15 p.m., Eastern Time. Operating hours are extended during periods of
unusually high workloads or when necessary to contact a high concentration of Pacific time zone
businesses.
The BLs form the nucleus of the Operations Center staff. The number of BLs fluctuates
somewhat, ranging from 8 to 20, depending on workload. New BLs are recruited and hired at
various intervals to compensate for attrition and increases in workload. BL job candidates are
carefully screened and evaluated by Operations Center management, who use a job description
and a set of criteria that include a minimum of 2 years of work experience in a call center or
related work experience in a human resources department.
Case Management System
The O*NET CMS is a Web-based control system that supports and monitors the data
collection activities of the BLs, the mailing of informational materials and questionnaires, and
the receipt of completed paper and Web questionnaires.
102
OMB Supporting Statement
Statistical Methods
Questionnaires and Information Materials
The Establishment data collection protocol calls for each sampled worker to receive one
of three randomly assigned domain questionnaires—Knowledge (which includes Education and
Training as well as Work Styles), Generalized Work Activities, and Work Context. Each domain
questionnaire also includes a Background section that asks a standard set of 11 demographic
questions about the respondent. In addition, each worker receives a Task Questionnaire specific
to his or her occupation.
Task Questionnaires are developed initially through the extraction of task information
from multiple sources located on the Internet. This questionnaire includes a definition of the
occupation, a list of tasks, and space for the respondent to write in additional tasks. The
respondent is instructed to indicate whether or not each task is relevant to his or her occupation
and to rate each relevant task’s frequency and importance. In subsequent updating efforts, task
inventories are revised to reflect the new and most current information from respondents,
including write-in tasks.
For all occupations, sampled workers also receive an occupation-specific Association
Membership Questionnaire. The questionnaire provides a list of associations related to the
worker’s occupation and asks the respondent to indicate whether he or she belongs to any of
them. The respondent is also asked to write in any other associations to which he or she belongs.
This information is collected in case it becomes necessary to complete the occupation with use of
the dual-frame approach.
Each sampled employee receives an integrated questionnaire consisting of the randomly
assigned domain questionnaire and the Task and Association Membership Questionnaires
applicable to the employee’s occupation. Questionnaires are custom-printed on demand for each
sampled worker. In addition, workers are given the option of completing their questionnaire
online at the project’s Web site instead of completing and returning the paper questionnaire.
Spanish versions of the questionnaires are available for occupations with high
proportions of Hispanic workers. The Spanish questionnaires are sent to workers who are
identified as Spanish-speaking by their POC. In addition, an employee who has been sent an
English questionnaire can request a Spanish version by calling the survey contractor with a tollfree number.
Examples of the English questionnaires are included in Appendix A.33 The Spanish
versions are available on request.
33
The only change made to the O*NET questionnaires since the last OMB clearance package was submitted to
OMB in December 2008 was the addition of a respondent comment box at the end of the paper version of the
questionnaires. This change is described at the beginning of Appendix A.
103
Statistical Methods
OMB Supporting Statement
In addition to the questionnaires, the Establishment Method data collection protocol
includes a variety of letters, brochures, and other informational materials mailed to POCs and
sampled workers. Spanish versions of the materials addressed to workers are available for
occupations with high proportions of Hispanic workers. Appendix B contains examples of the
English versions of these materials.34 The Spanish versions are available on request.
Data Collection Procedures: Establishment Method
Described here are the steps of the Establishment Method standard data collection
protocol. A summary of this protocol is shown in Exhibit 17.
Exhibit 17. Establishment Method Data Collection Flowchart
Step 1
Step 5
Step 10
Make Verification Call to Receptionist
Make Sampling Call to POC
Make 21-Day Follow-Up Call to POC
Step 2
Send Questionnaire Package
Step 6
Make Screening Call to the
Point of Contact (POC)
Step 11
Make 31-Day Follow-Up Call to POC
Step 7
Send Toolkit
Step 3
Send Advance Package
Step 12
Step 8
Send Replacement Questionnaires
Make 7-Day Follow-Up Call to POC
Step 4
Make Recruiting Call to POC
Step 9
Send Thank You/Reminder Postcards
Step 13
Make 45-Day Follow-Up Call to POC
Step 1: Verification call to the receptionist. The BLs call each sampled business to
determine whether the business is eligible (i.e., whether it is still in operation at the sampled
address). The other component of the verification call is to identify the anticipated POC, who
must be knowledgeable about the types of jobs present and who is the recipient of the screening
call.
Step 2: Screening call to the point of contact. The BLs next call (or are transferred to)
the anticipated POC to ascertain whether the business has at least one employee in at least one of
the occupations targeted for that establishment. If so, the following POC information is obtained:
34
A few minor changes have been made to the letters and other materials mailed to survey participants since the last
OMB clearance package was submitted to OMB in December 2008. These are listed in a table at the beginning of
Appendix B.
104
OMB Supporting Statement
Statistical Methods
name and title of the POC,
U.S. Postal Service delivery address,
telephone number,
e-mail address (if available), and
fax number.
None of the BLs’ conversations with the POC is scripted in advance. Instead, ―talking
points‖ are provided to guide the BLs’ interactions with POCs. BLs are trained to listen and
interact effectively and in a comfortable style, rather than to read from a prepared script;
therefore, reading off a computer screen is discouraged. The BLs enter all information gathered
during each conversation with a POC into the CMS.
Step 3: Send information package. The information package, which is sent to the POC
after the completion of the screening call, contains more detailed information about the O*NET
Program. The following information is included in the information package:
lead letter from the U.S. Department of Labor (DOL);
O*NET brochure;
―Who, What, and How‖ brochure;
Selected Occupations List, providing title and descriptions of target occupations;
list of endorsing professional associations;
brochure describing the business-, POC-, and employee-level incentives; and
POC incentive (i.e., the O*NET desk clock).
Step 4: Recruiting call to the point of contact. To give the POC adequate time to receive,
read, and process the information, the next call to the POC is made approximately 7 days after
the information package is shipped. During the recruiting call, the BL
verifies that the information package was received;
confirms that the POC is qualified to serve in the POC role;
reviews with the POC the titles and descriptions from the Selected Occupations List
for the target occupations, to determine whether the establishment has any employees
in those occupations;
(if one or more target occupations are present) explains the O*NET Program in
greater detail, answers questions, and attempts to secure the POC’s commitment to
participate;
for participating establishments explains the need for the POC to prepare a numbered
list of employees’ names for each selected occupation, for use in selecting a sample
of employees; and
sets an appointment for the sampling call, allowing sufficient time for the POC to
compile the occupation rosters (in smaller businesses, the sampling call is sometimes
combined with the recruiting call).
105
Statistical Methods
OMB Supporting Statement
Step 5: Sampling call to the point of contact. During this call, the BL obtains from the
POC the number of names on each roster and enters the counts into the CMS, which selects the
sample according to preprogrammed random sampling algorithms. The BL then informs the
POC which employees are selected for each occupation. The POC is asked to note for later
reference the line numbers of the selected employees on his or her list (or lists) when the
questionnaires are distributed. For designated O*NET-SOC occupations with a high percentage
of Hispanic employees, the BL also asks the POC if any of the selected employees should
receive a Spanish version of the questionnaire instead of the English version. The language
preference of each employee is then indicated in the CMS.
Step 6: Send questionnaire package. After completion of the sampling call, the
employee packets are shipped to the POC for subsequent distribution to the sampled employees.
As part of the same mailing, the POC receives a thank-you letter and a framed Certificate of
Appreciation from DOL, personalized with the name of the POC and signed by a high-ranking
DOL official. Each questionnaire packet contains a letter from the contractor’s project director,
the assigned questionnaire (including the domain questionnaire and the Task and Association
Questionnaires integrated into a single booklet), a return envelope, an information sheet for
completing the Web questionnaire (including the respondent’s user name and password), and a
$10 cash incentive. In addition, a label is affixed to the cover of the questionnaire to remind the
respondent of the option to complete the questionnaire online. A Spanish questionnaire is sent to
any Hispanic employees who the POC indicated during the sampling call should receive this
version. In addition, all employees in these O*NET-SOC occupations are informed through a
bilingual notice included in the mailing that they have a choice between English and Spanish
versions, and they are provided with a toll-free number to call if they would like to receive the
alternate version.
Step 7: Send toolkit. Approximately 3 days after mailing the Questionnaire Package, the
contractor also mails the POC the O*NET Toolkit for Business—a packet of information about
the O*NET Program, which managers can use for human resource planning and preparation of
job descriptions.
Step 8: 7-day follow-up call to the point of contact. Approximately 7 days after the
shipment of the original questionnaire package to the POC, the BL calls to verify receipt of the
mailing and to review the process for distributing the questionnaires to the selected employees.
The BL also informs the POC of a forthcoming shipment of thank you/reminder postcards and
asks him or her to distribute them to all sampled employees.
106
OMB Supporting Statement
Statistical Methods
Step 9: Send thank you/reminder postcards. After the 7-day follow-up call, the BL
places an order for thank you/reminder postcards to be sent to the POC for distribution to all
sampled employees.
Step 10: 21-day follow-up call to the point of contact. Approximately 21 days after the
shipment of the original questionnaire package, the BL calls to thank the POC for his or her
ongoing participation and to provide an update on any employee questionnaires received to date.
The BL asks the POC to follow up with nonrespondents.
Step 11: 31-day follow-up call to the point of contact. Approximately 31 days after the
shipment of the original questionnaire package to the POC, the BL calls to again thank the POC
for his or her ongoing participation and to provide an update on any employee questionnaires
received to date. At this time, the BL informs the POC of a forthcoming shipment of replacement
questionnaires, which are to be distributed to any employees who have not yet returned the
original questionnaire.
Step 12: Send replacement questionnaires. After the 31-day follow-up call, the BL
places an order for the shipment of replacement questionnaires. These packages are ordered for
any employees who have not yet responded. The replacement questionnaire package is like the
original one, except for a slightly different cover letter and the absence of the $10 cash incentive.
Using roster line information or employee initials provided by the BL during the 31-day followup call, the POC then distributes the appropriate replacement questionnaire package to each
nonresponding employee and encourages the employee to complete and return the questionnaire.
Step 13: 45-day follow-up call to the point of contact. Approximately 45 days after the
shipment of the original questionnaire package to the POC, the BL places one final follow-up
call to the POC to thank the POC for his or her assistance and to provide one final status report
regarding employee questionnaires. If all questionnaires have been received at this point, the BL
thanks the POC for his or her organization’s participation. If questionnaires are still outstanding,
the BL confirms receipt and distribution of the replacement questionnaire packets. The BL asks
the POC to follow up with nonrespondents. This step is usually the final one in the data
collection protocol.35
Mailout Operations, Questionnaire Receipt, and Processing
Orders for mailings of questionnaires and informational materials to support data
collection are placed by the BLs and processed by data preparation staff. The CMS supports and
35
If no employee questionnaires have been received at the time of the last scheduled follow-up call, the case is
referred to a Team Leader, who reviews the history notes for the case to try to determine if the POC actually
distributed the questionnaires; if necessary and appropriate, the Team Leader will make an additional follow-up
call to the POC.
107
Statistical Methods
OMB Supporting Statement
monitors the entire process, including placing the order, printing on-demand questionnaires and
other order-specific materials, shipping the order, and interacting with the U.S. Postal Service to
track delivery of the order. Staff members follow written procedures in fulfilling orders,
including prescribed quality control checks. They are also responsible for maintaining an
adequate inventory of mailing materials and for inventory control.
Completed questionnaires returned by mail are delivered to the contractor, where they are
opened and batched and the barcodes are scanned to update the CMS for receipt. The batches are
then delivered to data-entry staff, where the survey data are keyed and 100% key verified. The
questionnaire batches are then stored in a secure storage area. Data from the paper questionnaires
are merged with the Web questionnaire data and readied for data cleaning routines.
Establishment Method Data Collection Results
Establishment data collection, which began in June 2001, is still under way. As of
December 31, 2010, 97 waves of data collection have been completed and more than 119,000
establishments and 154,000 employees have participated, resulting in an establishment response
rate of 76% and an employee response rate of 65%.36
B.2.2 Occupation Expert Method
The OE Method is an alternate method of collecting information on occupational
characteristics and worker attributes that is used to improve sampling efficiency and avoid
excessive use of burden for problematic occupations. This situation occurs when occupations
have low employment scattered among many industries or when employment data do not yet
exist (e.g., for new and emerging occupations).With this method, persons considered experts in
the target occupation, rather than job incumbents, are surveyed. Occupation experts are sampled
from lists provided by source organizations that can include professional associations,
certification organizations, industry associations, and other organizations that can identify
qualified experts in a given occupation. The sampled occupation experts are contacted directly
by the BLs, without involvement of a sampled establishment or a POC. Unlike the standard
Establishment Method, under which workers complete only one questionnaire, the OE Method
requires that occupation experts complete all three domain questionnaires, as well as a
Background Questionnaire and a Task Questionnaire. Because of the increased burden,
occupation experts receive a $40 cash incentive instead of the $10 incentive offered to
Establishment Method respondents.
The same facility used for Establishment Method data collection—the Operations Center
in Raleigh, North Carolina—is also used for the OE Method work and uses a CMS, a Web-based
36
See Section A.1.4 for details on the Establishment Method response rate experience and a comparison of these
response rates with other surveys.
108
OMB Supporting Statement
Statistical Methods
control system that supports and monitors the data collection activities of the BLs, the mailing of
informational materials and questionnaires, and the receipt of completed paper questionnaires.
Questionnaires and Information Materials
With the exception of a few additional items in the Background Questionnaire, the OE
Method questionnaires are the same as those used for Establishment Method data collection.
Occupation experts are asked to complete all three domain questionnaires (as well as a
Background Questionnaire and Task Questionnaire), whereas Establishment Method respondents
complete only one domain questionnaire (as well as Background, Task, and Association
Questionnaires bound together with the domain questionnaire). Paper questionnaires are bundled
before shipping, with the order of the domain questionnaires randomized at the respondent level.
As with the Establishment Method, occupation experts are given the option of completing their
questionnaires online at the project Web site.
OE Method information materials resemble the Establishment Method materials but are
modified to reflect how the OE Method differs from the Establishment Method (direct contact
with the respondent, identification through a named source organization, reference to only one
occupation, multiple questionnaires, and a higher incentive).
Examples of OE Method questionnaires are presented in Appendix A; information
materials are presented in Appendix B.
Data Collection Procedures
The steps in the OE Method data collection protocol closely follow those for
establishments. The primary differences are the absence of the verification and sampling calls.
Verification calls are inapplicable because a specific individual is contacted instead of an
establishment. Sampling calls are inapplicable because the individual is not sampled from a
larger group of employees. All other steps follow the Establishment Method protocol. The OE
Method data collection protocol is shown in Exhibit 18.
Mailout Operations, Questionnaire Receipt, and Processing
OE Method mailing operations and questionnaire receipt and processing follow the same
procedures as those described for the Establishment Method in Section B.2.1.
Occupation Expert Method Data Collection Results
Data collection was completed for 182 occupations as of December 31, 2010. As
described in Section A.1.4, of the 6,768 eligible occupation experts identified, 5,400 participated,
for an occupation expert response rate of 80%.
109
Statistical Methods
OMB Supporting Statement
Exhibit 18. Occupation Expert Method Data Collection Flowchart
B.3
Step 1
Step 6
Make Screening Call to
Occupation Expert
Send Thank You/Reminder
Postcard
Step 2
Step 7
Send Advance Package
Make 21-Day Follow-Up Call
Step 3
Step 8
Make Recruiting Call
Make 31-Day Follow-Up Call
Step 4
Step 9
Send Questionnaires
Send Replacement Questionnaires
Step 5
Step 10
Make 7-Day Follow-Up Call
Make 45-Day Follow-Up Call
Methods to Maximize Response Rates
The O*NET Data Collection Program is committed to achieving the highest possible
response rates. This section summarizes some of the key features of the data collection protocol
that are designed to maximize response rates and reduce nonresponse:37
37
Multiple Contacts—As described in Section B.2, the Establishment Method protocol
consists of up to 13 separate telephone and mail contacts with POCs (see Exhibit 14),
and the OE Method consists of up to 10 contacts with occupation experts. In addition,
supplemental contacts are made via e-mail and fax as appropriate. The multiple
contacts are designed to establish rapport with the POC/ occupation expert, facilitate
their participation, and help ensure the best possible questionnaire completion rate.
Multi-Mode Study—O*NET offers employees the choice of completing the survey
online or filling out a paper version and returning it via mail. In their questionnaire
packet, sampled employees receive both the hardcopy survey and a customized flyer
providing their unique user ID and password as well as some general instructions on
how to access the questionnaire. To encourage participation via Web, related study
See Section B.2 for a full description of the data collection protocol.
110
OMB Supporting Statement
Statistical Methods
materials sent to the POC and the employees highlight the Web survey option; talking
points have been added in the CMS for BLs to remind POCs and occupation experts
of the Web option; and reminders are communicated via e-mail and are posted on the
project Web site.
Incentives—As described in Section A.9, the O*NET Data Collection Program offers
incentives to employers, POCs, job incumbents, and occupation experts. Participating
employers receive the ToolKit for Business; POCs receive a desk clock and, if they
agree to participate, a framed Certificate of Appreciation from DOL; job incumbents
receive a prepaid incentive of $10; and occupation experts receive a desk clock and, if
they agree to participate, a cash incentive of $40 and a Certificate of Appreciation.
See Section A.9 for a full discussion of the incentives and their rationale.
Refusal Conversions—At least one conversion attempt is made on every refusal
encountered by a BL. When a POC or occupation expert refuses to participate, the
case is transferred to a specially trained Converter for a refusal conversion contact.
Refusal rates for BLs and conversion rates for the Converters are tracked and
monitored for quality control.
Quality of Staff—Because of the unscripted nature of the calls conducted on this
study and the challenges of securing participation from POCs and occupation experts,
BL job candidates are carefully screened and evaluated by Operations Center
management. Candidates are selected on the basis of a track record of successful
work experience (minimum of 2 years) in a call center, customer service, or human
resources department setting; educational attainment (a majority have college degrees
and several have advanced degrees); computer proficiency, and research skills. BLs
receive competitive salaries and attrition is extremely low.
Staff Training—Newly hired BLs must participate in and successfully complete a 4day intensive training program that includes presentations by key management staff
on the data collection steps, situational role play, hands-on practice in the CMS, and
coaching on overcoming objections. Once hired, BLs routinely participate in ongoing
training in such topics as skills enhancement, refusal conversion, protocol
refreshment, and occupation briefing.
Supervision and Call Monitoring—The BLs are closely supervised by the
Operations Center supervisors, who use silent monitoring equipment to monitor a
random sample of each BL’s calls and who provide ongoing training and coaching to
the BLs as needed.
Adaptive Total Design—O*NET data collection managers use a system called
Adaptive Total Design to help monitor and control the implementation of the data
collection protocol. As described in Section B.4.10, this system uses an online
dashboard of survey metrics such as BL phone time, BL caseload, and BL response
rate by industry that help the O*NET survey managers monitor BL performance,
detect potential problem areas, and observe trends that inform decisions about
matching BLs to industries when making future case assignments.
These and other enhancements have had a positive effect on the O*NET Program’s
ability to secure the participation of both establishments and employees. For example, the
111
Statistical Methods
OMB Supporting Statement
establishment response rate for data collection waves completed through December 31, 2011,
was 76%, compared with 64% for the initial data collection wave completed during 2001–2002.
Improvement in the employee response rate was less dramatic, increasing from 63% in the initial
2001–2002 wave to 65% for waves completed through December 31, 2011. The O*NET
Program will continue to explore ways to enhance response rates still further through its
continuous improvement program.
B.4
Tests of Procedures
Continuous improvement of survey quality and efficiency has been a constant focus of
the O*NET Data Collection Program. The survey design specified in this document has evolved
over years of testing and evaluating alternative procedures. The O*NET Program team believes
that this design reflects the best combination of design parameters and features for maximizing
data quality while reducing data collection costs. Summarized here are some of the tests of
procedures that have been conducted for the O*NET Data Collection Program.
B.4.1 1999 Pretest Experiments
The initial design of the O*NET Program was based on rigorous testing and evaluation of
alternative design protocols. Tests of seven different cash incentive plans were carried out
between June 1999 and January 2000 on a sample of about 2,500 eligible businesses and 3,800
employees. In addition, the use of stamped return envelopes and various options for contents of
the Toolkit for Business incentive were tested. These tests found that the best design appeared to
be the combination of the $10 prepaid incentive to employees, various material incentives to the
POC, and the use of first-class stamped return envelopes. These experiments also determined
that a videotape describing the O*NET database should not be included in the Toolkit for
Business. A report documenting the pretest activity and results was included in the 2002 O*NET
Office of Management and Budget (OMB) submission and can be found at
http://www.onetcenter.org/ombclearance.html.
B.4.2 Wave 1.1 Experiment
A split-sample, randomized assignment experiment was conducted in Wave 1.1 to
compare the benefits of using business-reply envelopes with the benefits of using envelopes with
first-class stamps. Preliminary results from the 1999 O*NET pretest provided mixed results
regarding the benefits of using real stamps on the return envelopes. Effects on response rates
were insignificant; however, there was some anecdotal evidence from the BLs that the first-class
stamps yielded shorter response times than the business-reply envelopes. The Wave 1.1
experiment was undertaken to explore this issue further. The results of the experiment supported
the pretest finding that the use of first-class stamps had no effect on either response rates or
112
OMB Supporting Statement
Statistical Methods
response time. As a result, the use of stamps was discontinued in favor of the less expensive
business-reply envelopes.
B.4.3 Point-of-Contact Incentive Experiment
The POC incentive experiment considered the effects on establishment and employee
response rates of offering the POC a $20 incentive in addition to the other incentives that the
POC receives for O*NET participation (i.e., the desk clock and framed Certificate of
Appreciation). About 80% of the approximately 10,500 establishments and 30,000 employees
involved in the experiment were assigned to the treatment group, and the remaining 20% were
assigned to the control group. With this large sample size, statistical power of the experiment
was very high.
The results provided no evidence that the incentive had any effect on establishment
cooperation rates: the POC appeared equally as likely to initially agree to participate with the
$20 incentive as without it. Nor was there evidence of any benefit for employee response rates.
Given these results and the considerable cost of providing monetary incentives to the POC, in
December 2004 it was decided that the $20 POC incentive should be discontinued for all newly
recruited POCs. Detailed results can be found in Biemer, Ellis, Robbins, and Pitts (2006).
B.4.4 Experiments in Weight Trimming
A substantial component of the sampling error for O*NET estimates is due to the survey
weights. Known as the UWE, it can be quite large because of the disproportionate sampling
methods that must be employed to achieve data collection efficiency. The UWE can be reduced
through weight trimming but only at the risk of increasing selection bias. Alternative methods for
weight trimming were investigated from 2005 to 2007. This investigation involved assessing the
effect of successively aggressive weight-trimming plans for a wide range of estimates and
population domains. The weight-trimming analysis was comprehensive, including
comparison of UWEs,
graphical and tabular displays of current weight estimates compared with
aggressively trimmed weight estimates,
evaluation of weights on suppression of estimates,
evaluation of statistical differences between current weight estimates and
aggressively trimmed weight estimates, and
evaluation of substantive differences between current weight estimates and
aggressively trimmed weight estimates.
The method and results of the evaluation are described in an internal report (Penne &
Williams, 2007, July 30). The evaluation resulted in the implementation of a more aggressive
weight-trimming plan that provides an optimal balance of sampling variance and bias.
113
Statistical Methods
OMB Supporting Statement
B.4.5 Experiments in Model-Aided Sampling
The use of MAS to enhance the efficiency of the O*NET sample is considered one of the
most important sample design innovations in the history of the program. This approach
dramatically reduced data collection costs with minimal effects on the accuracy of the estimators.
Work on the development of the MAS methodology began in early 2004 and continued through
the end of 2007. As part of this research effort, various sample cutoff or stopping rules were
investigated by means of Monte Carlo simulation. A cutoff rule determines the point at which
efforts to interview certain types of establishments and employees are discontinued because the
prespecified quota cell criteria have been satisfied. These studies showed that, under a fairly
wide range of cutoff rules, MAS does not bias the O*NET estimates or their standard errors.
Furthermore, MAS substantially reduced the number of establishment contacts required to satisfy
the random sample allocation for an occupation. This innovation resulted in substantial
reductions in respondent burden, data collection costs, and time required to complete an
occupation (Berzofsky et al., 2006).
B.4.6 Alternative Measures of Uncertainty
The standard error of an estimate is a measure of statistical precision that is inversely
proportional to the sample size. However, for many O*NET data users, the interrater agreement
for an average scale rating is very important for their applications. Therefore, beginning in 2005,
alternative measures of uncertainty of scale estimates were investigated to supplement the
current use of standard errors. Three alternatives—the standard deviation, kappa, and weighted
kappa—were analyzed and contrasted, using actual O*NET data as well as simulation.38 This
work led to the decision to make available the standard deviation as a second measure of
uncertainty.
38
To define kappa and the weight kappa, consider the cross-classification of ratings from two raters, A and B. For a
standard 5-point Likert scale, the expected proportion of entries in cell (k , k ) of the AB table is nk nk / n for
2
k , k = 1, . . . 5, where nk is the number of raters in the sample that select category k. The kappa statistic is
defined as
where
Pe P0
,
1 P0
Pe is the agreement rate (sum of the diagonal elements of the AB table) and P0 is the expected agreement
rate, assuming completely random ratings (i.e., all categories are equally likely). The weighted kappa is similar,
except the agreement rates, Pe and P0 , include some fraction, fd, of disagreements, where d is the distance of a
disagreement from the diagonal (see L. Johnson, Jones, Butler, & Main, 1981). In the O*NET application, the fd
weights proposed by Cicchetti and Allison (1971) were used, which have also been implemented in SAS Proc
Freq.
114
OMB Supporting Statement
Statistical Methods
B.4.7 Suppression of Estimates with Poor Precision
Before the O*NET data are published, a set of suppression rules is applied to identify
estimates that may have extremely poor precision due to very large variances or inadequate
sample sizes. Estimates that fail these rules are flagged to caution data users that the estimates
are unreliable and should be interpreted as such. Ideally, estimates that are sufficiently reliable
are not flagged (i.e., suppressed). An optimal set of suppression rules balances the need to
provide as much data as possible to the data users with the need to duly warn users about
estimates that are extremely unreliable. In 2004 and 2005 alternative methods for suppression
were investigated. The study also evaluated the current methodology for estimating the standard
errors of the estimates—the generalized variance function (GVF) method. Using Monte Carlo
simulation techniques, as well as actual O*NET data, the study found that the GVF method
produced standard error estimates that were severely positively biased. As a result, many
estimates that were sufficiently reliable were erroneously suppressed because their standard
errors were overstated. Use of the GVF method was discontinued in favor of a direct method of
variance estimation. Consequently, estimates published before this change were republished
under the direct method of variance estimation. No other changes in the method of estimate
suppression were required.
B.4.8 Dual-Frame Sampling for Hard-to-Find Occupations
Some occupations are rare and dispersed across many Standard Industrial Classifications
(SICs), which is the system used by Dun & Bradstreet for drawing samples of establishments
from their database. This can make certain occupations difficult to find. Identifying an employee
in one of these occupations can require calling scores of establishments over many weeks of data
collection. To address this inefficiency, a dual-frame approach for sampling employees in hardto-find occupations was tested. As the term implies, the dual-frame sampling approach involves
two sampling frames: (1) the usual SIC frame, which has very good coverage of all occupations,
regardless of their size, and (2) a smaller, more targeted frame (such as a professional association
list), which may have much lower coverage of all employees in the occupation but contains a
very high concentration of them, making them more efficient to sample and contact than a
multitude of establishments in many industries. Using these two frames in combination provides
the benefits of good population coverage with reduced data collection costs and reduced
establishment burden. The testing and evaluation of the dual-frame sampling option was
introduced in 2004 for selecting college teachers. This test showed the dual-frame approach to be
an effective method of efficiently completing difficult-to-find occupations. Details regarding
weighting and estimation for the approach were further developed and refined and, eventually, it
was expanded for use in other hard-to-find occupations when suitable second frames of
employees in the occupations are available.
115
Statistical Methods
OMB Supporting Statement
B.4.9 Alternative Levels of Population Coverage
In constructing a sampling frame for an occupation, the sampling statistician has a
choice: include all possible establishments where the occupation could be found, or include only
those where the probability of finding the target occupation is high. The former strategy will
generate a sampling frame having 100% coverage of the population but with many
nonpopulation units. The latter approach will reduce the number of nonpopulation units (and the
sampling and data collection costs) but at a lower rate of population coverage and, thus, with
greater potential coverage bias. The optimal level of frame coverage is one that maximizes data
collection efficiency while maintaining an acceptable level of coverage bias. In 2004,
experiments were conducted to determine the optimum coverage level. Frames were constructed
having a range of coverage levels (from 50% to 100%), and sampling from each of these frames
was simulated with existing data. Both the estimates and their standard errors and the associated
costs of completing the occupations were compared across the frames for many occupations.
These studies clearly indicated that frame coverage could be safely reduced to as low as 50%
with essentially no risk of coverage bias and that data collection costs could be substantially
reduced. These results led to the adoption of a minimum coverage rate of 50% for all occupation
sampling frames.
B.4.10 Adaptive Total Design
The increasing reluctance of sample establishments or persons to participate in surveys,
and growing problems related to gaining cooperation, are challenging researchers’ control over
data collection costs, timeliness, and quality. In response to these challenges, the O*NET Data
Collection Program implemented Adaptive Total Design, a system based on the notion of
responsive design, which tracks data collection metrics to help the survey manager monitor and
control survey operations.
In 2010, O*NET data collection staff worked with programming staff to develop an
online Adaptive Total Design dashboard linked to the CMS, from which the Operations Center
Manager can monitor several metrics in tandem. The initial version of the O*NET dashboard
tracks BL telephone time, BL caseload, and BL response rate by industry. The system has been
designed to be dynamic, allowing for modifications to be made in the interest of continuous
quality improvement.
B.4.11 Analysis of Unit and Item Nonresponse
Nonresponse in the O*NET Data Collection Program can occur from the establishment
POC at the screening/verification, recruiting, or sampling stages of selection. Within-unit
nonresponse occurs at the employee level when a selected employee fails to complete and return
a questionnaire. In addition, employees who return their questionnaires may inadvertently or
116
OMB Supporting Statement
Statistical Methods
intentionally skip one or more question items on the questionnaire. This type of missing data is
known as item nonresponse. The effects of all three types of nonresponse on the estimates have
been continually investigated since 2003 and reported to OMB; the nonresponse analysis for
Analysis Cycles 9–12 appears as Appendix E.39 Such analyses have shown that nonresponse
errors, whether unit, within unit, or item, do not appear to be a significant source of error in the
O*NET program. In addition, the sampling weights are adjusted for nonresponse, which further
minimizes any potential bias (Section B.1.1).
B.4.12 Additional Tests of Procedures
This list of the O*NET tests of procedures is far from complete; many other, smaller tests
and evaluations have been conducted for the O*NET Program in the name of continuous quality
improvement. Still, the list indicates the breadth and depth of the investigations and highlights
some of the major results and design innovations. Work continues on quality improvement to
this day because optimum quality for an ongoing survey of a dynamic, ever-changing population
is a quest, not a destination.
B.5
Statistical Consultants
The DOL/Employment and Training Administration (ETA) official responsible for the
O*NET Data Collection Program is Pam Frugoli (202-693-3643). Through a DOL grant, the
National Center for O*NET Development in Raleigh, North Carolina, is responsible for
managing O*NET-related projects and contracts and for providing technical support and
customer service to users of the O*NET data and related products. Under contract to the Center,
RTI International is responsible for providing sampling, data collection, data processing, and
data analysis services. Additional analyses are provided by HumRRO, Inc., in Alexandria,
Virginia, and by North Carolina State University in Raleigh, North Carolina. The statistical
consultants listed in Exhibit 19 reviewed this OMB Supporting Statement.
The primary authors of Section B.1 of the Supporting Statement are Michael Penne and
Paul Biemer. Mr. Penne is a senior research statistician and Dr. Biemer is a Distinguished
Statistical Fellow at RTI International.
Draft versions of the Supporting Statement were reviewed by Summit Consulting on
behalf of BLS. Summit’s review appears in Appendix F. ETA was notified in November 2011 of
BLS’s approval of the OMB package.
39
The OMB Supporting Statement dated September 10, 2008, included nonresponse analyses for Analysis Cycles
4–8. The OMB Supporting Statement dated September 2, 2005, included nonresponse analyses for Analysis
Cycles 1–3.
117
Statistical Methods
OMB Supporting Statement
Exhibit 19. Statistical Consultants
Name
Organization
Telephone Number
Nonfederal Statisticians and Researchers
James B. Rounds
University of Illinois at Urbana-Champaign
217-333-8519
Michael Campion
Purdue University
765-494-5909
On behalf of the U.S. Bureau of Labor Statistics
202-407-8300
Federal Government
Independent statistical review by
Summit Consulting
Data Collection/Analysis Contractor (RTI International)
Paul Biemer
RTI International
919-541-6056
Michael Penne
RTI International
919-541-5988
Marcus Berzofsky
RTI International
919-316-3752
118
C. References
Abel, J. R., & Gabe, T. M. (2008, July). Human capital and economic activity in urban America.
Federal Reserve Bank of New York Staff Reports, 332.
ACT. (2006). Ready for college and ready for work: Same or different? Retrieved February 6,
2008, from http://www.act.org/research/policymakers/pdf/ReadinessBrief.pdf
Berzofsky, M. E., Welch, B., Williams, R. L., & Biemer, P. P. (2006). Using a model-assisted
sampling paradigm instead of a traditional sampling paradigm in a nationally
representative establishment survey. Proceedings of the American Statistical Association,
Section on Survey Research Methods (pp. 2763–2770). Washington, DC: American
Statistical Association. Retrieved from
https://www.amstat.org/Sections/Srms/Proceedings/y2006/Files/JSM2006-000811.pdf
Biemer, P., Ellis, C., Pitt, A., & Robbins, K. (2006). Effects on response rates and costs of a
monetary incentive for the point of contact in an establishment survey. Proceedings of the
American Association of Public Opinion Research. Phoenix, AZ.
Biemer, P., Ellis, C., Pitts, A., & Robbins, K. (2006). Effects on response rates and costs of a
monetary incentive for the point of contact in an establishment survey. Proceedings of the
American Association of Public Opinion Research.
Central Michigan University. (2004). Leadership competency model [O*NET database].
Retrieved January 2, 2008, from http://www.chsbs.cmich.edu/leader_model/model.htm
Chromy, J. R. (1979). Sequential sample selection methods. Proceedings of the American
Statistical Association, Section on Survey Research Methods (pp. 401–406). Washington,
DC: American Statistical Association.
Cicchetti, D. V., & Allison, T. (1971). A New Procedure for Assessing Reliability of Scoring
EEG Sleep Recordings. American Journal of EEG Technology, 11, 101–109.
Couper, M. (2008). Designing effective Web surveys. New York, NY: Cambridge University
Press.
Deville, J. C., & Särndal, C. E. (1992). Calibration estimation in survey sampling. Journal of the
American Statistical Association, 87(418), 376–382.
Dierdorff, E. C., Norton, J. J., Drewes, D. W., Kroustalis , C. M., Rivkin, D., & Lewis, P. (2009).
Greening of the world of work: Implications for O*NET-SOC and new and emerging
occupations. Raleigh, NC: National Center for O*NET Development. Retrieved from
http://www.onetcenter.org/dl_files/Green.pdf
Dillman, D. (1978). Mail and telephone surveys: The total design method. New York: Wiley.
Dillman, D. (2000). Mail and Internet surveys: The tailored design method. New York: Wiley.
Elliott, S. W. (2007, May). Projecting the impact of computers on work in 2030. Paper presented
at the Workshop on Research Evidence Related to Future Skill Demands Center for
Education, National Research Council, Washington, DC.
Fleishman, E. A., & Mumford, M. D. (1988). The ability requirements scales. In S. Gael (Ed.),
The job analysis handbook for business, industry, and government. New York: Wiley.
Fleishman, E. A., & Mumford, M. D. (1991). Evaluating classifications of job behavior: A
construct validation of the ability requirements scales. Personnel Psychology, 44(3), 523–
575. doi:10.1111/j.1744-6570.1991.tb02403.x
119
References
OMB Supporting Statement
Folsom, R. E., Potter, F. J., & Williams, S. R. (1987). Notes on a composite size measure for
self-weighting samples in multiple domains. Proceedings of the American Statistical
Association, Section on Survey Research Methods (pp. 792–796). Washington, DC:
American Statistical Association.
Folsom, R. E., & Singh, A. C. (2000). A generalized exponential model of sampling weight
calibration for extreme values, nonresponse and poststratification. Proceedings of the
American Statistical Association, Section on Survey Research Methods (pp. 598–603).
Washington, DC: American Statistical Association. Available from
https://www.amstat.org/Sections/Srms/Proceedings/
Folsom, R. E., & Witt, M. B. (1994). Testing a new attrition nonresponse adjustment method for
SIPP. Proceedings of the American Statistical Association, Social Statistics Section.
Washington, DC: American Statistical Association.
Hanna, J. (2008, December). How many U.S. jobs are "offshorable"? Harvard Business School
Working Knowledge newsletter., Retrieved December 8, 2008, from
http://hbswk.hbs.edu/item/6012.html
Hanser, L. M., Campbell, J., Pearlman, K., Petho, F., Plewes, T., & Spenner, K. (2008). Final
report of the panel on the Department of Defense human capital strategy. Report
prepared for the Office of the Secretary of Defense by the RAND Corporation.
Johnson, L., Jones, A., Butler, M., & Main, D. (1981). Assessing interrater agreement in job
analysis ratings. San Diego, CA: Naval Health Research Center.
Johnson, R. W., Mermin, B. T., & Resseger, M. (2007, November). Employment at older ages
and the changing nature of work (Urban Institute Report No. 2007-20). Washington, DC:
AARP.
Klein, R. J., Proctor, S. E., Boudreault, M. A., & Tuczyn, K. M. (2002). Healthy People 2010
criteria for data suppression. Hyattsville, MD: National Center for Health Statistics.
Korn, E. L., & Graubard, B. I. (1999). Analysis of health surveys. New York: Wiley.
Landrum, R. E. (2009). Finding jobs with a psychology bachelor’s degree: Expert advice for
launching your career. Washington, DC: American Psychological Association.
Leeuw, A. (2006, December). The butcher, the baker and the candlestick-maker revisited:
Indiana's new skills-based career clusters. INcontext, 7(12), Retrieved from
http://www.incontext.indiana.edu/2006/december/2006.asp
Mumford, M. D., Peterson, N. G., & Childs, R. A. (1997). Basic and cross-functional skills:
Evidence for the reliability and validity of the measures. In N. G. Peterson, M. D.
Mumford, W. C. Borman, P. R. Jeanneret, E. A. Fleishman & K. Y. Levin (Eds.), O*NET
final technical report. Salt Lake City, UT: Utah Department of Workforce Services
through a contract with American Institutes of Research.
Paxon, M. C., Dillman, D. A., & Tarnai, J. (1995). Improving response to business mail survey.
In B. G. Cox (Ed.), Business survey methods. New York: Wiley.
Penne, M., & Williams, R. (2007, July 30). Aggressive Weight Trimming Evaluation. Research
Triangle Park, NC: RTI International.
Peterson, N. G., Mumford, M. D., Borman, W. C., Jeanneret, P. R., & Fleishman, E. A. (1995).
Development of prototype Occupational Information Network (O*NET) content model.
Salt Lake City: Utah Department of Workforce Service through a contract with the
American Institutes of Research.
120
OMB Supporting Statement
References
Peterson, N. G., Mumford, M. D., Borman, W. C., Jeanneret, P. R., Fleishman, E. A., & Levin,
K. Y. (Eds.). (1997). O*NET final technical report. Salt Lake City: Utah Department of
Workforce Services through a contract with American Institutes for Research.
Peterson, N. G., Mumford, M. D., Borman, W. C., Jeanneret, P. R., Fleishman, E. A., Levin, K.
Y., Campion, M. A., Mayfield, M. S., Morgeson, F. P., Pearlman, K., Gowing, M. K.,
Lancaster, A. R., Silver, M. B., & Dye, D. M. (2001). Understanding work using the
Occupational Information Network (O*NET): Implications for practice and research.
Personnel Psychology, 54(2), 451–492. doi:10.1111/j.1744-6570.2001.tb00100.x
Peterson, N. G., Mumford, M. D., Levin, K. Y., Green, J., & Waksberg, J. (1997). Research
method: Development and field testing of the content model. In N. G. Peterson, M. D.
Mumford, W. C. Borman, P. R. Jeanneret & K. Y. Levin (Eds.), O*NET final technical
report. Salt Lake City: Utah Department of Workforce Services, through a contract with
the American Institutes for Research.
Peterson, N. G., Mumford, M. D., Levin, K. Y., Green, J., & Waksberg, J. (1999). Research
method: Development and field testing of the content model. In N. G. Peterson, M. D.
Mumford, W. C. Borman, P. R. Jeanneret & E. A. Fleishman (Eds.), An occupational
information system for the 21st century: The development of O*NET (pp. 31–47).
Washington, DC: American Psychological Association.
Peterson, N. G., Owens-Kurtz, C., Hoffman, R. G., Arabian, J. M., & Whetzel, D. C. (1990).
Army synthetic validation project. Alexandria, VA: U.S. Army Research Institute for the
Behavioral Sciences.
Petroni, R., Sigman, R., Willimack, D., Cohen, S., & Tucker, C. (2004). Response rates and
nonresponse in BLS and Census Bureau establishment surveys. Proceedings of the
Survey Research Methods Section (pp. 4159–4166). Alexandria, VA: American
Statistical Association.
Postlethwaite, B. E., Wang, X., Casillas, A., Swaney, K., McKinniss, T. L., Allen, J., & et al.
(2009, April). Person-occupation fit and integrity: Evidence for incremental validity.
Paper presented at the annual conference of the Society for Industrial and Organizational
Psychology, New Orleans, LA.
Roth, P. L., & BeVier, C. (1998). Response rates in HRM/OB survey research: Norms and
correlates, 1990–1994. Journal of Management, 24(1), 97–117.
doi:10.1177/014920639802400107
RTI International (Producer). (2004). SUDAAN language manual (Release 9.0).
Russell, T. L., Sinclair, A., Erdheim, J., Ingerick, M., Owens, K., Peterson, N., & et al. (2008).
Evaluating the O*NET occupational analysis system for army competency development
(Contract for Manpower, Personnel, Leader Development, and Training for the U.S.
Army Research Institute). Alexandria, VA: HumRRO.
Thompson, K. R., & Koys, D. J. (2010). The management curriculum and assessment journey:
Use of Baldridge criteria and the Occupational Network Database. Journal of Leadership
and Organizational Studies, 17(2), 156–166.
Tracey, J. B., Sturman, M. C., & Tews, M. J. (2007). Ability versus personality: Factors that
predict employee job performance. Cornell Hotel and Restaurant Administration
Quarterly, 48(3), 313-322.
121
References
OMB Supporting Statement
Treasury Board of Canada Secretariat. (2009, May). Public servants on the public service of
Canada: Summary of the results of the 2008 Public Service Employee Survey. Retrieved
from http://www.tbs-sct.gc.ca/pses-saff/2008/report-rapport-eng.asp#appa
RESOURCEID = 16339
Tsacoumis, S., & Van Iddekinge, C. H. (2006). A comparison of incumbent and analyst ratings
of O*NET skills. Alexandria, VA: Human Resources Research Organization.
Tulp, D. R., Jr., Hoy, E., Kusch, G., & Cole, S. (1991). Nonresponse under mandatory vs.
voluntary reporting in the 1989 survey of pollution abatement costs and experiments
(PACE). Proceedings of the American Statistical Association, Section on Survey
Research Methods (pp. 272–277). Washington, DC: American Statistical Association.
U.S. Bureau of Labor Statistics. (2011). Labor force statistics from the Current Population
Survey. Table 12: Employed persons by sex, occupation, class of worker, full- or parttime status, and race. Retrieved from http://www.bls.gov/cps/cpsaat12.pdf
U.S. Department of Labor, Employment & Training Administration. (2008). O*NET data
collection program: Office of Management and Budget clearance package supporting
statement and data collection instruments. Washington, DC: Author.
U.S. Office of Personnel Management. (2010). Federal Employee Viewpoint Survey.
Washington, DC: Author. Available from http://www.fedview.opm.gov/
Willenborg, L., & De Waal, T. (1996). Statistical disclosure control in practice series: Lecture
notes in statistics (Vol. 111). New York, NY: Springer.
Worden, G., & Hoy, E. (1992). Summary of nonresponse studies conducted by industry division,
1989–1991. U.S. Bureau of the Census. Washington, DC.
Workforce Investment Act, Pub. L. 105-220, 112 Stat. 936 (1998).
122
File Type | application/pdf |
Author | Loraine Monroe |
File Modified | 2012-04-12 |
File Created | 2012-02-15 |