Download:
pdf |
pdfAppendix E
Technology and Engineering Literacy Reports
1. Student report
2. School Report
3. Final Adjudication Decisions
Technology and Engineering Literacy Reports
1. Student report
NAEP Item Development (ID)
Technology and Engineering Literacy (TEL)
2014 Probe Survey Questionnaire
Recommendations to NCES
Grade 8 Student Questionnaire
Deliverable in response to ID Task 3.1.2
Submitted: March 12, 2013
Listening.
Learning.
Leading.
Table of Contents
2014 Technology and Engineering Literacy (TEL)
Grade 8 Student Questionnaire:
Post-Pilot Analysis and Recommendations
Background ............................................................................................................................................. 2
Criteria for item review........................................................................................................................... 3
Recommendations for the 2014 TEL Probe assessment – Overview ..................................................... 7
Item Review and Recommendations for the 2014 TEL Student Grade 8 Probe assessment ............... 10
References ............................................................................................................................................ 43
1
2014 Technology and Engineering Literacy (TEL)
Grade 8 Student Questionnaires:
Post-Pilot Analysis and Recommendations
This document provides a post-pilot review of the grade 8 student and school Technology and
Engineering Literacy (TEL) questionnaires using data collected in the 2013 pilot administration.
The goal of this report is to evaluate the performance of the items and propose a set of
questions that will be included in the 2014 TEL probe assessment. As such, this review serves
the research objective to develop items that provide reportable survey results based on
measures of contextual factors that might explain differences in student performance (e.g.,
more proficient students have access to more instructional or extracurricular content related
to engineering design).
Unlike previous post-pilot recommendations, a much stricter item evaluation and selection
was required for this recommendation given the spiraled design in the 2013 pilot
administration (see more details below). The questionnaire material had to be reduced from
approximately 24 minutes of assessment time in the pilot (based on actual pilot timing data)
to 10 minutes of assessment time in the probe assessment. Recommendations are based on a
combination of criteria including an analysis of frequency data and theoretical considerations.
Background
The NAEP TEL assessment measures three core areas of interest: Technology and Society
(T&S), Design and Systems (D&S), and Information and Communication Technologies (ICT).
Technology and Society addresses the effects that technology has on society and on the
natural world. Design and Systems covers the nature of technology, the engineering design
process, and basic principles of dealing with everyday technologies. Information and
Communication Technology includes computers and software learning tools, networking
systems and protocols, handheld digital devices, and other technologies for accessing,
creating, and communicating information and for facilitating creative expression (WestEd,
2010). The TEL issues paper identified four broad issues that informed and guided the survey
questionnaire development administered in the pilot assessment. These are: availability of
school resources; organization of technology and engineering instruction; teacher
preparation; and student engagement. Each of these issues comprises several sub-issues. Two
of these four issues were covered in both the student and school questionnaires. Teacher
preparation was covered in the school questionnaire only and student engagement was
covered in the student questionnaire only.
The 2013 TEL pilot used a spiraled design where not every student received all questionnaire
items. In order to maximize the number of questions included certain items were spiraled
across different, partially overlapping, questionnaire forms. This design was chosen to make
2
better use of the limited questionnaire time, while covering a maximum number of topics and
constructs, particularly in the pilot assessment. This ensured that a sufficient number of items
pertaining to each topic and each issue could be piloted. The available questionnaire response
time was 15 minutes per student, of which approximately 4–5 minutes were devoted to a
student core section (e.g., demographic items) which is required to be administered as part of
every NAEP survey questionnaire. Questions in the remaining 10–11 minutes were TEL specific
questions. At the end of the questionnaire, a few debriefing items were administered, as was
the case in other NAEP questionnaires.
In the spiraling design, questions were divided into separate blocks (or sections) that were
configured into 10 different booklets (see details below) that were administered to students.
Spiraling provided randomly equivalent samples of students receiving each of the blocks.
Using timing data collected from the tryout, an appropriate division of items and a spiraling
approach was determined that would adequately pilot all of the items without reducing data
quality or increasing respondent burden. Students were allotted four minutes to complete
each of five different blocks. There were ten different combinations of the five blocks resulting
in 10 different booklets. This procedure ensured that 1) every BQ spiraled item was paired
with other spiraled BQ items; 2) the position effect of the blocklets in the combination were
removed (i.e., each of the five blocklets appeared an equal number of times in the each of two
positions), and 3) each of the BQ combinations were paired with each of the cognitive blocks.
Table 1 (on page 6) shows how the items were distributed across the ten booklets with each
category represented by a different color.
Criteria for Item Review
Analysis of Frequency Distributions
Following the same procedure as for other background questionnaire item reviews, a set of
frequency-based flagging criteria were considered in evaluating whether items were
applicable to the targeted population. It is important to keep in mind that flags are indications
that a particular item should be thoroughly evaluated. Flags are not absolute criteria for
making decisions regarding the use or quality of items. Instead, the flagging criteria should be
viewed collectively, along with other criteria and professional judgment, in recommending
keeping, revising, or dropping items from the 2014 TEL Probe survey questionnaires.
For this analysis, we examined item response frequencies for response options. We also
assessed item non-response patterns to determine whether problematic items or response
options warrant revising items, expanding or collapsing response categories, or dropping an
item. The flagging criteria on response patterns and item non-response (i.e., missing response)
rates for reviewing the data are:
3
A high percentage of item non-response (relative to other adjacent items) may indicate
that the item content might have been problematic (e.g., ambiguous, burdensome,
overly complex, offensive) or that the format might have caused respondents to
overlook the item. Note that, this criterion does not apply to multiple selection
multiple-choice items, because the missing rate for “select all that apply” items
contains both missing responses and “not apply” responses.
Low single-category response rates (e.g., <10%) may indicate that a category does not
apply to this population and possibly that different categories may be more
informative.
High single-category response rates (e.g., >80%) may indicate that almost all
respondents in the population fall into one category and that a limited range of
demographics or behavior indicators are collected.
In addition to these flagging criteria, we investigated differences in relative frequencies
for response categories across the different booklets in which an item had been
administered. High variation in relative frequencies across booklets might be an
indicator for instability of item performance.
Implications of flags on any of these criteria for a given item will be discussed in more detail.
Note again that, flagging an item on one of these criteria does not, by itself, warrant the
necessity for revising or dropping the item. Whether an item needs to be modified or revised
also depends on whether response categories are unique to an item versus representing one
of the standard formats used across many questionnaire items, and whether certain response
categories are needed from a theoretical perspective. Maintaining a high level of consistency
across items is one important consideration for the validity of the questionnaires as well as
the trend information being collected.
Response Time/Burden
Item selection was guided by available response time data from the pilot assessment. All
timing estimates were based on the 90th percentile of the response time based on available
pilot data. The rationale for choosing this value was that a maximum not-complete-rate of
10% is expected when basing timing decisions on the 90th percentile. Results from previous
analyses with NAEP questionnaires showed that a 10% missing rate seemed to be acceptable
from a sampling and analysis perspective with unbiased estimates for each subgroup
investigated. Further, 10% missing rates for BQ items have been considered acceptable in
previous operational NAEP assessments. For example, in 2011 Grade 8 science some science
specific BQ items had 11% (2009: 8 %) or 12% (2009: 13%) missing rate in the teacher’s BQ
questionnaire that was used in the NAEP analysis. Our previous operational experience shows
that about 90% of the response rate still ensured us a valid sample to represent the whole
population. Therefore, using the 90th percentile of the complete time as the cutoff is effective
from an analysis perspective. Note that the same rule for timing estimates was used when the
TEL pilot questionnaires were assembled. The data used for the item review summarized in
4
this document shows that using the 90th percentile actually resulted in rather conservative
time estimates with missing values rates clearly less than 10%. Even for the items at the very
end of the questionnaire, missing value rates only reached values of around 5 % for the TEL
pilot assessment. Overall, missing value rates were close to zero for most items.
The number of TEL-specific items that can be selected for the probe assessment was limited to
what will fit into 600 seconds questionnaire time. The recommendations summarized in the
following are in line with the criterion of including not more than 600 seconds of material.
Wherever feasible, reducing the number of sub-items in matrix items was considered as one
mean to reduce burden while maintaining the breadth of the questionnaire.
Because of the strict timing constraints, many of the items that are recommended to be
dropped showed satisfactory performance and might serve as an item pool for future
assessments.
Theoretical Coherence, Relevance, and Content Coverage
In addition to purely data-driven evaluation, a number of theory-based criteria were applied
when making recommendations. First, even a reduced questionnaire needs to include
questions about each of the three main topic areas as well as the three issues covered in the
student questionnaire. Items were selected to create a balance of the number of items or subitems across the three areas and to cover the most important themes and sub-themes as
defined in the TEL issues paper.
Another important goal was to include items addressing both at-school and out-of-school
learning and activities given the importance of out-of-school learning experiences for TEL. Few
studies have systematically investigated effects of out-of-school learning experiences. The
available research suggests that under certain circumstances, technology and engineering
education can boost learning and achievement in science and mathematics.
Further, the coverage of all three topic areas for student engagement questions was an
important criterion, i.e., the inclusion of questions related to student interest in each of the
domains given the importance of interests in particular content areas as predictors of
subsequent career choices (e.g., Holland 1997; Lubinski 2000), and a selection of self-efficacy
or self-concept items for the three areas. Meta-analyses have shown that self-efficacy is one
of the strongest predictors of student achievement (e.g., Richardson, Abraham, & Bond,
2012). Students’ self-efficacy perceptions can be considered an important achievement
predictor as well as an informative outcome with policy relevance by themselves.
When possible items with clearly quantifiable, behavior-related response categories were
preferred over items with more vague response categories.
5
Table 1 - Spiraling Design in the TEL Pilot
J2TXBQ01
VE631435
VE631437
VE011083
VE011103
VF541314
VE011108
VE011109
VE011111
VE011063
VE011064
VE011121
VF541324
VE117468
VE682225
VE682232
VE639166
VE639123
VF025108
VE682276
VE638999
VE639008
VE682315
VE682317
VF009358
VE401773
VE401776
VE401779
J2TXBQ02
VE631435
VE631437
VE011083
VE011103
VF541314
VE011108
VE011109
VE011111
VE011063
VE011064
VE011121
VF541324
VE117468
VE639123
VF025108
VE682276
VE639025
VF009050
VE682274
VE682217
VE682215
VE638999
VE639008
VE682315
VE682317
VF009358
VE401773
VE401776
VE401779
J2TXBQ03
VE631435
VE631437
VE011083
VE011103
VF541314
VE011108
VE011109
VE011111
VE011063
VE011064
VE011121
VF541324
VE117468
VE639025
VF009050
VE682274
VE682217
VE682215
VE639847
VE638956
VE638983
VF238958
VE638999
VE639008
VE682315
VE682317
VF009358
VE401773
VE401776
VE401779
J2TXBQ04
VE631435
VE631437
VE011083
VE011103
VF541314
VE011108
VE011109
VE011111
VE011063
VE011064
VE011121
VF541324
VE117468
VE639847
VE638956
VE638983
VF238958
VE639842
VE639172
VE681624
VE638999
VE639008
VE682315
VE682317
VF009358
VE401773
VE401776
VE401779
J2TXBQ05
VE631435
VE631437
VE011083
VE011103
VF541314
VE011108
VE011109
VE011111
VE011063
VE011064
VE011121
VF541324
VE117468
VE639842
VE639172
VE681624
VE682225
VE682232
VE639166
VE638999
VE639008
VE682315
VE682317
VF009358
VE401773
VE401776
VE401779
J2TXBQ06
VE631435
VE631437
VE011083
VE011103
VF541314
VE011108
VE011109
VE011111
VE011063
VE011064
VE011121
VF541324
VE117468
VE682225
VE682232
VE639166
VE639025
VF009050
VE682274
VE682217
VE682215
VE638999
VE639008
VE682315
VE682317
VF009358
VE401773
VE401776
VE401779
J2TXBQ07
VE631435
VE631437
VE011083
VE011103
VF541314
VE011108
VE011109
VE011111
VE011063
VE011064
VE011121
VF541324
VE117468
VE639123
VF025108
VE682276
VE639847
VE638956
VE638983
VF238958
VE638999
VE639008
VE682315
VE682317
VF009358
VE401773
VE401776
VE401779
Note. Within each booklet, the item sequence follows the order from top to the bottom.
Legend
Common (items apply to all the three assessment areas, that is D&S, ICT, and T&S)
Design and Systems
Information and Communication Technology
Technology and Society
Common Debrief
6
J2TXBQ08
VE631435
VE631437
VE011083
VE011103
VF541314
VE011108
VE011109
VE011111
VE011063
VE011064
VE011121
VF541324
VE117468
VE639847
VE638956
VE638983
VF238958
VE682225
VE682232
VE639166
VE638999
VE639008
VE682315
VE682317
VF009358
VE401773
VE401776
VE401779
J2TXBQ09
VE631435
VE631437
VE011083
VE011103
VF541314
VE011108
VE011109
VE011111
VE011063
VE011064
VE011121
VF541324
VE117468
VE639025
VF009050
VE682274
VE682217
VE682215
VE639842
VE639172
VE681624
VE638999
VE639008
VE682315
VE682317
VF009358
VE401773
VE401776
VE401779
J2TXBQ10
VE631435
VE631437
VE011083
VE011103
VF541314
VE011108
VE011109
VE011111
VE011063
VE011064
VE011121
VF541324
VE117468
VE639842
VE639172
VE681624
VE639123
VF025108
VE682276
VE638999
VE639008
VE682315
VE682317
VF009358
VE401773
VE401776
VE401779
Recommendations for the 2014 TEL Probe Assessment – Overview
In the following we will summarize our item review in detail for each item, along with the
recommendation for the 2014 TEL Probe assessment. Three different cases are distinguished:
1.) An item is recommended to be kept in the questionnaire;
2.) An item is recommended to be dropped from the questionnaire based on timing constraints
and student burden;
3.) An item is recommended to be dropped in the questionnaire based on poor item performance.
Note that, in cases 1 and 2, an item shows satisfactory performance to be included in the 2014 TEL
probe assessment but is recommended to be dropped only based on timing constraints and student
burden.
For some matrix items we may recommend to keep the item stem and certain sub-items but to drop
specific sub-items, either based on item performance or for burden reasons.
Table 2 presents an overview of all recommendations. For each item, the number in the booklet, the
Accession number, and the area and issue that are addressed are given in the table. In addition, the
two rightmost columns summarize our recommendation and the rationale for recommending dropping
complete items or sub-items.
7
Table 2 - Recommendations for 2014 TEL Probe - Overview
Item #
14
AccNum
VE682225
Area
Design and
Systems
Issue
Org. of Instruction
Recommendation
15
VE682232
Design and
Systems
Student Engagement Drop
16
VE639166
Common
Student Engagement Keep
17
VE639123
ICT
Org. of Instruction
18
VF025108
ICT
19
VE682276
Design and
Systems
20
VE639025
Common
Keep this item and sub-items a, c, d, e, and f;
Drop sub-item b (VE639127).
Student Engagement Keep this item and sub-items a, c, d, e, and f;
Drop sub-item b (VF025110)
Student Engagement Keep this item and sub-items b, c, and d;
Drop sub-items a (VE682278) and e (VE682286).
Student Engagement Drop
21
VF009050
Common
Student Engagement Drop
22
VE682274
Design and
Systems
23
VE682217
ICT
24
VE682215
ICT
25
VE639847
Common
Availability and Use
Drop
of Instructional
Resources
Student Engagement Keep this item and sub-items a, b, and c;
Drop sub-item d (VE682222).
Availability and Use
Drop
of Instructional
Resources
Org. of Instruction
Drop
26
VE638956
Design and
Systems
27
VE638983
Design and
Systems
Drop
Rationale for Recommendation
Time restrictions/student
burden
Time restrictions/student
burden
NA
Time restrictions/student
burden
Time restrictions/student
burden
Time restrictions/student
burden
Time restrictions/student
burden and item performance
Time restrictions/student
burden and item performance
Item performance
Time restrictions/student
burden
Item performance
Time restrictions/student
burden
Org. of Instruction
Keep this item and sub-items b, c, d, and e;
Time restrictions/student
Drop sub-items a (VE638957) and f (VE682248).
burden and item performance
Student Engagement Keep this item and sub-items c, d, e, and f;
Time restrictions/student
Drop sub-items a (VE638986), b (VE009777), and g burden and item performance
(VE682268).
for sub-item g.
8
Item #
28
AccNum
VF238958
ICT
Area
Issue
Org. of Instruction
Drop
Recommendation
Rationale for Recommendation
Time restrictions/student
burden
NA
29
VE639842
Common
Org. of Instruction
Keep
30
VE639172
Common
Student Engagement Drop
Item performance
31
VE681624
ICT
Org. of Instruction
Keep
NA
32
VE638999
Org. of Instruction
Keep
NA
33
VE639008
Student Engagement Keep
NA
34
VE682315
Technology and
Society
Technology and
Society
Technology and
Society
Item performance
35
VE682317
Availability and Use
Drop
of Instructional
Resources
Student Engagement Keep
36
NA
VF009358
Technology and
Society
Common -- Debrief Student Engagement Keep
37
VE401773
Common -- Debrief Student Engagement Keep
NA
38
VE401776
Common -- Debrief Student Engagement Keep
NA
39
VE401779
Common -- Debrief Student Engagement Keep
NA
NA
Note. NA = not applicable
9
Item Review and Recommendations for the 2014 TEL Student Grade 8 Probe
Assessment
Note: Core NAEP background items are not included in this review
VE682225
14. In school, how often have you learned about or discussed the following? Select one circle in each
row.
Never Rarely Sometimes Often
a. The use and purpose of tools, machines, or
devices
b. The care or maintenance of tools, machines, or
devices
c. Designing or creating something to solve a
problem
d. Designing something when there is limited time,
money, or materials
e. Figuring out how to fix something
f. Finding the right people to work with or get help
from to fix something
Area: Design and Systems
Issue: Organization of Instruction
Booklets: 1, 5, 6, 8
(A)
(B)
(C)
(D)
VE682226
(A)
(B)
(C)
(D)
VE682227
(A)
(B)
(C)
(D)
VE682228
(A)
(B)
(C)
(D)
VE682229
(A)
(B)
(C)
(D)
VE682230
(A)
(B)
(C)
(D)
VE682231
Item Review:
This item provides contextual information for the cognitive assessment pertaining to Design and
Systems literacy. It focuses on TEL-related learning opportunities and experiences at school. Exactly the
same sub-items as in its outside-of-school-related counterpart (VE682232) are used. The frequency
distribution across response options is satisfactory for all sub-items with reasonable balance of responses
across all response categories. The missing rate is close to zero (0.23 – 0.29%) across all questionnaire
booklets. The maximum difference in the response proportion across the four booklets in which the item
has been administrated is 13%-points, which indicates that there is some variation in item performance
across booklets that should be investigated more closely.
Overall, this item and all sub-items show satisfactory item statistics to be included in the TEL student
questionnaire. There is no evidence that any response categories need to be collapsed, expanded, or
deleted. Item VE638956, however, covers similar content and should be given higher priority as the
response options represent more quantifiable, behavior-related categories than those in item
VE682225.
Recommendation:
Drop this item and all sub-items from the 2014 TEL Probe Administration based on time
restrictions/consideration of student burden.
10
VE682232
15. Outside of school, how often have you learned about or discussed the following? Select one circle
in each row.
Never Rarely Sometimes Often
a. The use and purpose of tools, machines, or
devices
b. The care or maintenance of tools, machines, or
devices
c. Designing or creating something to solve a
problem
d. Designing something when there is limited time,
money, or materials
e. Figuring out how to fix something
f. Finding the right people to work with or get help
from to fix something
(A)
(B)
(C)
(D)
VE682233
(A)
(B)
(C)
(D)
VE682234
(A)
(B)
(C)
(D)
VE682238
(A)
(B)
(C)
(D)
VE682236
(A)
(B)
(C)
(D)
VE682237
(A)
(B)
(C)
(D)
VE682235
Area: Design and Systems
Issue: Student Engagement
Booklets: 1, 5, 6, 8
Item Review:
This item provides contextual information for the cognitive assessment pertaining to Design and
Systems literacy. It focuses on TEL-related learning opportunities and experiences outside of school.
Exactly the same sub-items as in the previous at-school item are used. The frequency distribution across
response options is satisfactory for five of the six sub-items with reasonable balance of responses across all
response categories. Sub-item e has a low response rate (less than 10%) for one category. The missing rate
is close to zero (0.31 – 0.39%) across all questionnaire booklets. The maximum difference in the response
proportion across the four booklets in which the item has been administrated is 13%-points, which
might indicate that there is some variation in item performance across booklets.
Overall, this item and all sub-items show satisfactory item statistics to be included in the TEL student
questionnaire. There is no evidence that any response categories need to be collapsed, expanded, or
deleted. Item VE638983, however, covers similar content and should be given higher priority as the
response options represent more quantifiable, behavior-related anchors than those in item VE682232.
Recommendation:
Drop this item and all sub-items from the 2014 TEL Probe Administration based on time
restrictions/consideration of student burden.
11
VE639166
16. How interested are you in learning about the following areas of technologies? Select one circle in
each row.
a. Information and communication
(for example, computers, Internet,
social networking sites)
b. Transportation (for example, cars,
planes, trains, traffic analysis)
c. Construction (for example,
architecture, building a bridge)
d. Power and energy (for example,
dams, power plants, batteries)
e. Environmental and green
technologies (for example,
recycling, renewable energy
sources such as sunlight and wind)
f. Agriculture (for example, farming,
food chemistry)
g. Medical technologies (for
example, vaccines, drugs, surgical
tools, heart monitors, x-ray
machines)
h. Home and domestic (for example,
air conditioning, cleaning, cooking,
heating, plumbing, sewing)
i. Manufacturing (for example, what
goes on in factories, developing or
improving products)
Area: Common
Issue: Student Engagement
Booklets: 1, 5, 6, 8
Not at all
interested
Not too
interested
Somewhat
interested
Very
interested
(A)
(B)
(C)
(D)
VE639168
(A)
(B)
(C)
(D)
VE639169
(A)
(B)
(C)
(D)
VE639171
(A)
(B)
(C)
(D)
VE639173
(A)
(B)
(C)
(D)
VE639174
(A)
(B)
(C)
(D)
VE639175
(A)
(B)
(C)
(D)
VE639176
(A)
(B)
(C)
(D)
VF009755
(A)
(B)
(C)
(D)
VE639170
Item Review:
This is common item across the three TEL areas and provides information about students’ interest in
nine areas of technologies. The item collects important information regarding whether students would
like to learn more in each of the areas of technologies in the future and as such result may inform TEL
curriculum design and have important policy implication. The frequency distribution across response
options is satisfactory for all sub-items with reasonable balance of responses across all response categories.
The missing rate is less than 1% (0.45 – 0.56%) in the aggregated response frequency. There is no
evidence that any response options need to be collapsed, expanded, or deleted. The maximum difference
in the response proportion across the four booklets in which the item has been administrated is 6%point indicating that performance is very stable across different booklets for this item.
12
This item and all sub-items show satisfactory item statistics to be included in the TEL student
questionnaire. In direct comparison with the two other matrix items focusing on student engagement
(VE639025, VF009050), higher priority should be given to this item because it provides more specific
information about multiple areas of technologies, and not only on technology and engineering in
general. Moreover, frequency distributions for this item are more balanced than those for items
VE639025 and VF009050 allowing better differentiation across students.
Recommendation:
Keep this item and all sub-items in the 2014 TEL Probe Administration.
13
VE639123
17. For school work, how often do you use a computer or other digital technology for the following
activities? Select one circle in each row.
a. Send or receive messages (for
example, chat, e-mail, instant
messages, text messages)
b. View or download digital media
(for example, art, books, games,
mobile apps, music, pictures,
software, videos)
c. Create, edit, or organize digital
media
d. Send, share, present, or upload
digital media
e. Create a presentation
Never or
almost
never
A few
times a
year
Once or
twice a
month
Once or
twice a
week
Every day
or almost
every day
(A)
(B)
(C)
(D)
(E)
VE639125
(A)
(B)
(C)
(D)
(E)
VE639127
(A)
(B)
(C)
(D)
(E)
VE639130
(A)
(B)
(C)
(D)
(E)
VE639131
(A)
(B)
(C)
(D)
(E)
VE639137
(C)
(D)
(E)
VE639136
f. Create a spreadsheet (a table or
grid that displays data into
columns and rows and may be
(A)
(B)
used to create charts and
graphs)
Area: Information and Communication Technology
Issue: Organization of Instruction
Booklets: 1, 2, 7, 10
Item Review:
This item provides contextual information for the cognitive assessment pertaining to Information and
Communication Technology literacy. It focuses on TEL-related learning opportunities and experiences
at school. Exactly the same sub-items as in its outside-of-school-related counterpart (VF025108) are
used. The frequency distribution across response options is satisfactory for four of the six sub-items with
reasonable balance of responses across all response categories. Some response categories for sub-items e
and f have low response rates less than 10%. The missing rate is close to zero (0.43 – 0.62 %) across all
questionnaire booklets. There is no evidence that any response options need to be collapsed, expanded, or
deleted. The maximum difference in the response proportion across the four booklets in which the
item has been administrated is 9%-point indicating relatively stable performance across different
booklets for this item.
This item and all sub-items show satisfactory item statistics to be included in the TEL student
questionnaire. The item collects important information regarding frequencies at which students use
computer or other digital technology while doing ICT related activities.
A further content review of the TEL cognitive assessment items indicated that sub-item b of this matrix
item is less strongly linked to the content of any cognitive item compared to the other sub-items.
14
Despite its satisfactory performance, dropping sub-item b in order to reduce student burden would be
reasonable.
Recommendation:
Keep this matrix item and sub-items a, c, d, e, and f in the 2014 TEL Probe Administration.
Drop sub-item b from the 2014 TEL Probe administration based on time restrictions/consideration of
student burden.
15
VF025108
18. In this question, please think about activities you do that are not related to your school work. How
often do you use a computer or other digital technology for the following activities not for school
work? Select one circle in each row.
Never or
almost
never
A few
times a
year
a. Send or receive messages
(for example, chat, e-mail,
(A)
(B)
instant messages, text
messages)
b. View or download digital
media (for example, art,
books, games, mobile apps,
(A)
(B)
music, pictures, software,
videos)
c. Create, edit, or organize
(A)
(B)
digital media
d. Send, share, present, or
(A)
(B)
upload digital media
e. Create a presentation
(A)
(B)
f. Create a spreadsheet (a
table or grid that displays
data into columns and rows
(A)
(B)
and may be used to create
charts and graphs)
Area: Information and Communication Technology
Issue: Student Engagement
Booklets: 1, 2, 7, 10
Once or
twice a
month
Once or
twice a
week
Every day
or almost
every day
(C)
(D)
(E)
VF025109
(C)
(D)
(E)
VF025110
(C)
(D)
(E)
VF025112
(C)
(D)
(E)
VF025113
(C)
(D)
(E)
VF025117
(C)
(D)
(E)
VF025116
Item Review:
This item provides contextual information for the cognitive assessment pertaining to Information and
Communication Technology (ICT) literacy. It focuses on TEL-related learning opportunities and
experiences outside of school. Exactly the same sub-items as in its at-school-related counterpart
(VE639123) are used. Although relative frequencies for response options a or f are less than 10% for subitems a, b, e, and f, there is sufficient variation across response options to distinguish different levels of
student engagement. Moreover, keeping response options the same for all sub-items, and consistent with
the response options used for the at-school related version of this matrix item is important for comparisons
of and reporting on the results for these items. The missing rate is close to zero (0.43 – 0.51 %) across all
questionnaire booklets. The maximum difference in the response proportion across the four booklets in
which the item has been administrated is 8%-point indicating relatively stable performance across
different booklets for this item.
This item and all sub-items show satisfactory item statistics to be included in the TEL student
questionnaire. The item collects important information regarding the frequencies at which students
use computer or other digital technology engaging in ICT related activities that is not for school work.
16
A further content review of the TEL cognitive assessment items indicated that sub-item b of this matrix
item is less strongly linked to the content of any cognitive item compared to the other sub-items.
Despite its satisfactory performance, dropping sub-item b in order to reduce student burden—as for
item VF025110—would be reasonable.
Recommendation:
Keep this matrix item and sub-items a, c, d, e, and f in the 2014 TEL Probe Administration.
Drop sub-item b from the 2014 TEL Probe administration based on time restrictions/consideration of
student burden.
17
VE682276
19. Do you think that you would be able to do each of the following? Select one circle in each row.
a. Build a model using a kit
b. Build a model without
using a kit
c. Use tools or materials to
fix something
d. Take something apart in
order to fix it or see how
it works
e. Design a computer
program
Area: Design and Systems
Issue: Student Engagement
Booklets: 1, 2, 7, 10
I definitely
can’t
I probably
can’t
Maybe
I probably
can
I definitely
can
(A)
(B)
(C)
(D)
(E)
VE682278
(A)
(B)
(C)
(D)
(E)
VE682280
(A)
(B)
(C)
(D)
(E)
VE682281
(A)
(B)
(C)
(D)
(E)
VE682284
(A)
(B)
(C)
(D)
(E)
VE682286
Item Review:
This item provides contextual information for the cognitive assessment pertaining to Design and
Systems (D&S). The item collects important information regarding student self-efficacy in conducting
activities that are related to Design and Systems. Meta-analyses have shown that self-efficacy is one of
the strongest predictors of student achievement (e.g., Richardson et al, 2012). This item can be
considered an important achievement predictor as well as an informative outcome with policy
relevance by itself.
Even though relative frequencies for response categories A and B are lower for most sub-items with
frequency distributions skewed to the right there is sufficient variation across response options to
differentiate across the range of levels of student self-efficacy perceptions. The missing rate is close to zero
(0.37 – 0.45%) across all questionnaire booklets. The maximum difference in the response proportion
across the four booklets in which the item has been administrated is 8%-points indicating reasonably
stable performance across different booklets for this item.
This item and all sub-items show satisfactory item statistics to be included in the TEL student
questionnaire. In addition to an analysis as stand-alone items, the sub-items should be analyzed as part
of a potential broader student self-efficacy index based on aggregation of these items with items
VE682217 and VE682317, which were designed to measure student self-efficacy in the two other
content areas.
Item content was reviewed in more detail to evaluate whether the number of sub-items might be
reduced considering student burden and timing constraints. This review indicated that sub-items a and
e were less strongly linked to the content of the cognitive items compared to the other sub-items.
Further, relative frequencies for sub-items a and e were less balanced than for the other sub-items.
Dropping sub-items a and e in order to reduce student burden would seem reasonable. Retaining three
18
sub-items for this matrix items would also increase consistency with its Technology & Society-related
counterpart (VE682317) where only three sub-items were administered in the pilot.
Recommendation:
Keep this item and sub-items b, c, and d in the 2014 TEL Probe Administration.
Drop sub-items a and e from the 2014 TEL Probe administration based on time
restrictions/consideration of student burden.
19
VE639025
20. Technology refers to all the things people make and do to their natural environment in order to get
the things they want and need. How much do you disagree or agree with the following statements
about technology? Select one circle in each row.
a. Technology is important to
society.
b. Technology is important to
my daily life.
c. Learning about technology
will help me in the future.
d. Learning about technology
will help me do (or get) the
job I want.
e. I enjoy learning about
technology.
f. I enjoy using technology.
Area: Common
Issue: Student Engagement
Booklets: 2, 3, 6, 9
Disagree
Neither
disagree
nor agree
Agree
Strongly
agree
(A)
(B)
(C)
(D)
(E)
VE639028
(A)
(B)
(C)
(D)
(E)
VE639043
(A)
(B)
(C)
(D)
(E)
VE639048
(A)
(B)
(C)
(D)
(E)
VE639046
(A)
(B)
(C)
(D)
(E)
VE639053
(A)
(B)
(C)
(D)
(E)
VF009048
Strongly
disagree
Item Review:
This is a common item across the three TEL areas. It was designed to capture student engagement,
specifically interest in technology. The frequency distribution across response options is skewed for all
sub-items. All six sub-items have very low response rates for both options A and B while relative
frequencies for the categories D and E are very high. This might indicate that the sub-items represent highly
socially desirable statements. Compared to item VE639166 which also addresses student interest, this item
seems less suited to differentiate across a wide range of levels of interest. The response categories would
need to be revised prior to an operational use. The missing rate is close to zero (0.31 – 0.44%) across all
questionnaire booklets in the aggregated response frequency. The maximum difference in the response
proportion across the four booklets in which the item has been administrated is 8%-points indicating
reasonably stable performance across different booklets for this item.
Recommendation:
Drop this item from the 2014 TEL Probe Administration based on item performance as well as time
restrictions/consideration of student burden.
20
VF009050
21. Engineering refers to using skills or knowledge to solve problems that meet people’s wants and
needs. How much do you disagree or agree with the following statements about engineering?
Select one circle in each row.
a. Engineering is important to
society.
b. Engineering is important to
my daily life.
c. Learning about engineering
will help me in the future.
d. Learning about engineering
will help me do (or get) the
job I want.
e. I enjoy learning about
engineering.
f. I enjoy solving problems.
g. I enjoy fixing things.
h. I enjoy creating, building, or
designing things.
i. I enjoy figuring out how
things work.
j. I do things that I would
describe as engineering.
Area: Common
Issue: Student Engagement
Booklets: 2, 3, 6, 9
Disagree
Neither
disagree
nor agree
Agree
Strongly
agree
(A)
(B)
(C)
(D)
(E)
VF009051
(A)
(B)
(C)
(D)
(E)
VF009052
(A)
(B)
(C)
(D)
(E)
VF009053
(A)
(B)
(C)
(D)
(E)
VF009054
(A)
(B)
(C)
(D)
(E)
VF009055
(A)
(A)
(B)
(B)
(C)
(C)
(D)
(D)
(E)
(E)
VF009056
(A)
(B)
(C)
(D)
(E)
VF009064
(A)
(B)
(C)
(D)
(E)
VF009065
(A)
(B)
(C)
(D)
(E)
VF009066
Strongly
disagree
VF009061
Item Review:
This is a common item across the three TEL areas. It was designed to capture student engagement,
specifically interest in engineering. The frequency distribution across response options is skewed for all
sub-items. All ten sub-items have low response rate less than 10% for both options A and B while
relative frequencies for the categories D and E are very high. This might indicate that the sub-items
represent highly socially desirable statements. Compared to item VE639166 which also addresses student
interest, this item seems less suited to differentiate across a wide range of levels of interest. The response
categories would need to be revised prior to an operational use. The missing rate is less than 1% (0.51 –
0.66%) across all questionnaire booklets. The maximum difference in the response proportion across
the four booklets in which the item has been administrated is 6%-points indicating reasonably stable
performance across different booklets for this item.
Recommendation:
Drop this item from the 2014 TEL Probe Administration based on item performance as well as time
restrictions/consideration of student burden.
21
VE682274
22. Who taught you most of what you know about building things, fixing things, or how things work?
A. I taught myself.
B. Family members
C. Friends
D. Teachers
E. Someone else
Area: Design and Systems
Issue: Availability and Use of Instructional Resources
Booklets: 2, 3, 6, 9
Item Review:
This is an item that provides contextual information for cognitive assessment pertaining to Design and
Systems. The frequency distribution across response options is clearly not satisfactory for the item. Options
c, d, and e have low response rates less than 10%. The missing rate is close to zero (0.23%) across all
booklets. The maximum difference in the response proportion across the four booklets in which the
item has been administrated is 4%-points indicating reasonably stable performance across different
booklets for this item.
Overall, this item does not perform well enough to be included in the TEL student questionnaire
without making revisions.
Recommendation:
Drop this item from the 2014 TEL Probe Administration based on item performance.
22
VE682217
23. Do you think that you would be able to do each of the following? Select one circle in each row.
a. Publish or maintain a
personal website or blog
b. Create presentations with
sound, pictures, or video
c. Organize information into
a chart, graph, or
spreadsheet
d. Compare products using
the Internet
I definitely
can’t
I probably
can’t
Maybe
I probably
can
I definitely
can
(A)
(B)
(C)
(D)
(E)
VE682218
(A)
(B)
(C)
(D)
(E)
VE682219
(A)
(B)
(C)
(D)
(E)
VE682221
(A)
(B)
(C)
(D)
(E)
VE682222
Area: Information and Communication Technology
Issue: Student Engagement
Booklets: 2, 3, 6, 9
Item Review:
This is an item that provides contextual information for the cognitive assessment pertaining to
Information and Communication Technology (ICT) literacy. The item collects important information
regarding student self-efficacy in conducting activities that are related to ICT. Meta-analyses have
shown that self-efficacy is one of the strongest predictors of student achievement (e.g., Richardson et
al, 2012). This item can be considered an important achievement predictor as well as an informative
outcome with policy relevance by itself.
Even though relative frequencies for response categories A and B are lower for most sub-items with
frequency distributions skewed to the right there is sufficient variation across response options to
differentiate across the range of levels of student self-efficacy perceptions. The missing rate is close to zero
(0.37 – 0.44%) across all questionnaire booklets. The maximum difference in the response proportion
across the four booklets in which the item has been administrated is 8%-points indicating reasonably
stable performance across different booklets for this item.
This item and all sub-items show satisfactory item statistics to be included in the TEL student
questionnaire. In addition to an analysis as stand-alone items, the sub-items should be analyzed as part
of a potential broader student self-efficacy index based on aggregation of these items with items
VE682276 and VE682317, which were designed to measure student self-efficacy in the two other
content areas.
Item content was reviewed in more detail to evaluate whether the number of sub-items might be
reduced considering student burden and timing constraints. This review indicated that sub-item d was
less strongly linked to the content of the cognitive items compared to the other sub-items. Further,
relative frequencies for sub-item d were less balanced than for the other sub-items. Dropping sub-item
d in order to reduce student burden would seem reasonable. Retaining three sub-items for this matrix
items would also increase consistency with its Technology & Society-related counterpart (VE682317)
where only three sub-items were administered in the pilot.
23
Recommendation:
Keep this item and sub-items a, b, and c in the 2014 TEL Probe Administration.
Drop sub-item d from the 2014 TEL Probe administration based on time restrictions/consideration of
student burden.
24
VE682215
24. Who taught you most of what you know about using computers or other digital technology for
collecting or sharing information?
A. I taught myself.
B. Family members
C. Friends
D. Teachers
E. Someone else
Area: Information and Communication Technology
Issue: Availability and Use of Instructional Resources
Booklets: 2, 3, 6, 9
Item Review:
This is an item that provides contextual information for cognitive assessment pertaining to Information
and Communication Technology (ICT) literacy. The frequency distribution across response options is not
satisfactory for this item. Options c and e have low response rate less than 10%. The missing rate is close
to zero (0.38%) across all booklets. The maximum difference in the response proportion across the four
booklets in which the item has been administrated is 4%-points indicating reasonably stable
performance across different booklets for this item.
Overall, this item does not perform well enough to be included in the TEL student questionnaire
without making revisions.
Recommendation:
Drop this item from the 2014 TEL Probe Administration based on item performance.
25
VE639847
25. Have you ever studied technology or engineering topics in any of the following classes or subjects
in school? Select one or more squares.
A. Mathematics
B. Science
C. Social studies or history
D. I have not studied technology or engineering in any of the classes or subjects listed above.
Area: Common
Issue: Organization of Instruction
Booklets: 3, 4, 7, 8
Item Review:
This is a common item across the three TEL areas. The frequency distribution across response options is
satisfactory for the item. The missing rate is close to zero (0.14%) in the aggregated response frequency.
The maximum difference in the response proportion across the four booklets in which the item has
been administrated is 15%-point, which might indicate that there is some variation in item
performance across booklets.
This item shows satisfactory performance to be included in the TEL student questionnaire.
Item VE639842, however, covers similar content and should be given higher priority as it addresses
technology and engineering classes more explicitly than this item.
Recommendation:
Drop this item from the 2014 TEL Probe Administration based on time restrictions/consideration of
student burden.
26
VE638956
26. In school, how often have you ever done the following activities? Select one circle in each row.
a. Used tools or materials to fix or build
something
b. Used different tools, materials, or
machines to see which are best for a
given purpose
c. Built or tested a model to see if it
solves a problem
d. Figured out why something is not
working in order to fix it
e. Taken something apart in order to fix
it or see how it works
f. Designed a computer program
Area: Design and Systems
Issue: Organization of Instruction
Booklets: 3, 4, 7, 8
Never
Once or
twice
Three to
five times
More than
five times
(A)
(B)
(C)
(D)
VE638957
(A)
(B)
(C)
(D)
VE638959
(A)
(B)
(C)
(D)
VE638963
(A)
(B)
(C)
(D)
VE682247
(A)
(B)
(C)
(D)
VE638965
(A)
(B)
(C)
(D)
VE682248
Item Review:
This item provides contextual information for the cognitive assessment pertaining to Design and
Systems. It focuses on TEL-related learning opportunities and experiences at school. The same subitems are administered in its outside-of-school counterpart (VE638983). The frequency distribution across
response options is satisfactory for five of the six sub-items with a reasonable balance of responses across
all response categories. Sub-item f has low relative frequencies for categories C and D. The response
categories seem to work not very well for this sub-item. The missing rate is close to zero (0.26 – 0.41%)
across all booklets. The maximum difference in the response proportion across the four booklets in
which the item has been administrated is 8%-point indicating reasonably stable performance across
different booklets for this item.
Overall, this item and sub-items a-e show satisfactory item statistics to be included in the TEL student
questionnaire. There is no evidence that any response categories for these items need to be collapsed,
expanded, or deleted. This matrix item covers similar content as matrix item VE682225. It should be
given priority over item as the response options represent more quantifiable, behavior-related
categories than those in item VE682225.
Item content was reviewed in more detail to evaluate whether the number of sub-items might be
reduced considering student burden and timing constraints. This review indicated that sub-items a and
f were less strongly linked to the content of the cognitive items compared to the other sub-items.
Dropping sub-items a and f in order to reduce student burden would seem reasonable.
Recommendation:
Keep this item and sub-items b, c, d, and e in the 2014 TEL Probe Administration.
Drop sub-item a and f from the 2014 TEL Probe administration based on item performance (sub-item f)
and time restrictions/consideration of student burden.
27
VE638983
27. Outside of school, how often have you ever done the following activities? Select one circle in each
row.
a. Used tools or materials to fix or
build something
b. Used tools or materials to plan or
design something (for example,
cake recipe, party)
c. Used different tools, materials, or
machines to see which are best
for a given purpose
d. Built or tested a model to see if it
solves a problem
e. Figured out why something is not
working in order to fix it
f. Taken something apart in order
to fix it or see how it works
g. Designed a computer program
Area: Design and Systems
Issue: Student Engagement
Booklets: 3, 4, 7, 8
Never
Once or
twice
Three to
five times
More than
five times
(A)
(B)
(C)
(D)
VE638986
(A)
(B)
(C)
(D)
VE009777
(A)
(B)
(C)
(D)
VE638998
(A)
(B)
(C)
(D)
VE639038
(A)
(B)
(C)
(D)
VE682267
(A)
(B)
(C)
(D)
VE639042
(A)
(B)
(C)
(D)
VE682268
Item Review:
This is an item that provides contextual information for the cognitive assessment pertaining to Design
and Systems. It focuses on TEL-related learning opportunities and experiences outside of school. Subitems a and c-g are also administered in its at-school counterpart (VE638956). The frequency
distribution across response options is satisfactory for six of the seven sub-items with reasonable
balance of responses across all response categories. Sub-item g has low relative frequencies for categories C
and D. The missing rate is less than 1% (0.46 – 0.53 %) across all booklets. The maximum difference in
the response proportion across the four booklets in which the item has been administrated is 8%points indicating reasonably stable performance across different booklets for this item.
Overall, this item and sub-items a–f show satisfactory item statistics to be included in the TEL student
questionnaire. There is no evidence that any response categories for these items need to be collapsed,
expanded, or deleted. This matrix item covers similar content as matrix item VE682232. It should be
given priority over item as the response options represent more quantifiable, behavior-related
categories than those in item VE682232.
Item content was reviewed in more detail to evaluate whether the number of sub-items might be
reduced considering student burden and timing constraints. This review indicated that sub-items a, b,
and g were less strongly linked to the content of the cognitive items compared to the other sub-items.
28
Moreover, sub-item b was not included in the at-school version of this matrix item, making it less
valuable for direct comparisons of learning opportunities and behaviors at and outside of school.
Dropping sub-items a, b, and g to reduce student burden would seem reasonable.
Recommendation:
Keep this item and sub-items c, d, e, and f in the 2014 TEL Probe Administration.
Drop sub-items a, b, and g from the 2014 TEL Probe administration based on item performance (subitem g) and time restrictions/consideration of student burden.
29
VF238958
28. For school work, how often do you use a computer or other digital technology for the following
activities? Select one circle in each row.
Never or A few Once or
almost times a twice a
never
year
month
a. Participate in online discussion
forums, social networking sites, or
(A)
virtual communities
b. Work with others to solve a problem
(A)
c. Get information from experts
(people with strong skills or
(A)
knowledge in a subject)
d. Maintain a website or blog
(A)
e. Search for information (for example,
browse the Internet or check out
(A)
websites)
f. Play games or run simulations
(A)
Area: Information and Communication Technology
Issue: Organization of Instruction
Booklets: 3, 4, 7, 8
Once or
twice a
week
Every day
or almost
every day
(B)
(C)
(D)
(E)
VE238965
(B)
(C)
(D)
(E)
VF238968
(B)
(C)
(D)
(E)
VF238969
(B)
(C)
(D)
(E)
VF238973
(B)
(C)
(D)
(E)
VF238974
(B)
(C)
(D)
(E)
VF238975
Item Review:
This is an item that provides contextual information for cognitive assessment on Information and
Communication Technology literacy. It focuses on TEL-related learning opportunities and experiences
at school. In contrast to item VE639123, this matrix item does not have a outside-of-school type
counterpart. The frequency distribution across response options is satisfactory for four of the six subitems with reasonable balance of responses across all response categories. For two sub-items
frequency distributions indicate less variation among students: sub-items d and e have low response
rates less than 10% for some response options. However, NCES might want to retain the response
categories as is to maintain consistency across sub-items. The missing rate is less than 1% (0.56 –
0.72%) across all booklets. The maximum difference in the response proportion across the four
booklets in which the item has been administrated is 8%-point indicating reasonably stable
performance across different booklets for this item.
Overall, this item shows satisfactory performance to be included in the TEL student questionnaire.
There is no evidence that any response categories for these items need to be collapsed, expanded, or
deleted. This matrix item covers similar content as matrix item VE639123. Compared to VE639123, one
issue regarding the current item is that no direct comparison between at school and out-of-school
experiences can be made. Giving consideration to the timing constraints and reducing student burden,
dropping this matrix item including all sub-items would seem reasonable.
Recommendation:
Drop this matrix item including all sub-items from the 2014 TEL Probe administration based on time
restrictions/consideration of student burden.
30
VE639842
29. Have you ever taken or are you currently taking any of the following classes or subjects in school?
Select one or more squares.
A. Industrial technology (for example, auto mechanics, carpentry)
B. Engineering (for example, robotics, bridge building, rocketry)
C. Any class that involves learning to use, program, or build computers
D. Any other technology-related class (for example, electronics, sewing, farming)
E. I have not taken any of the classes listed above.
Area: Common
Issue: Organization of Instruction
Booklets: 4, 5, 9, 10
Item Review:
This is a common item across the three TEL areas. It provides important contextual information on the
organization of instruction, specifically whether students have attended classes specifically targeting at
technology and engineering. This is a multiple selection item. Relative selection frequencies are
reasonably balanced across all response options. The missing rate is close to zero (0.34%) across all
booklets. The maximum difference in the response proportion across the four booklets in which the
item has been administrated is 5%-points indicating reasonably stable performance across different
booklets for this item.
This item shows satisfactory performance to be included in the TEL student questionnaire. In direct
comparison to item VE639847 that also addresses organization of instruction, this item should be given
higher priority as it addresses technology and engineering classes more explicitly than item VE639847.
Recommendation:
Keep this item in the 2014 TEL Probe Administration.
31
VE639172
30. In school or outside of school, how often do the following? Select one circle in each row.
Never or A few Once or Several At least
almost times a twice a times a once a
never
year
month month week
a. Participate in clubs, camps, or
competitions about technology or
engineering (for example, digital art and
editing, design, programming, robotics,
science)
b. Go to museums or events to learn about
technology or engineering
c. Edit digital photographs or other graphic
images
d. Create, build, or design things (for
example, robots, clothes, science
projects, recipes)
e. Work in a shop or garage with industrial
technologies (for example, auto
mechanics, machining, metalworking,
construction, carpentry)
f. Work with drafting or design tools (for
example, computer aided design [CAD],
systems analysis)
g. Take online classes to learn more about
technology or engineering
h. Watch video or listen to audio to learn
more about technology or engineering
(video or audio includes online videos,
movies, television shows, podcasts, radio
programs)
Area: Common
Issue: Student Engagement
Booklets: 4, 5, 9, 10
(A)
(B)
(C)
(D)
(E)
VE639177
(A)
(B)
(C)
(D)
(E)
VE639178
(A)
(B)
(C)
(D)
(E)
VE639179
(A)
(B)
(C)
(D)
(E)
VE639180
(A)
(B)
(C)
(D)
(E)
VE639181
(A)
(B)
(C)
(D)
(E)
VE639182
(A)
(B)
(C)
(D)
(E)
VE639183
(A)
(B)
(C)
(D)
(E)
VE677642
Item Review:
This is a common item across the three TEL areas. It was developed to provide additional contextual
information on student engagement in technology and engineering related activities. The frequency
distribution across response options is not satisfactory for the item. Five of the eight sub-items have low
response rates less than 10% across several response categories. The missing rate is less than 1% (0.58 –
0.81%) across all booklets. The maximum difference in the response proportion across the four
booklets in which the item has been administrated is 12%-points, which might indicate that the item
did not have a stable performance across the booklets in which the previous items vary.
32
Overall, this matrix item does not perform well enough to be included in the TEL student questionnaire
without making revisions. Given that student engagement is covered by several other items which
show satisfactory performance, dropping this matrix items and all sub-items would be reasonable.
Recommendation:
Drop this item from the 2014 TEL Probe Administration based on item performance.
33
VE681624
31. In school, how often do you learn about or discuss the following? Select one circle in each row.
Never
a. How to judge reliability of sources (for
example, how a website might be biased
(A)
or inaccurate)
b. How to credit others for their ideas (for
example, citing sources, using endnotes
(A)
and footnotes in reports)
Area: Information and Communication Technology
Issue: Organization of Instruction
Booklets: 4, 5, 9, 10
Rarely
Sometimes
Often
(B)
(C)
(D)
VE681629
(B)
(C)
(D)
VE681632
Item Review:
This is a common item across the three TEL areas. The item collects important information regarding
whether students have learned how to judge the reliability of sources and how to credit others for
their ideas. Both sub-items function well. The frequency distribution across response options is
satisfactory for both sub-items with a reasonable balance of responses across all response categories.
There is no evidence that any response options need to be collapsed, expanded, or deleted. The missing
rate is close to zero (0.34 – 0.38%) across all booklets... The maximum difference in the response
proportion across the four booklets in which the item has been administrated is 9%- points indicating
reasonably stable performance across different booklets for this item.
This matrix item and both sub-items show satisfactory performance to be included in the TEL student
questionnaire.
Recommendation:
Keep this item and both sub-items in the 2014 TEL Probe Administration.
34
VE638999
32. In school, how often have you learned about or discussed the following? Select one circle in each
row.
a. Inventions that change the way people live
b. Choices people make that affect the
environment
c. Conditions that influence the use or
availability of machines or devices
d. The ways people work together to solve
problems in their community or the world
Area: Technology and Society
Issue: Organization of Instruction
Booklets: 1–10
Never
(A)
Rarely
(B)
Sometimes
(C)
Often
(D)
VE639002
(A)
(B)
(C)
(D)
VE639004
(A)
(B)
(C)
(D)
VE639005
(A)
(B)
(C)
(D)
VE682300
Item Review:
This item provides contextual information for the cognitive assessment pertaining to Technology and
Society. Response on the four sub-items provide important information regarding students’
opportunities in school to learn how about Technology and Society. Exactly the same four sub-items as
in this item are as well used in its outside-of-school-related counterpart (VE639008).The frequency
distribution across response options is very balanced for two of the four sub-items. Although sub-items b
and d have lower relative frequencies (<10%) for selected categories, there is still reasonable variation
across response options to make this a useful item in the student questionnaire. In order to maintain
consistency in response options across sub-items, NCES may consider leaving the response categories
unchanged. The missing rate is very low (1.55 – 1.75%) across all booklets.
Although item performance (based on frequencies) is slightly worse for this item than for its outsideof-school version, it seems important to retain both items given that no other item in the
questionnaire captures students’ learning opportunities regarding Technology and Society at school.
Overall, item performance is satisfactory to be included in the TEL student questionnaire, and there is
no evidence that any response options need to be collapsed, expanded, or deleted. Moreover, the burden
for the student associated with this item is low given that this matrix includes only four sub-items.
Recommendation:
Keep this item and all sub-items in the 2014 TEL Probe administration.
35
VE639008
33. Outside of school, how often have you learned about or discussed the following? Select one circle
in each row.
a. Inventions that change the way people live
b. Choices people make that affect the
environment
c. Conditions that influence the use or
availability of machines or devices
d. The ways people work together to solve
problems in their community or the world
Area: Technology and Society
Issue: Student Engagement
Booklets: 1–10
Never
(A)
Rarely
(B)
Sometimes
(C)
Often
(D)
VE639012
(A)
(B)
(C)
(D)
VE639013
(A)
(B)
(C)
(D)
VE639014
(A)
(B)
(C)
(D)
VE682314
Item Review:
This item provides contextual information for the cognitive assessment pertaining to Technology and
Society. Response on the four sub-items provide important information regarding students’
opportunities outside of school to learn about Technology and Society. Exactly the same four sub-items
as in this item are as well used in its at-school-related counterpart (VE638999).This matrix item
functions very well. The frequency distributions across response options are satisfactory for all sub-items
with reasonable variation across response options. There is no evidence that any response options need to
be collapsed, expanded, or deleted. The missing rate is very low (2.18 – 2.39%) across all booklets.
Overall, item performance is satisfactory to be included in the TEL student questionnaire. The burden
for the student associated with this item is low given that this matrix includes only four sub-items.
Recommendation:
Keep this item and all sub-items in the 2014 TEL Probe administration.
36
VE682315
34. Who taught you most of what you know about how technology, people, and the environment are
related to each other?
A. I taught myself.
B. Family members
C. Friends
D. Teachers
E. Someone else
Area: Technology and Society
Issue: Availability and Use of Instructional Resources
Booklets: 1–10
Item Review:
This is an item that provides contextual information for the cognitive assessment pertaining to
Technology and Society. The frequency distribution across response options is not satisfactory for the item.
Options c and e have low response rate less than 10%. The missing rate is very low (2.15%) across all
booklets. The maximum difference in the response proportion across the ten booklets in which the
item has been administrated is 8%-points indicating relatively stable performance across different
booklets for this item.
Overall, this item does not perform well enough to be included in the TEL student questionnaire
without making substantial revisions.
Recommendation:
Drop this item from the 2014 TEL Probe Administration based on item performance.
37
VE682317
35. Do you think that you would be able to do each of the following? Select one circle in each row.
a. Describe how inventions
change society
b. Compare how different
activities affect the
environment
c. Explain why people have
different tools, machines,
or devices in different
parts of the world
Area: Technology and Society
Issue: Student Engagement
Booklets: 1–10
I definitely
can’t
I probably
can’t
Maybe
I probably
can
I definitely
can
(A)
(B)
(C)
(D)
(E)
VE682321
(A)
(B)
(C)
(D)
(E)
VE682323
(A)
(B)
(C)
(D)
(E)
VE682324
Item Review:
This is an item that provides contextual information for the cognitive assessment pertaining to
Technology and Society. The item collects important information regarding student self-efficacy in
conducting activities that are related to Technology and Society. Meta-analyses have shown that selfefficacy is one of the strongest predictors of student achievement (e.g., Richardson et al, 2012). This
item can be considered an important achievement predictor as well as an informative outcome with
policy relevance by itself.
Even though relative frequencies for response categories A and B are lower for most sub-items with
frequency distributions skewed to the right there is sufficient variation across response options to
differentiate across the range of levels of student self-efficacy perceptions. The missing rate is low (2.74 –
3.05%) across all questionnaire booklets. The maximum difference in the response proportion across
the four booklets in which the item has been administrated is 11%-points pointing to some variation in
item performance across booklets.
Overall, this item and all sub-items show satisfactory item statistics to be included in the TEL student
questionnaire. In addition to an analysis as stand-alone items, the sub-items might as well be analyzed
as part of a potential broader student self-efficacy index based on aggregation of these items with
items VE682276 and VE682217, which were designed to measure student self-efficacy in the two other
content areas.
No further reduction of the number of sub-items seemed feasible given that only three sub-items were
included in the pilot.
Recommendation:
Keep this item and all sub-items in the 2014 TEL Probe Administration.
38
VF009358
36. Before today, had you ever taken an interactive computer test similar to the one you just took?
Select one circle in each row.
a. I had taken an interactive computer test in school.
b. I had taken an interactive computer test outside of school.
Yes
(A)
(A)
No
(B)
(B)
VF009360
VF009361
Area: Common Debrief
Issue: Student Engagement
Booklets: 1–10
Item Review:
This is a common debrief item asking whether students have taken interactive computer test prior to
taking the TEL assessment. The item collects important information regarding prior knowledge and
exposure of students to similar assessments.
The relative frequencies for sub-item a are reasonably balanced indicating that there is considerable
variation in schools’ use of interactive computer tests. For sub-item b, more than 80% of all students
indicate that they do not have experiences with interactive computer tests outside of school. Although
variation on item b is limited, including this sub-item seems important for the comparison of at school and
out-of-school experiences as well as for future trend analyses. The missing rate is low (3.33 – 3.79%) across
all booklets. The maximum difference in the response proportion across the ten booklets in which the
item has been administrated is 12%-point pointing to some variation in item performance across
booklets.
Overall, this item shows satisfactory performance to be included in the TEL student questionnaire.
Recommendation:
Keep this item and both sub-items in the 2014 TEL Probe Administration.
39
VE401773
37. How hard was this test compared to most other tests you have taken this year in school?
A. Easier than other tests
B. About as hard as other tests
C. Harder than other tests
D. Much harder than other tests
Area: Common Debrief
Issue: Student Engagement
Booklets: 1–10
Item Review:
This is a common debrief item that is typically asked at the end of student questionnaire for every
subject. Although response options “C” and “D” have low response rates, the frequency distribution
across all response options can be considered reasonable and satisfactory for this type of
questionnaire item The missing rate is low (3.71%) across all questionnaire booklets. The maximum
difference in the relative frequencies across the ten booklets in which the item has been administrated
is 8%-points indicating reasonably stable performance across different booklets for this item.
Overall, this item shows satisfactory performance to be included in the TEL student questionnaire.
Recommendation:
Keep this item in the 2014 TEL Probe Administration.
40
VE401776
38. How hard did you try on this test compared to how hard you tried on most other tests you have
taken this year in school?
A. Not as hard as on other tests
B. About as hard as on other tests
C. Harder than on other tests
D. Much harder than on other tests
Area: Common Debrief
Issue: Student Engagement
Booklets: 1–10
Item Review:
This is a common debrief item that is typically asked at the end of all NAEP student questionnaires.
Although response option “D” has a less than 10% response rate, the frequency distribution across all
response options can be considered reasonable and satisfactory for this type of questionnaire item.
The missing rate is low (4.33%) across all booklets. The maximum difference in the response
proportion across the ten booklets in which the item has been administrated is 6%-points indicating
reasonably stable performance across different booklets for this item.
Overall, this item shows satisfactory performance to be included in the TEL student questionnaire.
Recommendation:
Keep this item in the 2014 TEL Probe Administration.
41
VE401779
39. How important was it to you to do well on this test?
A. Not very important
B. Somewhat important
C. Important
D. Very important
Area: Common Debrief
Issue: Student Engagement
Booklets: 1–10
Item Review:
This is a common debrief item that is typically asked at the end of all NAEP student questionnaires.
Although response option “A” has a less than 10% response rate, the frequency distribution across all
response options can be considered reasonable and satisfactory for this type of questionnaire item. The
missing rate is low (4.78%) across all booklets. The maximum difference in the response proportion
across the ten booklets in which the item has been administrated is 5%- points indicating reasonably
stable performance across different booklets for this item.
Overall, this item shows satisfactory performance to be included in the TEL student questionnaire.
Recommendation:
Keep this item in the 2014 TEL Probe Administration.
42
References
Holland, J.L. (1997). Making vocational choices. Odessa: FL: Psychological Assessment
Resources.
Lubinski, D. (2000). Scientific and social significance of assessing individual differences: Sinking
shafts at a few critical points. Annual Review of Psychology, 51, 405-444.
Richardson, M., Abraham, C., & Bond, R. (2012). Psychological Correlates of University
Students Academic Performance: A Systematic Review and Meta-Analysis, Psychological
Bulletin, 138, 353-387.
WestEd (2010). Technology and Engineering Literacy Framework for the 2014 National
Assessment of Educational Progress. Washington, DC: WestEd.
43
Technology and Engineering Literacy Reports
2. School report
NAEP Item Development (ID)
Technology and Engineering Literacy (TEL)
2014 Probe Survey Questionnaire
Recommendations to NCES
Grade 8 School Questionnaire
Deliverable in response to ID Task 3.1.2
Submitted: March 14, 2013
Listening.
Learning.
Leading.
t
Table of Contents
2014 Technology and Engineering Literacy (TEL)
Grade 8 School Questionnaire:
Post-Pilot Analysis and Recommendations
Background ............................................................................................................................................. 2
Criteria for Item Review .......................................................................................................................... 2
Recommendations for the 2014 TEL Probe Assessment – Overview ..................................................... 4
Item Review and Recommendations for the 2014 TEL School Grade 8 Probe Assessment................... 6
References ............................................................................................................................................ 28
1
2014 Technology and Engineering Literacy (TEL)
Grade 8 School Questionnaires:
Post-Pilot Analysis and Recommendations
This document provides a post-pilot review of the grade 8 school Technology and Engineering
Literacy (TEL) questionnaire using data collected in the 2013 pilot administration. The goal of
this report is to evaluate the performance of the items and determine whether revisions or
deletions are necessary for the 2014 TEL probe assessment.
Background
The NAEP TEL assessment measures three core areas of interest: Technology and Society
(T&S), Design and Systems (D&S), and Information and Communication Technologies (ICT).
Technology and Society addresses the effects that technology has on society and on the
natural world. Design and Systems covers the nature of technology, the engineering design
process, and basic principles of dealing with everyday technologies. Information and
Communication Technology includes computers and software learning tools, networking
systems and protocols, handheld digital devices, and other technologies for accessing,
creating, and communicating information and for facilitating creative expression (WestEd,
2010). The TEL Issues Paper identified four broad issues that informed and guided the survey
questionnaire development administered in the pilot assessment. These are: availability of
school resources; organization of technology and engineering instruction; teacher
preparation; and student engagement. Each of these issues comprises several sub-issues. Two
of these four issues were covered in both the student and school questionnaires. Teacher
preparation was covered in the school questionnaire only and student engagement was
covered in the student questionnaire only.
Criteria for Item Review
Analysis of Frequency Distributions
Following the same procedure as for other background questionnaire item reviews, a set of
frequency-based flagging criteria were considered in evaluating whether items were
applicable to the targeted population. It is important to keep in mind that flags are indications
that a particular item should be thoroughly evaluated. Flags are not absolute criteria for
making decisions regarding the use or quality of items. Instead, the flagging criteria should be
viewed collectively, along with other criteria and professional judgment, in recommending
keeping, revising, or dropping items from the 2014 TEL Probe survey questionnaires.
For this analysis, we examined item response frequencies for response options. We also
assessed item non-response patterns to determine whether problematic items or response
2
options warrant revising items, expanding or collapsing response categories, or dropping an
item. The flagging criteria on response patterns and item non-response (i.e., missing response)
rates for reviewing the data are:
A high percentage of item non-response (relative to other adjacent items) may indicate
that the item content might have been problematic (e.g., ambiguous, burdensome,
overly complex, offensive) or that the format might have caused respondents to
overlook the item. Note that, this criterion does not apply to multiple selection
multiple-choice items, because the missing rate for “select all that apply” items
contains both missing responses and “not apply” responses.
Low single-category response rates (e.g., <10%) may indicate that a category does not
apply to this population and possibly that different categories may be more
informative.
High single-category response rates (e.g., >80%) may indicate that almost all
respondents in the population fall into one category and that a limited range of
demographics or behavior indicators are collected.
Implications of flags on any of these criteria for a given item will be discussed in more detail.
Note again that, flagging an item on one of these criteria does not, by itself; warrant the
necessity for revising or dropping the item. Whether an item needs to be modified or revised
also depends on whether response categories are unique to an item versus representing one
of the standard formats used across many questionnaire items, and whether certain response
categories are needed from a theoretical perspective. Maintaining a high level of consistency
across items is one important consideration for the validity of the questionnaires as well as
the trend information being collected.
Response Time/Burden
The length of time it takes a respondent to complete a questionnaire is important. If a
questionnaire to too long or “burdensome,” the respondent will not exert their best efforts,
especially at the end of the questionnaire. It does not appear that school administrators were
overly burdened with the length of this questionnaire. The missing rate for the TEL School
questionnaire ranged from 1%–2%.
3
Recommendations for the 2014 TEL Probe Assessment – Overview
In the following we will summarize our item review in detail for each item, along with the
recommendation for the 2014 TEL Probe assessment.
Table 1 on the following page presents an overview of all recommendations. For each item,
the accession number and the area and issue that are addressed are given in the table.
4
Table 1 - Recommendations for 2014 TEL School Probe - Overview
Item #
1
AccNum
VE638378
Common
Area
Issue
Organization of Instruction
Recommendation
Keep
2
VE638432
Common
Organization of Instruction
Keep
VE638446
Common
Organization of Instruction
Keep
4
VE638450
Common
Organization of Instruction
Keep
5
VE638334
Common
Organization of Instruction
Keep
6
VE681573
Common
Organization of Instruction
Keep
7
VE638483
Common
Organization of Instruction
Keep
8
VE638475
Common
Keep
9
VE675587
Common
10
VE638517
Common
11
VE638436
Common
12
VE675659
Common
13
VE638523
Common
14
VE638496
Common
Availability and Use of
Instructional Resources
Availability and Use of
Instructional Resources
Availability and Use of
Instructional Resources
Availability and Use of
Instructional Resources
Availability and Use of
Instructional Resources
Availability and Use of
Instructional Resources
Teacher Preparation
15
VE638333
Technology and Society
Organization of Instruction
Keep
16
VE638350
Technology and Society
Organization of Instruction
Keep
17
VE638372
Design and Systems
Organization of Instruction
Keep
18
VE638380
Design and Systems
Organization of Instruction
Keep
19
VE638391
Organization of Instruction
Keep
20
VE638410
Information and Communication
Technology
Information and Communication
Technology
Organization of Instruction
Keep
5
Keep
Keep
Keep
Keep
Keep
Keep
Item Review and Recommendations for the 2014 TEL School Grade 8 Probe
Assessment
Note: The “missing rate” for most items range from 1%–2% for this questionnaire. ETS finds this
missing rate acceptable and will not address this criteria item-by-item below except where indicated.
Area: Common
Issue: Organization of Instruction
Item Review:
This is a common item across the three TEL target assessment areas and provides information about
students’ previous instruction in six areas of technology. Response options are skewed toward the
middle of the scale with options “C” and “D” exhibiting low response rates for sub-items “a–d” (7%–
9%).
Although relative frequencies for response options “C” and “D” are less than 10% for these four subitems, there is sufficient variation across response options to distinguish different levels of instruction.
6
Moreover, keeping response options the same for all sub-items is important for comparisons of and
reporting on the results for these items.
Overall, the sub-items show satisfactory item statistics to be included in the TEL school questionnaire.
Recommendation:
Keep the item and sub-items in the 2014 TEL Probe Administration.
7
Area: Common
Issue: Organization of Instruction
Item Review:
This is a common item across the three TEL target assessment areas and provides information about
how instruction is administered in the three TEL target assessment areas. Response option “C – After
school” had low response rates for all three sub-items (5%, 9%, and 7%, respectively), however this
response option may provide interesting findings for trend going forward. ETS does not recommend
removing this response option.
The frequency distribution across the other response options shows a reasonable and expected
balance of responses across response categories. Overall, the sub-items show satisfactory item
statistics to be included in the TEL school questionnaire.
Recommendation:
Keep this item and all sub-items in the 2014 TEL Probe Administration.
8
Area: Common
Issue: Organization of Instruction
Item Review:
This is a common item across the three TEL target assessment areas and provides information about
whether the school requires any technology or engineering instruction to students.
The frequency distribution across both response options is satisfactory with a reasonable balance of
response.
Recommendation:
Keep this item and in the 2014 TEL Probe Administration.
9
Area: Common
Issue: Organization of Instruction
Item Review:
This is a common item across the three TEL target assessment areas and provides information about
the schools’ instruction in eight areas of technology and engineering. Response option “A” had low
response rates for sub-items “a,” “b,” “e,” and “f” – (7%– 9%). In addition, sub-item “i” had a missing
rate of 73%.
Because four other items use the identical set of response options, response option “A” should remain.
Overall, the sub-items show satisfactory item statistics to be included in the TEL school questionnaire.
For sub-item “i,” the write-in responses are shown on the following page.
10
21 THINGS FOR STUDENTS
ADMINISTRATIVE DIRECTIVE
GRAPHIC DESIGN ILLUSTRATOR, GOOGLE DOCS, PREP, ETC. ADOBE PHOTO
HABIT
I think society is the driving force of technology use in school.
IB Middle Years Program standards
IB MVP TECHNOLOGY COURSE REQUIREMENTS
IB Program
International Baccalaureate Middle Years Programme Technology
Keyboarding Instruction
Parental technical input
Professional growth
Project Lead the Way
Project Lead the Way (Gateway to Technology Middle School Program)
School developed programming
Student shared information
Teacher Created
TEXT INFORMATIONAL
Workbook
Recommendation:
Keep this item and sub-items in the 2014 TEL Probe Administration.
11
Area: Common
Issue: Organization of Instruction
Item Review:
This is a common item across the three TEL target assessment areas and provides information about
whether the school provides any courses or afterschool programs that cover technology or engineering
concepts.
The frequency distribution across both response options is satisfactory with reasonable balance of
responses and should be included in the 2014 school questionnaire.
Recommendation:
Keep this item in the 2014 TEL Probe Administration.
12
Area: Common
Issue: Organization of Instruction
Item Review:
The purpose of this constructed-response item is to identify the most relevant courses that cover
technology and engineering concepts. The non-response rate for sub-item “a” is 28% and rises to 87%
for sub-item “e.” Among 467 respondents who were directed to the item, 103 respondents did not
provide any write-in answers to any of the five options, therefore, the missing rate for this item is 28%.
In the write-in responses, administrators provided a variety of courses that cover technology or
engineering concepts. The item collects important information regarding TEL-related instruction.
Recommendation:
Keep this item in the 2014 TEL Probe Administration.
13
Area: Common
Issue: Organization of Instruction
Item Review:
This is a common item across the three TEL target assessment areas and provides information about
how often a grade 8 student is assessed about what he or she knows about technology or engineering.
The frequency distribution across the response options is satisfactory with a reasonable balance of
responses across all response categories. The items and sub-items show satisfactory item statistics to
be included in the TEL school questionnaire. There is no evidence that any response categories need to
be collapsed, expanded, or deleted.
Overall, the sub-items show satisfactory item statistics to be included in the TEL school questionnaire.
Recommendation:
Keep this item in the 2014 TEL Probe Administration.
14
Area: Common
Issue: Availability and Use of Instructional Resources
Item Review:
This is a common item across the three TEL target assessment areas and provides information about
the number of grade 8 students in the school, the number of computers available for educational
purposes, the number of computers connected to the Internet, and the number of computers for
students to take home.
Although the item has low response rates for all sub-items on specific response options, the options
provided clearly cover the full range of all possible responses. The item collects information about
school resources regarding computer availability, and could be used to measure growth in the future.
Recommendation:
Keep this item in the 2014 TEL Probe Administration.
15
Area: Common
Issue: Availability and Use of Instructional Resources
Item Review:
This is a common item across the three TEL target assessment areas and provides information about
schools providing computers for students to take home.
While only 3% of administrators indicated a response in sub-item “a,” this is an important sub-item for
understanding trend. The frequency distribution across the other response options is satisfactory with
reasonable balance of responses.
Recommendation:
Keep this item in the 2014 TEL Probe Administration.
16
Area: Common
Issue: Availability and Use of Instructional Resources
Item Review:
This is a common item across the three TEL target assessment areas and provides information about
students taking advantage of school-sponsored resources.
The frequency distribution across sub-items indicates a skew to the left with low responses for subitems “c,” “d,” and “e” (1%–9%). Most school administrators indicated that the school does not provide
resources to students (“A”), (67%–76%) or at very low proportions (“B”), (11%–20%). Sub-items “a”
and “b” show a satisfactory frequency distribution.
While there is some evidence to collapse response options “A” and “B” together, these are important
measurements for the future study of trend. This item and all sub-items should be included in the TEL
school questionnaire.
Recommendation:
Keep this item in the 2014 TEL Probe Administration.
17
Area: Common
Issue: Availability and Use of Instructional Resources
Item Review:
This is a common item across the three TEL target assessment areas and provides information about
resources available to teachers.
The frequency distribution across response options is satisfactory for all sub-items with reasonable
balance of responses across all response. Moreover, keeping response options the same for all subitems is important for comparisons of and reporting on the results for these items.
Overall, the sub-items show satisfactory item statistics to be included in the TEL school questionnaire.
Recommendation:
Keep this item in the 2014 TEL Probe Administration.
18
Area: Common
Issue: Availability and Use of Instructional Resources
Item Review:
This is a common item across the three TEL target assessment areas and provides information about
equipment available for instruction. Response option “A” had low response rates for sub-items “a–f”
(1%–6%), however this response option may provide an interesting finding going forward. ETS does
not recommend removing this response option.
The frequency distribution across the other response options is satisfactory for all sub-items with a
reasonable and expected balance of responses across response categories. Moreover, keeping
response options the same for all sub-items is important for comparisons of and reporting on the
results for these items.
Recommendation:
Keep this item in the 2014 TEL Probe Administration.
19
Area: Common
Issue: Availability and Use of Instructional Resources
Item Review:
This is a common item across the three TEL target assessment areas and provides information about
whether the school’s capability to provide instruction in technology or engineering concepts is
hindered.
Sub-item “d” had low responses for response options “C”–9% and “D”–4%. Sub-item “f” had a low
response for “D”–5%, however there is sufficient variation across response options to distinguish
different levels of problems in the school. Moreover, keeping response options the same for all subitems, and consistent with the response options used for other versions of this matrix item is
important for comparisons of and reporting on the results for these items.
Overall, the sub-items show satisfactory item statistics to be included in the TEL school questionnaire.
Recommendation:
Keep this item in the 2014 TEL Probe Administration.
20
Area: Common
Issue: Teacher Preparation
Item Review:
This is a Common item across the three TEL target assessment areas and provides information about
teachers’ professional development.
All sub-items exhibited low relative frequencies for one or more response options: “a” for D–F, “b” for
B and E, “c” for B, however there is sufficient variation across response options to distinguish different
levels of professional development. Moreover, keeping response options the same for all sub-items is
important for comparisons of and reporting on the results for these items.
Overall, the sub-items show satisfactory item statistics to be included in the TEL school questionnaire.
Recommendation:
Keep this item in the 2014 TEL Probe Administration.
21
Area: Technology and Society
Issue: Organization of Instruction
Item Review:
This item provides contextual information for the cognitive assessment pertaining to Technology and
Society (T&S) and discusses the emphasis placed on teaching students various topics. Response option
“A” had a low response rate for all sub-items (1%–5%).
Although relative frequencies for response options “A” are less than 10% for the sub-items, there is
sufficient variation across response options to distinguish different levels of teaching engagement.
Moreover, keeping response options the same for all sub-items, and consistent with the response
options used for other versions of this matrix item is important for comparisons of and reporting on
the results for these items.
Overall, the sub-items show satisfactory item statistics to be included in the TEL school questionnaire.
Recommendation:
Keep this item in the 2014 TEL Probe Administration.
22
Area: Technology and Society
Issue: Organization of Instruction
Item Review:
This item provides contextual information for the cognitive assessment pertaining to Technology and
Society (T&S) and discusses various student activities prior to grade 8.
Response option “A” had a low response rate for all sub-items (2%–3%). Although relative frequencies
for response options “A” are less than 10% for the sub-items, there is sufficient variation across
response options to distinguish different levels of student activities. Moreover, keeping response
options the same for all sub-items, and consistent with the response options used for other versions of
this matrix item is important for comparisons of and reporting on the results for these items.
Overall, the sub-items show satisfactory item statistics to be included in the TEL school questionnaire.
Recommendation:
Keep this item in the 2014 TEL Probe Administration.
23
Area: Design and Systems
Issue: Organization of Instruction
Item Review:
This item provides contextual information for the cognitive assessment pertaining to Design and
Systems (D&S) and discusses teaching emphasis for various topics.
Two response options had low response rates: “A” had a low response rate for sub-items “a” and “c”
(4%–5%) and “D” had low response rates for sub-items “b,” “d,” and “e” (8%–9%).
Although relative frequencies for response options “A” and “D” are less than 10% for some sub-items,
there is sufficient variation across response options to distinguish different levels of teacher emphasis.
Moreover, keeping response options the same for all sub-items, and consistent with the response
options used for other versions of this matrix item is important for comparisons of and reporting on
the results for these items.
Overall, the sub-items show satisfactory item statistics to be included in the TEL school questionnaire.
Recommendation:
Keep this item in the 2014 TEL Probe Administration.
24
Area: Design and Systems
Issue: Organization of Instruction
Item Review:
This item provides contextual information for the cognitive assessment pertaining to Design and
Systems (D&S) and discusses the extent students participate in various activities. Response option “D”
had a low response rate for six of seven sub-items (1%–8%).
Although relative frequencies for response options “D” are less than 10% for many sub-items, there is
sufficient variation across response options to distinguish different levels of student participation.
Moreover, keeping response options the same for all sub-items, and consistent with the response
options used for other versions of this matrix item is important for comparisons of and reporting on
the results for these items. Overall, the sub-items show satisfactory item statistics to be included in
the TEL school questionnaire.
Recommendation:
Keep this item in the 2014 TEL Probe Administration.
25
Area: Information and Communication Technology
Issue: Organization of Instruction
Item Review:
This item provides contextual information for the cognitive assessment pertaining to Information and
Communication Technology (ICT) and discusses teaching emphasis for or various topics.
Two response options had low response rates: “A” had a low response rate for sub-items “a–e” (1%–
7%) and “B” had a low response rate for sub-items “c” and “e” (7% and 9%).
Although relative frequencies for response options “A” and “B” are less than 10% for some sub-items,
there is sufficient variation across response options to distinguish different levels of teaching emphasis.
Moreover, keeping response options the same for all sub-items, and consistent with the response
options used for other versions of this matrix item is important for comparisons of and reporting on
the results for these items.
Overall, the sub-items show satisfactory item statistics to be included in the TEL school questionnaire.
Recommendation:
Keep this item in the 2014 TEL Probe Administration.
26
Area: Information and Communication Technology
Issue: Organization of Instruction
Item Review:
This item provides contextual information for the cognitive assessment pertaining to Information and
Communication Technology (ICT) and discusses the extent students’ conduct various activities.
Response option “A” had a low response rate for sub-items “b” and “d” (5% each).
Although relative frequencies for response options “A” are less than 10% for two sub-items, there is
sufficient variation across response options to distinguish different levels of student participation.
Moreover, keeping response options the same for all sub-items, and consistent with the response
options used for other versions of this matrix item is important for comparisons of and reporting on
the results for these items.
Overall, the sub-items show satisfactory item statistics to be included in the TEL school questionnaire.
Recommendation:
Keep this item in the 2014 TEL Probe Administration.
27
References
WestEd (2010). Technology and Engineering Literacy Framework for the 2014 National
Assessment of Educational Progress. Washington, DC: WestEd.
28
Technology and Engineering Literacy Reports
3. Final Adjudication Decisions
TEL 2014 Student and School Questionnaires: Final Decisions
The tables below summarize the final adjudication decisions for the TEL student questionnaire
(Table 1) and the TEL school questionnaire (Table 2) that were made after the 2013 pilot. After
the initial recommendations reports for the two questionnaires were developed, an additional
review and evaluation, conducted by NCES and the Governing Board, informed these final
decisions based on policy relevance and content coverage of the questionnaire items.
In the table headers, “Type” refers to whether the item described in the row is a discrete item, an
item stem of a matrix item, or a sub-item of a matrix item. The “Pilot AccNum” header refers to
the item accession number used in the 2013 pilot administration. The “2014 Sequence” header
refers to the item sequence that will be used in the 2014 administration. The “2014 AccNum”
header refers to the item accession number that will be used in the 2014 administration. The
“Area” header indicates which TEL area the item measures. In the “Area” column, the text
“Common” indicates that the item measures a topic that is common to all of the three specific
areas of the TEL assessment (i.e., Design and Systems, Technology and Society, and Information
and Communication Technology). The “Issue” header indicates which specific issue the item
addresses. In the “2014 Sequence” and “2014 AccNum” columns, the text “Dropped” indicates
that an item that has been used in the 2013 pilot administration, but will not be administered in
2014 per the adjudication decision. The rows that have been highlighted in dark grey are discrete
items or item stems of matrix items. The rows that have been highlighted in light grey are subitems of matrix items.
Table 1: Final adjudication decisions for the 2014 TEL student questionnaire
Type
Discrete
Discrete
Item Stem
Pilot
AccNum
VE639842
VE639847
VE681624
2014
Sequence
1
2
3
2014
AccNum
VE639842
VE639847
VE681624
Sub-item
VE681629
3a
VE681629
Sub-item
VE681632
3b
VE681632
Item Stem
VE639123
4
VE639123
Sub-item
VE639125
Dropped
Dropped
Sub-item
VE639127
Dropped
Dropped
Sub-item
VE639130
4a
VE639130
Area
Common
Common
Information and
Communication Technology
Information and
Communication Technology
Information and
Communication Technology
Information and
Communication Technology
Information and
Communication Technology
Information and
Communication Technology
Information and
Communication Technology
Issue
Organization of Instruction
Organization of Instruction
Organization of Instruction
Organization of Instruction
Organization of Instruction
Organization of Instruction
Organization of Instruction
Organization of Instruction
Organization of Instruction
1
Type
Sub-item
Pilot
AccNum
VE639131
2014
Sequence
Dropped
2014
AccNum
Dropped
Sub-item
VE639137
4b
VE639137
Sub-item
VE639136
4c
VE639136
Item Stem
VF025108
5
VF025108
Sub-item
VF025109
Dropped
Dropped
Sub-item
VF025110
Dropped
Dropped
Sub-item
VF025112
5a
VF025112
Sub-item
VF025113
Dropped
Dropped
Sub-item
VF025117
5b
VF025117
Sub-item
VF025116
5c
VF025116
Item Stem
Sub-item
Sub-item
Sub-item
Sub-item
Sub-item
Sub-item
Item Stem
Sub-item
Sub-item
Sub-item
Sub-item
Sub-item
Sub-item
Item Stem
Sub-item
Sub-item
Sub-item
Sub-item
Sub-item
Sub-item
Sub-item
Item Stem
VE682225
VE682226
VE682227
VE682228
VE682229
VE682230
VE682231
VE638956
VE638957
VE638959
VE638963
VE682247
VE638965
VE682248
VE638983
VE638986
VF009777
VE638998
VE639038
VE682267
VE639042
VE682268
VE682217
6
Dropped
Dropped
6a
6b
Dropped
Dropped
7
Dropped
7a
7b
7c
7d
Dropped
8
Dropped
8a
8b
8c
8d
8e
Dropped
9
VE682225
Dropped
Dropped
VE682228
VE682229
Dropped
Dropped
VE638956
Dropped
VE638959
VE638963
VE682247
VE638965
Dropped
VE638983
Dropped
VF009777
VE638998
VE639038
VE682267
VE639042
Dropped
VH008232
Area
Information and
Communication Technology
Information and
Communication Technology
Information and
Communication Technology
Information and
Communication Technology
Information and
Communication Technology
Information and
Communication Technology
Information and
Communication Technology
Information and
Communication Technology
Information and
Communication Technology
Information and
Communication Technology
Design and Systems
Design and Systems
Design and Systems
Design and Systems
Design and Systems
Design and Systems
Design and Systems
Design and Systems
Design and Systems
Design and Systems
Design and Systems
Design and Systems
Design and Systems
Design and Systems
Design and Systems
Design and Systems
Design and Systems
Design and Systems
Design and Systems
Design and Systems
Design and Systems
Design and Systems
Design and Systems,
Technology and Society, &
Information and
Communication Technology
Issue
Organization of Instruction
Organization of Instruction
Organization of Instruction
Student Engagement
Student Engagement
Student Engagement
Student Engagement
Student Engagement
Student Engagement
Student Engagement
Organization of Instruction
Organization of Instruction
Organization of Instruction
Organization of Instruction
Organization of Instruction
Organization of Instruction
Organization of Instruction
Organization of Instruction
Organization of Instruction
Organization of Instruction
Organization of Instruction
Organization of Instruction
Organization of Instruction
Organization of Instruction
Student Engagement
Student Engagement
Student Engagement
Student Engagement
Student Engagement
Student Engagement
Student Engagement
Student Engagement
Student Engagement
2
Type
Sub-item
Pilot
AccNum
VE682218
2014
Sequence
Dropped
2014
AccNum
Dropped
Sub-item
VE682219
9a
VH008238
Sub-item
VE682221
9b
VH008240
Sub-item
VE682222
9c
VH008241
Item Stem
Sub-item
Sub-item
Sub-item
Sub-item
Sub-item
Item Stem
Sub-item
Sub-item
Sub-item
Item Stem
Sub-item
Sub-item
Sub-item
Sub-item
Item Stem
Sub-item
Sub-item
Sub-item
Sub-item
Discrete
VE682276
VE682278
VE682280
VE682281
VE682284
VE682286
VE682317
VE682321
VE682323
VE682324
VE638999
VE639002
VE639004
VE639005
VE682300
VE639008
VE639012
VE639013
VE639014
VE682314
VE682274
Dropped
Dropped
Dropped
9d
9e
Dropped
Dropped
9f
9g
9h
10
10a
10b
10c
10d
11
11a
11b
11c
11d
12
Dropped
Dropped
Dropped
VH008243
VH008244
Dropped
Dropped
VH008245
VH008247
VH008248
VE638999
VE639002
VE639004
VE639005
VE682300
VE639008
VE639012
VE639013
VE639014
VE682314
VE682274
Discrete
VE682215
13
VE682215
Discrete
VE682315
14
VE682315
Information and
Communication Technology
Technology and Society
Item Stem
Sub-item
Sub-item
Discrete
Discrete
Discrete
Item Stem
Sub-item
Sub-item
Sub-item
Sub-item
VF009358
VF009360
VF009361
VE401773
VE401776
VE401779
VE639166
VE639168
VE639169
VE639171
VE639173
15
15a
15b
16
17
18
Dropped
Dropped
Dropped
Dropped
Dropped
VF009358
VF009360
VF009361
VE401773
VE401776
VE401779
Dropped
Dropped
Dropped
Dropped
Dropped
Common
Common
Common
Common
Common
Common
Common
Common
Common
Common
Common
Area
Information and
Communication Technology
Information and
Communication Technology
Information and
Communication Technology
Information and
Communication Technology
Design and Systems
Design and Systems
Design and Systems
Design and Systems
Design and Systems
Design and Systems
Technology and Society
Technology and Society
Technology and Society
Technology and Society
Technology and Society
Technology and Society
Technology and Society
Technology and Society
Technology and Society
Technology and Society
Technology and Society
Technology and Society
Technology and Society
Technology and Society
Design and Systems
Issue
Student Engagement
Student Engagement
Student Engagement
Student Engagement
Student Engagement
Student Engagement
Student Engagement
Student Engagement
Student Engagement
Student Engagement
Student Engagement
Student Engagement
Student Engagement
Student Engagement
Organization of Instruction
Organization of Instruction
Organization of Instruction
Organization of Instruction
Organization of Instruction
Student Engagement
Student Engagement
Student Engagement
Student Engagement
Student Engagement
Availability and Use of
Instructional Resources
Availability and Use of
Instructional Resources
Availability and Use of
Instructional Resources
Student Engagement
Student Engagement
Student Engagement
Student Engagement
Student Engagement
Student Engagement
Student Engagement
Student Engagement
Student Engagement
Student Engagement
Student Engagement
3
Type
Sub-item
Sub-item
Sub-item
Sub-item
Sub-item
Item Stem
Sub-item
Sub-item
Sub-item
Sub-item
Sub-item
Sub-item
Item Stem
Sub-item
Sub-item
Sub-item
Sub-item
Sub-item
Sub-item
Sub-item
Sub-item
Sub-item
Sub-item
Item Stem
Pilot
AccNum
VE639174
VE639175
VE639176
VF009755
VE639170
VE639025
VE639028
VE639043
VE639048
VE639046
VE639053
VF009048
VF009050
VF009051
VF009052
VF009053
VF009054
VF009055
VF009056
VF009061
VF009064
VF009065
VF009066
VF238958
2014
Sequence
Dropped
Dropped
Dropped
Dropped
Dropped
Dropped
Dropped
Dropped
Dropped
Dropped
Dropped
Dropped
Dropped
Dropped
Dropped
Dropped
Dropped
Dropped
Dropped
Dropped
Dropped
Dropped
Dropped
Dropped
2014
AccNum
Dropped
Dropped
Dropped
Dropped
Dropped
Dropped
Dropped
Dropped
Dropped
Dropped
Dropped
Dropped
Dropped
Dropped
Dropped
Dropped
Dropped
Dropped
Dropped
Dropped
Dropped
Dropped
Dropped
Dropped
Sub-item
VE238965
Dropped
Dropped
Sub-item
VF238968
Dropped
Dropped
Sub-item
VF238969
Dropped
Dropped
Sub-item
VF238973
Dropped
Dropped
Sub-item
VF238974
Dropped
Dropped
Sub-item
VF238975
Dropped
Dropped
Item Stem
Sub-item
Sub-item
Sub-item
Sub-item
Sub-item
Sub-item
Sub-item
VE639172
VE639177
VE639178
VE639179
VE639180
VE639181
VE639182
VE639183
Dropped
Dropped
Dropped
Dropped
Dropped
Dropped
Dropped
Dropped
Dropped
Dropped
Dropped
Dropped
Dropped
Dropped
Dropped
Dropped
Area
Common
Common
Common
Common
Common
Common
Common
Common
Common
Common
Common
Common
Common
Common
Common
Common
Common
Common
Common
Common
Common
Common
Common
Information and
Communication Technology
Information and
Communication Technology
Information and
Communication Technology
Information and
Communication Technology
Information and
Communication Technology
Information and
Communication Technology
Information and
Communication Technology
Common
Common
Common
Common
Common
Common
Common
Common
Issue
Student Engagement
Student Engagement
Student Engagement
Student Engagement
Student Engagement
Student Engagement
Student Engagement
Student Engagement
Student Engagement
Student Engagement
Student Engagement
Student Engagement
Student Engagement
Student Engagement
Student Engagement
Student Engagement
Student Engagement
Student Engagement
Student Engagement
Student Engagement
Student Engagement
Student Engagement
Student Engagement
Organization of Instruction
Organization of Instruction
Organization of Instruction
Organization of Instruction
Organization of Instruction
Organization of Instruction
Organization of Instruction
Student Engagement
Student Engagement
Student Engagement
Student Engagement
Student Engagement
Student Engagement
Student Engagement
Student Engagement
4
Type
Sub-item
Item Stem
Sub-item
Sub-item
Sub-item
Sub-item
Sub-item
Sub-item
Item Stem
Pilot
AccNum
VE677642
VE682232
VE682233
VE682234
VE682238
VE682236
VE682237
VE682235
VF238997
2014
Sequence
Dropped
Dropped
Dropped
Dropped
Dropped
Dropped
Dropped
Dropped
Dropped
2014
AccNum
Dropped
Dropped
Dropped
Dropped
Dropped
Dropped
Dropped
Dropped
Dropped
Sub-item
VF239009
Dropped
Dropped
Sub-item
VF239000
Dropped
Dropped
Sub-item
VF239004
Dropped
Dropped
Sub-item
VF239007
Dropped
Dropped
Sub-item
VF239008
Dropped
Dropped
Sub-item
VF239010
Dropped
Dropped
Sub-item
VF239003
Dropped
Dropped
Area
Common
Design and Systems
Design and Systems
Design and Systems
Design and Systems
Design and Systems
Design and Systems
Design and Systems
Information and
Communication Technology
Information and
Communication Technology
Information and
Communication Technology
Information and
Communication Technology
Information and
Communication Technology
Information and
Communication Technology
Information and
Communication Technology
Information and
Communication Technology
Issue
Student Engagement
Student Engagement
Student Engagement
Student Engagement
Student Engagement
Student Engagement
Student Engagement
Student Engagement
Organization of Instruction
Organization of Instruction
Organization of Instruction
Organization of Instruction
Organization of Instruction
Organization of Instruction
Organization of Instruction
Organization of Instruction
Table 2: Final adjudication decisions for the 2014 TEL school questionnaire
Type
Item Stem
Pilot
AccNum
VE638378
2014
Sequence
1
2014
AccNum
VE638378
Common
Issue
Organization of Instruction
Sub-item
VE638386
1a
VE638386
Common
Organization of Instruction
Sub-item
VE638388
1b
VE638388
Common
Organization of Instruction
Sub-item
VE638389
1c
VE638389
Common
Organization of Instruction
Sub-item
VE638390
1d
VE638390
Common
Organization of Instruction
Sub-item
VE638392
1e
VE638392
Common
Organization of Instruction
Sub-item
VE638395
1f
VE638395
Common
Organization of Instruction
Item Stem
VE638432
2
VE638432
Common
Organization of Instruction
Sub-item
VE638435
2a
VE638435
Common
Organization of Instruction
Sub-item
VE638438
2b
VE638438
Common
Organization of Instruction
Sub-item
VE638442
2c
VE638442
Common
Organization of Instruction
Discrete
VE638446
3
VE638446
Common
Organization of Instruction
Item Stem
VE638450
4
VE638450
Common
Organization of Instruction
Area
5
Type
Sub-item
Pilot
AccNum
VE638453
2014
Sequence
4a
2014
AccNum
VE638453
Common
Issue
Organization of Instruction
Sub-item
VE638456
4b
VE638456
Common
Organization of Instruction
Sub-item
VE638457
4c
VE638457
Common
Organization of Instruction
Sub-item
VE638462
4d
VE638462
Common
Organization of Instruction
Sub-item
VE638459
4e
VE638459
Common
Organization of Instruction
Sub-item
VE638464
4f
VE638464
Common
Organization of Instruction
Sub-item
VE638470
4g
VE638470
Common
Organization of Instruction
Sub-item
VE638472
4h
VF821977
Common
Organization of Instruction
Sub-item
VE638467
4i
VE638467
Common
Organization of Instruction
Discrete
VE638334
5
VE638334
Common
Organization of Instruction
Discrete
VE681573
6
VE681573
Common
Organization of Instruction
Item Stem
VE638483
7
VE638483
Common
Organization of Instruction
Sub-item
VE638486
7a
VE638486
Common
Organization of Instruction
Sub-item
VE638487
7b
VE638487
Common
Organization of Instruction
Sub-item
VE638490
7c
VE638490
Common
Organization of Instruction
Item Stem
VE638475
8
VE638475
Common
Sub-item
VE638480
8a
VE638480
Common
Sub-item
VE638484
8b
VE638484
Common
Sub-item
VE638485
8c
VE638485
Common
Sub-item
VE675583
8d
VE675583
Common
Discrete
VE675587
9
VE675587
Common
Item Stem
VE638517
10
VE638517
Common
Sub-item
VE638518
10a
VE638518
Common
Sub-item
VE638519
10b
VE638519
Common
Sub-item
VE638520
10c
VE638520
Common
Sub-item
VE638521
10d
VE638521
Common
Sub-item
VE638522
10e
VE638522
Common
Item Stem
VE638436
11
VE638436
Common
Sub-item
VE638440
11a
VE638440
Common
Sub-item
VE638441
11b
VE638441
Common
Availability and Use of
Instructional Resources
Availability and Use of
Instructional Resources
Availability and Use of
Instructional Resources
Availability and Use of
Instructional Resources
Availability and Use of
Instructional Resources
Availability and Use of
Instructional Resources
Availability and Use of
Instructional Resources
Availability and Use of
Instructional Resources
Availability and Use of
Instructional Resources
Availability and Use of
Instructional Resources
Availability and Use of
Instructional Resources
Availability and Use of
Instructional Resources
Availability and Use of
Instructional Resources
Availability and Use of
Instructional Resources
Availability and Use of
Instructional Resources
Area
6
Type
Sub-item
Pilot
AccNum
VE638443
2014
Sequence
11c
2014
AccNum
VE638443
Common
Sub-item
VE638445
11d
VE638445
Common
Sub-item
VE638449
11e
VE638449
Common
Sub-item
VE638452
11f
VE638452
Common
Sub-item
VE638454
11g
VE638454
Common
Sub-item
VE675624
11h
VE675624
Common
Item Stem
VE675659
12
VE675659
Common
Sub-item
VE677022
Dropped
Dropped
Common
Sub-item
VE677568
12a
VE677568
Common
Sub-item
VE677569
12b
VE677569
Common
Sub-item
VE677570
12c
VE677570
Common
Sub-item
VE677571
12d
VE677571
Common
Sub-item
VE677572
12e
VE677572
Common
Sub-item
VE677573
12f
VE677573
Common
Sub-item
VE677574
12g
VE677574
Common
Item Stem
VE638523
13
VE638523
Common
Sub-item
VE638524
13a
VE638524
Common
Sub-item
VE638525
13b
VE638525
Common
Sub-item
VE638526
13c
VE638526
Common
Sub-item
VE638528
13d
VE638528
Common
Sub-item
VE638529
13e
VE638529
Common
Sub-item
VE638533
13f
VE638533
Common
Sub-item
VE638534
13g
VE638534
Common
Sub-item
VE638535
13h
VE638535
Common
Item Stem
VE638496
14
VE638496
Common
Issue
Availability and Use of
Instructional Resources
Availability and Use of
Instructional Resources
Availability and Use of
Instructional Resources
Availability and Use of
Instructional Resources
Availability and Use of
Instructional Resources
Availability and Use of
Instructional Resources
Availability and Use of
Instructional Resources
Availability and Use of
Instructional Resources
Availability and Use of
Instructional Resources
Availability and Use of
Instructional Resources
Availability and Use of
Instructional Resources
Availability and Use of
Instructional Resources
Availability and Use of
Instructional Resources
Availability and Use of
Instructional Resources
Availability and Use of
Instructional Resources
Availability and Use of
Instructional Resources
Availability and Use of
Instructional Resources
Availability and Use of
Instructional Resources
Availability and Use of
Instructional Resources
Availability and Use of
Instructional Resources
Availability and Use of
Instructional Resources
Availability and Use of
Instructional Resources
Availability and Use of
Instructional Resources
Availability and Use of
Instructional Resources
Teacher Preparation
Sub-item
VE638497
14a
VE638497
Common
Teacher Preparation
Area
7
Type
Sub-item
Pilot
AccNum
VE638498
2014
Sequence
14b
2014
AccNum
VE638498
Common
Issue
Teacher Preparation
Sub-item
VE638504
14c
VE638504
Common
Teacher Preparation
Item Stem
VE638333
15
VE638333
Technology and Society
Organization of Instruction
Sub-item
VE638336
15a
VE638336
Technology and Society
Organization of Instruction
Sub-item
VE638338
15b
VE638338
Technology and Society
Organization of Instruction
Sub-item
VE638340
15c
VE638340
Technology and Society
Organization of Instruction
Sub-item
VE677585
15d
VE677585
Technology and Society
Organization of Instruction
Item Stem
VE638350
16
VE638350
Technology and Society
Organization of Instruction
Sub-item
VE638354
16a
VE638354
Technology and Society
Organization of Instruction
Sub-item
VE638355
16b
VE638355
Technology and Society
Organization of Instruction
Sub-item
VE638356
16c
VE638356
Technology and Society
Organization of Instruction
Item Stem
VE638372
17
VE638372
Design and Systems
Organization of Instruction
Sub-item
VE638375
17a
VE638375
Design and Systems
Organization of Instruction
Sub-item
VE638376
17b
VE638376
Design and Systems
Organization of Instruction
Sub-item
VE638377
17c
VE638377
Design and Systems
Organization of Instruction
Sub-item
VE639184
17d
VE639184
Design and Systems
Organization of Instruction
Sub-item
VE677599
17e
VE677599
Design and Systems
Organization of Instruction
Sub-item
VE677600
17f
VE677600
Design and Systems
Organization of Instruction
Item Stem
VE638380
18
VE638380
Design and Systems
Organization of Instruction
Sub-item
VE677603
18a
VE677603
Design and Systems
Organization of Instruction
Sub-item
VE638383
18b
VE638383
Design and Systems
Organization of Instruction
Sub-item
VE638384
18c
VE638384
Design and Systems
Organization of Instruction
Sub-item
VE677604
18d
VE677604
Design and Systems
Organization of Instruction
Sub-item
VE638385
18e
VE638385
Design and Systems
Organization of Instruction
Sub-item
VE677605
18f
VE677605
Design and Systems
Organization of Instruction
Sub-item
VE677606
18g
VE677606
Design and Systems
Organization of Instruction
Item Stem
VE638391
19
VE638391
Organization of Instruction
Sub-item
VE638396
19a
VE638396
Sub-item
VE638399
19b
VE638399
Sub-item
VE677607
19c
VE677607
Sub-item
VE677609
19d
VE677609
Sub-item
VF239167
19e
VF239167
Sub-item
VE677608
19f
VE677608
Item Stem
VE638410
20
VE638410
Information and
Communication Technology
Information and
Communication Technology
Information and
Communication Technology
Information and
Communication Technology
Information and
Communication Technology
Information and
Communication Technology
Information and
Communication Technology
Information and
Communication Technology
Area
Organization of Instruction
Organization of Instruction
Organization of Instruction
Organization of Instruction
Organization of Instruction
Organization of Instruction
Organization of Instruction
8
Type
Sub-item
Pilot
AccNum
VE638433
2014
Sequence
20a
2014
AccNum
VE638433
Sub-item
VE638434
20b
VE638434
Sub-item
VE638428
20c
VE638428
Sub-item
VE638420
20d
VE638420
Area
Information and
Communication Technology
Information and
Communication Technology
Information and
Communication Technology
Information and
Communication Technology
Issue
Organization of Instruction
Organization of Instruction
Organization of Instruction
Organization of Instruction
9
File Type | application/pdf |
File Title | Microsoft Word - CoverAppendix E.doc |
Author | JOConnell |
File Modified | 2013-08-01 |
File Created | 2013-08-01 |