Skip to main content

has yielded many competing

Venkatesh et al./User Acceptance of IT
MIS Quarterly Vol. 27 No. 3, pp. 425-478/September 2003
425
RESEARCH ARTICLE
USER ACCEPTANCE OF INFORMATION
TECHNOLOGY: TOWARD A UNIFIED VIEW1
By: Viswanath Venkatesh
Robert H. Smith School of Business
University of Maryland
Van Munching Hall
College Park, MD 20742
U.S.A.
vvenkate@rhsmith.umd.edu
Michael G. Morris
McIntire School of Commerce
University of Virginia
Monroe Hall
Charlottesville, VA 22903-2493
U.S.A.
mmorris@virginia.edu
Gordon B. Davis
Carlson School of Management
University of Minnesota
321 19
th Avenue South
Minneapolis, MN 55455
U.S.A.
gdavis@csom.umn.edu
Fred D. Davis
Sam M. Walton College of Business
University of Arkansas
Fayetteville, AR 72701-1201
U.S.A.
fdavis@walton.uark.edu
1Cynthia Beath was the accepting senior editor for this
paper.
Abstract
Information technology (IT) acceptance research
has yielded many competing models, each with
different sets of acceptance determinants. In this
paper, we (1) review user acceptance literature
and discuss eight prominent models, (2) empirically compare the eight models and their extensions, (3) formulate a unified model that integrates
elements across the eight models, and (4) empirically validate the unified model. The eight models
reviewed are the theory of reasoned action, the
technology acceptance model, the motivational
model, the theory of planned behavior, a model
combining the technology acceptance model and
the theory of planned behavior, the model of PC
utilization, the innovation diffusion theory, and the
social cognitive theory. Using data from four
organizations over a six-month period with three
points of measurement, the eight models explained between 17 percent and 53 percent of the
variance in user intentions to use information
technology. Next, a unified model, called the
Unified Theory of Acceptance and Use of Technology (UTAUT), was formulated, with four core
determinants of intention and usage, and up to
four moderators of key relationships. UTAUT was
then tested using the original data and found to
outperform the eight individual models (adjusted
R
2 of 69 percent). UTAUT was then confirmed
with data from two new organizations with similar
results (adjusted R
2 of 70 percent). UTAUT thus
provides a useful tool for managers needing to

Venkatesh et al./User Acceptance of IT
426 MIS Quarterly Vol. 27 No. 3/September 2003
assess the likelihood of success for new technology introductions and helps them understand the
drivers of acceptance in order to proactively design interventions (including training, marketing,
etc.) targeted at populations of users that may be
less inclined to adopt and use new systems. The
paper also makes several recommendations for
future research including developing a deeper
understanding of the dynamic influences studied
here, refining measurement of the core constructs
used in UTAUT, and understanding the organizational outcomes associated with new technology
use.
Keywords: Theory of planned behavior, innovation characteristics, technology acceptance
model, social cognitive theory, unified model,
integrated model
Introduction
The presence of computer and information technologies in today’s organizations has expanded
dramatically. Some estimates indicate that, since
the 1980s, about 50 percent of all new capital
investment in organizations has been in information technology (Westland and Clark 2000). Yet,
for technologies to improve productivity, they must
be accepted and used by employees in organizations. Explaining user acceptance of new technology is often described as one of the most
mature research areas in the contemporary information systems (IS) literature (e.g., Hu et al.
1999). Research in this area has resulted in
several theoretical models, with roots in information systems, psychology, and sociology, that
routinely explain over 40 percent of the variance in
individual intention to use technology (e.g., Davis
et al. 1989; Taylor and Todd 1995b; Venkatesh
and Davis 2000). Researchers are confronted
with a choice among a multitude of models and
find that they must “pick and choose” constructs
across the models, or choose a “favored model”
and largely ignore the contributions from
alternative models. Thus, there is a need for a
review and synthesis in order to progress toward
a unified view of user acceptance.
The current work has the following objectives:
(1)
To review the extant user acceptance
models:
The primary purpose of this review
is to assess the current state of knowledge
with respect to understanding individual
acceptance of new information technologies.
This review identifies eight prominent models
and discusses their similarities and differences. Some authors have previously observed some of the similarities across
models.
2 However, our review is the first to
assess similarities and differences across all
eight models, a necessary first step toward
the ultimate goal of the paper: the development of a unified theory of individual acceptance of technology. The review is presented
in the following section.
(2)
To empirically compare the eight models:
We conduct a within-subjects, longitudinal
validation and comparison of the eight
models using data from four organizations.
This provides a baseline assessment of the
relative explanatory power of the individual
models against which the unified model can
be compared. The empirical model comparison is presented in the third section.
(3)
To formulate the Unified Theory of Acceptance and Use of Technology (UTAUT):
Based upon conceptual and empirical similarities across models, we formulate a unified
model. The formulation of UTAUT is presented in the fourth section.
(4)
To empirically validate UTAUT: An empirical
test of UTAUT on the original data provides
preliminary support for our contention that
UTAUT outperforms each of the eight original
models. UTAUT is then cross-validated using
data from two new organizations. The empirical validation of UTAUT is presented in the
fifth section.
2
For example, Moore and Benbasat (1991) adapted the
perceived usefulness and ease of use items from Davis
et al.’s (1989) TAM to measure relative advantage and
complexity, respectively, in their innovation diffusion
model.

Venkatesh et al./User Acceptance of IT
MIS Quarterly Vol. 27 No. 3/September 2003
427

Individual reactions to
using information
technology

Intentions to use
information
technology
Actual use of
information
technology
Figure 1. Basic Concept Underlying User Acceptance Models
Review of Extant User
Acceptance Models
Description of Models
and Constructs
IS research has long studied how and why individuals adopt new information technologies. Within
this broad area of inquiry, there have been several
streams of research. One stream of research
focuses on individual acceptance of technology by
using intention or usage as a dependent variable
(e.g., Compeau and Higgins 1995b; Davis et al.
1989). Other streams have focused on
implementation success at the organizational level
(Leonard-Barton and Deschamps 1988) and tasktechnology fit (Goodhue 1995; Goodhue and
Thompson 1995), among others. While each of
these streams makes important and unique
contributions to the literature on user acceptance
of information technology, the theoretical models
to be included in the present review, comparison,
and synthesis employ intention and/or usage as
the key dependent variable. The goal here is to
understand usage as the dependent variable. The
role of intention as a predictor of behavior (e.g.,
usage) is critical and has been well-established in
IS and the reference disciplines (see Ajzen 1991;
Sheppard et al. 1988; Taylor and Todd 1995b).
Figure 1 presents the basic conceptual framework
underlying the class of models explaining individual acceptance of information technology that
forms the basis of this research. Our review resulted in the identification of eight key competing
theoretical models. Table 1 describes the eight
models and defines their theorized determinants
of intention and/or usage. The models hypothesize between two and seven determinants of
acceptance, for a total of 32 constructs across
the eight models. Table 2 identifies four key
moderating variables (experience, voluntariness,
gender, and age) that have been found to be
significant in conjunction with these models.
Prior Model Tests and
Model Comparisons
There have been many tests of the eight models
but there have only been four studies reporting
empirically-based comparisons of two or more of
the eight models published in the major information systems journals. Table 3 provides a brief
overview of each of the model comparison
studies. Despite the apparent maturity of the research stream, a comprehensive comparison of
the key competing models has not been conducted in a single study. Below, we identify five
limitations of these prior model tests and comparisons, and how we address these limitations in our
work.
Technology studied: The technologies that
have been studied in many of the model
development and comparison studies have
been relatively simple, individual-oriented
information technologies as opposed to more
complex and sophisticated organizational
technologies that are the focus of managerial
concern and of this study.

Table 1. Models and Theories of Individual Acceptance
Theory of Reasoned Action (TRA) Core Constructs Definitions
Drawn from social psychology, TRA is one of the most
fundamental and influential theories of human behavior.
It has been used to predict a wide range of behaviors
(see Sheppard et al. 1988 for a review). Davis et al.
(1989) applied TRA to individual acceptance of techno
logy and found that the variance explained was largely
consistent with studies that had employed TRA in the
context of other behaviors.
Attitude Toward
Behavior
“an individual’s positive or negative feelings (evaluative
affect) about performing the target behavior” (Fishbein and
Ajzen 1975, p. 216).
Subjective Norm “the person’s perception that most people who are
important to him think he should or should not perform the
behavior in question” (Fishbein and Ajzen 1975, p. 302).
Technology Acceptance Model (TAM)
TAM is tailored to IS contexts, and was designed to pre
dict information technology acceptance and usage on
the job. Unlike TRA, the final conceptualization of TAM
excludes the attitude construct in order to better explain
intention parsimoniously. TAM2 extended TAM by in
cluding subjective norm as an additional predictor of
intention in the case of mandatory settings (Venkatesh
and Davis 2000). TAM has been widely applied to a
diverse set of technologies and users.
Perceived Usefulness “the degree to which a person believes that using a
particular system would enhance his or her job
performance” (Davis 1989, p. 320).
Perceived Ease of
Use
“the degree to which a person believes that using a
particular system would be free of effort” (Davis 1989, p.
320).
Subjective Norm Adapted from TRA/TPB. Included in TAM2 only.
Motivational Model (MM)
A significant body of research in psychology has sup
ported general motivation theory as an explanation for
behavior. Several studies have examined motivational
theory and adapted it for specific contexts. Vallerand
(1997) presents an excellent review of the fundamental
tenets of this theoretical base. Within the information
systems domain, Davis et al. (1992) applied motiva
tional theory to understand new technology adoption
and use (see also Venkatesh and Speier 1999).
Extrinsic Motivation The perception that users will want to perform an activity
“because it is perceived to be instrumental in achieving
valued outcomes that are distinct from the activity itself,
such as improved job performance, pay, or promotions”
(Davis et al. 1992, p. 1112).
Intrinsic Motivation The perception that users will want to perform an activity
“for no apparent reinforcement other than the process of
performing the activity per se” (Davis et al. 1992, p. 1112).

 

Table 1. Models and Theories of Individual Acceptance (Continued)
Theory of Planned Behavior (TPB) Core Constructs Definitions
TPB extended TRA by adding the construct of perceived
behavioral control. In TPB, perceived behavioral control
is theorized to be an additional determinant of intention
and behavior. Ajzen (1991) presented a review of
several studies that successfully used TPB to predict
intention and behavior in a wide variety of settings. TPB
has been successfully applied to the understanding of
individual acceptance and usage of many different tech
nologies (Harrison et al. 1997; Mathieson 1991; Taylor
and Todd 1995b). A related model is the Decomposed
Theory of Planned Behavior (DTPB). In terms of pre
dicting intention, DTPB is identical to TPB. In contrast
to TPB but similar to TAM, DTPB “decomposes” atti
tude, subjective norm, and perceived behavioral control
into its the underlying belief structure within technology
adoption contexts.
Attitude Toward
Behavior
Adapted from TRA.
Subjective Norm Adapted from TRA.
Perceived Behavioral
Control
“the perceived ease or difficulty of performing the
behavior” (Ajzen 1991, p. 188). In the context of IS
research, “perceptions of internal and external constraints
on behavior” (Taylor and Todd 1995b, p. 149).
Combined TAM and TPB (C-TAM-TPB)
This model combines the predictors of TPB with
perceived usefulness from TAM to provide a hybrid
model (Taylor and Todd 1995a).
Attitude Toward
Behavior
Adapted from TRA/TPB.
Subjective Norm Adapted from TRA/TPB.
Perceived Behavioral
Control
Adapted from TRA/TPB.
Perceived Usefulness Adapted from TAM.

 

Table 1. Models and Theories of Individual Acceptance (Continued)
Model of PC Utilization (MPCU) Core Constructs Definitions
Derived largely from Triandis’ (1977) theory of human
behavior, this model presents a competing perspective
to that proposed by TRA and TPB. Thompson et al.
(1991) adapted and refined Triandis’ model for IS con
texts and used the model to predict PC utilization.
However, the nature of the model makes it particularly
suited to predict individual acceptance and use of a
range of information technologies. Thompson et al.
(1991) sought to predict usage behavior rather than
intention; however, in keeping with the theory’s roots,
the current research will examine the effect of these
determinants on intention. Also, such an examination is
important to ensure a fair comparison of the different
models.
Job-fit “the extent to which an individual believes that using [a
technology] can enhance the performance of his or her
job” (Thompson et al. 1991, p. 129).
Complexity Based on Rogers and Shoemaker (1971), “the degree to
which an innovation is perceived as relatively difficult to
understand and use” (Thompson et al. 1991, p. 128).
Long-term
Consequences
“Outcomes that have a pay-off in the future” (Thompson et
al. 1991, p. 129).
Affect Towards Use Based on Triandis, affect toward use is “feelings of joy,
elation, or pleasure, or depression, disgust, displeasure,
or hate associated by an individual with a particular act”
(Thompson et al. 1991, p. 127).
Social Factors Derived from Triandis, social factors are “the individual’s
internalization of the reference group’s subjective culture,
and specific interpersonal agreements that the individual
has made with others, in specific social situations”
(Thompson et al. 1991, p. 126).
Facilitating Conditions Objective factors in the environment that observers agree
make an act easy to accomplish. For example, returning
items purchased online is facilitated when no fee is
charged to return the item. In an IS context, “provision of
support for users of PCs may be one type of facilitating
condition that can influence system utilization” (Thompson
et al. 1991, p. 129).

 

Table 1. Models and Theories of Individual Acceptance (Continued)
Innovation Diffusion Theory (IDT) Core Constructs Definitions
Grounded in sociology, IDT (Rogers 1995) has been
used since the 1960s to study a variety of innovations,
ranging from agricultural tools to organizational
innovation (Tornatzky and Klein 1982). Within
information systems, Moore and Benbasat (1991)
adapted the characteristics of innovations presented in
Rogers and refined a set of constructs that could be
used to study individual technology acceptance. Moore
and Benbasat (1996) found support for the predictive
validity of these innovation characteristics (see also
Agarwal and Prasad 1997, 1998; Karahanna et al. 1999;
Plouffe et al. 2001).
Relative Advantage “the degree to which an innovation is perceived as being
better than its precursor” (Moore and Benbasat 1991, p.
195).
Ease of Use “the degree to which an innovation is perceived as being
difficult to use” (Moore and Benbasat 1991, p. 195).
Image “The degree to which use of an innovation is perceived to
enhance one’s image or status in one’s social system”
(Moore and Benbasat 1991, p. 195).
Visibility The degree to which one can see others using the system
in the organization (adapted from Moore and Benbasat
1991).
Compatibility “the degree to which an innovation is perceived as being
consistent with the existing values, needs, and past
experiences of potential adopters” (Moore and Benbasat
1991, p. 195).
Results Demon
strability
“the tangibility of the results of using the innovation,
including their observability and communicability” (Moore
and Benbasat 1991, p. 203).
Voluntariness of Use “the degree to which use of the innovation is perceived as
being voluntary, or of free will” (Moore and Benbasat
1991, p. 195).

 

Table 1. Models and Theories of Individual Acceptance (Continued)
Social Cognitive Theory (SCT) Core Constructs Definitions
One of the most powerful theories of human behavior is
social cognitive theory (see Bandura 1986). Compeau
and Higgins (1995b) applied and extended SCT to the
context of computer utilization (see also Compeau et al.
1999); while Compeau and Higgins (1995a) also em
ployed SCT, it was to study performance and thus is
outside the goal of the current research. Compeau and
Higgins’ (1995b) model studied computer use but the
nature of the model and the underlying theory allow it to
be extended to acceptance and use of information
technology in general. The original model of Compeau
and Higgins (1995b) used usage as a dependent
variable but in keeping with the spirit of predicting
individual acceptance, we will examine the predictive
validity of the model in the context of intention and
usage to allow a fair comparison of the models.
Outcome Expec
tations—
Performance
The performance-related consequences of the behavior.
Specifically, performance expectations deal with job
related outcomes (Compeau and Higgins 1995b).
Outcome
Expectations—
Personal
The personal consequences of the behavior. Specifically,
personal expectations deal with the individual esteem and
sense of accomplishment (Compeau and Higgins 1995b).
Self-efficacy Judgment of one’s ability to use a technology (e.g.,
computer) to accomplish a particular job or task.
Affect An individual’s liking for a particular behavior (e.g.,
computer use).
Anxiety Evoking anxious or emotional reactions when it comes to
performing a behavior (e.g., using a computer).

 

Table 2. Role of Moderators in Existing Models
Model Experience Voluntariness Gender Age
Theory of
Reasoned
Action
Experience was not explicitly included in
the original TRA. However, the role of
experience was empirically examined
using a cross-sectional analysis by Davis
et al. (1989). No change in the salience of
determinants was found. In contrast,
Karahanna et al. (1999) found that attitude
was more important with increasing
experience, while subjective norm became
less important with increasing experience.
Voluntariness was not
included in the original
TRA. Although not
tested, Hartwick and
Barki (1994) suggested
that subjective norm
was more important
when system use was
perceived to be less
voluntary.
N/A N/A
Technology
Acceptance
Model (and
TAM2)
Experience was not explicitly included in
the original TAM. Davis et al. (1989) and
Szajna (1996), among others, have pro
vided empirical evidence showing that
ease of use becomes nonsignificant with
increased experience.
Voluntariness was not
explicitly included in the
original TAM. Within
TAM2, subjective norm
was salient only in
mandatory settings and
even then only in cases
of limited experience
with the system (i.e., a
three-way interaction).
Gender was not in
cluded in the original
TAM. Empirical evi
dence demonstrated
that perceived useful
ness was more salient
for men while perceived
ease of use was more
salient for women
(Venkatesh and Morris
2000). The effect of
subjective norm was
more salient for women
in the early stages of
experience (i.e., a
three-way interaction).
N/A
Motivational
Model
N/A N/A N/A N/A

 

Table 2. Role of Moderators in Existing Models (Continued)
Model Experience Voluntariness Gender Age
Theory of
Planned
Behavior
Experience was not explicitly included in
the original TPB or DTPB. It has been
incorporated into TPB via follow-on studies
(e.g., Morris and Venkatesh 2000).
Empirical evidence has demonstrated that
experience moderates the relationship
between subjective norm and behavioral
intention, such that subjective norm
becomes less important with increasing
levels of experience. This is similar to the
suggestion of Karahanna et al. (1999) in
the context of TRA.
Voluntariness was not
included in the original
TPB or DTPB. As
noted in the discussion
regarding TRA,
although not tested,
subjective norm was
suggested to be more
important when system
use was perceived to
be less voluntary (Hart
wick and Barki 1994).
Venkatesh et al. (2000)
found that attitude was
more salient for men.
Both subjective norm
and perceived beha
vioral control were more
salient for women in
early stages of exper
ience (i.e., three-way
interactions).
Morris and Venkatesh
(2000) found that atti
tude was more salient
for younger workers
while perceived
behavioral control was
more salient for older
workers. Subjective
norm was more salient
to older women (i.e., a
three-way interaction).
Combined
TAM-TPB
Experience was incorporated into this
model in a between-subjects design
(experienced and inexperienced users).
Perceived usefulness, attitude toward
behavior, and perceived behavioral control
were all more salient with increasing
experience while subjective norm became
less salient with increasing experience
(Taylor and Todd 1995a).
N/A N/A N/A

 

Table 2. Role of Moderators in Existing Models (Continued)
Model Experience Voluntariness Gender Age
Model of PC
Utilization
Thompson et al. (1994) found that com
plexity, affect toward use, social factors,
and facilitating conditions were all more
salient with less experience. On the other
hand, concern about long-term conse
quences became increasingly important
with increasing levels of experience.
N/A N/A N/A
Innovation
Diffusion
Theory
Karahanna et al. (1999) conducted a
between-subjects comparison to study the
impact of innovation characteristics on
adoption (no/low experience) and usage
behavior (greater experience) and found
differences in the predictors of adoption
vs. usage behavior. The results showed
that for adoption, the significant predictors
were relative advantage, ease of use, trial
ability, results demonstrability, and visi
bility. In contrast, for usage, only relative
advantage and image were significant.
Voluntariness was not
tested as a moderator,
but was shown to have
a direct effect on
intention.
N/A N/A
Social
Cognitive
Theory
N/A N/A N/A N/A

 

Table 3. Review of Prior Model Comparisons
Model
Comparison
Studies
Theories/
Models
Compared
Context of Study
(Incl.
Technology)
Participants Newness of
Technology
Studied
Number of Points
of Measurement
Cross
Sectional or
Longitudinal
Analysis
Findings
Davis et al.
(1989)
TRA, TAM Within-subjects
model compari
son of intention
and use of a word
processor
107 students Participants
were new to
the technology
Two; 14 weeks
apart
Cross
sectional
analysis at the
two points in
time
The variance in intention
and use explained by TRA
was 32% and 26%, and
TAM was 47% and 51%,
respectively.
Mathieson
(1991)
TAM, TPB Between-subjects
model compari
son of intention to
use a spread
sheet and
calculator
262 students Some famil
iarity with the
technology as
each partici
pant had to
choose a tech
nology to per
form a task
One Cross
sectional
The variance in intention
explained by TAM was
70% and TPB was 62%
Taylor and
Todd (1995b)
TAM,
TPB/DTPB
Within-subjects
model compari
son of intention to
use a computing
resource center
786 students Many students
were already
familiar with the
center
For a three-month
period, all students
visiting the center
were surveyed—
i.e., multiple mea
sures per student.
Cross
sectional
The variance in intention
explained by TAM was
52%, TPB was 57%, and
DTPB was 60%
Plouffe et al.
(2001)
TAM, IDT Within-subjects
model compari
son of behavioral
intention to use
and use in the
context of a mar
ket trial of an
electronic pay
ment system
using smart card.
176
merchants
Survey
administered
after 10 months
of use
One Cross
sectional
The variance in intention
explained by TAM was
33% and IDT was 45%

Venkatesh et al./User Acceptance of IT
MIS Quarterly Vol. 27 No. 3/September 2003
437
Participants: While there have been some
tests of each model in organizational settings,
the participants in three of the four model
comparison studies have been students—
only Plouffe et al. (2001) conducted their
research in a nonacademic setting. This
research is conducted using data collected
from employees in organizations.
Timing of measurement: In general, most of
the tests of the eight models were conducted
well after the participants’ acceptance or
rejection decision rather than during the
active adoption decision-making process.
Because behavior has become routinized,
individual reactions reported in those studies
are retrospective (see Fiske and Taylor 1991;
Venkatesh et al. 2000). With the exception of
Davis et al. (1989), the model comparisons
examined technologies that were already
familiar to the individuals at the time of measurement. In this paper, we examine technologies from the time of their initial introduction
to stages of greater experience.
Nature of measurement: Even studies that
have examined experience have typically
employed cross-sectional and/or betweensubjects comparisons (e.g., Davis et al. 1989;
Karahanna et al. 1999; Szajna 1996; Taylor
and Todd 1995a; Thompson et al. 1994).
This limitation applies to model comparison
studies also. Our work tracks participants
through various stages of experience with a
new technology and compares all models on
all participants.
Voluntary vs. mandatory contexts: Most of
the model tests and all four model comparisons were conducted in voluntary usage
contexts.
3 Therefore, one must use caution
when generalizing those results to the
mandatory settings that are possibly of more
interest to practicing managers. This research examines both voluntary and mandatory implementation contexts.
Empirical Comparison of the
Eight Models
Settings and Participants
Longitudinal field studies were conducted at four
organizations among individuals being introduced
to a new technology in the workplace. To help
ensure our results would be robust across
contexts, we sampled for heterogeneity across
technologies, organizations, industries, business
functions, and nature of use (voluntary vs.
mandatory). In addition, we captured perceptions
as the users’ experience with the technology
increased. At each firm, we were able to time our
data collection in conjunction with a training
program associated with the new technology
introduction. This approach is consistent with
prior training and individual acceptance research
where individual reactions to a new technology
were studied (e.g., Davis et al. 1989; Olfman and
Mandviwalla 1994; Venkatesh and Davis 2000).
A pretested questionnaire containing items measuring constructs from all eight models was
administered at three different points in time:
post-training (T1), one month after implementation
(T2), and three months after implementation (T3).
Actual usage behavior was measured over the sixmonth post-training period. Table 4 summarizes
key characteristics of the organizational settings.
Figure 2 presents the longitudinal data collection
schedule.
Measurement
A questionnaire was created with items validated
in prior research adapted to the technologies and
organizations studied. TRA scales were adapted
from Davis et al. (1989); TAM scales were
adapted from Davis (1989), Davis et al. (1989),
3
Notable exceptions are TRA (Hartwick and Barki 1994)
and TAM2 (Venkatesh and Davis 2000) as well as
studies that have incorporated voluntariness as a direct
effect (on intention) in order to account for perceived
nonvoluntary adoption (e.g., Agarwal and Prasad 1997;
Karahanna et al. 1999; Moore and Benbasat 1991).

Venkatesh et al./User Acceptance of IT
438 MIS Quarterly Vol. 27 No. 3/September 2003
X O X O X O X O
Training User System User System User System Usage
Reactions Use Reactions/ Use Reactions/ Use Measurement
Usage Usage
Measurement Measurement

1 week 1 month 3 months 6 months
Figure 2. Longitudinal Data Collection Schedule

 

Table 4. Description of Studies
Study Industry Functional
Area
Sample
Size
System Description
Voluntary Use
1a Entertainment Product
Development
54 Online meeting manager that could be
used to conduct Web-enabled video or
audio conferences in lieu of face-to-face
or traditional phone conferences
1b Telecomm
Services
Sales 65 Database application that could be used
to access industry standards for particular
products in lieu of other resources (e.g.,
technical manuals, Web sites)
Mandatory Use
2a Banking Business
Account
Management
58 Portfolio analyzer that analysts were
required to use in evaluating existing and
potential accounts
2b Public
Administration
Accounting 38 Proprietary accounting systems on a PC
platform that accountants were required
to use for organizational bookkeeping

and Venkatesh and Davis (2000); MM scales were
adapted from Davis et al. (1992); TPB/DTPB
scales were adapted from Taylor and Todd
(1995a, 1995b); MPCU scales were adapted from
Thompson et al. (1991); IDT scales were adapted
from Moore and Benbasat (1991); and SCT scales
were adapted from Compeau and Higgins (1995a,
1995b) and Compeau et al. (1999). Behavioral
intention to use the system was measured using
a three-item scale adapted from Davis et al.
(1989) and extensively used in much of the
previous individual acceptance research. Sevenpoint scales were used for all of the aforementioned constructs’ measurement, with 1 being the
negative end of the scale and 7 being the positive
end of the scale. In addition to these measures,
perceived voluntariness was measured as a
manipulation check per the scale of Moore and
Benbasat (1991), where 1 was nonvoluntary and
7 was completely voluntary. The tense of the
verbs in the various scales reflected the timing of
measurement: future tense was employed at T1,
present tense was employed at T2 and T3 (see
Karahanna et al. 1999). The scales used to measure the key constructs are discussed in a later
section where we perform a detailed comparison
(Tables 9 through 13). A focus group of five
business professionals evaluated the questionnaire, following which minor wording changes
were made. Actual usage behavior was measured as duration of use via system logs. Due to
the sensitivity of usage measures to network

Venkatesh et al./User Acceptance of IT
MIS Quarterly Vol. 27 No. 3/September 2003
439
availability, in all organizations studied, the system
automatically logged off inactive users after a
period of 5 to 15 minutes, eliminating most idle
time from the usage logs.
Results
The perceptions of voluntariness were very high in
studies 1a and 1b (1a: M = 6.50, SD = 0.22; 1b:
M = 6.51, SD = 0.20) and very low in studies 2a
and 2b (1a: M = 1.50, SD = 0.19; 1b: M = 1.49,
SD = 0.18). Given this bi-modal distribution in the
data (voluntary vs. mandatory), we created two
data sets: (1) studies 1a and 1b, and (2) studies
2a and 2b. This is consistent with Venkatesh and
Davis (2000).
Partial least squares (PLS Graph, Version
2.91.03.04) was used to examine the reliability
and validity of the measures. Specifically, 48
separate validity tests (two studies, eight models,
three time periods each) were run to examine
convergent and discriminant validity. In testing
the various models, only the direct effects on
intention were modeled as the goal was to
examine the prediction of intention rather than
interrelationships among determinants of intention; further, the explained variance (R
2) is not
affected by indirect paths. The loading pattern
was found to be acceptable with most loadings
being .70 or higher. All internal consistency
reliabilities were greater than .70. The patterns of
results found in the current work are highly consistent with the results of previous research.
PLS was used to test all eight models at the three
points of measurement in each of the two data
sets. In all cases, we employed a bootstrapping
method (500 times) that used randomly selected
subsamples to test the PLS model.
4 Tables 5 and
6 present the model validation results at
each of
the points of measurement
. The tables report the
variance explained and the beta coefficients. Key
findings emerged from these analyses. First, all
eight models explained individual acceptance,
with variance in intention explained ranging from
17 percent to 42 percent. Also, a key difference
across studies stemmed from the voluntary vs.
mandatory settings—in mandatory settings (study
2), constructs related to social influence were
significant whereas in the voluntary settings (study
1), they were not significant. Finally, the determinants of intention varied over time, with some
determinants going from significant to nonsignificant with increasing experience.
Following the test of the baseline/original specifications of the eight models (Tables 5 and 6), we
examined the moderating influences suggested
(either explicitly or implicitly) in the literature—i.e.,
experience, voluntariness, gender, and age
(Table 2). In order to test these moderating influences, stay true to the model extensions
(Table 2), and conduct a complete test of the
existing models and their extensions, the data
were pooled across studies and time periods.
Voluntariness was a dummy variable used to
separate the situational contexts (study 1 vs.
study 2); this approach is consistent with previous
research (Venkatesh and Davis 2000). Gender
was coded as a 0/1 dummy variable consistent
with previous research (Venkatesh and Morris
2000) and age was coded as a continuous variable, consistent with prior research (Morris and
Venkatesh 2000). Experience was operationalized via a dummy variable that took ordinal values
of 0, 1, or 2 to capture increasing levels of user
experience with the system (T1, T2, and T3).
Using an ordinal dummy variable, rather than
categorical variables, is consistent with recent
research (e.g., Venkatesh and Davis 2000).
Pooling the data across the three points of measurement resulted in a sample of 645 (215 × 3).
The results of the pooled analysis are shown in
Table 7.
Because pooling across time periods allows the
explicit modeling of the moderating role of experience, there is an increase in the variance explained in the case of TAM2 (Table 7) compared
to a main effects-only model reported earlier
(Tables 5 and 6). One of the limitations of pooling
is that there are repeated measures from the
4
The interested reader is referred to a more detailed
exposition of bootstrapping and how it compares to other
techniques of resampling such as jackknifing (see Chin
1998; Efron and Gong 1983).

Venkatesh et al./User Acceptance of IT
440 MIS Quarterly Vol. 27 No. 3/September 2003

Table 5. Study 1: Predicting Intention in Voluntary Settings
Time 1 (N = 119) Time 2 (N = 119) Time 3 (N = 119)
Models Independent variables R2 Beta R2 Beta R2 Beta
TRA Attitude toward using tech. .30 .55*** .26 .51*** .19 .43***
Subjective norm .06 .07 .08
TAM/
TAM2
Perceived usefulness .38 .55*** .36 .60*** .37 .61***
Perceived ease of use .22** .03 .05
Subjective norm .02 .06 .06
MM Extrinsic motivation .37 .50*** .36 .47*** .37 .49***
Intrinsic motivation .22** .22** .24***
TPB/
DTPB
Attitude toward using tech. .37 .52*** .25 .50*** .21 .44***
Subjective norm .05 .04 .05
Perceived behavioral control .24*** .03 .02
C-TAM
TPB
Perceived usefulness .39 .56*** .36 .60*** .39 .63***
Attitude toward using tech. .04 .03 .05
Subjective norm .06 .04 .03
Perceived behavioral control .25*** .02 .03
MPCU Job-fit .37 .54*** .36 .60*** .38 .62***
Complexity (reversed) .23*** .04 .04
Long-term consequences .06 .04 .07
Affect toward use .05 .05 .04
Social factors .04 .07 .06
Facilitating conditions .05 .06 .04
IDT Relative advantage .38 .54*** .37 .61*** .39 .63***
Ease of use .26** .02 .07
Result demonstrability .03 .04 .06
Trialability .04 .09 .08
Visibility .06 .03 .06
Image .06 .05 .07
Compatibility .05 .02 .04
Voluntariness .03 .04 .03
SCT Outcome expectations .37 .47*** .36 .60*** .36 .60***
Self-efficacy .20*** .03 .01
Affect .05 .03 .04
Anxiety -.17* .04 .06

Notes: 1. *p < .05; **p < .01; ***p < .001.
2. When the data were analyzed separately for studies 2a and 2b, the pattern of results was
very similar.

Venkatesh et al./User Acceptance of IT
MIS Quarterly Vol. 27 No. 3/September 2003
441

Table 6. Study 2: Predicting Intention in Mandatory Settings
Time 1 (N = 96) Time 2 (N = 96) Time 3 (N = 96)
Models Independent variables R2 Beta R2 Beta R2 Beta
TRA Attitude toward using tech. .26 .27*** .26 .28*** .17 .40***
Subjective norm .20** .21** .05
TAM/
TAM2
Perceived usefulness .39 .42*** .41 .50*** .36 .60***
Perceived ease of use .21* .23** .03
Subjective norm .20* .03 .04
MM Extrinsic motivation .38 .47*** .40 .49*** .35 .44***
Intrinsic motivation .21** .24** .19**
TPB/
DTPB
Attitude toward using tech. .34 .22* .28 .36*** .18 .43***
Subjective norm .25*** .26** .05
Perceived behavioral control .19* .03 .08
C-TAM
TPB
Perceived usefulness .36 .42*** .35 .51*** .35 .60***
Attitude toward using tech. .07 .08 .04
Subjective norm .20* .23** .03
Perceived behavioral control .19* .11 .09
MPCU Job-fit .37 .42*** .40 .50*** .37 .61***
Complexity (reversed) .20* .02 .04
Long-term consequences .07 .07 .07
Affect toward use .01 .05 .04
Social factors .18* .23** .02
Facilitating conditions .05 .07 .07
IDT Relative advantage .38 .47*** .42 .52*** .37 .61***
Ease of use .20* .04 .04
Result demonstrability .03 .07 .04
Trialability .05 .04 .04
Visibility .04 .04 .01
Image .18* .27** .05
Compatibility .06 .02 .04
Voluntariness .02 .06 .03
SCT Outcome expectations .38 .46*** .39 .44*** .36 .60***
Self-efficacy .19** .21*** .03
Affect .06 .04 .05
Anxiety -.18* -.16* .02

Notes: 1. *p < .05; **p < .01; ***p < .001.
2. When the data were analyzed separately for studies 2a and 2b, the pattern of results was
very similar.

Venkatesh et al./User Acceptance of IT
442 MIS Quarterly Vol. 27 No. 3/September 2003

Table 7. Predicting Intention—Model Comparison Including Moderators: Data
Pooled Across Studies (N = 645)
Model Version Independent Variables R2 Beta Explanation
TRA 1 Attitude (A) .36 .41*** Direct effect
Subjective norm (SN) .11
Experience (EXP) .09
Voluntariness (VOL) .04
A × EXP .03
SN × EXP -.17* Effect decreases with increasing
experience
SN × VOL .17* Effect present only in mandatory
settings
TAM 2a
TAM2
Perceived usefulness (U) .53 .48*** Direct effect
Perceived ease of use (EOU) .11
Subjective norm (SN) .09
Experience (EXP) .06
Voluntariness (VOL) .10
EOU × EXP -.20** Effect decreases with increasing
experience
SN × EXP -.15
SN × VOL -.16* Cannot be interpreted due to
presence of higher-order term
EXP × VOL .07
SN × EXP × VOL -.18** Effect exists only in mandatory
settings but decreases with
increasing experience
2b
TAM
incl.
gender
Perceived usefulness (U) .52 .14* Cannot be interpreted due to
presence of interaction term
Percd. ease of use (EOU) .08
Subjective norm (SN) .02
Gender (GDR) .11
Experience (EXP) .07
U × GDR .31*** Effect is greater for men
EOU × GDR -.20** Effect is greater for women
SN × GDR .11
SN × EXP .02
EXP × GDR .09
SN × GDR × EXP .17** Effect is greater for women but
decreases with increasing
experience

Venkatesh et al./User Acceptance of IT
MIS Quarterly Vol. 27 No. 3/September 2003
443

Table 7. Predicting Intention—Model Comparison Including Moderators: Data
Pooled Across Studies (N = 645) (Continued)
Model Version Independent Variables R2 Beta Explanation
MM 3 Extrinsic motivation .38 .50*** Direct effect
Intrinsic motivation .20*** Direct effect
TPB/
DTPB
4a
TPB
incl. vol
Attitude (A) .36 .40*** Direct effect
Subjective norm (SN) .09
Percd. behrl. control (PBC) .13
Experience (EXP) .10
Voluntariness (VOL) .05
SN × EXP -.17** Effect decreases with increasing
experience
SN × VOL .17** Effect present only in mandatory
settings
4b
TPB
incl.
gender
Attitude (A) .46 .17*** Cannot be interpreted due to
presence of interaction term
Subjective norm (SN) .02
Percd. behrl. control (PBC) .10
Gender (GDR) .01
Experience (EXP) .02
A × GDR .22*** Effect is greater for men
SN × EXP -.12
SN × GDR .10
PBC × GDR .07
PBC × EXP .04
GDR × EXP .15* Term included to test higher
order interactions below
SN × GDR × EXP -.18** Both SN and PBC effects are
higher for women, but the effects
decrease with increasing
experience
PBC × GDR × EXP -.16*
4c
TPB
incl.
age
Attitude (A) .47 .17*** Cannot be interpreted due to
presence of interaction term
Subjective norm (SN) .02
Percd. behrl. control (PBC) .10
Age (AGE) .01
Experience (EXP) .02
A × AGE -.26*** Effect is greater for younger
workers
SN × EXP -.03
SN × AGE .11
PBC × AGE .21** Effect is greater for older
workers

Venkatesh et al./User Acceptance of IT
444 MIS Quarterly Vol. 27 No. 3/September 2003

Table 7. Predicting Intention—Model Comparison Including Moderators: Data
Pooled Across Studies (N = 645) (Continued)
Model Version Independent Variables R2 Beta Explanation
AGE × EXP .15* Term included to test higher
order interaction below
SN × AGE × EXP -.18** Effect is greater for older
workers, but the effect
decreases with increasing
experience
C
TAM
TPB
5 Perceived usefulness (U) .39 .40*** Direct effect
Attitude (A) .09
Subjective norm (SN) .08
Perceived beholder control
(PBC)
.16* Cannot be interpreted due to
presence of interaction term
Experience (EXP) .11
U × EXP .01
A × EXP .08
SN × EXP -.17* Effect decreases with increasing
experience
PBC × EXP -.19** Effect decreases with increasing
experience
MPCU 6 Job-fit (JF) .47 .40*** Direct effect
Complexity (CO) (reversed) .07
Long-term consequences (LTC) .02
Affect toward use (ATU) .05
Social factors (SF) .10
Facilitating condns. (FC) .07
Experience (EXP) .08
CO × EXP (CO—reversed) -.17* Effect decreases with increasing
experience
LTC × EXP .02
ATU × EXP .01
SF × EXP -.20** Effect decreases with increasing
experience
FC × EXP .05
IDT 7 Relative advantage (RA) .40 .49*** Direct effect
Ease of use (EU) .05
Result demonstrability (RD) .02
Trialability (T) .04
Visibility (V) .03
Image (I) .01
Compatibility (COMPAT) .06
Voluntariness of use (VOL) .11

Venkatesh et al./User Acceptance of IT
MIS Quarterly Vol. 27 No. 3/September 2003
445

Table 7. Predicting Intention—Model Comparison Including Moderators: Data
Pooled Across Studies (N = 645) (Continued)
Model Version Independent Variables R2 Beta Explanation
Experience (EXP) .03
EU × EXP -.16* Effect decreases with increasing
experience
RD × EXP .03
V × EXP -.14* Effect decreases with increasing
experience
I × EXP -.14* Effect decreases with increasing
experience
Voluntariness of use (VOL) .05
SCT 8 Outcome expectations .36 .44*** Direct effect
Self-efficacy .18* Direct effect
Affect .01
Anxiety -.15* Direct effect

Notes: 1. *p < .05; **p < .01; ***p < .001.
2. The significance of the interaction terms was also verified using Chow’s test.
same individuals, resulting in measurement errors
that are potentially correlated across time.
However, cross-sectional analysis using Chow’s
(1960) test of beta differences (p < .05) from each
time period (not shown here) confirmed the
pattern of results shown in Table 7. Those beta
differences with a significance of p < .05 or better
(when using Chow’s test) are discussed in the
“Explanation” column in Table 7. The interaction
terms were modeled as suggested by Chin et al.
(1996) by creating interaction terms that were at
the level of the indicators. For example, if latent
variable A is measured by four indicators (A1, A2,
A3, and A4) and latent variable B is measured by
three indicators (B1, B2, and B3), the interaction
term A × B is specified by 12 indicators, each one
a product term—i.e., A1 × B1, A1 × B2, A1 × B3,
A2 × B1, etc.
With the exception of MM and SCT, the predictive
validity of the models increased after including the
moderating variables. For instance, the variance
explained by TAM2 increased to 53 percent and
TAM including gender increased to 52 percent
when compared to approximately 35 percent in
cross-sectional tests of TAM (without moderators).
The explained variance of TRA, TPB/DTPB,
MPCU, and IDT also improved. For each model,
we have only included moderators previously
tested in the literature. For example, in the case
of TAM and its variations, the extensive prior
empirical work has suggested a larger number of
moderators when compared to moderators suggested for other models. This in turn may have
unintentionally biased the results and contributed
to the high variance explained in TAM-related
models when compared to the other models.
Regardless, it is clear that the extensions to the
various models identified in previous research
mostly enhance the predictive validity of the
various models beyond the original specifications.
In looking at technology use as the dependent
variable, in addition to intention as a key predictor,
TPB and DTPB employ perceived behavioral
control as an additional predictor. MPCU employs
facilitating conditions, a construct similar to perceived behavioral control, to predict behavior.
Thus, intention and perceived behavioral control
were used to predict behavior in the subsequent

Venkatesh et al./User Acceptance of IT
446 MIS Quarterly Vol. 27 No. 3/September 2003

Table 8. Predicting Usage Behavior
Use12 Use23 Use34
Independent Variables R2 Beta R2 Beta R2 Beta
Studies
1a and 1b
(voluntary)
(N=119)
Behavioral intention to use (BI) .37 .61*** .36 .60*** .39 .58***
Perceived behavioral control (PBC) .04 .06 .17*
Studies
2a and 2b
(mandatory)
(N = 96)
Behavioral intention to use (BI) .35 .58*** .37 .61*** .39 .56***
Perceived behavioral control (PBC) .07 .07 .20*

Notes: 1. BI, PBC measured at T1 were used to predict usage between time periods 1 and 2 (denoted
Use
12); BI, PBC measured at T2 were used to predict usage between time periods 2 and 3
(Use
23); BI, PBC measured at T3 were used to predict usage between time periods 3 and 4
(Use
34).
2. *p < .05; **p < .01; ***p < .001.
time period: intention from T1 was used to predict
usage behavior measured between T1 and T2 and
so on (see Table 8). Since intention was used to
predict actual behavior, concerns associated with
the employment of subjective measures of usage
do not apply here (see Straub et al. 1995). In
addition to intention being a predictor of use, perceived behavioral control became a significant
direct determinant of use over and above intention
with increasing experience (at T3) indicating that
continued use could be directly hindered or
fostered by resources and opportunities. A nearly
identical pattern of results was found when the
data were analyzed using facilitating conditions
(from MPCU) in place of perceived behavioral
control (the specific results are not shown here).
Having reviewed and empirically compared the
eight competing models, we now formulate a
unified theory of acceptance and use of technology (UTAUT). Toward this end, we examine commonalities across models as a first step. Tables
5, 6, 7, and 8 presented cross-sectional tests of
the baseline models and their extensions. Several
consistent findings emerged. First, for every
model, there was at least one construct that was
significant in all time periods and that construct
also had the strongest influence—e.g., attitude in
TRA and TPB/DTPB, perceived usefulness in
TAM/TAM2 and C-TAM-TPB, extrinsic motivation
in MM, job-fit in MPCU, relative advantage in IDT,
and outcome expectations in SCT. Second,
several other constructs were initially significant,
but then became nonsignificant over time, including perceived behavioral control in TPB/DTPB
and C-TAM-TPB, perceived ease of use in TAM/
TAM2, complexity in MPCU, ease of use in IDT,
and self-efficacy and anxiety in SCT. Finally, the
voluntary vs. mandatory context
did have an
influence on the significance of constructs related
to social influence: subjective norm (TPB/DTPB,
C-TAM-TPB and TAM2), social factors (MPCU),
and image (IDT) were only significant in
mandatory implementations.
Formulation of the Unified
Theory of Acceptance and
Use of Technology
(UTAUT)
Seven constructs appeared to be significant direct
determinants of intention or usage in one or more
of the individual models (Tables 5 and 6). Of
these, we theorize that four constructs will play a

Venkatesh et al./User Acceptance of IT
MIS Quarterly Vol. 27 No. 3/September 2003
447

Performance
Expectancy
Effort
Expectancy
Facilitating
Conditions
Voluntariness
Gender Age Experience of Use
Use
Behavior
Figure 3. Research Model

Social
Influence
Behavioral
Intention
significant role as direct determinants of user
acceptance and usage behavior:
performance
expectancy
, effort expectancy, social influence,
and
facilitating conditions. As will be explained
below,
attitude toward using technology, selfefficacy, and anxiety are theorized not to be direct
determinants of intention. The labels used for the
constructs describe the essence of the construct
and are meant to be independent of any particular
theoretical perspective. In the remainder of this
section, we define each of the determinants,
specify the role of key moderators (gender, age,
voluntariness, and experience), and provide the
theoretical justification for the hypotheses.
Figure 3 presents the research model.
Performance Expectancy
Performance expectancy is defined as the degree
to which an individual believes that using the system will help him or her to attain gains in job
performance. The five constructs from the different models that pertain to performance
expectancy are perceived usefulness (TAM/TAM2
and C-TAM-TPB), extrinsic motivation (MM), job-fit
(MPCU), relative advantage (IDT), and outcome
expectations (SCT). Even as these constructs
evolved in the literature, some authors acknowledged their similarities: usefulness and extrinsic
motivation (Davis et al. 1989, 1992), usefulness
and job-fit (Thompson et al. 1991), usefulness and
relative advantage (Davis et al. 1989; Moore and
Benbasat 1991; Plouffe et al. 2001), usefulness
and outcome expectations (Compeau and Higgins
1995b; Davis et al. 1989), and job-fit and outcome
expectations (Compeau and Higgins 1995b).
The performance expectancy construct within
each individual model (Table 9) is the strongest
predictor of intention and remains significant at all
points of measurement in both voluntary and mandatory settings (Tables 5, 6, and 7), consistent
with previous model tests (Agarwal and Prasad

Venkatesh et al./User Acceptance of IT
448 MIS Quarterly Vol. 27 No. 3/September 2003

Table 9. Performance Expectancy: Root Constructs, Definitions, and Scales
Construct Definition Items
Perceived
Usefulness
(Davis 1989; Davis et
al. 1989)
The degree to which a
person believes that using
a particular system would
enhance his or her job
performance.
1. Using the system in my job would
enable me to accomplish tasks more
quickly.
2. Using the system would improve my job
performance.
3. Using the system in my job would
increase my productivity.
4. Using the system would enhance my
effectiveness on the job.
5. Using the system would make it easier
to do my job.
6. I would find the system useful in my job.
Extrinsic Motivation
(Davis et al. 1992)
The perception that users
will want to perform an
activity because it is per
ceived to be instrumental
in achieving valued out
comes that are distinct
from the activity itself, such
as improved job perfor
mance, pay, or promotions
Extrinsic motivation is operationalized using
the same items as perceived usefulness
from TAM (items 1 through 6 above).
Job-fit
(Thompson et al.
1991)
How the capabilities of a
system enhance an indi
vidual’s job performance.
1. Use of the system will have no effect on
the performance of my job (reverse
scored).
2. Use of the system can decrease the
time needed for my important job
responsibilities.
3. Use of the system can significantly
increase the quality of output on my job.
4. Use of the system can increase the
effectiveness of performing job tasks.
5. Use can increase the quantity of output
for the same amount of effort.
6. Considering all tasks, the general extent
to which use of the system could assist
on the job. (different scale used for this
item).

Venkatesh et al./User Acceptance of IT
MIS Quarterly Vol. 27 No. 3/September 2003
449

Table 9. Performance Expectancy: Root Constructs, Definitions, and Scales
(Continued)
Construct Definition Items
Relative Advantage
(Moore and Benbasat
1991)
The degree to which using
an innovation is perceived
as being better than using
its precursor.
1. Using the system enables me to
accomplish tasks more quickly.
2. Using the system improves the quality of
the work I do.
3. Using the system makes it easier to do
my job.
4. Using the system enhances my
effectiveness on the job.
5. Using the system increases my
productivity.
Outcome
Expectations
(Compeau and
Higgins 1995b;
Compeau et al. 1999)
Outcome expectations
relate to the consequences
of the behavior. Based on
empirical evidence, they
were separated into per
formance expectations
(job-related) and personal
expectations (individual
goals). For pragmatic
reasons, four of the highest
loading items from the
performance expectations
and three of the highest
loading items from the
personal expectations
were chosen from Com
peau and Higgins (1995b)
and Compeau et al. (1999)
for inclusion in the current
research. However, our
factor analysis showed the
two dimensions to load on
a single factor.
If I use the system…
1. I will increase my effectiveness on the
job.
2. I will spend less time on routine job
tasks.
3. I will increase the quality of output of my
job.
4. I will increase the quantity of output for
the same amount of effort.
5. My coworkers will perceive me as
competent.
6. I will increase my chances of obtaining a
promotion.
7. I will increase my chances of getting a
raise.

1998; Compeau and Higgins 1995b; Davis et al.
1992; Taylor and Todd 1995a; Thompson et al.
1991; Venkatesh and Davis 2000). However, from
a theoretical point of view, there is reason to
expect that the relationship between performance
expectancy and intention will be moderated by
gender and age. Research on gender differences
indicates that men tend to be highly task-oriented
(Minton and Schneider 1980) and, therefore, performance expectancies, which focus on task
accomplishment, are likely to be especially salient
to men. Gender schema theory suggests that
such differences stem from gender roles and
socialization processes reinforced from birth
rather than biological gender per se (Bem 1981;
Bem and Allen 1974; Kirchmeyer 1997; Lubinski
et al. 1983; Lynott and McCandless 2000; Motowidlo 1982). Recent empirical studies outside the

Venkatesh et al./User Acceptance of IT
450 MIS Quarterly Vol. 27 No. 3/September 2003
IT context (e.g., Kirchmeyer 2002; Twenge 1997)
have shown that gender roles have a strong
psychological basis and are relatively enduring,
yet open to change over time (see also Ashmore
1990; Eichinger et al. 1991; Feldman and Aschenbrenner 1983; Helson and Moane 1987).
Similar to gender, age is theorized to play a
moderating role. Research on job-related attitudes (e.g., Hall and Mansfield 1975; Porter 1963)
suggests that younger workers may place more
importance on extrinsic rewards. Gender and age
differences have been shown to exist in technology adoption contexts also (Morris and Venkatesh
2000; Venkatesh and Morris 2000). In looking at
gender and age effects, it is interesting to note
that Levy (1988) suggests that studies of gender
differences can be misleading without reference to
age. For example, given traditional societal gender roles, the importance of job-related factors
may change significantly (e.g., become supplanted by family-oriented responsibilities) for
working women between the time that they enter
the labor force and the time they reach childrearing years (e.g., Barnett and Marshall 1991).
Thus, we expect that the influence of performance
expectancy will be moderated by both
gender and
age.
H1: The influence of performance expectancy on behavioral intention will
be moderated by
gender and age,
such that the effect will be stronger
for men and particularly for younger
men.
Effort Expectancy
Effort expectancy is defined as the degree of ease
associated with the use of the system. Three constructs from the existing models capture the
concept of effort expectancy: perceived ease of
use (TAM/TAM2), complexity (MPCU), and ease
of use (IDT). As can be seen in Table 10, there is
substantial similarity among the construct definitions and measurement scales. The similarities
among these constructs have been noted in prior
research (Davis et al. 1989; Moore and Benbasat
1991; Plouffe et al. 2001; Thompson et al. 1991).
The effort expectancy construct within each model
(Table 10) is significant in both voluntary and
mandatory usage contexts; however, each one is
significant only during the first time period (posttraining, T1), becoming nonsignificant over
periods of extended and sustained usage (see
Tables 5, 6, and 7), consistent with previous
research (e.g., Agarwal and Prasad 1997, 1998;
Davis et al. 1989; Thompson et al. 1991, 1994).
Effort-oriented constructs are expected to be be
more salient in the early stages of a new behavior,
when process issues represent hurdles to be
overcome, and later become overshadowed by
instrumentality concerns (Davis et al. 1989;
Szajna 1996; Venkatesh 1999).
Venkatesh and Morris (2000), drawing upon other
research (e.g., Bem and Allen 1974; Bozionelos
1996), suggest that effort expectancy is more
salient for
women than for men. As noted earlier,
the gender differences predicted here could be
driven by cognitions related to gender roles (e.g.,
Lynott and McCandless 2000; Motowidlo 1982;
Wong et al. 1985). Increased
age has been
shown to be associated with difficulty in processing complex stimuli and allocating attention to
information on the job (Plude and Hoyer 1985),
both of which may be necessary when using
software systems. Prior research supports the
notion that constructs related to effort expectancy
will be stronger determinants of individuals’ intention for women (Venkatesh and Morris 2000;
Venkatesh et al. 2000) and for older workers
(Morris and Venkatesh 2000). Drawing from the
arguments made in the context of performance
expectancy, we expect gender, age, and experience to work in concert (see Levy 1988). Thus,
we propose that effort expectancy will be most
salient for
women, particularly those who are older
and with relatively little experience with the
system
.
H2: The influence of effort expectancy
on behavioral intention will be
moderated by
gender, age, and
experience, such that the effect will
be stronger for women, particularly
younger women, and particularly at
early stages of experience.

Venkatesh et al./User Acceptance of IT
MIS Quarterly Vol. 27 No. 3/September 2003
451

Table 10. Effort Expectancy: Root Constructs, Definitions, and Scales
Construct Definition Items
Perceived Ease of Use
(Davis 1989; Davis et
al. 1989)
The degree to which a
person believes that
using a system would be
free of effort.
1. Learning to operate the system would be
easy for me.
2. I would find it easy to get the system to
do what I want it to do.
3. My interaction with the system would be
clear and understandable.
4. I would find the system to be flexible to
interact with.
5. It would be easy for me to become
skillful at using the system.
6. I would find the system easy to use.
Complexity
(Thompson et al. 1991)
The degree to which a
system is perceived as
relatively difficult to
understand and use.
1. Using the system takes too much time
from my normal duties.
2. Working with the system is so
complicated, it is difficult to understand
what is going on.
3. Using the system involves too much time
doing mechanical operations (e.g., data
input).
4. It takes too long to learn how to use the
system to make it worth the effort.
Ease of Use
(Moore and Benbasat
1991)
The degree to which
using an innovation is
perceived as being
difficult to use.
1. My interaction with the system is clear
and understandable.
2. I believe that it is easy to get the system
to do what I want it to do.
3. Overall, I believe that the system is easy
to use.
4. Learning to operate the system is easy
for me.

Social Influence
Social influence is defined as the degree to which
an individual perceives that important others
believe he or she should use the new system.
Social influence as a direct determinant of behavioral intention is represented as subjective norm in
TRA, TAM2, TPB/DTPB and C-TAM-TPB, social
factors in MPCU, and image in IDT. Thompson et
al. (1991) used the term
social norms in defining
their construct, and acknowledge its similarity to
subjective norm within TRA. While they have
different labels, each of these constructs contains
the explicit or implicit notion that the individual’s
behavior is influenced by the way in which they
believe others will view them as a result of having
used the technology. Table 11 presents the three
constructs related to social influence: subjective
norm (TRA, TAM2, TPB/ DTPB, and C-TAM-TPB),
social factors (MPCU), and image (IDT).
The current model comparison (Tables 5, 6, and
7) found that the social influence constructs listed
above behave similarly. None of the social influence constructs are significant in voluntary contexts; however, each becomes significant when

Venkatesh et al./User Acceptance of IT
452 MIS Quarterly Vol. 27 No. 3/September 2003

Table 11. Social Influence: Root Constructs, Definitions, and Scales
Construct Definition Items
Subjective Norm
(Ajzen 1991; Davis et al.
1989; Fishbein and Azjen
1975; Mathieson 1991;
Taylor and Todd 1995a,
1995b)
The person’s perception
that most people who are
important to him think he
should or should not
perform the behavior in
question.
1. People who influence my
behavior think that I should use
the system.
2. People who are important to me
think that I should use the
system.
Social Factors
(Thompson et al. 1991)
The individual’s inter
nalization of the reference
group’s subjective culture,
and specific interpersonal
agreements that the indivi
dual has made with others,
in specific social situations.
1. I use the system because of the
proportion of coworkers who use
the system.
2. The senior management of this
business has been helpful in the
use of the system.
3. My supervisor is very supportive
of the use of the system for my
job.
4. In general, the organization has
supported the use of the system.
Image
(Moore and Benbasat 1991)
The degree to which use of
an innovation is perceived
to enhance one’s image or
status in one’s social
system.
1. People in my organization who
use the system have more
prestige than those who do not.
2. People in my organization who
use the system have a high
profile.
3. Having the system is a status
symbol in my organization.

use is mandated. Venkatesh and Davis (2000)
suggested that such effects could be attributed
to compliance in mandatory contexts that
causes social influences to have a direct effect
on intention; in contrast, social influence in
voluntary contexts operates by influencing perceptions about the technology—the mechanisms at play here are internalization and
identification. In mandatory settings, social
influence appears to be important only in the
early stages of individual experience with the
technology, with its role eroding over time and
eventually becoming nonsignificant with sustained usage (T3), a pattern consistent with the
observations of Venkatesh and Davis (2000).
The role of social influence in technology
acceptance decisions is complex and subject to
a wide range of contingent influences. Social
influence has an impact on individual behavior
through three mechanisms: compliance, internalization, and identification (see Venkatesh and
Davis 2000; Warshaw 1980). While the latter
two relate to altering an individual’s belief structure and/or causing an individual to respond to
potential social status gains, the compliance
mechanism causes an individual to simply alter
his or her intention in response to the social
pressure—i.e., the individual intends to comply
with the social influence. Prior research suggests that individuals are more likely to comply

Venkatesh et al./User Acceptance of IT
MIS Quarterly Vol. 27 No. 3/September 2003
453
with others’ expectations when those referent
others have the ability to reward the desired
behavior or punish nonbehavior (e.g., French
and Raven 1959; Warshaw 1980). This view of
compliance is consistent with results in the
technology acceptance literature indicating that
reliance on others’ opinions is significant only in
mandatory settings (Hartwick and Barki 1994),
particularly in the early stages of experience,
when an individual’s opinions are relatively illinformed (Agarwal and Prasad 1997; Hartwick
and Barki 1994; Karahanna et al. 1999; Taylor
and Todd 1995a; Thompson et al. 1994; Venkatesh and Davis 2000). This normative pressure
will attenuate over time as increasing
experience provides a more instrumental (rather than
social) basis for individual intention to use the
system.
Theory suggests that
women tend to be more
sensitive to others’ opinions and therefore find
social influence to be more salient when forming
an intention to use new technology (Miller 1976;
Venkatesh et al. 2000), with the effect declining
with experience (Venkatesh and Morris 2000).
As in the case of performance and effort expectancies, gender effects may be driven by psychological phenomena embodied within sociallyconstructed gender roles (e.g., Lubinski et al.
1983). Rhodes’ (1983) meta-analytic review of
age effects concluded that affiliation needs increase with
age, suggesting that older workers
are more likely to place increased salience on
social influences, with the effect declining with
experience (Morris and Venkatesh 2000).
Therefore, we expect a complex interaction with
these moderating variables simultaneously influencing the social influence-intention relationship.
H3: The influence of social influence
on behavioral intention will be
moderated by
gender, age,
voluntariness, and experience,
such that the effect will be
stronger for women, particularly
older women, particularly in mandatory settings in the early stages
of experience.
Facilitating Conditions
Facilitating conditions are defined as the degree
to which an individual believes that an organizational and technical infrastructure exists to
support use of the system. This definition captures concepts embodied by three different constructs: perceived behavioral control (TPB/
DTPB, C-TAM-TPB), facilitating conditions
(MPCU), and compatibility (IDT). Each of these
constructs is operationalized to include aspects
of the technological and/or organizational environment that are designed to remove barriers to
use (see Table 12). Taylor and Todd (1995b)
acknowledged the theoretical overlap by
modeling facilitating conditions as a core component of perceived behavioral control in
TPB/DTPB. The compatibility construct from
IDT incorporates items that tap the fit between
the individual’s work style and the use of the
system in the organization.
The empirical evidence presented in Tables 5,
6, 7, and 8 suggests that the relationships
between each of the constructs (perceived
behavioral control, facilitating conditions, and
compatibility) and intention are similar. Specifically, one can see that perceived behavioral
control is significant in both voluntary and mandatory settings immediately following training
(T1), but that the construct’s influence on
intention disappears by T2. It has been
demonstrated that issues related to the support
infrastructure—a core concept within the facilitating conditions construct—are largely captured
within the effort expectancy construct which taps
the ease with which that tool can be applied
(e.g., Venkatesh 2000). Venkatesh (2000)
found support for full mediation of the effect of
facilitating conditions on intention by effort
expectancy. Obviously, if effort expectancy is
not present in the model (as is the case with
TPB/DTPB), then one would expect facilitating
conditions to become predictive of intention.
Our empirical results are consistent with these
arguments. For example, in TPB/DTPB, the
construct is significant in predicting intention;
however, in other cases (MPCU and IDT), it is
nonsignificant in predicting intention. In short,

Venkatesh et al./User Acceptance of IT
454 MIS Quarterly Vol. 27 No. 3/September 2003

Table 12. Facilitating Conditions: Root Constructs, Definitions, and Scales
Construct Definition Items
Perceived Behavioral
Control
(Ajzen 1991; Taylor and
Todd 1995a, 1995b)
Reflects perceptions of
internal and external
constraints on behavior
and encompasses self
efficacy, resource facili
tating conditions, and
technology facilitating
conditions.
1. I have control over using the system.
2. I have the resources necessary to use
the system.
3. I have the knowledge necessary to
use the system.
4. Given the resources, opportunities
and knowledge it takes to use the
system, it would be easy for me to use
the system.
5. The system is not compatible with
other systems I use.
Facilitating Conditions
(Thompson et al. 1991)
Objective factors in the
environment that
observers agree make
an act easy to do,
including the provision
of computer support.
1. Guidance was available to me in the
selection of the system.
2. Specialized instruction concerning the
system was available to me.
3. A specific person (or group) is
available for assistance with system
difficulties.
Compatibility
(Moore and Benbasat
1991)
The degree to which an
innovation is perceived
as being consistent with
existing values, needs,
and experiences of
potential adopters.
1. Using the system is compatible with
all aspects of my work.
2. I think that using the system fits well
with the way I like to work.
3. Using the system fits into my work
style.

when both performance expectancy constructs
and effort expectancy constructs are present,
facilitating conditions becomes nonsignificant in
predicting intention.
H4a: Facilitating conditions will not
have a significant influence on
behavioral intention.
5
The empirical results also indicate that facilitating
conditions do have a direct influence on usage
beyond that explained by behavioral intentions
alone (see Table 8). Consistent with TPB/DTPB,
facilitating conditions are also modeled as a direct
antecedent of usage (i.e., not fully mediated by
intention). In fact, the effect is expected to
increase with experience as users of technology
find multiple avenues for help and support
throughout the organization, thereby removing
impediments to sustained usage (Bergeron et al.
1990). Organizational psychologists have noted
that older workers attach more importance to
receiving help and assistance on the job (e.g., Hall
and Mansfield 1975). This is further underscored
in the context of complex IT use given the
increasing cognitive and physical limitations associated with age. These arguments are in line with
empirical evidence from Morris and Venkatesh
(2000). Thus, when moderated by
experience and
age, facilitating conditions will have a significant
influence on usage behavior.
H4b: The influence of facilitating conditions on usage will be mode-
5
To test the nonsignificant relationship, we perform a
power analysis in the results section.

Venkatesh et al./User Acceptance of IT
MIS Quarterly Vol. 27 No. 3/September 2003
455
rated by age and experience,
such that the effect will be
stronger for older workers, particularly with increasing experience.
Constructs Theorized Not to Be
Direct Determinants of Intention
Although self-efficacy and anxiety appeared to be
significant direct determinants of intention in SCT
(see Tables 5 and 6), UTAUT does
not include
them as direct determinants. Previous research
(Venkatesh 2000) has shown self-efficacy and
anxiety to be conceptually and empirically distinct
from effort expectancy (perceived ease of use).
Self-efficacy and anxiety have been modeled as
indirect determinants of intention fully mediated by
perceived ease of use (Venkatesh 2000). Consistent with this, we found that self-efficacy and
anxiety appear to be significant determinants of
intention in SCT—i.e., without controlling for the
effect of effort expectancy. We therefore expect
self-efficacy and anxiety to behave similarly, that
is, to be distinct from effort expectancy and to
have no direct effect on intention above and
beyond effort expectancy.

H5a: Computer self-efficacy will not
have a significant influence on
behavioral intention.
6
Compute anxiety will not have
H5b:

a significant influence on behavioral intention.7
Attitude toward using technology is defined as an
individual’s overall affective reaction to using a
system. Four constructs from the existing models
align closely with this definition: attitude toward
behavior (TRA, TPB/DTPB, C-TAM-TPB), intrinsic
motivation
8 (MM), affect toward use (MPCU), and
affect (SCT). Table 13 presents the definitions
and associated scale items for each construct.
Each construct has a component associated with
generalized feeling/affect associated with a given
behavior (in this case, using technology). In
examining these four constructs, it is evident that
they all tap into an individual’s liking, enjoyment,
joy, and pleasure associated with technology use.
Empirically, the attitude constructs present an
interesting case (see Tables 5, 6, and 7). In some
cases (e.g., TRA, TPB/DTPB, and MM), the
attitude construct is significant across all three
time periods and is also the strongest predictor of
behavioral intention. However, in other cases (CTAM-TPB, MPCU, and SCT), the construct was
not significant. Upon closer examination, the
attitudinal constructs are significant only when
specific cognitions—in this case, constructs
related to performance
and effort expectancies—
are
not included in the model. There is empirical
evidence to suggest that affective reactions (e.g.,
intrinsic motivation) may operate through effort
expectancy (see Venkatesh 2000). Therefore, we
consider any observed relationship between
attitude and intention to be spurious and resulting
from the omission of the other key predictors
(specifically, performance and effort expectancies). This spurious relationship likely stems
from the effect of performance and effort expectancies on attitude (see Davis et al. 1989). The
non-significance of attitude in the presence of
such other constructs has been reported in previous model tests (e.g., Taylor and Todd 1995a;
Thompson et al. 1991), despite the fact that this
finding is counter to what is theorized in TRA and
TPB/DTPB. Given that we expect strong relationships in UTAUT between performance expectancy
and intention, and between effort expectancy and
intention, we believe that, consistent with the logic
developed here, attitude toward using technology
6
To test the nonsignificant relationship, we perform a
power analysis in the results section.
7
To test the nonsignificant relationship, we perform a
power analysis in the results section.
8
Some perspectives differ on the role of intrinsic
motivation. For example, Venkatesh (2000) models it as
a determinant of perceived ease of use (effort
expectancy). However, in the motivational model, it is
shown as a direct effect on intention and is shown as
such here.

Venkatesh et al./User Acceptance of IT
456 MIS Quarterly Vol. 27 No. 3/September 2003

Table 13. Attitude Toward Using Technology: Root Constructs, Definitions, and
Scales
Construct Definition Items
Attitude Toward
Behavior
(Davis et al. 1989;
Fishbein and Ajzen
1975; Taylor and Todd
1995a, 1995b)
An individual’s positive or
negative feelings about
performing the target
behavior.
1. Using the system is a bad/good idea.
2. Using the system is a foolish/wise idea.
3. I dislike/like the idea of using the
system.
4. Using the system is unpleasant/
pleasant.
Intrinsic Motivation
(Davis et al. 1992)
The perception that users
will want to perform an
activity for no apparent
reinforcement other than
the process of performing
the activity per se.
1. I find using the system to be enjoyable
2. The actual process of using the system
is pleasant.
3. I have fun using the system.
Affect Toward Use
(Thompson et al. 1991)
Feelings of joy, elation, or
pleasure; or depression,
disgust, displeasure, or
hate associated by an
individual with a particular
act.
1. The system makes work more
interesting.
2. Working with the system is fun.
3. The system is okay for some jobs, but
not the kind of job I want. (R)
Affect
(Compeau and Higgins
1995b; Compeau et al.
1999)
An individual’s liking of
the behavior.
1. I like working with the system.
2. I look forward to those aspects of my
job that require me to use the system.
3. Using the system is frustrating for me.
(R)
4. Once I start working on the system, I
find it hard to stop.
5. I get bored quickly when using the
system. (R)

will not have a direct or interactive influence on
intention.
H5c: Attitude toward using technology will not have a significant
influence on behavioral
intention.
9
Behavioral Intention
Consistent with the underlying theory for all of the
intention models
10 discussed in this paper, we
expect that behavioral intention will have a
significant positive influence on technology usage.
H6: Behavioral intention will have a
significant positive influence on
usage.
9
To test the absence of a relationship, we perform a
power analysis in the results section.
10
For example, see Sheppard et al. (1988) for an
extended review of the intention-behavior relationship.

Venkatesh et al./User Acceptance of IT
MIS Quarterly Vol. 27 No. 3/September 2003
457
Empirical Validation of UTAUT
Preliminary Test of UTAUT
Using the post-training data (T1) pooled across
studies (N = 215), a measurement model of the
seven direct determinants of intention (using all
items that related to each of the constructs) was
estimated. All constructs, with the exception of
use, were modeled using reflective indicators. All
internal consistency reliabilities (ICRs) were
greater than .70. The square roots of the shared
variance between the constructs and their measures were higher than the correlations across
constructs, supporting convergent and discriminant validity—see Table 14(a). The reversecoded affect items of Compeau and Higgins
(1995b) had loadings lower than .60 and were
dropped and the model was reestimated. With
the exception of eight loadings, all others were
greater than .70, the level that is generally considered acceptable (Fornell and Larcker 1981; see
also Compeau and Higgins 1995a, 1995b; Compeau et al. 1999)—see Table 15. Inter-item correlation matrices (details not shown here due to
space constraints) confirmed that intra-construct
item correlations were very high while inter-construct item correlations were low. Results of
similar analyses from subsequent time periods (T2
and T3) also indicated an identical pattern and are
shown in Tables 14(b) and 14(c).
Although the structural model was tested on all
the items, the sample size poses a limitation here
because of the number of latent variables and
associated items. Therefore, we reanalyzed the
data using only four of the highest loading items
from the measurement model for each of the
determinants; intention only employs three items
and use is measured via a single indicator.
UTAUT was estimated using data pooled across
studies at each time period (N = 215). As with the
model comparison tests, the bootstrapping
method was used here to test the PLS models.
An examination of the measurement model from
the analysis using the reduced set of items was
similar to that reported in Tables 14 and 15 in
terms of reliability, convergent validity, discriminant validity, means, standard deviations, and
correlations. The results from the measurement
model estimations (with reduced items) are not
shown here in the interest of space.
An examination of these highest loading items
suggested that they adequately represented the
conceptual underpinnings of the constructs—this
preliminary content validity notwithstanding, we
will return to this issue later in our discussion of
the limitations of this work. Selection based on
item loadings or corrected item-total correlations
are often recommended in the psychometric
literature (e.g., Nunnally and Bernstein 1994).
This approach favors building a homogenous
instrument with high internal consistency, but
could sacrifice content validity by narrowing
domain coverage.
11 The items selected for further
analysis are indicated via an asterisk in Table 15,
and the actual items are shown in Table 16.
Tables 17(a) and 17(b) show the detailed model
test results at each time period for intention and
usage, respectively, including all lower-level
interaction terms. Tables 17(a) and 17(b) also
show the model with direct effects only so the
reader can compare that to a model that includes
the moderating influences. The variance explained at various points in time by a direct
effects-only model and the full model including
interaction terms are shown in Tables 17(a) and
17(b) for intention and usage behavior, respectively.
12 We pooled the data across the different
11
Bagozzi and Edwards (1998) discuss promising new
alternatives to this approach for coping with the inherent
tension between sample size requirements and the
number of items, such as full or partial aggregation of
items.
12
Since PLS does not produce adjusted R2, we used an
alternative procedure to estimate an adjusted R
2. PLS
estimates a measurement model that in turn is used to
generate latent variable observations based on the
loadings; these latent variable observations are used to
estimate the structural model using OLS. Therefore, in
order to estimate the adjusted R
2, we used the latent
variable observations generated from PLS and analyzed
the data using hierarchical regressions in SPSS. We
report the R
2 and adjusted R2 from the hierarchical
regressions. This allows for a direct comparison of
variance explained from PLS with variance explained
(both R
2 and adjusted R2) from traditional OLS
regressions and allows the reader to evaluate the
variance explained-parsimony trade-off.

Venkatesh et al./User Acceptance of IT
458 MIS Quarterly Vol. 27 No. 3/September 2003

Table 14. Measurement Model Estimation for the Preliminary Test of UTAUT
(a) T1 Results (N = 215)
ICR Mean S Dev PE EE ATUT SI FC SE ANX BI
PE .92 5.12 1.13 .94
EE .91 4.56 1.40 .31*** .91
ATUT .84 4.82 1.16 .29*** .21** .86
SI .88 4.40 1.04 .30*** -.16* .21** .88
FC .87 4.17 1.02 .18* .31*** .17* .21** .89
SE .89 5.01 1.08 .14 .33*** .16* .18** .33*** .87
ANX .83 3.11 1.14 -.10 -.38*** -.40*** -.20** -.18** -.36*** .84
BI .92 4.07 1.44 .38*** .34*** .25*** .35*** .19** .16* -.23** .84
(b) T2 Results (N = 215)
ICR Mean S Dev PE EE ATUT SI FC SE ANX BI
PE .91 4.71 1.11 .92
EE .90 5.72 0.77 .30*** .90
ATUT .77 5.01 1.42 .25** .20** .86
SI .94 3.88 1.08 .27*** -.19* .21** .88
FC .83 3.79 1.17 .19* .31*** .18* .20** .86
SE .89 5.07 1.14 .24** .35*** .19** .21** .33*** .75
ANX .79 3.07 1.45 -.07 -.32*** -.35*** -.21** -.17* -.35*** .82
BI .90 4.19 0.98 .41*** .27*** .23** .21** .16* .16* -.17* .87
(c) T3 Results (N = 215)
ICR Mean S Dev PE EE ATUT SI FC SE ANX BI
PE .91 4.88 1.17 .94
EE .94 5.88 0.62 .34*** .91
ATUT .81 5.17 1.08 .21** .24** .79
SI .92 3.86 1.60 .27*** -.15 .20** .93
FC .85 3.50 1.12 .19* .28*** .18* .22** .84
SE .90 5.19 1.07 .14* .30*** .22** .20** .33*** .77
ANX .82 2.99 1.03 -.11 -.30*** -.30*** -.20** -.24** -.32*** .82
BI .90 4.24 1.07 .44*** .24** .20** .16* .16* .16* -.14 .89

Notes: 1. ICR: Internal consistency reliability.
2. Diagonal elements are the square root of the shared variance between the constructs and
their measures; off-diagonal elements are correlations between constructs.
3. PE: Performance expectancy; EE: Effort expectancy; ATUT: Attitude toward using
technology; SI: Social influence; FC: Facilitating conditions; SE: Self-efficacy; ANX: Anxiety;
BI: Behavioral intention to use the system.

Venkatesh et al./User Acceptance of IT
MIS Quarterly Vol. 27 No. 3/September 2003
459

Table 15. Item Loadings from PLS (N = 215 at Each Time Period)
Items T1 T2 T3 Items TI T2 T3
Performance Expectancy
(PE)
U1 .82 .81 .80 Social Influence
(SI)
*SN1 .82 .85 .90
U2 .84 .80 .81 *SN2 .83 .85 .84
U3 .81 .84 .84 SF1 .71 .69 .76
U4 .80 .80 .84 *SF2 .84 .80 .90
U5 .81 .78 .84 SF3 .72 .74 .77
*U6 .88 .88 .90 *SF4 .80 .82 .84
*RA1 .87 .90 .90 I1 .69 .72 .72
RA2 .73 .70 .79 I2 .65 .75 .70
RA3 .70 .69 .83 I3 .71 .72 .69
RA4 .71 .74 .74 Facilitating Conditions
(FC)
PBC1 .72 .66 .62
*RA5 .86 .88 .94 *PBC2 .84 .81 .80
JF1 .70 .71 .69 *PBC3 .81 .81 .82
JF2 .67 .73 .64 PBC4 .71 .69 .70
JF3 .74 .70 .79 *PBC5 .80 .82 .80
JF4 .73 .79 .71 FC1 .74 .73 .69
JF5 .77 .71 .73 FC2 .78 .77 .64
JF6 .81 .78 .81 *FC3 .80 .80 .82
OE1 .72 .80 .75 C1 .72 .72 .70
OE2 .71 .68 .77 C2 .71 .74 .74
OE3 .75 .70 .70
OE4 .70 .72 .67
OE5 .72 .72 .70
OE6 .69 .79 .74
*OE7 .86 .87 .90
Effort Expectancy
(EE)
EOU1 .90 .89 .83
EOU2 .90 .89 .88
*EOU3 .94 .96 .91
EOU4 .81 .84 .88
*EOU5 .91 .90 .90
*EOU6 .92 .92 .93
EU1 .84 .80 .84
EU2 .83 .88 .85
EU3 .89 .84 .80
*EU4 .91 .91 .92
CO1 .83 .82 .81 *BI2 .82 .86 .88
CO2 .83 .78 .80 *BI3 .84 .88 .87
CO3 .81 .84 .80
CO4 .75 .73 .78
Attitude Toward Using
Technology (ATUT)
*A1 .80 .83 .85
A2 .67 .64 .65
A3 .64 .64 .71
A4 .72 .71 .64
IM1 .70 .78 .72
IM2 .72 .72 .78
IM3 .73 .75 .81
*AF1 .79 .77 .84
*AF2 .84 .83 .84
AF3 .71 .70 .69
*Affect1 .82 .85 .82
Affect2 .67 .70 .70
Affect3 .62 .68 .64
C3 .78 .77 .70
Self-Efficacy
(SE)
*SE1 .80 .83 .84
SE2 .78 .80 .80
SE3 .72 .79 .74
*SE4 .80 .84 .84
SE5 .77 .74 .69
*SE6 .81 .82 .82
*SE7 .87 .85 .86
SE8 .70 .69 .72
Anxiety
(ANX)
*ANX1 .78 .74 .69
*ANX2 .71 .70 .72
*ANX3 .72 .69 .73
*ANX4 .74 .72 .77
Inten
tion
(BI)
*BI1 .88 .84 .88

Notes: 1. The loadings at T1, T2, and T3 respectively are from separate measurement model tests and relate to
Tables 14(a), 14(b), and 14(c) respectively.
2. Extrinsic motivation (EM) was measured using the same scale as perceived usefulness (U).
3. Items denoted with an asterisk are those that were selected for inclusion in the test of UTAUT.

Venkatesh et al./User Acceptance of IT
460 MIS Quarterly Vol. 27 No. 3/September 2003
Table 16. Items Used in Estimating UTAUT
Performance expectancy

U6:
RA1:
RA5:
OE7:
I would find the system useful in my job.
Using the system enables me to accomplish tasks more quickly.
Using the system increases my productivity.
If I use the system, I will increase my chances of getting a raise.

Effort expectancy
EOU3: My interaction with the system would be clear and understandable.
EOU5: It would be easy for me to become skillful at using the system.
EOU6: I would find the system easy to use.

EU4: Learning to operate the system is easy for me.
Attitude toward using technology
A1:
AF1:
AF2:
Using the system is a bad/good idea.
The system makes work more interesting.
Working with the system is fun.
Affect1: I like working with the system.
Social influence
SN1:
SN2:
SF2:
SF4:
People who influence my behavior think that I should use the system.
People who are important to me think that I should use the system.
The senior management of this business has been helpful in the use of the system.
In general, the organization has supported the use of the system.

Facilitating conditions
PBC2: I have the resources necessary to use the system.
PBC3: I have the knowledge necessary to use the system.
PBC5: The system is not compatible with other systems I use.

FC3:
Self-efficacy
A specific person (or group) is available for assistance with system difficulties.
I could complete a job or task using the system…
SE1:
SE4:
SE6:
SE7:
Anxiety
If there was no one around to tell me what to do as I go.
If I could call someone for help if I got stuck.
If I had a lot of time to complete the job for which the software was provided.
If I had just the built-in help facility for assistance.

ANX1: I feel apprehensive about using the system.
ANX2: It scares me to think that I could lose a lot of information using the system by hitting
the wrong key.
ANX3: I hesitate to use the system for fear of making mistakes I cannot correct.
ANX4: The system is somewhat intimidating to me.
Behavioral intention to use the system

BI1:
BI2:
BI3:
I intend to use the system in the next <n> months.
I predict I would use the system in the next <n> months.
I plan to use the system in the next <n> months.

Venkatesh et al./User Acceptance of IT
MIS Quarterly Vol. 27 No. 3/September 2003
461
points of measurement by converting time period
(experience) into a moderator. The column titled
“Pooled Analysis” reports the results of this
analysis. Caution is necessary when conducting
such analyses and the reader is referred to
Appendix A for a discussion of the potential constraints of pooling. Also reported in Appendix A
are the statistical tests that we conducted prior to
pooling the data for the analysis in Table 17.
As expected, the effect of performance expectancy was in the form of a three-way interaction—the effect was moderated by gender and
age such that it was more salient to younger
workers, particularly men, thus supporting H1.
Note that a direct effect for performance expectancy on intention was observed; however, these
main effects are not interpretable due to the
presence of interaction terms (e.g., Aiken and
West 1991). The effect of effort expectancy was
via a three-way interaction—the effect was moderated by gender and age (more salient to women
and more so to older women). Based on Chow’s
test of beta differences (p < .05), effort expectancy
was more significant with limited exposure to the
technology (effect decreasing with experience),
thus supporting H2. The effect of social influence
was via a four-way interaction—with its role being
more important in the context of mandatory use,
more so among women, and even more so among
older women. The Chow’s test of beta differences
(p < .05) indicated that social influence was even
more significant in the early stages of individual
experience with the technology, thus supporting
H3. Facilitating conditions was nonsignificant as
a determinant of intention, thus supporting H4a.
13
As expected self-efficacy, anxiety, and attitude
did not have any direct effect on intention, thus
supporting H5a, H5b, and H5c. Three of the four
nonsignificant determinants (i.e., self-efficacy,
anxiety, and attitude) were dropped from the
model and the model was reestimated; facilitating
conditions was not dropped from the model
because of its role in predicting use.
14 The reestimated model results are shown in Table 17.
In predicting usage behavior (Table 17b), both
behavioral intention (H6) and facilitating conditions
were significant, with the latter’s effect being
moderated by age (the effect being more important to older workers); and, based on Chow’s test
of the beta differences (p < .05), the effect was
stronger with increasing experience with the
technology (H4b). Since H4a, H5a, H5b, and H5c
are hypothesized such that the predictor is
not
expected to have an effect on the dependent
variable, a power analysis was conducted to
examine the potential for type II error. The likelihood of detecting medium effects was over 95
percent for an alpha level of .05 and the likelihood
of detecting small effects was under 50 pecent.
Cross-Validation of UTAUT
Data were gathered from two additional organizations to further validate UTAUT and add
external validity to the preliminary test. The major
details regarding the two participating organizations are provided in Table 18. The data were
collected on the same timeline as studies 1 and 2
(see Figure 2). The data analysis procedures
were the same as the previous studies. The
results were consistent with studies 1 and 2. The
items used in the preliminary test of UTAUT (listed
in Table 16) were used to estimate the measurement and structural models in the new data.
This helped ensure the test of the same model in
both the preliminary test and this validation, thus
limiting any variation due to the changing of items.
13 UTAUT was then tested separately for each time
Since these two hypotheses were about nonsignificant
relationships, the supportive results should be interpreted with caution, bearing in mind the power analysis
reported in the text. In this case, the concern about the
lack of a relationship is somewhat alleviated as some
previous research in this area has found attitude and
facilitating conditions to not be predictors of intention in
the presence of performance expectancy
and effort
expectancy constructs (see Taylor and Todd 1995b;
Venkatesh 2000).
14
Since these two hypotheses were about nonsignificant
relationships, the supportive results should be interpreted with caution after consideration of the power
analyses reported in the text. The results are presented
here for model completeness and to allow the reader to
link the current research with existing theoretical
perspectives in this domain.

Venkatesh et al./User Acceptance of IT
462 MIS Quarterly Vol. 27 No. 3/September 2003

Table 17. Preliminary Test of UTAUT
(a) Dependent Variable: Intention
T1
(N = 215)
T2
(N = 215)
T3
(N = 215)
Pooled
(N = 645)
D ONLY D + I D ONLY D + I D ONLY D + I D ONLY D + I
R2 (PLS) .40 .51 .41 .52 .42 .50 .31 .76
R2 (hierarchical regrn.) .39 .51 .41 .51 .42 .50 .31 .77
Adjusted R2 (hierarchical regrn.) .35 .46 .38 .46 .36 .45 .27 .69
Performance expectancy (PE) .46*** .17* .57*** .15* .59*** .16* .53*** .18*
Effort expectancy (EE) .20* -.12 .08 .02 .09 .11 .10 .04
Social influence (SI) .13 .10 .10 .07 .07 .04 .11 .01
Facilitating conditions (FC) .03 .04 .02 .01 .01 .01 .09 .04
Gender (GDR) .04 .02 .04 .01 .02 .01 .03 .01
Age (AGE) .08 .02 .09 .08 .01 -.08 .06 .00
Voluntariness (VOL) .01 .04 .03 .02 .04 -.04 .02 .00
Experience (EXP) .04 .00
PE × GDR .07 .17* .06 .02
PE × AGE .13 .04 .10 .01
GDR × AGE .07 .02 .02 .06
PE × GDR × AGE .52*** .55*** .57*** .55***
EE × GDR .17* .08 .09 .02
EE × AGE .08 .04 .02 .04
EE × EXP .02
GDR × AGE (included earlier) Earlier Earlier Earlier Earlier
GDR × EXP .02
AGE × EXP .01
EE × GDR × AGE .22** .20*** .18* .01
EE × GDR × EXP -.10
EE × AGE × EXP -.02
GDR × AGE × EXP -.06
EE × GDR × AGE × EXP -.27***
SI × GDR .11 .00 .02 .02
SI × AGE .01 .06 .01 .02
SI × VOL .02 .01 .02 .06
SI × EXP .04
GDR × AGE (included earlier) Earlier Earlier Earlier Earlier
GDR × VOL .01 .04 .02 .01
GDR × EXP (included earlier) Earlier
AGE × VOL .00 .02 .06 .02
AGE × EXP (included earlier) Earlier
VOL × EXP .02
SI × GDR × AGE -.10 .02 .04 .04
SI × GDR × VOL -.01 .03 .02 .01
SI × GDR × EXP .01
SI × AGE × VOL -.17* .02 .06 .06
SI × AGE × EXP .01
SI × VOL × EXP .00

Venkatesh et al./User Acceptance of IT
MIS Quarterly Vol. 27 No. 3/September 2003
463

Table 17. Preliminary Test of UTAUT (Continued)
T1
(N = 215)
T2
(N = 215)
T3
(N = 215)
Pooled
(N = 645)
D ONLY D + I D ONLY D + I D ONLY D + I D ONLY D + I
GDR × AGE × VOL .02 .02 .01 .00
GDR × AGE × EXP (included
earlier)
Earlier
GDR × VOL × EXP .00
AGE × VOL × EXP .01
SI × GDR × AGE × VOL .25** .23** .20* .04
GDR × AGE × VOL × EXP .02
SI × GDR × AGE × VOL × EXP -.28***
(b) Dependent Variable: Usage Behavior
R2 (PLS) .37 .43 .36 .43 .39 .44 .38 .53
R2 (hierarchical regrn.) .37 .43 .36 .43 .39 .43 .38 .52
Adjusted R2 (hierarchical regrn.) .36 .41 .35 .40 .38 .41 .37 .47
Behavioral intention (BI) .61*** .57*** .60*** .58*** .58*** .59*** .59*** .52***
Facilitating conditions (FC) .05 .07 .06 .07 .18* .07 .10 .11
Age (AGE) .02 .02 .01 .02 .04 .13 .04 .08
Experience (EXP) .06
FC × AGE .22* .24* .27** .02
FC × EXP .00
AGE × EXP .01
FC × AGE × EXP .23**

Notes: 1. D ONLY: Direct effects only; D + I: Direct effects and interaction terms.
2. “Included earlier” indicates that the term has been listed earlier in the table, but is included again for
completeness as it relates to higher-order interaction terms being computed. Grayed out cells are not
applicable for the specific column.

Table 18. Description of Studies
Study Industry Functional
Area
Sample
Size
System Description
Voluntary Use
3 Financial
Services
Research 80 This software application was one of the
resources available to analysts to conduct
research on financial investment opportunities
and IPOs
Mandatory Use
4 Retail
Electronics
Customer
Service
53 Application that customer service representa
tives were required to use to document and
manage service contracts

Venkatesh et al./User Acceptance of IT
464 MIS Quarterly Vol. 27 No. 3/September 2003

Table 19. Measurement Model Estimation for the Cross-Validation of UTAUT
(a) T1 Results (N = 133)
ICR Mean S Dev PE EE ATUT SI FC SE ANX BI
PE .90 4.87 1.20 .88
EE .90 3.17 1.09 .30*** .93
ATUT .80 2.67 0.87 .28*** .16* .80
SI .91 4.01 1.07 .34*** -.21* .20* .89
FC .85 3.12 1.11 .18* .30*** .15* .14 .84
SE .85 4.12 1.08 .19** .37*** .17* .22** .41*** .82
ANX .82 2.87 0.89 -.17* -.41*** -.41*** -.22** -.25** -.35*** .80
BI .89 4.02 1.19 .43*** .31*** .23*** .31*** .22* .24** -.21** .85
(b) T2 Results (N=133)
ICR Mean S Dev PE EE ATUT SI FC SE ANX BI
PE .90 4.79 1.22 .91
EE .92 4.12 1.17 .34*** .89
ATUT .84 3.14 0.79 .32*** .18** .78
SI .92 4.11 1.08 .30*** -.20* .21** .91
FC .88 3.77 1.08 .19* .31*** .13 .11 .86
SE .87 4.13 1.01 .21** .38*** .25** .20** .35*** .80
ANX .83 3.00 0.82 -.17* -.42*** -.32*** -.24** -.26* -.36*** .81
BI .88 4.18 0.99 .40*** .24** .21** .24** .20* .21** -.22** .88
(c) T3 Results (N = 133)
ICR Mean S Dev PE EE ATUT SI FC SE ANX BI
PE .94 5.01 1.17 .92
EE .92 4.89 0.88 .34*** .90
ATUT .83 3.52 1.10 .30*** .19** .80
SI .92 4.02 1.01 .29*** -.21* .21** .84
FC .88 3.89 1.00 .18* .31*** .14 .15 .81
SE .87 4.17 1.06 .18** .32*** .21* .20* .35*** .84
ANX .85 3.02 0.87 -.15* -.31*** -.28*** -.22* -.25* -.38*** .77
BI .90 4.07 1.02 .41*** .20** .21* .19* .20* .20* -.21** .84

Notes: 1. ICR: Internal consistency reliability.
2. Diagonal elements are the square root of the shared variance between the constructs and
their measures; off-diagonal elements are correlations between constructs.
3. PE: Performance expectancy; EE: Effort expectancy; ATUT: Attitude toward using
technology; SI: Social influence; FC: Facilitating conditions; SE: Self-efficacy; ANX: Anxiety;
BI: Behavioral intention to use the system.

Venkatesh et al./User Acceptance of IT
MIS Quarterly Vol. 27 No. 3/September 2003
465

Table 20. Item Loadings from PLS (N=133 at each time period)
Items T1 T2 T3 Items T1 T2 T3
Performance
Expectancy
(PE)
U6 .91 .92 .91 Facilitating
Conditions
(FC)
PBC2 .84 .88 .85
RA1 .90 .89 .88 PBC3 .88 .89 .88
RA5 .94 .89 .90 PBC5 .86 .89 .84
OE7 .89 .90 .91 FC3 .87 .78 .81
Effort
Expectancy
(EE)
EOU3 .91 .90 .94 Self-Efficacy
(SE)
SE1 .90 .84 .88
EOU5 .92 .91 .90 SE4 .88 .82 .81
EOU6 .93 .90 .89 SE6 .80 .85 .79
EU4 .87 .87 .90 SE7 .81 .77 .75
Attitude
Toward
Using
Technology
(ATUT)
A1 .84 .80 .86 Anxiety
(ANX)
ANX1 .80 .84 .80
AF1 .82 .83 .77 ANX2 .84 .84 .82
AF2 .80 .80 .76 ANX3 .83 .80 .83
Affect1 .87 .84 .76 ANX4 .84 .77 .83
Social
Influence
(SI)
SN1 .94 .90 .90 Intention
(BI)
BI1 .92 .90 .91
SN2 .90 .93 .88 BI2 .90 .90 .91
SF2 .89 .92 .94 BI3 .90 .92 .92
SF4 .92 .81 .79

Note: The loadings at T1, T2, and T3 respectively are from separate measurement model tests and
relate to Tables 18(a), 18(b), and 18(c) respectively.

Table 21. Cross-Validation of UTAUT
(a) Dependent Variable: Intention
T1
(N = 133)
T2
(N = 133)
T3
(N = 133)
Pooled
(N = 399)
D ONLY D + I D ONLY D + I D ONLY D + I D ONLY D + I
R2 (PLS) .42 .52 .41 .52 .42 .51 .36 .77
R2 (hierarchical regrn.) .41 .52 .41 .52 .42 .51 .36 .77
Adjusted R2 (hierarchical regrn.) .37 .48 .36 .47 .36 .46 .30 .70
Performance expectancy (PE) .45*** .15 .59*** .16* .59*** .15* .53*** .14
Effort expectancy (EE) .22** .02 .06 .06 .04 .01 .10 .02
Social influence (SI) .02 .04 .04 .04 .02 .01 .02 .02
Facilitating conditions (FC) .07 .01 .00 .08 .01 .00 .07 .01
Gender (GDR) .02 .06 .01 .04 .07 .02 .03 .03
Age (AGE) .01 .00 .07 .02 .02 .01 .07 .01
Voluntariness (VOL) .00 .00 .01 .06 .02 .04 .00 .01
Experience (EXP) .08 .00
PE × GDR .14 .17* .18* .01
PE × AGE .06 .01 .02 -.04
GDR × AGE .02 .04 .04 -.02
EE × GDR -.06 .02 .04 -.09
EE × AGE -.02 .01 .02 -.04
EE × EXP .01

Venkatesh et al./User Acceptance of IT
466 MIS Quarterly Vol. 27 No. 3/September 2003

Table 21. Cross-Validation of UTAUT (Continued)
T1
(N = 133)
T2
(N = 133)
T3
(N = 133)
Pooled
(N = 399)
D ONLY D + I D ONLY D + I D ONLY D + I D ONLY D + I
GDR × AGE (included earlier) Earlier Earlier Earlier Earlier
GDR × EXP .02
AGE × EXP .06
EE × GDR × AGE .21** .18* .16* .04
EE × GDR × EXP .00
EE × AGE × EXP .00
GDR × AGE × EXP .01
EE × GDR × AGE × EXP -.25***
SI × GDR -.06 .00 .02 -.07
SI × AGE .02 .01 .04 .02
SI × VOL .01 .04 .00 .00
SI × EXP .02
GDR × AGE (included earlier) Earlier Earlier Earlier Earlier
GDR × VOL .04 .02 -.03 .02
GDR × EXP (included earlier) Earlier
AGE × VOL .00 .01 .00 .07
AGE × EXP (included earlier) Earlier
VOL × EXP .02
SI × GDR × AGE -.03 -.01 -.07 .00
SI × GDR × VOL .04 -.03 .04 .00
SI × GDR × EXP .00
SI × AGE × VOL .06 .02 .00 .01
SI × AGE × EXP .07
SI × VOL × EXP .02
GDR × AGE × VOL .01 .06 .01 .04
GDR × AGE × EXP (included
earlier)
Earlier
GDR × VOL × EXP .01
AGE × VOL × EXP .00
SI × GDR × AGE × VOL .27*** .21** .16* .01
GDR × AGE × VOL × EXP .00
SI × GDR × AGE × VOL × EXP -.29***
(b) Dependent Variable: Usage Behavior
R2 (PLS) .37 .44 .36 .41 .36 .44 .38 .52
R2 (hierarchical regrn.) .37 .43 .36 .41 .36 .44 .37 .52
Adjusted R2 (hierarchical regrn.) .36 .41 .35 .38 .35 .41 .36 .48
Behavioral intention (BI) .60*** .56*** .59*** .50*** .59*** .56*** .59*** .51***
Facilitating conditions (FC) .04 .11 .01 .01 .02 .06 .14* .08
Age (AGE) .06 .02 .06 -.03 -.03 -.01 .06 .02
Experience (EXP) .10
FC × AGE .17* .21** .24** .01
FC × EXP -.06
AGE × EXP -.07
FC × AGE × EXP .22**

Notes: 1. D ONLY: Direct effects only; D + I: Direct effects and interaction terms.
2. “Included earlier” indicates that the term has been listed earlier in the table, but is included again for
completeness as it relates to higher-order interaction terms being computed. Grayed out cells are not
applicable for the specific column.

Venkatesh et al./User Acceptance of IT
MIS Quarterly Vol. 27 No. 3/September 2003
467
period (N = 133 at each time period). The measurement models are shown in Tables 19 and 20.
The pattern of results in this validation (Tables
21(a) and 21(b)) mirrors what was found in the
preliminary test (Table 17). The last column of
Tables 21(a) and 21(b) reports observations from
the pooled analysis as before. Appendix B reports
the statistical tests we conducted prior to pooling
the data for the cross-validation test, consistent
with the approach taken in the preliminary test.
Insofar as the no-relationship hypotheses were
concerned, the power analysis revealed a high
likelihood (over 95 percent) of detecting medium
effects. The variance explained was quite comparable to that found in the preliminary test of
UTAUT.
Discussion
The present research set out to integrate the
fragmented theory and research on individual
acceptance of information technology into a unified theoretical model that captures the essential
elements of eight previously established models.
First, we identified and discussed the eight
specific models of the determinants of intention
and usage of information technology. Second,
these models were empirically compared using
within-subjects, longitudinal data from four organizations. Third, conceptual and empirical similarities across the eight models were used to
formulate the Unified Theory of Acceptance and
Use of Technology (UTAUT). Fourth, the UTAUT
was empirically tested using the original data from
the four organizations and then cross-validated
using new data from an additional two organizations. These tests provided strong empirical
support for UTAUT, which posits three direct
determinants of intention to use (performance
expectancy, effort expectancy, and social influence) and two direct determinants of usage
behavior (intention and facilitating conditions).
Significant moderating influences of experience,
voluntariness, gender, and age were confirmed as
integral features of UTAUT. UTAUT was able to
account for 70 percent of the variance (adjusted
R
2) in usage intention—a substantial improvement
over any of the original eight models and their
extensions. Further, UTAUT was successful in
integrating key elements from among the initial set
of 32 main effects and four moderators as
determinants of intention and behavior collectively
posited by eight alternate models into a model that
incorporated four main effects and four
moderators.
Thus, UTAUT is a definitive model that synthesizes what is known and provides a foundation to
guide future research in this area. By encompassing the combined explanatory power of the
individual models and key moderating influences,
UTAUT advances cumulative theory while retaining a parsimonious structure. Figure 3 presents the model proposed and supported.
Table 22 presents a summary of the findings. It
should be noted that performance expectancy
appears to be a determinant of intention in most
situations: the strength of the relationship varies
with gender and age such that it is more significant for men and younger workers. The effect of
effort expectancy on intention is also moderated
by gender and age such that it is more significant
for women and older workers, and those effects
decrease with experience. The effect of social
influence on intention is contingent on all four
moderators included here such that we found it to
be nonsignificant when the data were analyzed
without the inclusion of moderators. Finally, the
effect of facilitating conditions on usage was only
significant when examined in conjunction with the
moderating effects of age and experience—i.e.,
they only matter for older workers in later stages
of experience.
Prior to discussing the implications of this work, it
is necessary to recognize some of its limitations.
One limitation concerns the scales used to measure the core constructs. For practical analytical
reasons, we operationalized each of the core
constructs in UTAUT by using the highest-loading
items from each of the respective scales. This
approach is consistent with recommendations in
the psychometric literature (e.g., Nunnally and
Bernstein 1994). Such pruning of the instrument
was the only way to have the degrees of freedom
necessary to model the various interaction terms

Venkatesh et al./User Acceptance of IT
468 MIS Quarterly Vol. 27 No. 3/September 2003

Table 22. Summary of Findings
Hypothesis
Number
Dependent
Variables
Independent
Variables
Moderators Explanation
H1 Behavioral
intention
Performance
expectancy
Gender, Age Effect stronger for men and
younger workers
H2 Behavioral
intention
Effort
expectancy
Gender, Age,
Experience
Effect stronger for women,
older workers, and those with
limited experience
H3 Behavioral
intention
Social
influence
Gender, Age,
Voluntariness,
Experience
Effect stronger for women,
older workers, under conditions
of mandatory use, and with
limited experience
H4a Behavioral
intention
Facilitating
conditions
None Nonsignificant due to the effect
being captured by effort
expectancy
H4b Usage Facilitating
conditions
Age,
Experience
Effect stronger for older
workers with increasing
experience
H5a Behavioral
intention
Computer
self-efficacy
None Nonsignificant due to the effect
being captured by effort
expectancy
H5b Behavioral
intention
Computer
anxiety
None Nonsignificant due to the effect
being captured by effort
expectancy
H5c Behavioral
intention
Attitude
toward using
tech.
None Nonsignificant to the effect
being captured by process
expectancy and effort
expectancy
H6 Usage Behavioral
intention
None Direct effect

at the item level as recommended by Chin et al.
(1996). However, one danger of this approach is
that facets of each construct can be eliminated,
thus threatening content validity. Specifically, we
found that choosing the highest-loading items
resulted in items from some of the models not
being represented in some of the core constructs
(e.g., items from MPCU were not represented in
performance expectancy). Therefore, the measures for UTAUT should be viewed as preliminary
and future research should be targeted at more
fully developing and validating appropriate scales
for each of the constructs with an emphasis on
content validity, and then revalidating the model
specified herein (or extending it accordingly) with
the new measures. Our research employed standard measures of intention, but future research
should examine alternative measures of intention
and behavior in revalidating or extending the
research presented here to other contexts.
From a theoretical perspective, UTAUT provides
a refined view of how the determinants of intention
and behavior evolve over time. It is important to

Venkatesh et al./User Acceptance of IT
MIS Quarterly Vol. 27 No. 3/September 2003
469
emphasize that most of the key relationships in
the model are moderated. For example, age has
received very little attention in the technology
acceptance research literature, yet our results
indicate that it moderates all of the key relationships in the model. Gender, which has received
some recent attention, is also a key moderating
influence; however, consistent with findings in the
sociology and social psychology literature (e.g.,
Levy 1988), it appears to work in concert with age,
a heretofore unexamined interaction. For example, prior research has suggested that effort
expectancy is more salient for women (e.g.,
Venkatesh and Morris 2000). While this may be
true, our findings suggest this is particularly true
for the older generation of workers and those with
relatively little experience with a system. While
existing studies have contributed to our understanding of gender and age influences independently, the present research illuminates the interplay of these two key demographic variables and
adds richness to our current understanding of the
phenomenon. We interpret our findings to suggest that as the younger cohort of employees in
the workforce mature, gender differences in how
each perceives information technology may
disappear. This is a hopeful sign and suggests
that oft-mentioned gender differences in the use of
information technology may be transitory, at least
as they relate to a younger generation of workers
raised and educated in the Digital Age.
The complex nature of the interactions observed,
particularly for gender and age, raises several
interesting issues to investigate in future research,
especially given the interest in today’s societal
and workplace environments to create equitable
settings for women and men of all ages. Future
research should focus on identifying the potential
“magic number” for age where effects begin to
appear (say for effort expectancy) or disappear
(say for performance expectancy). While gender
and age are the variables that reveal an interesting pattern of results, future research should
identify the underlying influential mechanisms—
potential candidates here include computer
literacy and social or cultural background, among
others. Finally, although gender moderates three
key relationships, it is imperative to understand
the importance of gender roles and the possibility
that “psychological gender” is the root cause for
the effects observed. Empirical evidence has
demonstrated that gender roles can have a profound impact on individual attitudes and behaviors
both within and outside the workplace (e.g., Baril
et al. 1989; Feldman and Aschenbrenner 1983;
Jagacinski 1987; Keys 1985; Roberts 1997; Sachs
et al. 1992; Wong et al. 1985). Specifically, gender effects observed here could be a manifestation of effects caused by masculinity, femininity,
and androgyny rather than just “biological sex”
(e.g., Lubinski et al. 1983). Future work might be
directed at more closely examining the importance
of gender roles and exploring the socio-psychological basis for gender as a means for better
understanding its moderating role.
As is evident from the literature, the role of social
influence constructs has been controversial.
Some have argued for their inclusion in models of
adoption and use (e.g., Taylor and Todd 1995b;
Thompson et al. 1991), while others have not
included them (e.g., Davis et al. 1989). Previous
work has found social influence to be significant
only in mandatory settings (see Hartwick and
Barki 1994; Venkatesh and Davis 2000). Other
work has found social influence to be more
significant among women in early stages of
experience (e.g., Venkatesh and Morris 2000).
Still other research has found social influence to
be more significant among older workers (e.g.,
Morris and Venkatesh 2000). This research is
among the first to examine these moderating influences in concert. Our results suggest that social
influences do matter; however, they are more
likely to be salient to older workers, particularly
women, and even then during early stages of
experience/adoption. This pattern mirrors that for
effort expectancy with the added caveat that social
influences are more likely to be important in
mandatory usage settings. The contingencies
identified here provide some insights into the way
in which social influences change over time and
may help explain some of the equivocal results
reported in the literature. By helping to clarify the
contingent nature of social influences, this paper
sheds light on when social influence is likely to
play an important role in driving behavior and
when it is less likely to do so.

Venkatesh et al./User Acceptance of IT
470 MIS Quarterly Vol. 27 No. 3/September 2003
UTAUT underscores this point and highlights the
importance of contextual analysis in developing
strategies for technology implementation within
organizations. While each of the existing models
in the domain is quite successful in predicting
technology usage behavior, it is only when one
considers the complex range of potential moderating influences that a more complete picture of
the dynamic nature of individual perceptions about
technology begins to emerge. Despite the ability
of the existing models to predict intention and
usage, current theoretical perspectives on individual acceptance are notably weak in providing
prescriptive guidance to designers. For example,
applying any of the models presented here might
inform a designer that some set of individuals
might find a new system difficult to use. Future
research should focus on integrating UTAUT with
research that has identified causal antecedents of
the constructs used within the model (e.g.,
Karahanna and Straub 1999; Venkatesh 2000;
Venkatesh and Davis 2000) in order to provide a
greater understanding of how the cognitive phenomena that were the focus of this research are
formed. Examples of previously examined determinants of the core predictors include system
characteristics (Davis et al. 1989) and self-efficacy
(Venkatesh 2000). Additional determinants that
have not been explicitly tied into this stream but
merit consideration in future work include tasktechnology fit (Goodhue and Thompson 1994) and
individual ability constructs such as “g”—general
cognitive ability/intelligence (Colquitt et al. 2000).
While the variance explained by UTAUT is quite
high for behavioral research, further work should
attempt to identify and test additional boundary
conditions of the model in an attempt to provide
an even richer understanding of technology adoption and usage behavior. This might take the form
of additional theoretically motivated moderating
influences, different technologies (e.g., collaborative systems, e-commerce applications), different
user groups (e.g., individuals in different functional
areas), and other organizational contexts (e.g.,
public or government institutions). Results from
such studies will have the important benefit of
enhancing the overall generalizability of UTAUT
and/or extending the existing work to account for
additional variance in behavior. Specifically, given
the extensive moderating influences examined
here, a research study that examines the generalizability of these findings with significant representation in each cell (total number of cells: 24;
two levels of voluntariness, three levels of experience [no, limited, more], two levels of gender, and
at least two levels of age [young vs. old]) would be
valuable. Such a study would allow a pairwise,
inter-cell comparison using the rigorous Chow’s
test and provide a clear understanding of the
nature of effects for each construct in each cell.
Related to the predictive validity of this class of
models in general and UTAUT in particular is the
role of intention as a key criterion in user acceptance research—future research should investigate other potential constructs such as behavioral
expectation (Warshaw and Davis 1985) or habit
(Venkatesh et al. 2000) in the nomological network. Employing behavioral expectation will help
account for anticipated changes in intention
(Warshaw and Davis 1985) and thus shed light
even in the early stages of the behavior about the
actual likelihood of behavioral performance since
intention only captures internal motivations to
perform the behavior. Recent evidence suggests
that sustained usage behavior may not be the
result of deliberated cognitions and are simply
routinized or automatic responses to stimuli (see
Venkatesh et al. 2000).
One of the most important directions for future
research is to tie this mature stream of research
into other established streams of work. For
example, little to no research has addressed the
link between user acceptance and individual or
organizational usage outcomes. Thus, while it is
often assumed that usage will result in positive
outcomes, this remains to be tested. The unified
model presented here might inform further inquiry
into the short- and long-term effects of information
technology implementation on job-related outcomes such as productivity, job satisfaction,
organizational commitment, and other performance-oriented constructs. Future research
should study the degree to which systems perceived as
successful from an IT adoption perspective (i.e., those that are liked and highly used
by users) are considered a
success from an
organizational perspective.

Venkatesh et al./User Acceptance of IT
MIS Quarterly Vol. 27 No. 3/September 2003
471
Conclusion
Following from the Unified Theory of Acceptance
and Use of Technology presented here, future
research should focus on identifying constructs
that can add to the prediction of intention and
behavior
over and above what is already known
and understood. Given that UTAUT explains as
much as 70 percent of the variance in intention, it
is possible that we may be approaching the
practical limits of our ability to explain individual
acceptance and usage decisions in organizations.
In the study of information technology implementations in organizations, there has been a
proliferation of competing explanatory models of
individual acceptance of information technology.
The present work advances individual acceptance
research by unifying the theoretical perspectives
common in the literature and incorporating four
moderators to account for dynamic influences
including organizational context, user experience,
and demographic characteristics.
Acknowledgements
The authors thank Cynthia Beath (the senior
editor), the associate editor, and the three anonymous reviewers. We also thank Heather Adams,
Wynne Chin, Deborah Compeau, Shawn Curley,
Ruth Kanfer, Phillip Ackerman, Tracy Ann Sykes,
Peter Todd, Shreevardhan Lele, and Paul Zantek.
Last, but not least, we thank Jan DeGross for her
tireless efforts in typesetting this rather long
paper.
References
Agarwal, R., and Prasad, J. “A Conceptual and
Operational Definition of Personal Innovativeness in the Domain of Information Technology,”
Information Systems Research (9:2), 1998, pp.
204-215.
Agarwal, R., and Prasad, J. “The Role of Innovation Characteristics and Perceived Voluntariness in the Acceptance of Information Technologies,”
Decision Sciences (28:3), 1997, pp.
557-582.
Aiken, L. S., and West, S. G.
Multiple Regression: Testing and Interpreting Interactions,
Sage, London, 1991.
Ajzen, I. “The Theory of Planned Behavior,”
Organizational Behavior and Human Decision
Processes
(50:2), 1991, pp. 179-211.
Ashmore, R. D. “Sex, Gender, and the Individual,”
in
Handbook of Personality, L. A. Pervin (ed.),
Guilford Press, New York, 1980, pp. 486-526.
Bagozzi, R. P., and Edwards, J. R. “A General
Approach to Construct Validation in Organizational Research: Application to the Measurement of Work Values,”
Organizational Research Methods (1:1), 1998, pp. 45-87.
Bandura, A.
Social Foundations of Thought and
Action: A Social Cognitive Theory,
Prentice
Hall, Englewood Cliffs, NJ, 1986.
Baril, G. L., Elbert, N., Mahar-Potter, S., and
Reavy, G. C. “Are Androgynous Managers
Really More Effective?,”
Group and Organization Studies (14:2), 1989, pp. 234-249.
Barnett, R. C., and Marshall, N. L. “The Relationship between Women’s Work and Family Roles
and Their Subjective Well-Being and Psychological Distress,” in
Women, Work and Health:
Stress and Opportunities,
M. Frankenhaeuser,
V. Lundberg, and M. A. Chesney (eds.),
Plenum, New York, 1981, pp. 111-136.
Bem, S. L. “The BSRI and Gender Schema
Theory: A Reply to Spence and Helmreich,”
Psychological Review (88:4), 1981, pp. 369-
371.
Bem, D. J., and Allen, A. “On Predicting Some of
the People Some of the Time: The Search for
Cross-Situational Consistencies in Behavior,”
Psychological Review (81:6), 1974, pp. 506-
520.
Bergeron, F., Rivard, S., and De Serre, L. “Investigating the Support Role of the Information
Center,”
MIS Quarterly (14:3), 1990, pp. 247-
259.
Bozionelos, N. “Psychology of Computer Use:
Prevalence of Computer Anxiety in British
Managers and Professionals,”
Psychological
Reports
(78:3), 1996, pp. 995-1002.
Chin, W. W. “The Partial Least Squares Approach for Structural Equation Modeling,” in
Modern Methods for Business Research,
George A. Marcoulides (ed.), Lawrence
Venkatesh et al./User Acceptance of IT
472 MIS Quarterly Vol. 27 No. 3/September 2003
Erlbaum Associates, New York, 1998, pp. 295-
336.
Chin, W. W., Marcolin, B. L., and Newsted, P. R.
“A Partial Least Squares Latent Variable
Modeling Approach for Measuring Interaction
Effects: Results from a Monte Carlo Simulation
Study and Voice Mail Emotion/Adoption Study,”
in
Proceedings of the International Conference
on Information Systems,
J. I. DeGross, A. Srinivasan, and S. Jarvenpaa (eds.), Cleveland,
OH, 1996, pp. 21-41.
Chow, G. C. “Tests of Equality Between Sets of
Coefficients in Two Linear Regressions,”
Econometrica (28:3), 1960, pp. 591-605.
Colquitt, J. A., LePine, J. A., and Noe, R. A.
“Toward an Integrative Theory of Training Motivation: A Meta-Analytic Path Analysis of 20
Years of Training Research,”
Journal of Applied
Psychology
(85:5), 2000, pp. 678-707.
Compeau, D. R., and Higgins, C. A. “Application
of Social Cognitive Theory to Training for Computer Skills,”
Information Systems Research
(6:2), 1995a, pp. 118-143.
Compeau, D. R., and Higgins, C. A. “Computer
Self-Efficacy: Development of a Measure and
Initial Test,”
MIS Quarterly (19:2), 1995b, pp.
189-211.
Compeau, D. R., Higgins, C. A., and Huff, S.
“Social Cognitive Theory and Individual Reactions to Computing Technology: A Longitudinal
Study,”
MIS Quarterly (23:2), 1999, pp. 145-
158.
Davis, F. D. “Perceived Usefulness, Perceived
Ease of Use, and User Acceptance of Information Technology,”
MIS Quarterly (13:3), 1989,
pp. 319-339.
Davis, F. D., Bagozzi, R. P., and Warshaw, P. R.
“Extrinsic and Intrinsic Motivation to Use Computers in the Workplace,”
Journal of Applied
Social Psychology
(22:14), 1992, pp. 1111-
1132.
Davis, F. D., Bagozzi, R. P., and Warshaw, P. R.
“User Acceptance of Computer Technology: A
Comparison of Two Theoretical Models,”
Management Science (35:8), 1989, pp. 982-1002.
Efron, B., and Gong, G. “A Leisurely Look at the
Bootstrap, the Jackknife, and Cross-Validation,”
The American Statistician (37:1), 1983, pp. 36-
48.
Eichinger, J., Heifetz, L.J., and Ingraham, C.
“Situational Shifts in Sex Role Orientation:
Correlates of Work Satisfaction and Burnout
Among Women in Special Education,”
Sex
Roles
(25:7/8), 1991, pp. 425-430.
Feldman, S. S., and Aschenbrenner, B. “Impact
of Parenthood on Various Aspects of Masculinity and Femininity: A Short-Term Longitudinal Study,”
Developmental Psychology (19:2),
1983, pp. 278-289.
Fiske, S. T., and Taylor, S. E.
Social Cognition,
McGraw-Hill, New York, 1991.
Fishbein, M., and Ajzen, I.
Belief, Attitude, Intention and Behavior: An Introduction to Theory
and Research,
Addison-Wesley, Reading, MA,
1975.
Fornell, C., and Larcker, D. F. “Evaluating Structural Equation Models with Unobservable
Variables and Measurement Error: Algebra
and Statistics,”
Journal of Marketing Research
(18:3), 1981, pp. 382-388.
French, J. R. P., and Raven, B. “The Bases of
Social Power,” in
Studies in Social Power, D.
Cardwright (ed.), Institute for Social Research,
Ann Arbor, MI, 1959, pp. 150-167.
Goodhue, D. L. “Understanding User Evaluations
of Information Systems,”
Management Science
(41:12), 1995, pp. 1827-1844.
Goodhue, D. L., and Thompson, R. L. “TaskTechnology Fit and Individual Performance,”
MIS Quarterly (19:2), 1995, pp. 213-236.
Hall, D., and Mansfield, R. “Relationships of Age
and Seniority with Career Variables of Engineers and Scientists,”
Journal of Applied
Psychology
(60:2), 1995, pp. 201-210.
Harrison, D. A., Mykytyn, P. P., and Riemenschneider, C. K. “Executive Decisions About
Adoption of Information Technology in Small
Business: Theory and Empirical Tests,”
Information Systems Research (8:2), 1997, pp.
171-195.
Hartwick, J., and Barki, H. “Explaining the Role of
User Participation in Information System Use,”
Management Science (40:4), 1994, pp. 40-465.
Helson, R., and Moane, G. “Personality Change
in Women from College to Midlife,”
Journal of
Personality and Social Psychology
(53:1),
1987, pp. 176-186.

Venkatesh et al./User Acceptance of IT
MIS Quarterly Vol. 27 No. 3/September 2003
473
Hu, P. J., Chau, P. Y. K., Sheng, O. R. L., and
Tam, K. Y. “Examining the Technology Acceptance Model Using Physician Acceptance of
Telemedicine Technology,”
Journal of Management Information Systems (16:2), 1999, pp. 91-
112.
Jagacinski, C. M. “Androgyny in a Male-Dominated Field: The Relationship of Sex-Typed
Traits to Performance and Satisfaction in Engineering,
Sex Roles (17:X), 1987, pp. 529-547.
Karahanna, E., and Straub, D. W. “The Psychological Origins of Perceived Usefulness and
Ease of Use,”
Information and Management
(35:4), 1999, pp. 237-250.
Karahanna, E., Straub, D. W., and Chervany, N.
L. “Information Technology Adoption Across
Time: A Cross-Sectional Comparison of PreAdoption and Post-Adoption Beliefs,”
MIS
Quarterly
(23:2), 1999, pp. 183-213.
Keys, D. E. “Gender, Sex Role, and Career
Decision Making of Certified Management Accountants,”
Sex Roles (13:1/2), 1985, pp. 33-
46.
Kirchmeyer, C. “Change and Stability in Manager’s Gender Roles,”
Journal of Applied
Psychology
(87:5), 2002, pp. 929-939.
Kirchmeyer, C. “Gender Roles in a Traditionally
Female Occupation: A Study of Emergency,
Operating, Intensive Care, and Psychiatric
Nurses,
Journal of Vocational Behavior (50:1),
1997, pp. 78-95.
Leonard-Barton, D., and Deschamps, I. “Managerial Influence in the Implementation of New
Technology,”
Management Science (34:10),
1988, pp. 1252-1265.
Levy, J. A. “Intersections of Gender and Aging,”
The Sociological Quarterly (29:4), 1988, pp.
479-486.
Lubinski, D., Tellegen, A. and Butcher, J. N.
“Masculinity, Femininity, and Androgyny
Viewed and Assessed as Distinct Concepts,”
Journal of Personality and Social Psychology
(44:2), 1983, pp. 428-439.
Lynott, P. P., and McCandless, N. J. “The Impact
of Age vs. Life Experiences on the Gender Role
Attitudes of Women in Different Cohorts,”
Journal of Women and Aging (12:2), 2000, pp. 5-21.
Mathieson, K. “Predicting User Intentions: Comparing the Technology Acceptance Model with
the Theory of Planned Behavior,”
Information
Systems Research
(2:3), 1991, pp. 173-191.
Miller, J. B.
Toward a New Psychology of
Women
, Beacon Press, Boston, 1976.
Minton, H. L., and Schneider, F. W.
Differential
Psychology,
Waveland Press, Prospect
Heights, IL, 1980.
Moore, G. C., and Benbasat, I. “Development of
an Instrument to Measure the Perceptions of
Adopting an Information Technology Innovation,”
Information Systems Research (2:3),
1991, pp. 192-222.
Moore, G. C., and Benbasat, I. “Integrating Diffusion of Innovations and Theory of Reasoned
Action Models to Predict Utilization of Information Technology by End-Users,” in
Diffusion
and Adoption of Information Technology
, K.
Kautz and J. Pries-Hege (eds.), Chapman and
Hall, London, 1996, pp. 132-146.
Morris, M. G., and Venkatesh, V. “Age Differences in Technology Adoption Decisions:
Implications for a Changing Workforce,”
Personnel Psychology (53:2), 2000, pp. 375-403.
Motowidlo, S. J. “Sex Role Orientation and
Behavior in a Work Setting,”
Journal of Personality and Social Psychology (42:5), 1982,
pp. 935-945.
Nunnally, J. C., and Bernstein, I. H.
Psychometric
Theory
(3rd ed.), McGraw-Hill, New York, 1994.
Olfman, L., and Mandviwalla, M. “Conceptual
Versus Procedural Software Training for
Graphical User Interfaces: A Longitudinal Field
Experiment,”
MIS Quarterly (18:4), 1994, pp.
405-426.
Plouffe, C. R., Hulland, J. S., and Vandenbosch,
M. “Research Report: Richness Versus Parsimony in Modeling Technology Adoption Decisions—Understanding Merchant Adoption of a
Smart Card-Based Payment System,”
Information Systems Research (12:2), 2001, pp.
208-222.
Plude, D., and Hoyer, W. “Attention and Performance: Identifying and Localizing Age Deficits,” in
Aging and Human Performance, N.
Charness (ed.), John Wiley & Sons, New York,
1985, pp. 47-99.
Porter, L. “Job Attitudes in Management: Perceived Importance of Needs as a Function of
Job Level,”
Journal of Applied Psychology
(47:2), 1963, pp. 141-148.
Venkatesh et al./User Acceptance of IT
474 MIS Quarterly Vol. 27 No. 3/September 2003
Rhodes, S. R. “Age-Related Differences in Work
Attitudes and Behavior: A Review and Conceptual Analysis,”
Psychological Bulletin (93:2),
1983, pp. 328-367.
Roberts, R. W. “Plaster or Plasticity: Are Adult
Work Experiences Associated with Personality
Change in Women?,”
Journal of Personality
(65:2), 1997, pp. 205-232.
Rogers, E.
Diffusion of Innovations, Free Press,
New York, 1995.
Rogers, E. M., and Shoemaker, F. F.
Communication of Innovations: A Cross-Cultural
Approach,
Free Press, New York, 1971.
Sachs, R., Chrisler, J. C., and Devlin, A. S.
“Biographic and Personal Characteristics of
Women in Management,”
Journal of Vocational
Behavior
(41:1), 1992, pp. 89-100.
Sheppard, B. H., Hartwick, J., and Warshaw, P. R.
“The Theory of Reasoned Action: A MetaAnalysis of Past Research with Recommendations for Modifications and Future Research,”
Journal of Consumer Research (15:3), 1988,
pp. 325-343.
Straub, D., Limayem, M., and Karahanna, E.
“Measuring System Usage: Implications for IS
Theory Testing,”
Management Science (41:8),
1995, pp. 1328-1342.
Szajna, B. “Empirical Evaluation of the Revised
Technology Acceptance Model,”
Management
Science
(42:1), 1996, pp. 85-92.
Taylor, S., and Todd, P. A. “Assessing IT Usage:
The Role of Prior Experience,”
MIS Quarterly
(19:2), 1995a, pp. 561-570.
Taylor, S., and Todd, P. A. “Understanding Information Technology Usage: A Test of Competing Models,”
Information Systems Research
(6:4), 1995b, pp. 144-176.
Thompson, R. L., Higgins, C. A., and Howell, J. M.
“Influence of Experience on Personal Computer
Utilization: Testing a Conceptual Model,”
Journal of Management Information Systems
(11:1), 1994, pp. 167-187.
Thompson, R. L., Higgins, C. A., and Howell, J. M.
“Personal Computing: Toward a Conceptual
Model of Utilization,”
MIS Quarterly (15:1),
1991, pp. 124-143.
Tornatzky, L. G., and Klein, K. J. “Innovation
Characteristics and Innovation Adoption-Implementation: A Meta-Analysis of Findings,”
IEEE
Transactions on Engineering Management
(29:1), 1982, pp. 28-45.
Triandis, H. C.
Interpersonal Behavior, Brooke/
Cole, Monterey, CA, 1977.
Twenge, J. M. “Changes in Masculine and
Feminine Traits Over Time: A Meta-Analysis,”
Sex Roles (35:5/6), 1997, pp. 305-325.
Vallerand, R. J. “Toward a Hierarchical Model of
Intrinsic and Extrinsic Motivation,” in
Advances
in Experimental Social Psychology
(29), M.
Zanna (ed.), Academic Press, New York, 1997,
pp. 271-360.
Venkatesh, V. “Creating Favorable User Perceptions: Exploring the Role of Intrinsic Motivation,”
MIS Quarterly (23:2), 1999, pp. 239-
260.
Venkatesh, V. “Determinants of Perceived Ease
of Use: Integrating Perceived Behavioral
Control, Computer Anxiety and Enjoyment into
the Technology Acceptance Model,”
Information Systems Research (11:4), 2000, pp. 342-
365.
Venkatesh, V., and Davis, F. D. “A Theoretical
Extension of the Technology Acceptance
Model: Four Longitudinal Field Studies,”
Management Science (45:2), 2000, pp. 186-204.
Venkatesh, V., and Morris, M. G. “Why Don’t Men
Ever Stop to Ask For Directions? Gender,
Social Influence, and Their Role in Technology
Acceptance and Usage Behavior,”
MIS
Quarterly
(24:1), 2000, pp. 115-139.
Venkatesh, V., Morris, M. G., and Ackerman, P. L.
“A Longitudinal Field Investigation of Gender
Differences in Individual Technology Adoption
Decision Making Processes,”
Organizational
Behavior and Human Decision Processes
(83:1), 2000, pp. 33-60.
Venkatesh, V., and Speier, C. “Computer Technology Training in the Workplace: A Longitudinal Investigation of the Effect of the Mood,”
Organizational Behavior and Human Decision
Processes
(79:1), 1999, pp. 1-28.
Warshaw, P. R. “A New Model for Predicting
Behavioral Intentions: An Alternative to Fishbein,”
Journal of Marketing Research (17:2),
1980, pp. 153-172.
Warshaw, P. R., and Davis, F. D. “Disentangling
Behavioral Intention and Behavioral Expectation,”
Journal of Experimental Social Psychology (21:3), 1985, pp. 213-228.
Venkatesh et al./User Acceptance of IT
MIS Quarterly Vol. 27 No. 3/September 2003
475
West, S. G., and Hepworth, J. T., “Statistical
Issues in the Study of Temporal Data: Daily
Experiences,”
Journal of Personality (59:2),
1991, pp. 609-662.
Westland, J. C., and Clark, T. H. K.
Global Electronic Commerce: Theory and Case Studies,
MIT Press, Cambridge, MA, 2000.
Wong, P. T. P., Kettlewell, G., and Sproule, C. F.
“On the Importance of Being Masculine: Sex
Role, Attribution, and Women’s Career
Achievement,”
Sex Roles (12:7/8), 1985, pp.
757-769.
About the Authors
Viswanath Venkatesh is Tyser Fellow, associate
professor, and director of MBA Consulting at the
Robert H. Smith School of Business at the University of Maryland, and has been on the faculty there
since 1997. He earned his Ph.D. at the University
of Minnesota’s Carlson School of Management.
His research interests are in implementation and
use of technologies in organizations and homes.
His research has been published in many leading
journals including
MIS Quarterly, Information
Systems Research, Management Science, Communications of the ACM, Decision Sciences
,
Organizational Behavior and Human Decision
Processes, Personnel Psychology.
He is currently
an associate editor at
MIS Quarterly and Information Systems Research. He received MIS
Quarterly’s
“Reviewer of the Year” award in 1999.
Michael G. Morris is an assistant professor of
Commerce within the Information Technology area
at the McIntire School of Commerce, University of
Virginia. He received his Ph.D. in Management
Information Systems from Indiana University in
1996. His research interests can broadly be classified as socio-cognitive aspects of human
response to information technology, including user
acceptance of information technology, usability
engineering, and decision-making. His research
has been published in
MIS Quarterly, Organizational Behavior and Human Decision Processes,
and
Personnel Psychology, among others.
Gordon B. Davis, Honeywell Professor of
Management Information Systems in the Carlson
School of Management at the University of Minnesota, is one of the founders of the academic
discipline of information systems. He has lectured
in 25 countries and has written 20 books and over
200 articles, monographs, and book chapters. He
participated in and helped form the major academic associations related to the field of management information systems. He has been
honored as an ACM Fellow and an AIS Fellow,
and is a recipient of the AIS LEO award for
lifetime achievement in the field of information
systems. His research interests include conceptual foundations of information systems, information system design and implementation, and
management of the IS function. He has a Ph.D.
from Stanford University and honorary doctorates
from the University of Lyon, France, the University
of Zurich, Switzerland and the Stockholm School
of Economics, Sweden.
Fred D. Davis is David D. Glass Endowed Chair
Professor in Information Systems and Chair of the
Information Systems Department at the Sam M.
Walton College of Business at the University of
Arkansas. Dr. Davis earned his Ph.D. at MIT’s
Sloan School of Management, and has served on
the business school faculties at University of
Michigan, University of Minnesota, and University
of Maryland. He has taught a wide range of information technology (IT) courses at the undergraduate, MBA, Ph.D., and executive levels. Dr.
Davis has served as associate editor for the
scholarly journals
Management Science, MIS
Quarterly
, and Information Systems Research. He
has published extensively on the subject of user
acceptance of IT and IT-assisted decision making.
His research has appeared in such journals as
MIS Quarterly, Information Systems Research,
Management Science, Journal of Experimental
Social Psychology
, Decision Sciences, Organizational Behavior and Human Decision Processes,
Communications of the AIS, and Journal of MIS.
Current research interests include IT training and
skill acquisition, management of emerging IT, and
the dynamics of organizational change.

Venkatesh et al./User Acceptance of IT
476 MIS Quarterly Vol. 27 No. 3/September 2003
Appendix A
Cautions Related to Pooling Data and Associated Statistical
Tests for Preliminary Test of UTAUT (Studies 1 and 2)
The most critical concern when pooling repeated measures from the same subjects is the possibility of
correlated errors. West and Hepworth describe this problem as follows:
A perusal of the empirical literature over the past 30 years reveals some of the statistical
approaches to repeated measures over time that have been used by personality
researchers. The first and perhaps most classic approach is to aggregate the
observations collected over time on each subject to increase the reliability of
measurement (1991, p. 610).
However, in many cases, these observations will often violate the assumption of independence, requiring
alternate analyses (see West and Hepworth for an extended discussion of alternate approaches).
When error terms can be shown to be independent, West and Hepworth note that “ traditional statistical
analyses such as ANOVA or MR [multiple regression] can be performed directly without any adjustment
of the data.” (pp. 612-613). The best way to determine whether it is appropriate to pool data is to conduct
a test for correlated errors. In order for it to be acceptable to pool the data, the error terms should be
uncorrelated. A second approach uses bootstrapping to select subsamples to conduct a between-subjects
examination of the within-subjects data. In order for it to be acceptable to pool the data, this second test
should yield identical results to the test of the model on the complete data set. Below we report the results
from the specific findings from the correlated errors test and the pattern of findings from the second test.
Correlated Errors Test
We computed the error terms associated with the prediction of intention at T1, T2, and T3 in studies 1
(voluntary settings) and 2 (mandatory settings). Further, we also calculated the error terms when pooled
across both settings—i.e., for cross-sectional tests of the unified model at T1, T2, and T3 respectively.
These computations were conducted both for the preliminary test and the cross-validation (reported below).
The error term correlations across the intention predictions at various points in time are shown below.
Note
that all error correlations are nonsignificant and, therefore, not significantly different from zero in all
situations.
Between-Subjects Test of Within-Subjects Data
While the results above are compelling, as a second check, we pooled the data across different levels of
experience (T1, T2, and T3) and used PLS to conduct a between-subjects test using the within-subjects
data. We used a DOS version of PLS to conduct the analyses (the reason for not using PLS-Graph as in
the primary analyses in the paper was because it did not allow the specification of filtering rules for
selecting observations in bootstraps). Specifically for the between-subjects test, we applied a filtering rule
that selected any given respondent’s observation at only one of the three time periods. This approach en

Venkatesh et al./User Acceptance of IT
MIS Quarterly Vol. 27 No. 3/September 2003
477

Table A1. Correlations Between Error Terms of Intention Construct at Various Time
Periods
Study 1 (Voluntary) T1 T2 T3
T1
T2 .04
T3 .11 .09
Study 2 (Mandatory) T1 T2 T3
T1
T2 .07
T3 .08 .13
Study 1 and 2 (Pooled) T1 T2 T3
T1
T2 .06
T3 .09 .10

sured that the data included to examine the interaction terms with experience did not include any potential
for systematic correlated errors. Using 50 such random subsamples, the model was tested and the results
derived supported the pattern observed when the entire data set was pooled across experience.
Taken together, the analyses reported above support the pooling of data (see Table 17) across levels of
experience and eliminate the potential statistical concerns noted by West and Hepworth in the analysis of
temporal data.

Venkatesh et al./User Acceptance of IT
478 MIS Quarterly Vol. 27 No. 3/September 2003
Appendix B
Statistical Tests Prior to Pooling of Data for
Cross-Validation of UTAUT (Studies 3 and 4)
As with the test of the preliminary model, prior to pooling the data for the cross-validation studies (studies
3 and 4), we conducted statistical tests to examine the independence of observations (as detailed in
Appendix A). The table below presents the error term correlation matrices for intention for studies 3
(voluntary) and 4 (mandatory) as well as pooled across both settings at T1, T2, and T3 respectively.
As in the preliminary test of UTAUT, the error correlation matrices above suggest that there is no constraint
to pooling in the cross-validation study of the model. As before, a between-subjects test of the withinsubjects data was tested using PLS (as described in Appendix A), and the results of that test corroborated
the independence of observations in this sample. In light of both sets of results, we proceeded with the
pooled analysis as reported in the body of the paper (see Table 21).

Table B1. Correlations Between Error Terms of Intention Construct at Various Time
Periods
Study 3 (Voluntary) T1 T2 T3
T1
T2 .01
T3 .07 .11
Study 4 (Mandatory) T1 T2 T3
T1
T2 .04
T3 .02 .08
Study 3 and 4 (Pooled) T1 T2 T3
T1
T2 .03
T3 .05 .10

 

The post has yielded many competing appeared first on My Assignment Tutor.



Logo GET THIS PAPER COMPLETED FOR YOU FROM THE WRITING EXPERTS  CLICK HERE TO ORDER 100% ORIGINAL PAPERS AT PrimeWritersBay.com

Comments

Popular posts from this blog

Should pit bull terriers be banned in my community

 Discussion Forum: Counterarguments (Should pit bull terriers be banned in my community) You created a question about the topic for your W6 Rough Draft. For this discussion, you will give an answer to that question in the form of a thesis statement. "Dieting Makes People Fat" Main Post: Share your thesis statement with your classmates. Please note: As with last week’s discussion, nothing here is set in stone. Be open to changing everything about your topic, including your position and audience, as you research it and get feedback from your classmates. Topic + Position/Purpose + Supporting Points =Thesis Statement Example: Suppose the question you posed in the Week 5 discussion was something like, “Should pit bull terriers be banned in my community?” After doing some preliminary research, you have concluded that pit bulls, if raised properly, are no more dangerous than other breeds of dogs. Your thesis statement can be something like, “Pitbulls should not be banned

Controversy Associated With Dissociative Disorders

 Assignment: Controversy Associated With Dissociative Disorders The  DSM-5-TR  is a diagnostic tool. It has evolved over the decades, as have the classifications and criteria within its pages. It is used not just for diagnosis, however, but also for billing, access to services, and legal cases. Not all practitioners are in agreement with the content and structure of the  DSM-5-TR , and dissociative disorders are one such area. These disorders can be difficult to distinguish and diagnose. There is also controversy in the field over the legitimacy of certain dissociative disorders, such as dissociative identity disorder, which was formerly called multiple personality disorder. In this Assignment, you will examine the controversy surrounding dissociative disorders. You will also explore clinical, ethical, and legal considerations pertinent to working with patients with these disorders. Photo Credit: Getty Images/Wavebreak Media To Prepare · Review this week’s Learning

CYBER SECURITY and how it can impact today's healthcare system and the future

 Start by reading and following these instructions: Create your Assignment submission and be sure to cite your sources, use APA style as required, and check your spelling. Assignment: Recommendations Document Due Week 6 (100 pts) Main Assignment Recommendations Document The 1250 to 1500-word deliverable for this week is an initial draft of your recommendations. Note that this is a working document and may be modified based on insights gained in module eight and your professor's feedback. This document should contain the following elements: Summary of your problem or opportunity definition A list of possible recommendation alternatives. In this section, you are not yet at the point of suggesting the best set of recommendations but you are trying to be creative and explore all the different ways that the problem or opportunity might best be addressed. The end result here will be a list of alternatives among which you will choose your final recom