Original Research

Dynamic MouselabWEB: Individualized
information display matrixes
Zeliha Yıldırım1 and Semra Erpolat Taşabat2
1 Middle East Technical University, Ankara, Turkey 2 Mimar Sinan University, Istanbul, Turkey

This paper introduces Dynamic MouselabWEB, a com-
puterized process tracing tool that was designed to cre-
ate flexible decision-making settings that are similar to
real life. While Dynamic MouselabWEB is an extension
of MouselabWEB, it differs in that it creates individual-
ized information display matrixes (IDMs) rather than pre-
senting predetermined IDMs, so participants decide on
the attributes and alternatives before the decision task.
This structure can improve the involvement of decision-
makers in the decision process and it gives researchers a
chance to observe decision-making behaviors and to ex-
plore new decision-making strategies when the decision
task has only the decision-makers’ important attributes
and appealing alternatives. In order to measure the effect
of this dynamic structure, two groups of students worked
on job selection tasks, one in the Dynamic MouselabWEB
program (n = 32) and one in the traditional Mouselab-
WEB program (n = 20). Results indicate a significant dif-
ference between the decision-making behaviors of these
two groups. The students who chose a job on Dynamic
MouselabWEB were informed more and they spent more
time on task than the other group; in other words, they
were involved more in the decision process.

Keywords: decision-making, process tracing, information display
matrix, MouselabWEB

This article consists of three main sections; the first sec-tion is dedicated to a literature review in order to
make the readers familiar with the background informa-
tion, the second is created for technical aspects of the soft-
ware, and the last section explains the experiment.

Literature review

Decision models

There are mainly two approaches to studying decision-
making behavior: 1) the outcome-based approach and
2) the process tracing approach (Harte & Koele, 2001).
The outcome-based approach attempts to construct
mathematical models for the relationship between in-
put (information) and output (decisions) to reveal
the cognitive patterns underlying decision-making pro-
cesses. This approach has been applied for over
two centuries (Abelson & Levi, 1985; Brehmer, 1994;
Dawes, 1979; Einhorn, Kleinmuntz, & Kleinmuntz,
1979; Ford, Schmitt, Schechtmann, Hults, & Doherty,

1989; Westenberg & Koele, 1994), but it has a large
number of limitations. For example, the approach im-
plies fitting mathematical models to data on decisions
(output) in various situations (input) in order to infer
the underlying decision-making processes without tak-
ing process data into account. According to Svenson
(1979), these models provide surface descriptions of
the processes rather than a detailed information con-
cerning stages in actual decision processes. As a result,
researchers have suggested that cognitive processes
cannot be sufficiently understood by solely studying
input-output relationships and other research methods
need to be included (Ford, et al., 1989; Payne, Braun-
stein, and Carroll, 1978; Svenson, 1979). Researchers
can overcome these concerns by employing the process-
tracing approach, which focuses on patterns of infor-
mation acquisition rather than the output (i.e., the de-
cisions). In most process-tracing studies, information
is presented in an information display matrix which
has at least two alternatives which are characterized
by at least two attributes (Ford et al., 1989). By ob-
serving the information acquisition process, such as
the amount of time spent making decisions, sequences
of information acquisition and the amount of infor-
mation accessed, researchers have gained insights into
cognitive processes and developed more accurate pre-
dictive models (Einhorn & Hogarth, 1981).

Process tracing methods

Several methods have been used to monitor informa-
tion acquisition processes, such as verbal protocols
(Jarvenpaa, 1989), eye movement recording (Russo
& Dosher, 1983), information display boards (Payne,
1976), and the Mouselab system (Payne, Bettman,
& Johnson, 1988). In earlier process-tracing studies,
such as those carried out toward the end of the 1970s,
participants were asked to make a decision by look-
ing at information on cards located in envelopes on an
information display board (Payne, 1976). In order to
access information about a particular attribute for a
particular alternative, the participant had to open the
associated envelope and read the card, which would
then be placed back in the envelope. In the meantime,

Corresponding author: Zeliha Yıldırım, Middle East Technical University,
Üniversiteler Mahallesi Dumlupınar Bulvarı No:1, Ankara, Turkey; e-mail: zl-
hyldrm@gmail.com

10.11588/jddm.2019.1.63149 JDDM | 2019 | Volume 5 | Article 4 | 1

mailto:zlhyldrm@gmail.com
mailto:zlhyldrm@gmail.com
https://doi.org/10.11588/jddm.2019.1.63149


Yıldırım & Taşabat: Dynamic MouselabWEB

the researcher would write down the sequence of ac-
quisitions and the number of times that the envelopes
were opened. Computerized process tracing, such as
the Mouselab system, employs computer graphics to
display information displaying an information display
matrix (Payne et al., 1988). With this system, the
values of attributes are concealed in boxes on a screen
instead of in envelopes. Each box is opened when the
mouse cursor is moved over it (mouseover), and it re-
mains open until the cursor is moved away (mouseout).
Only one box can be opened at a time (Payne et al.,
1988).
In addition to offering all the features of Mouse-

lab, MouselabWEB makes it possible to carry out
research on the Internet (Willemsen & Johnson
2004, 2011). MouselabWEB (www.mouselabweb.org),
which uses Web technology (Javascript, HTML, PHP,
and MySQL), has open-source codes, released under
GNU General Public License v3.0, so researchers can
add new features. Moreover, participants do not need
to download plugins or other software; all they need is
access to the Internet and a mouse to use a Mouselab-
WEB page. Along with MouselabWEB there are other
computerized process tracing tools, such as ISCube
(Tabatabai, 1998), MouseTrace (Jasper & Shapiro,
2002), phased narrowing (Jasper & Levin, 2001; Levin
& Jasper, 1995), active information search (Huber,
Wider, & Huber, 1997; Williamson, Ranyard, & Cuth-
bert, 2000), P1198 (Andersson, 2001) and Computer-
Shop (Huneke, Cole, & Levin, 2004).

Decision strategies

According to Payne, Bettman, and Johnson (1992),
a decision strategy is a “sequence of mental and ef-
fector (actions on the environment) operations that
transform some initial state of knowledge into a fi-
nal knowledge state so that the decision maker per-
ceives that the particular decision problem is solved”
(p. 109). Knowing about decision makers’ cognitive
processes enables us to infer the decision strategies
used and it also enables us to predict future decisional
behavior and decision outcomes (Payne, Braunstein,
& Carroll, 1978). Furthermore, if we are better in-
formed about people’s decision strategies we can de-
sign better decision support systems (Browne, Pitts, &
Wetherbe, 2007; Montgomery, Hosanagar, Krishnan,
& Clay, 2004; Bettman et al., 1993). Although it is sel-
dom possible to precisely identify a particular decision
strategy, researchers can identify types of strategies
by using process measures (Ford et al., 1989). In gen-
eral, decision strategies can be classified as being either
compensatory or non-compensatory. With compen-
satory strategies, decision makers make tradeoffs be-
tween different values of multiple attributes and they
engage in extensive information processing (Stevenson
et al., 1990). When a large amount of information is
gathered in the process, this usually indicates that par-
ticipants are employing compensatory decision-making
strategies (Ford et al., 1989). The weighted adding
strategy, for example, is a compensatory strategy that

requires decision makers to multiply each attribute’s
subjective value by its importance weight to obtain
an overall value for each alternative (Abelson & Levi,
1985). Non-compensatory strategies are selective in
the sense that decision makers restrict their attention
to only part of the available information (Beach &
Mitchell, 1978) and eliminate alternatives that do not
meet their requirements. One example of this is the
lexicographic strategy (LEX), which selects the option
with the best value for the most important attribute.
According to the cost/benefit framework, the choice of
a decision strategy is a function of both costs, i.e. the
effort required to use a rule, and benefits, i.e. the abil-
ity of a strategy to select the best alternative (Beach &
Mitchell, 1978; Russo & Dosher, 1983). By using this
framework, the decision maker has a large repertoire
of decision strategies, and he/she chooses one of these
strategies contingent on the decision tasks’ character-
istics, such as complexity of the decision task and time
pressure, to adapt to decision settings (Bettman et al.,
1993). In addition to decision task-based variables, an-
other important variable that affects the use of various
decision-making strategies is the level of involvement
of the decision maker. Involvement is considered to be
an important mediating variable of consumer behavior
(Mitchell, 1979). For example, an extensive informa-
tion search might indicate a higher level of involvement
(Mittal, 1989).

Process measures

When participants process information in a decision
task, computerized process tracing tools record time-
dependent mouse events, e.g., mouseover (opening of
boxes), mouseout (closing of boxes), the order in which
boxes are opened, the amount of time boxes remain
open, selected options, and total elapsed time since
the display first appears on the screen automatically.
By using this event-based data, several measures can
be computed to characterize decision-making behav-
ior, which are generally divided into depth, content,
and sequence of searches (Ford et al., 1989; Jacoby,
Jaccard, Kuss, Troutman, & Mazursky, 1987; Jasper
& Shapiro, 2002, p. 370). Depth measures focus on de-
termining how extensively the participants attempted
to acquire information. This includes decision-making
time (Hogarth, 1975; Pollay, 1970), proportions of in-
formation sought (Payne, 1976), reacquisition rates
(Jacoby, Chestnut, Weigl, & Fisher, 1976), number
of acquisitions (Bettman et al., 1993), and the av-
erage amount of time spent per item of information
acquired (Bettman et al., 1993). Content measures
are used to quantify the relative weights assigned to
the various types of information and they refer to ex-
actly what information was acquired and which op-
tions were chosen. Content measures include the total
amount of time spent on the information in the boxes
(Bettman et al., 1993), time spent on the most im-
portant attribute/-s, and the proportion of time spent
on the most important attribute. It is common to
ask subjects to rate the importance they assign to at-

10.11588/jddm.2019.1.63149 JDDM | 2019 | Volume 5 | Article 4 | 2

https://doi.org/10.11588/jddm.2019.1.63149


Yıldırım & Taşabat: Dynamic MouselabWEB

tributes by using a Likert rating scale question after
the decision task. When this is done, the attribute
with the highest rating is considered to be the most
important one. Sequence measures are used to de-
scribe whether information selection behavior was pri-
marily attribute- or alternative-based. An alternative-
based search indicates that decision makers first con-
sider several attributes of the same alternative before
proceeding to the next one. In contrast, when they
compare alternatives across attributes first, this is re-
ferred to as an attribute-based approach. A common
metric for defining search patterns is the Search Index
(SI) (Payne, 1976), which measures the relative use of
alternative-based versus attribute-based searches. The
index ranges from -1 to +1, indicating a completely
attribute-based or a completely alternative-based in-
formation search, respectively. If both types of search
patterns are used equally, the index is 0. A positive
index value is often assumed to indicate the usage of
compensatory strategies (e.g., weighted adding strate-
gies), whereas a negative index value is interpreted as
being an indicator for more non-compensatory strate-
gies (e.g., elimination-by-aspects strategies). Böcken-
holt and Hynan (1994) proposed the Strategy Measure
(SM), another index for search patterns, as they found
in a simulation study that the SI is unreliable when
the number of alternatives and attributes is not iden-
tical. When the number of attributes is larger than
the number of alternatives, a positive SI index is more
likely and when the number of attributes is smaller
than the number of alternatives, a negative SI index
is more probable. The SM index also ranges from -1
to +1. Other variables for sequences of searches are
variability in the amount of information searched per
option (Payne, 1976), and variability in the amount of
information searched per attribute (Klayman, 1982).
Lower values for these two measures indicate that de-
cision processing is more evenly spread across all the
attributes or alternatives, respectively.

A new tool: Dynamic MouselabWEB

MouselabWEB

MouselabWEB focuses on the acquisition of infor-
mation to make inferences on the nature of the un-
derlying cognition and test existing decision models.
To do that, it presents information on an attributes-
alternatives matrix. In most computer-based displays,
the attribute values are hidden behind boxes. Partic-
ipants navigate through a task by moving the mouse
and they can reveal information as much as they need
to make a decision. Most computers record time and
sequence of acquisitions with sufficient precision (1/60
th of a second), resulting in a record of box open-
ings and closings. The interpretation of information-
acquisition data is based on two testable assumptions
(Costa-Gomes, Crawford & Broseta, 2001) about the
relationship between search and cognition: The first,
occurrence, states that if information is used by a
decision-maker, it must have been seen by opening a

box. The second, adjacency, assumes that informa-
tion is acquired, rather than memorized because of
limitations in short-term memory and the low cost of
(re-)acquisition.

Motivation behind Dynamic MouselabWEB

Dynamic MouselabWEB was motivated by the follow-
ing aims:

1. Overcome limitations of predetermined informational
structures

One of the limitations of IDMs is that the researcher
has to determine the options, attributes and infor-
mation available (Jacoby et al., 1987). This can
be alleviated by using an individualized structure.,
Aschemann-Witzel and Hamm (2011) suggests as one
of the possibilities for further development of IDMs:

Another method suggested by Jacoby et al.
(1987, p. 155) is an individualized IDM;
this takes account of the criticism of IDM
based on overly stark predefined informa-
tional structures: Before the IDM-survey is
carried out, each individual participant is
asked which of the product attributes are im-
portant to him or her, so that only these cri-
teria are then offered in the matrix. (p. 4)

2. Create stimulus close to real-life decision settings

In the age of the Internet, consumers tend to narrow
down the options and criteria via online shopping web-
sites so they can choose the best option. In light of
this, researchers should prepare IDMs with this kind
of dynamic structure. Allowing decision makers to cre-
ate their own IDMs rather than working with prede-
fined ones resembles certain real-life information envi-
ronments more closely.

3. Encourage researchers to discover new decision
strategies

As Huber (1980) states, future research should not be
limited to decision strategies which are discussed in
the literature:

The list of simple strategies discovered so far
covers only a subset of all possible simple
strategies. ... At least some of the simple
strategies are incomplete. For example, the
lexicographic-ordering strategy assumes that
the decision maker first selects the most im-
portant dimension. It is not clear what hap-
pens if there are two or more most important
dimensions. (pp. 187–188)

10.11588/jddm.2019.1.63149 JDDM | 2019 | Volume 5 | Article 4 | 3

https://doi.org/10.11588/jddm.2019.1.63149


Yıldırım & Taşabat: Dynamic MouselabWEB

4. Increase the involvement of participants in decision
processes

In decision research, involvement is directly related to
motivation to get information. We believe that if par-
ticipants can decide what they want to know about the
options that they are considering, they will be more
motivated to get more information in the decision pro-
cess.

5. Make it freely available for other researchers

Dynamic MouselabWEB is free software like Mouse-
labWEB, so researchers can redistribute it and/or
modify it under the terms of the GNU General Public
License v3.0.

Technical aspects

Dynamic MouselabWEB is an extended version of
MouselabWEB and uses all the features of Mouse-
labWEB. In this section, we list the basic features of
MouselabWEB and discuss the technical requirements
for MouselabWEB, and then we explain the structure
of Dynamic MouselabWEB.

Basic features

An experiment based on MouselabWEB generally
has more than one HTML page, such as an intro-
duction page, pages for warm-up tasks, a decision
task page and a survey page. Each page is linked
to the following page by defining “nextURL,” which
is placed in the source code. MouselabWEB pages
can be generated using the MouselabWEB Designer
(http://www.mouselabweb.org/designer/index.html),
an online editor program. The main purpose of the
Designer is to generate the MouselabWEB boxes
structure, which can be designed by using different
features. According to the study aim, researchers can
use default features or create new features. Mainly,
MouselabWEB has these basic features: 1) there are
different options for box openings and box closings, 2)
boxes can be fixed or counterbalanced or random, 3)
you can include a header row/column for alternatives
and attributes, 4) the layouts of boxes and selection
buttons can be changed by altering CSS files, 5)
the background scripting automatically saves the
data, and the process data is sent to and stored
on a database, 6) a questionnaire can be included
before/after the decision task.

Dynamic features

MouselabWEB uses MySQL, which is a relational
database management system. We created three more
tables in addition to the table that stores the process
data to make MouselabWEB dynamic. The functions
of these three tables are as follows:

• Talternatives stores information about the alterna-
tives,

• Tattributes stores information about the attributes,

• Tvalues stores the values of the attributes.

The Electronic Supplementary Material (ESM) 1 ex-
plains the columns of these three tables and the in-
formation that they store. In order to use Dynamic
MouselabWEB codes, you have to create these tables
in a database. Our study tables can be created by
running the SQL script in ESM 2.

In Dynamic MouselabWEB two additional screens
are present before the decision task: 1) “Choose the
alternatives screen”, which uses information from the
Talternatives table in the database (Figure 1 shows a
capture of this screen; see ESM 3 for page source code.
2) “Choose the attributes screen”, which uses informa-
tion from the Tattributes table in the database (Figure
2 shows a screen capture). This page’s source code is
in Electronic Supplementary Material 4.

Although we choose to display these two screens in
that order, it is possible to change the order of the
screens or omit one of them and use predefined at-
tributes /alternatives instead.

After these two selections, the names of columns and
rows, and information of boxes can be obtained from
the database by using the tables’ associations. To do
this, we added steps in the source code of the decision
task screen. The source code of the decision task is
provided ESM 5. For a screen capture of a decision
task, see Figure 3.

To create a dynamic decision task, researchers
should determine the alternatives, the attributes, and
the attributes’ values in advance. Then they need to
create three tables in the database to keep their in-
formation and should make necessary changes in the
codes of the sets of alternatives/attributes. Figure 4
illustrates the process of creating a study in Dynamic
MouselabWEB.

10.11588/jddm.2019.1.63149 JDDM | 2019 | Volume 5 | Article 4 | 4

http://www.mouselabweb.org/designer/index.html
https://doi.org/10.11588/jddm.2019.1.63149


Yıldırım & Taşabat: Dynamic MouselabWEB

Figure 1. A screen capture of a set of alternatives screen.

Figure 2. A screen capture of a set of attributes screen.

Technical requirements

Dynamic MouselabWEB has the same tech-
nical requirements as MouselabWEB. We
recommend to visit the download page
(http://www.mouselabweb.org/download.php),
where you can find the program files and a guide
that explains how to install the software and launch
an experiment. We encourage you to create a
MouselabWEB page using MouselabWEB Designer
to get familiar with the software. You can design
an experiment in an offline environment or with an
online version of MouselabWEB. Basically, you need
a web server and PHP/MySQL capabilities on your
system as well as a database to store the study data.
You can find all the necessary files that you have to
put in the study folder in ESM 6.

Experiment

In order to illustrate the dynamic structure of Dy-
namic MouselabWEB, we conducted an experiment
that compares the results of two decision tasks us-
ing the same amount of information presented either
in Dynamic MouselabWEB or in MouselabWEB (the
control).

We formulated three hypotheses based on the pro-
cess measures:

H1: The dynamic structure of a decision task has an
effect on decision-making behavior.
H2: Participants using a dynamic structure will access
more information and spend a longer amount of time
on a decision task, which indicates a difference in terms
of involvement.
H3: The Dynamic MouselabWEB group will place less
attention on the most important attribute than the
control group.

We mainly wanted to make a general statement
about the effect of the dynamic structure as it is stated
in H1. Even though H1 is related to H2, we want to
use different analysis techniques to test these two hy-
potheses. We used more than one dependent variable
to represent decision-making behavior to test H1. On
the other hand, H3 suggests that these two groups have
different approaches for the most important attribute.
Since the dynamic decision task has more than one im-
portant dimension, we assume that the group in this
task give less attention to the most important attribute
than the other group.

Method

Participants

The target group was Turkish undergraduates study-
ing in their final year in statistics departments. 32 un-
dergraduates (19 female, 9 male, 4 not answered, Mage

10.11588/jddm.2019.1.63149 JDDM | 2019 | Volume 5 | Article 4 | 5

http://www.mouselabweb.org/download.php
https://doi.org/10.11588/jddm.2019.1.63149


Yıldırım & Taşabat: Dynamic MouselabWEB

Figure 3. A screen capture of a decision task.

Figure 4. The flowchart of a study in Dynamic MouselabWEB. The
dotted lines represent the background scripts that show screens and
obtain information from the database, whereas the unbroken lines
denote researchers/participants’ activities.

= 22.9, SD = 0.71) from various universities selected
a job on Dynamic MouselabWEB and 20 undergrad-
uates (10 female, 10 male, Mage= 23.6, SD = 1.47)
selected a job on MouselabWEB.

Programs

We designed two job selection tasks with four alter-
natives and five attributes. The stimuli in Mouselab-
WEB have a traditional setup in which the alterna-
tives and the attributes are selected by the researcher,
whereas in the dynamic decision task, participants can
choose the alternatives and attributes from the sets
before the task.

Design

In the Dynamic MouselabWEB study, to define the set
of alternatives, we determined sectors in which people

who graduated in Statistics might work. These sectors
are IT, banking, insurance, industry, market research,
and education. Then we used job search websites to
determine the particular occupations. In the prepara-
tion phase, we selected 21 jobs and prepared a study
link that includes the decision task, and we sent this
link to 10 undergraduates and asked for their feed-
back. Based on their responses, we omitted 11 jobs
and added 2 new jobs to the set. The set of attributes
has 20 attributes which were collected from various
studies. To determine the attribute values, we turned
to eight specialists who have worked for many years in
their sectors. We asked them to rate 20 attributes on
a 7-point scale (from worst to best) for each job and
each sector (overall rate). After the specialists had as-
sessed all the jobs, we consolidated these assessments
into a single table which has 20 attributes and 12 jobs.

Procedure

In the Dynamic MouselabWEB task, participants were
asked to choose four jobs from the set of alternatives
and five attributes from the set of attributes. The op-
tion “I do not want to choose” was added to the sets,
and if this option was selected, the predetermined al-
ternatives/attributes were shown in the main decision
task. In the MouselabWEB task, participants were
asked to choose one job in the main decision task
in which the alternatives and attributes were prede-
fined. This decision task also had four jobs and five
attributes. Both studies include introduction pages
and two warm-up tasks before the stimuli. The survey
links, which included the decision tasks, were sent to
students between March and May 2018. They were

10.11588/jddm.2019.1.63149 JDDM | 2019 | Volume 5 | Article 4 | 6

https://doi.org/10.11588/jddm.2019.1.63149


Yıldırım & Taşabat: Dynamic MouselabWEB

Table 1. Means and Standart Deviations of the Dependent Variables and t-test Results.

Decision Task Group t-test for Equality of Means

Dynamic MouselabWEB MouselabWEB

Group (n = 32)) Group (n = 20) t df p

Process measures Mean SD Mean SD

Percent of unique boxes examined 0.84 0.18 0.70 0.26 2.33 50 .02

Number of acquisitions 42.16 19.25 26.60 15.66 3.04 50 <.01

Reacquisitions Number 25.41 17.14 12.70 11.66 3.18 50 <.01

Total time in decision (s) 42 24 31 16 2.10 50 .04

Time spent on the most important attribute (s) 8.32 5.90 8.86 7.20 -0.29 46 .78

Proportional time on the most important attribute 0.20 0.10 0.32 0.24 -2.19 24 .04

Strategy Measure -0.41 3.93 -0.95 3.20 0.52 50 .61

VarAlt 0.03 0.03 0.05 0.06 -1.59 50 .12

VarAtt 0.02 0.02 0.03 0.05 -1.46 22 .16

Note. VarAlt: variance in the proportion of time spentprocessing each alternative, VarAtt: variance in the proportion of time spent
processing each attribute. The significance level is .05.

informed about the project by an email and invited to
the study. They did not receive any compensation in
return.

Data preparation

The process data was used to create nine variables re-
flecting on decision-making behavior. To measure the
total amount of processing, we used four depth mea-
sures: 1) total time spent on decision(s), 2) number of
acquisitions, 3) number of reacquisitions, 4) percent-
age of unique boxes examined. To assess selectivity
in processing, we used two content measures: 1) time
spent on the most important attribute(s), 2) propor-
tional time spent on the most important attributes.
To define the pattern of searches, we use three se-
quence measures: 1) Strategy Measure, 2) variance
in the proportion of time spent processing each alter-
native (VARalt), 3) variance in the proportion of time
spent processing each attribute (VARatt).

Results

Since we expected the dependent variables to be cor-
related (0.2 < r < 0.8), we initially performed a Mul-
tivariate Analysis of Variance (MANOVA) with de-
pendent variables that meet MANOVA’s assumptions.
These dependent variables are: Total time in decision-
making, number of acquisitions, time spent on the
most important attribute, and variance in proportion
of time per alternative. The result of the MANOVA
shows that the decision-making behavior of partici-
pants who selected their attributes and alternatives
before the decision task is statistically different from
the control group, F (4,43) = 3.18, p = 0.02. As indi-
cated in H1, we can conclude that the dynamic struc-
ture of the decision task has an impact on decision-
making behavior. An independent samples t-test was
used to examine the effects in more detail. Table 1

presents the descriptive statistics (means and SDs) of
the dependent variables and t-test results.

Table 1 shows that the two groups are significantly
different in terms of all the depth measures: percent of
unique boxes examined, number of acquisitions, reac-
quisitions number, total time in decision. As predicted
in H2, the dynamic group processed more information
(i.e., higher number of acquisitions, higher number of
reacquisitions, and higher percentage of unique cells
examined) and spent longer time in decision process,
so we conclude that this group was involved more in
decision-making. Lastly, the two groups differed in
terms of “proportion of time spent on the most impor-
tant attribute” (p = 0.04). As stated in H3, it can be
concluded that when participants define the attributes
and the alternatives of the decision task, they spend
less time on the most important attribute. The data
file is placed in Electronic Supplementary Material 7.

Discussion

The idea underlying Dynamic MouselabWEB emerged
from the observation that there are no open-source
tools available that can create individualized IDMs.
In this paper, we have described Dynamic Mouse-
labWEB, which was designed to create individualized
IDMs that make it possible for participants to choose
the attributes and the alternatives before the decision
task. By using Dynamic MouselabWEB, researchers
can avoid the problem of predetermined informational
structures and generate more realistic decision set-
tings. Additionally, we believe that if participants
can decide what they want to know about the options
that they will consider, they will be more motivated
to get more information about the decision process.
To define Dynamic MouselabWEB’s position in deci-
sion studies, we can compare it with The method of
Active Information Search (AIS). Huber, Wider, and
Huber (1997) introduced the method of AIS, in which

10.11588/jddm.2019.1.63149 JDDM | 2019 | Volume 5 | Article 4 | 7

https://doi.org/10.11588/jddm.2019.1.63149


Yıldırım & Taşabat: Dynamic MouselabWEB

the subject gets a basic description of the task, and
has to ask questions to receive additional information.
These questions are recorded and answers are provided
in printed form. In an AIS experiment, the researcher
has to prepare possible questions in advance to present
their answers during the experiment. Similar to this,
in Dynamic MouselabWEB the researcher has to de-
fine all attributes and alternatives about the task to
display them on the screens. Additionally, both meth-
ods put the participant in an active position. The
main difference between these two approaches is that
in an AIS experiment, the participants have to ask
questions about what they need to know to solve the
problem, but in Dynamic MouselabWEB the partic-
ipants see all possible attributes and alternatives at
once and choose among them to create the task. In
order to test the program, a study was conducted to
compare the decision-making behavior of two groups
which selected a job from same-size decision tasks pre-
sented in Dynamic MouselabWEB versus Mouselab-
WEB. The results of the study show that 1) the dy-
namic structure of a decision task has an effect on
decision-making behavior; 2) the dynamic group pro-
cessed more information and spent more time on de-
cision processes; and 3) the dynamic group spent less
time on the most important attribute. The job choice
task is just one example of any number of decision-
making contexts, and this new program can be used for
many decision-making tasks with larger sample sizes.
We encourage researchers to use Dynamic Mouselab-
WEB to observe decision-making behavior because in-
dividualized IDMs include only those criteria that are
important for decision makers and only those options
that have the potential to be selected. Thus, Dynamic
MouselabWEB creates a decision-making environment
with more than one important dimension, an issue that
Huber (1980) raised. Naturally, the outcomes of this
should be explored by future research. Since it is an
open-source program, other researchers can easily ac-
cess the codes and adapt them to their research.

Electronic Supplementary Material

Supplementary material available online at
https://doi.org/10.11588/jddm.2019.1.63149

– ESM1.DatabaseTables (doc).

– ESM2.CreateDMWTables(sql).

– ESM3.ChooseAlterScreen(php).

– ESM4.ChooseAttriScreen(php).

– ESM5.DecisionScreen(php).

– ESM6.MouselabWEB_Files(zip).

– ESM7.Data(sav).

Acknowledgements: The authors would like to thank
the participants and the experts who contribute this study
without getting any reward.

Declaration of conflicting interests: The authors de-
clare that the research was conducted in the absence of
any commercial or financial relationships that could be
constructed as a potential conflict of interest.

Author contributions: Zeliha was the lead researcher
who designed the experiment, collected the data, anal-
ysed it for this work. Semra was the supervisor who
served as a constant guiding light for this work.

Handling editor: Andreas Fischer

Copyright: This work is licensed under a Creative Com-
mons Attribution-NonCommercial-NoDerivatives 4.0 In-
ternational License.

Citation: Yıldırım, Z. & Taşabat, S. E. (2019). Dy-
namic MouselabWEB: Individualized information display
matrixes. Journal of Dynamic Decision Making, 5, 4.
doi:10.11588/jddm.2019.1.63149.

Received: 23 June 2019
Accepted: 7 October 2019
Published: 28 December 2019

References

Abelson, R. & Levi, A. (1985). Decision making and decision the-
ory. In G. Lindzey & E. Aronson (Eds.), The handbook of social
psychology (Vol. 1, pp. 231–309). Munich: Random House.

Andersson, P. (2001). P1198: Software for tracing deci-
sion behavior in lending to small businesses. Behavior Re-
search Methods, Instruments, & Computers, 33(2), 234–242.
doi:10.3758/bf03195370

Aschemann-Witzel, J., & Hamm, U. (2011). Measuring con-
sumers’ information acquisition and decision behavior with
the computer-based information-display-matrix. Methodology.
doi:10.1027/1614-2241/a000018

Beach, L. R., & Mitchell, T. R. (1978). A contingency model for
the selection of decision strategies. Academy of management
review, 3(3), 439–449. doi:10.5465/amr.1978.4305717

Bettman, J. R., Johnson, E. J., Luce, M. F., & Payne, J. W.
(1993) Correlation, conflict, and choice. Journal of Experimen-
tal Psychology: Learning, Memory, and Cognition, 19, 931–51.
doi:10.1037//0278-7393.19.4.931

Bettman, J. R., Luce, M. F., & Payne, J. W. (1998). Constructive
consumer choice processes. Journal of Consumer Research, 25,
187–217. doi:10.1086/209535

Böckenholt, U., & Hynan, L. S. (1994). Caveats on a process-
tracing measure and a remedy. Journal of Behavioral Decision
Making, 7(2), 103–117. doi:10.1002/bdm.3960070203

Brehmer, B. (1994). The psychology of linear judgement mod-
els. Acta Psychologica, 87(2-3), 137–154. doi:10.1016/0001-
6918(94)90048-5

Browne, G. J., Pitts, M. G., & Wetherbe, J. C. (2007). Cognitive
stopping rules for terminating information search in online tasks.
MIS Quarterly, 89–104. doi:10.2307/25148782

Cohen, J. (1988). Statistical Power Analysis for the Behavioral
Sciences (2nd ed.). Hillsdale, NJ: Lawrence Erlbaum Associates,
Publishers. doi:10.4324/9780203771587

10.11588/jddm.2019.1.63149 JDDM | 2019 | Volume 5 | Article 4 | 8

https://doi.org/10.11588/jddm.2019.1.63149
https://doi.org/10.11588/jddm.2019.1.63149
https://doi.org/10.3758/bf03195370
https://doi.org/10.1027/1614-2241/a000018
https://doi.org/10.5465/amr.1978.4305717
https://doi.org/10.1037//0278-7393.19.4.931
https://doi.org/10.1086/209535
https://doi.org/10.1002/bdm.3960070203
https://doi.org/10.1016/0001-6918(94)90048-5
https://doi.org/10.1016/0001-6918(94)90048-5
https://doi.org/10.2307/25148782
https://doi.org/10.4324/9780203771587
https://doi.org/10.11588/jddm.2019.1.63149


Yıldırım & Taşabat: Dynamic MouselabWEB

Costa-Gomes, M., Crawford, V. P., & Broseta, B. (2001). Cog-
nition and Behavior in Normal-Form Games: An Experimental
Study. Econometrica, 69(5), 1193–1235. doi:10.1111/1468-
0262.00239

Dawes, R. M. (1979). The robust beauty of improper linear mod-
els in decision making. American Psychologist, 34(7), 571–582.
doi:10.1037//0003-066x.34.7.571

Einhorn, H. J., Kleinmuntz, D. N., & Kleinmuntz, B. (1979).
Linear regression and process-tracing models of judgment.
Psychological Review, 86(5), 465–485. doi:10.1037//0033-
295x.86.5.465

Einhorn, H. J., & Hogarth, R. M. (1981). Behavioral decision the-
ory: Processes of judgment and choice. Annual Review of Psy-
chology, 32, 53–88. doi:10.1146/annurev.ps.32.020181.000413

Ford, J. K., Schmitt, N., Schechtman, S. L., Hults, B. M., &
Doherty, M. L. (1989). Process tracing methods: Contribu-
tions, problems, and neglected research questions. Organiza-
tional Behavior and Human Decision Processes, 43(1), 75–117.
doi:10.1016/0749-5978(89)90059-9

Harte, J. M., & Koele, P. (2001). Modelling and describing human
judgement processes: The multiattribute evaluation case. Think-
ing & Reasoning, 7(1), 29–49. doi:10.1080/13546780042000028

Hogarth, R. M. (1975). Cognitive processes and the as-
sessment of subjective probability distributions. Journal
of the American Statistical Association, 70(350), 271–289.
doi:10.1080/01621459.1975.10479858

Huber, O. (1980). The influence of some task variables on cog-
nitive operations in an information-processing decision model.
Acta Psychologica, 45(1–3), 187–196. doi:10.1016/0001-
6918(80)90031-1

Huber, O., Wider, R., & Huber, O. W. (1997). Active infor-
mation search and complete information presentation in natu-
ralistic risky decision tasks. Acta Psychologica, 95(1), 15–29.
doi:10.1016/s0001-6918(96)00028-5

Huneke, M. E., Cole, C. & Levin, I. P. (2004). How Varying Lev-
els of Knowledge and Motivation Affect Search and Confidence
during Consideration and Choice. Marketing Letters, 15(2–3),
67–79. doi:10.1023/b:mark.0000047385.01483.19

Jacoby, J., Jaccard, J., Kuss, A., Troutman, T., & Mazursky, D.
(1987). New directions in behavioral process research: Implica-
tions for social psychology. Journal of Experimental Social Psy-
chology, 23(2), 146–175. doi:10.1016/0022-1031(87)90029-1

Jacoby, J., Chestnut, R. W., Weigl, K. C., & Fisher, W. (1976).
Pre-purchase information acquisition: Description of a process
methodology, research paradigm, and pilot investigation. ACR
North American Advances,3, 306-314.

Jarvenpaa, S. L. (1989). The Effect of Task Demands
and Graphical Format on Information Processing Strate-
gies. Management Science, INFORMS, 35(3), 285–303.
doi:10.1287/mnsc.35.3.285

Jasper, J. D., & Shapiro, J. (2002). MouseTrace: A bet-
ter mousetrap for catching decision processes. Behavior Re-
search Methods, Instruments, & Computers, 34(3), 364–374.
doi:10.3758/bf03195464

Jasper, J. D., & Levin, I. P. (2001). Validating a new
process tracing method for decision making. Behavior Re-
search Methods, Instruments, & Computers, 33(4), 496–512.
doi:10.3758/bf03195408

Klayman, J. (1982). Simulations of six decision strategies: Com-
parisons of search patterns, processing characteristics, and re-
sponse to task complexity. Center for Decision Research, Gradu-
ate School of Business, University of Chicago.

Levin, I. P., & Jasper, J. D. (1995). Phased narrowing: A
new process tracing method for decision making. Organiza-

tional Behavior and Human Decision Processes, 64(1), 1–8.
doi:10.1006/obhd.1995.1084

Mittal, B. (1989). Measuring purchase-decision in-
volvement. Psychology & Marketing, 6(2), 147–162.
doi:10.1002/mar.4220060206

Mitchell, A. A. (1979). Involvement: a potentially important me-
diator of consumer behavior. ACR North American Advances.

Montgomery, A. L., Hosanagar, K., Krishnan, R., & Clay, K. B.
(2004). Designing a better shopbot. Management Science,
50(2), 189–206. doi:10.1287/mnsc.1030.015

Payne, J. W. (1976). Task Complexity and Contingent Processing
in Decision Making: An Information Search and Protocol Anal-
ysis. Organizational Behavior and Human Performance, 16(2),
366–387. doi:10.1016/0030-5073(76)90022-2

Payne, J. W., Bettman, J. R., & Johnson, E. J. (1988). Adaptive
strategy selection in decision making. Journal of Experimental
Psychology: Learning, Memory, and Cognition, 14(3), 534–552.
doi:10.1037//0278-7393.14.3.534

Payne, J. W., Braunstein, M. L., & Carroll, J. S. (1978). Ex-
ploring predecisional behavior: An alternative approach to deci-
sion research. Organizational Behavior and Human Performance,
22(1), 17–44. doi:doi.org/10.1016/0030-5073(78)90003-x

Payne, J. W., Bettman, J. R., & Johnson, E. J. (1992).
Behavioral decision research: A constructive processing per-
spective. Annual review of psychology, 43(1), 87–131.
doi:10.1146/annurev.ps.43.020192.000511

Pollay, R. W. (1970). The structure of executive decisions and
decision times. Administrative Science Quarterly, 15(4) 459–471.
doi:10.2307/2391339

Russo, J. E., & Dosher, B. A. (1983). Strategies for multiattribute
binary choice. Journal of Experimental Psychology: Learn-
ing, Memory, and Cognition, 9(4), 676. doi:10.1037//0278-
7393.9.4.676

Stevenson, M. K., Busemeyer, J. R., & Naylor, J. C. (1990). Judg-
ment and decision-making theory. In M. D. Dunnette & L. M.
Hough (Eds.), Handbook of industrial and organizational psychol-
ogy (pp. 283–374). Palo Alto, CA, US: Consulting Psychologists
Press.

Svenson, O. (1979). Process descriptions of decision making.
Organizational Behavior and Human Performance, 23, 86-112.
doi:10.1016/0030-5073(79)90048-5

Tabatabai, M. (1998). Investigation of decision making process: A
hypermedia approach. Interacting with Computers, 9(4), 385–
396. doi:10.1016/s0953-5438(97)00009-x

Westenberg, M. R. M., & Koele, P. (1994). Multi-attribute evalua-
tion processes: Methodological and conceptual issues. Acta Psy-
chologica, 87(2-3), 65–84. doi:10.1016/0001-6918(94)90044-2

Willemsen, M. C., & Johnson, E. J. (2004). Mouselabweb: Per-
forming sophisticated process tracing experiments in the partic-
ipants home. In Society for Computers in Psychology annual
meeting, Minneapolis.

Willemsen, M. C., & Johnson, E. J. (2011). Visiting the decision
factory: Observing cognition with MouselabWEB and other in-
formation acquisition methods. In M. Schulte-Mecklenbeck, A.
Kuehberger, J.G. Johnson (Eds.), A handbook of process tracing
methods for decision research, pp. 21-42. Abingdon: Routledge

Williamson, J., Ranyard, R., & Cuthbert, L. (2000). Risk man-
agement in everyday insurance decisions: Evidence from a pro-
cess tracing study. Risk, Decision and Policy, 5(1), 19–38.
doi:10.1017/s1357530900000090

10.11588/jddm.2019.1.63149 JDDM | 2019 | Volume 5 | Article 4 | 9

https://doi.org/10.1111/1468-0262.00239
https://doi.org/10.1111/1468-0262.00239
https://doi.org/10.1037//0003-066x.34.7.571
https://doi.org/10.1037//0033-295x.86.5.465
https://doi.org/10.1037//0033-295x.86.5.465
https://doi.org/10.1146/annurev.ps.32.020181.000413
https://doi.org/10.1016/0749-5978(89)90059-9
https://doi.org/10.1080/13546780042000028
https://doi.org/10.1080/01621459.1975.10479858
https://doi.org/10.1016/0001-6918(80)90031-1
https://doi.org/10.1016/0001-6918(80)90031-1
https://doi.org/10.1016/s0001-6918(96)00028-5
https://doi.org/10.1023/b:mark.0000047385.01483.19
https://doi.org/10.1016/0022-1031(87)90029-1
https://doi.org/10.1287/mnsc.35.3.285
https://doi.org/10.3758/bf03195464
https://doi.org/10.3758/bf03195408
https://doi.org/10.1006/obhd.1995.1084
https://doi.org/10.1002/mar.4220060206
http://dx.doi.org/10.1287/mnsc.1030.0151
https://doi.org/10.1016/0030-5073(76)90022-2
https://doi.org/10.1037//0278-7393.14.3.534
https://doi.org/10.1016/0030-5073(78)90003-x
https://doi.org/10.1146/annurev.ps.43.020192.000511
https://doi.org/10.2307/2391339
https://doi.org/10.1037//0278-7393.9.4.676
https://doi.org/10.1037//0278-7393.9.4.676
https://doi.org/10.1016/0030-5073(79)90048-5
https://doi.org/10.1016/s0953-5438(97)00009-x
https://doi.org/10.1016/0001-6918(94)90044-2 
https://doi.org/10.1017/s1357530900000090 
https://doi.org/10.11588/jddm.2019.1.63149