Download 2 . The Role of Learning in the Task Domain

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Pattern recognition wikipedia , lookup

Human-Computer Interaction Institute wikipedia , lookup

Ecological interface design wikipedia , lookup

Machine learning wikipedia , lookup

Concept learning wikipedia , lookup

Transcript
The Mediating Effect of Learning on the Influence of
Embedded Explanation Facilities in Decision Support Activities
Abstract
The educative role of decision support tools is becoming more difficult in contemporary business
environments characterized by large and complex tasks. The research questions investigated in this study
are concerned with the role of learning in task performance and the influence of explanation facilities in
assisting the learning process. This study emphasizes the importance of decision support design features
in enhancing the learning process and contributing to task performance.
1. Introduction
Knowledge-Based Decision Support Systems (KBDSSs) are generally built to aid semistructured and unstructured decision problems (Ginzberg 1980; Gorry et al. 1989; Luconi et al.
1986; Sprague 1980). Different from data processing systems in which the procedures to solve a
problem are well defined, the KBDSSs dealing with the less-structured problems have various
alternatives to resolve a task. While KBDSSs can perform the well-structured activities of
problem solving, humans typically use their goals, heuristics, and knowledge to formulate
problems, and in modifying and controlling the problem solving process (Luconi et al. 1986). In
order to make the most effective decision, an individual must be able to understand the structure
of the problem at hand very well. Gorry and Morton (1989) also agree with the idea that KBDSS
should assist the managers to develop their decision making ability by increasing their
understanding of the environment. Hence, an important role of a KBDSS is in educating decision
makers (Gorry et al. 1989).
The educative role of KBDSS is often difficult to accomplish as the tasks encountered in
business environments are sizable and complex. Functions set in the KBDSS and the procedures
exploiting the functions for problem solving are so complicated that even experts in the task
domain do not fully comprehend problem dynamics. For novices and would-be experts the
1
situation is even worse. Uncertainty rules the minds of novices and it certainly gives pause for
thought to the expert-in-training as information overload rules the way.
As noted by Silver (1991) novices need guidance more than those who have implemented
comparable decisions many times in the past. Similarly, infrequent users of a KBDSS require
more guidance than frequent users. In one study, the users of a KBDSS would not even accept
the advice or recommendation suggested by the KBDSS because they were not confident of the
underlying mechanism running the KBDSS (Ye et al. 1995).
KBDSS can “support” the users at this stage. In case of real-time dynamic decision-making
(Lerch et al. 2001), “support” means to facilitate the users in fully understanding the ultimate
goals and sub-goals of the system, the alternative decisions, and the consequences of each subdecision. In order for a KBDSS to be effective, the users need to gain insight into the embedded
functions and procedures in KBDSSs. Functions represented in KBDSSs correspond to ‘data’
and procedures embedded in KBDSSs are compatible with the combination of ‘goal’,
‘procedure’ and ‘strategies’ from Luconi and Newell’s definition on the task characteristics. The
knowledge of those components of a problem embodied in the KBDSS leads to learning of the
current task, and hence, allows the implementers to resolve the task with more confidence and
precision. If the time and effort required of the users to understand a complex task can be
reduced throughout features successfully established inside the KBDSS, then performance
should be improved (Luconi et al. 1986).
This research draws on the Dhaliwal and Benbasat’s (1996) model for the explanation
facilities in KBDSS and Balzer’s reasoning on the cognitive feedback (1989). Dhaliwal and
Benbasat (1996) developed a conceptual model in which learning, as fostered by the use of a
Knowledge-Based System explanation, affects both decisional performance of users as well as
their perceptions of a Knowledge-Based System (KBS).
2
In summary, the research questions addressed in this paper are twofold

To investigate if learning mediates the relationship between KBDSS and
performance
and,

If the technology-embedded functions in KBDSS are more efficient for learning
than the explicit training for the specific information systems and tasks as well.
This study emphasizes the design features that enhance learning embedded in KBDSS.
Learning occurs when users utilize the explanation facilities to perform well. We intend to
confirm the role of learning that mediates explanations and performance, and fill the lack of
empirical assessment within the abundant studies on the impact of explanations on performance.
2. The Role of Learning in the Task Domain
Learning issues in KBDSS research are not trivial. For a KBDSS to be perceived as
trustworthy, easy-to-use, or useful it must first be understood. Similarly, a KBDSS can only help
decision makers in making better judgments if it assists them in learning and understanding their
task environment (Dhaliwal et al. 1996). Dhalliwal and Benbasat (1996) claimed that the
learning promoted by the use of KBS explanations affects decisional performance so that there
are serious demands for research focused on user learning for detailed analysis of the purpose
and design of KBS explanations (Wensley 1989). There is a rich tradition of integrating learning
concepts into information systems research agendas. For example, the Task Technology Fit
Theory (Goodhue et al. 1995) incorporated the learning effect of users into the task
characteristics. The basic idea is that if the features of a technology fit the characteristics of the
task, the users automatically understand the task and the technology at the same time and resolve
the task well. This may apply, however, to experts rather than novice or would-be-experts. How
3
Learning
KBDSS Explanation Facilities
Performance
Effectiveness of
Decision Making
Data
Understanding of
Task at Hand
Procedure
P1
P2
Knowledge
Transfer
Goals
Efficiency of
Decision Making
Satisfaction of
the Process
Confidence in
the Outcome
Strategies
<Figure 1> Relationships among explanation, learning and performance
people can recognize the fit without knowing the characteristics of tasks? In essence, there is
need for differentiation between learning or levels of expertise and task characteristics.
Even though there are many studies on the types of explanations to increase users’
understanding, the fundamental question on whether such learning or understanding translates
directly into improved decision making remains, as yet, unanswered (Dhaliwal et al. 1996). For
example, there has not been a full examination of the relationships in the Dhaliwal and
Benbasat’s research model (Figure 1).
Understanding task domain through the use of KBDSS is important. According to the
cognitive feedback paradigm (Todd et al. 1965), decision makers need three types of information
in order to make an effective decision, including task information, cognitive information, and
4
functional validity information (Balzer et al. 1989). A number of studies have concluded that
task information is the most effective in fostering learning and making accurate judgments
(Adelman 1981; Hoffman et al. 1981b; Newton 1965). In addition, KBS explanations should
play a role to transfer knowledge (Hsu 1993). If we have an in-depth knowledge of the task
domain, then we should be able to apply it to any similar cases as well as to understand the
specific case used at the learning context. As such, the learning construct is separated into two
different sub-constructs comprised of “understanding of task at hand” and “knowledge transfer”
(Figure 1).
3. The Significance of Technology-Embedded Explanation Facilities Affecting Learning
The first knowledge-based system incorporating explanations was MYCIN (Buchanan et
al. 1984). The developers of MYCIN reasoned that in addition to good advice, providing
explanations is critical for the KBDSS to be acceptable to users, and to be acceptable it had to be
understood (learned) by users. That is, the design goal of MYCIN was actually aimed at
educating naïve users (Clancey 1985). The goal was to encourage to learn from the system.
Why is explanation supposed to influence the users’ learning and performance? The
provision of why and how explanation was started by MYCIN (Buchanan et al. 1984) and it is
the foundation of explanation facilities in studies of knowledge-based systems (Dhaliwal et al.
1996). The explanations have three roles (Hayes et al. 1983): clarification, teaching, and
convincing. When it comes to the role of teaching, Hsu (1993) focused on the knowledge
transfer role of knowledge-based system explanations. While users use the information systems,
they also learn, and this in turn leads us to use the cognitive learning paradigm to construct
KBDSS.
5
Outcome feedback (Hogarth 1981) was purported to be insufficient for users to learn how
to improve the accuracy of decision-making. In a decision making task involving uncertainty,
outcome feedback was found to be ineffective in fostering learning and improving judgmental
performance (Brehmer 1980; Hammond et al. 1975; Hoffman et al. 1981a). Further, outcome
feedback was unsuccessful in providing adequate task information to the decision maker so that
she could develop an appropriate model of the environment (Brehmer 1987; Sterman 1989).
The ‘cognitive feedback paradigm’ (Todd et al. 1965) as implemented by the lens model
(Brunswik 1956) provides a very useful conceptual model for developing the KBDSS
infrastructure. The paradigm proposes two cognitive feedback and feedforward as very effective
learning operators for designing the KBDSS. Cognitive feedback uses case-specific information
provided to user to explain the outcomes of the task, and feedforward uses non case-specific
information input cues of the task to increase task performance. Feedforward training can be
operationalized by actual task performance training (Malloy et al. 1987; Sengupta et al. 1993).
Feedforward is difficult to operationalize (Dhaliwal et al. 1996). Feedforward is operationalized
as the pre-task provision of problem-solving heuristics or training session and has a significant
effect on learning and is more effective than outcome feedback (Catsbaril et al. 1987; Malloy et
al. 1987; Robertson 1987). It implies that if we can establish feedforward with the KBDSS, we
can learn how to make decision correctly while using the KBDSS.
Research on learning in the context of information systems use varies: learning from
instruction, learning by doing, vicarious learning, and learning that occurs through daily life and
work experience (Alavi et al. 2001). Even though training can improve expertise, it would take
much time and effort to do so. The inefficiency of extra training can be explained by examining
the “learning versus working conflict” (Carroll et al. 1987). That is, users who want to use a
6
KBDSS and its explanations to accomplish a task are unwilling to spend much time learning the
KBDSS, its knowledge domain and its reasoning logic.
Learning versus working conflict can, however, be reduced through the design of better
explanation facilities to integrate learning with the actual use of the KBDSS for improving
performance (Dhaliwal et al. 1996). If we can integrate learning with the actual use of the
KBDSS for improving performance, the motivational cost for learning can be reduced through
the design of better explanation facilities.
Here we suggest propositions that will be examined in the study:
Proposition 1: The explanation facility of KBDSS will enhance users’ understanding of the
task domain
Proposition 2: The learning of the task domain enhanced by explanation facilities of KBDSS
will mediate the relationship between explanation facilities and performance.
4. The Type of Tasks
The result of this study can be generalized to diverse types of complex tasks. In turbulent
business environments, critical missions are often fulfilled by the optimal decision-making
among numerous (sometimes countless) alternatives. Even the dynamic decision-making
problems compose of series of decisions. The evidenced effect of this study applies to learning
the sequence of decision makings in complex tasks.
Task Technology Fit Theory (Goodhue et al. 1995) claims that an information system
should correspond to task characteristics. A task has mutually exclusive dimensions to be
characterized. Luconi et al.(1986) relabel and regroup Newell’s (1985) distinguished
7
categorization of problem solving into four categories: data, goals, procedures and strategies. We
incorporate the categorization of task characteristics into the study (Figure 1).
A task used in the study is Multiple Criteria Decision Making (MCDM). An MCDSS
(Multiple criteria decision support systems) is simply a KBDSS that helps to implement MCDM.
MCDM is one of the tasks KBDSSs implement and characterized by less-structured and multiple
alternatives. The theoretic and algorithmic approaches to MCDM (Dyer et al. 1992; Keeney et al.
1991; Olson et al. 1995; Saaty 1986) and the surveys on MCDM (Buede 1992; Buede et al.
1995; Bui 1984; Minch et al. 1986) exist. The empirical examination of the KBDSS features to
aid MCDM execution, however, is rare.
Certain type of task won’t need mediating role of learning. Data processing systems that
deal with fully structured problem, i.e., problems with well-stated goals, specific input data,
standard procedures and simple strategies (Luconi et al. 1986), will allow users to implement the
task mechanically without understanding the internal reasoning of KBDSS. In contrast, KBDSSs
deal with tasks in which data, procedures, goals and constraints are partially represented (Figure
6 in (Luconi et al. 1986)).
The performance measure chosen in the study varies. They are effectiveness of decision
making (accuracy), efficiency of decision making (speed), satisfaction of process, and
confidence in the outcome.
5. Discussion
In this study, the explanation facilities of KBDSS include explanations on functions and
procedures. The former corresponds to data and the latter matches to goals, procedure and
strategies, according to Luconi’s (1986) task categorization. The possibility that the two kinds of
explanation facilities would interact each other is not planned to investigate in the present
8
research. In addition, one of them would more contribute to the mediation effect of enhanced
learning on task performance. The presence of information of how to perform the task, i.e.,
procedure, might influence learning and performance much more than information about the
functions inside the KBDSS. This relationship is suggested to explore in the future research.
Another consideration is that there might be more factors and dynamics in the relation
between explanation facilities and the mediation effect of learning on performance. What we are
most interested in is learning aided by explanation facilities and its effect on performance. This
study only presents that there must be some basic relationship between learning and
performance, and therefore, motivates further sophisticated study about the relationship and
variables.
The test whether learning plays mediating or moderating role should be assessed too. The
issue associates the level of user expertise and the type of tasks. We may ordinarily infer that
whatever level of expertise the user have, or however structured the tasks are, the performance
does not occur or augment without understanding of the task. The primary aim of the study is to
discover the inexorable effect of learning in the relationship between KBDSS and performance
and to encourage the future research on the design of KBDSS by which people can dramatically
increase their performance.
We could witness that explanation facilities influence the learning of the task when the textbased instruction is provided as well. And also we can manipulate the instruction types to make
synergy to increase the effectiveness of IS adoption. KBDSS would interact with other factors
already situated in the adopting organizations. Further, there can be some moderating factors that
make explanation facilities more influential to learning and performance.
When we explore the effect of explanation facilities on performance, there might be an
interesting finding that even experts, given explanation facilities, have performance improvement
9
when there is no mediating effect of learning on performance as in novices. If so, it could cast
two possibilities. First, experts have their own mechanism when processing a task. If they are
well informed of the task and systems, the presentation of explanation facilities will not make
any change to their performance. But if we observe improvement in experts’ performance, then
some other factors are presumed to emerge. Second, explanation facilities may have some
unknown influence on experts’ performance. The both potential answers motivate further
research on the underpinning variables and relationships.
References
Adelman, L. "the influence of formal, substantive, and contextual task properties on the relative
effectiveness of differenct forms of feedback in multiple-cue probability learning tasks,"
organizational behavior and human performance) 1981, pp 423-442.
Alavi, M., and Leidner, D.E. "Research commentary: Technology-mediated learning - A call for
greater depth and breadth of research," Information Systems Research (12:1), Mar 2001,
pp 1-10.
Balzer, W.K., Doherty, M.E., and Oconnor, R. "Effects of Cognitive Feedback on Performance,"
Psychological Bulletin (106:3), Nov 1989, pp 410-433.
Brehmer, B. "In One Word - Not from Experience," Acta Psychologica (45:1-3) 1980, pp 223241.
Brehmer, B. "Systems Design and teh psychology of Complex Systems," in: Empirical
Foundations of Information ands Software Science III, F.a.P.Z. J (ed.), Plenum, NY,
1987.
Brunswik, E. Perception and teh representative design of experiments University of California
Press, Berkely, CA, 1956.
Buchanan, B.G., and Shortliffe, E.H. Rule-based expert systems: the MYCIN experiments of
theStanford Heuristic Programming Project Addison-Wesley, MA, 1984.
Buede, D. "Software review: Overview of the MCDA software market," Journal of MultiCriteria Decision Anals (1) 1992, pp 59-61.
Buede, D., and Maxwell, D. "Rank disagreement: A comparison of multi-criteria
methodologies," journal of Multi-Criteria Decision Analysis (4) 1995, pp 1-21.
Bui, X.T. Builiding effective multiple criteria decision models: a decision support system
approach ACM, NY, 1984.
Carroll, J.M., and McKendree, J. "Interface Design Issues for Advice-Giving Expert Systems,"
Communications of the Acm (30:1), Jan 1987, pp 14-31.
Catsbaril, W.L., and Huber, G.P. "Decision Support Systems for Ill-Structured Problems - an
Empirical-Study," Decision Sciences (18:3), Sum 1987, pp 350-372.
Clancey, W.J. "Heuristic Classification," Artificial Intelligence (27:3) 1985, pp 289-350.
10
Dhaliwal, J.S., and Benbasat, I. "The use and effects of knowledge-based system explanations:
Theoretical foundations and a framework for empirical evaluation," Information Systems
Research (7:3) 1996, pp 342-362.
Dyer, J.S., Fishburn, P.C., Steuer, R.E., Wallenius, J., and Zionts, S. "Multiple Criteria DecisionMaking, Multiattribute Utility- Theory - the Next 10 Years," Management Science (38:5),
May 1992, pp 645-654.
Ginzberg, M.J. "Managing Management Information-Systems - Eindor,P, Segev,E," Journal of
Accountancy (149:1) 1980, pp 88-89.
Goodhue, D.L., and Thompson, R.L. "Task-technology fit and individual performance," MIS
Quarterly (19:2), Y: IS Success 1995, pp 213-236.
Gorry, G.A., and Morton, M.S.S. "A Framework for Management Information-Systems," Sloan
Management Review (30:3), Spr 1989, pp 49-61.
Hammond, K.R., Stewart, T.R., Brehmer, B., and Steinmann, D.O. "Social judgment theory," in:
Human Judgment and Decision Processes, S. Schwartz (ed.), Academic Press, New
York, 1975.
Hayes, P.J., and Reddy, D.R. "Steps Towards graceful interaction in spoken and written manmachine communication," International jounal of man-machine studies (19) 1983, pp
231-284.
Hoffman, P.J., Earle, T.C., and Slovic, P. "Multidimensional Functional Learning (Mfl) and
Some New Conceptions of Feedback," Organizational Behavior and Human
Performance (27:1) 1981a, pp 75-102.
Hoffman, P.J., and T.C "multidimensional functional learning and some new conceptions of
feedback," organizational behavior and human performance (27) 1981b, pp 75-102.
Hogarth, R.M. "Beyond discrete biases: functional and dysfunctional aspects of judgment
heuristics," Psychological Bulletin (90) 1981, pp 197-297.
Hsu, K.C. "The Effects of Cognitive Styles and Interface Designs on Expert Systems Usage: An
Assessment of Knowledge Transfer," in: Unbublished Doctoral Dissertation, Memphis
State University, Memphis, TN, 1993, 1993.
Keeney, R.P., and Raiffa, H. Decisions with Multiple Objectives John Wiley, NY, 1991.
Lerch, F.J., and Harter, D.E. "Cognitive support for real-time dynamic decision making,"
Information Systems Research (12:1), Mar 2001, pp 63-82.
Luconi, F.L., Malone, T.W., and Morton, M.S.S. "Expert Systems - the Next Challenge for
Managers," Sloan Management Review (27:4), Sum 1986, pp 3-14.
Malloy, T.E., Mitchell, C., and Gordon, O.E. "Training Cognitive Strategies Underlying
Intelligent Problem- Solving," Perceptual and Motor Skills (64:3), Jun 1987, pp 10391046.
Minch, R.P., and Sanders, G.L. "Computerized Information-Systems Supporting Multicriteria
Decision-Making," Decision Sciences (17:3), Sum 1986, pp 395-413.
Newell, A., and Gard, S.K. "The prospects for psychological science in human-computer
interaction,," Human-Computer Interaction (1) 1985, pp 209-242.
Newton, J.R. "judgment and feedback in a quasi-clinical situation," Journal of personality and
social psychology (1) 1965, pp 336-342.
Olson, D.L., Moshkovich, H.M., Schellenberger, R., and Mechitov, A.I. "Consistency and
accuracy in decision aids: Experiments with four multiattribute systems," Decision
Sciences (26:6), Nov-Dec 1995, pp 723-748.
Robertson, S. "The Effects of Training Content on User Performance with a Flexible Decision
Support System," Case Western Reserve University, Cleveland, OH, 1987.
11
Saaty, T.L. "Axiomatic Foundation of the Analytic Hierarchy Process," Management Science
(32:7), Jul 1986, pp 841-855.
Sengupta, K., and Abdelhamid, T.K. "Alternative Conceptions of Feedback in Dynamic Decision
Environments - an Experimental Investigation," Management Science (39:4), Apr 1993,
pp 411-428.
Sprague, R.H. "A Framework for the Development of Decision Support Systems," MIS
Quarterly (4:4) 1980.
Sterman, J.D. "Modeling Managerial Behavior - Misperceptions of Feedback in a Dynamic
Decision-Making Experiment," Management Science (35:3), Mar 1989, pp 321-339.
Todd, F.J., and Hammond, K.R. "Differential effects in two multiple cue probability learning
tasks," Behavioral Science (10) 1965.
Wensley, A. "Research Directions in Expert Systems," in: Knowledge-Based Management
Support Systems, L.D.e. al. (ed.), Ellis-Norwood-John Wiley, 1989, pp. 248-275.
Ye, L.R., and Johnson, P.E. "The Impact of Explanation Facilities on User Acceptance of
Expert-Systems Advice," MIS Quarterly (19:2), Jun 1995, pp 157-172.
12