The Origins/Evolution of Information Process Theory
VerifiedAdded on 2023/01/05
|4
|788
|62
AI Summary
This article explores the origins and evolution of information process theory, a cognitive psychology theory that compares human brain's information processing capabilities to that of a computer. It discusses key concepts such as chunking and the Test-Operate-Test-Exit (TOTE) model, and how this theory has influenced subsequent memory theories. The article also highlights the similarities between human thought processes and computer information processing.
Contribute Materials
Your contribution can guide someone’s learning journey. Share your
documents today.
Running head: INFORMATION PROCESS THEORY 1
The Origins/Evolution of your DMT
Student
Institution
The Origins/Evolution of your DMT
Student
Institution
Secure Best Marks with AI Grader
Need help grading? Try our AI Grader for instant feedback on your assignments.
INFORMATION PROCESS THEORY 2
Information processing theory was founded by George A. Miller between 1920 and 2012.
Other contributors to the evolution of information process theory are: Atkinson and Shriffin in
1968, Craik and Lockhart in 1972, and Rumelhart and McClelland in 1986.This theory evolved
out of the American experimental tradition in psychology. Basically, the authors of this theory
were interested in comparing human brain’s capability to process information to that of a basic
processor or a computer. Information processing theory discusses the mechanism under which
learning happens (Estes, 1978).
George A. Miller’s information process theory laid more emphasis on the continuous
pattern of human brain development. The creator of information process theory, George A.
Miller, provided two theoretical concepts which have been fundamental in cognitive psychology
and also in the frameworks of information processing (Ramon, 2005). The concept of
“chunking” led the rank with the short-term memory capacity. In building the theory, George A.
Miller argued that short-term memory was only able to hold 5-9 information chunks where a
chunk means a unit of meaningful information (Maxine & Elizabeth, 2005). In this case, George
A. Miller associated a “chunk” with words, faces of people, digits, or chess. This chunking and
limited capacity of short-term memory has since then been considered by the other theorists
namely Atkinson and Shriffin in 1968, Craik and Lockhart in 1972, and Rumelhart and
McClelland in 1986 as a basic element to all subsequent memory theories.
As stated above, George A. Miller came up with the second concept of “Test-Operate-
Test-Exit” (TOTE). The TOTE concept was also supported by Galanter and Pribran in his 1960
theoretical amendment. George A. Miller together with his college theorists postulated that this
TOTE concept should take the place for stimuli-response and be considered as a basic behavior
unit (Lisa, Diane, Ann, Andre, & Lynn, 2009). George A. Miller et al. suggested that under a
Information processing theory was founded by George A. Miller between 1920 and 2012.
Other contributors to the evolution of information process theory are: Atkinson and Shriffin in
1968, Craik and Lockhart in 1972, and Rumelhart and McClelland in 1986.This theory evolved
out of the American experimental tradition in psychology. Basically, the authors of this theory
were interested in comparing human brain’s capability to process information to that of a basic
processor or a computer. Information processing theory discusses the mechanism under which
learning happens (Estes, 1978).
George A. Miller’s information process theory laid more emphasis on the continuous
pattern of human brain development. The creator of information process theory, George A.
Miller, provided two theoretical concepts which have been fundamental in cognitive psychology
and also in the frameworks of information processing (Ramon, 2005). The concept of
“chunking” led the rank with the short-term memory capacity. In building the theory, George A.
Miller argued that short-term memory was only able to hold 5-9 information chunks where a
chunk means a unit of meaningful information (Maxine & Elizabeth, 2005). In this case, George
A. Miller associated a “chunk” with words, faces of people, digits, or chess. This chunking and
limited capacity of short-term memory has since then been considered by the other theorists
namely Atkinson and Shriffin in 1968, Craik and Lockhart in 1972, and Rumelhart and
McClelland in 1986 as a basic element to all subsequent memory theories.
As stated above, George A. Miller came up with the second concept of “Test-Operate-
Test-Exit” (TOTE). The TOTE concept was also supported by Galanter and Pribran in his 1960
theoretical amendment. George A. Miller together with his college theorists postulated that this
TOTE concept should take the place for stimuli-response and be considered as a basic behavior
unit (Lisa, Diane, Ann, Andre, & Lynn, 2009). George A. Miller et al. suggested that under a
INFORMATION PROCESS THEORY 3
TOTE unit, a goal gets tested to prove whether it has been achieved and if not then an operation
gets done to accomplish this set objective. Thereafter, the cycle of test-operate gets repeated till
the set objective gets realized or just abandoned (Estes, 1978). In this concept of TOTE, the basis
under which most subsequent theories get provided in solving a problem and productive systems
is realized.
In particular, information processing theory centers more on the memory encoding and
retrieval aspects for both human brain and the computer. The theorists conclude that the human
mind processes information in the same way an information processor or a computer does. The
theory postulates that people are like information processors or a computer; rather that the notion
by behaviorists that human beings tend to think and respond to the environment as issues come
(Ramon, 2005). The theory equates the human mind’s mechanism of thought to the mechanism
used by a computer.
To conclude, information processing DMT model has generally turned to be a human
cognition theory. In a nutshell, the chucking phenomenon has got verified in every aspect where
cognitive processing is involved. Therefore, it is clear that the human brain functioning can be
equated to that of a computer or an information processor.
TOTE unit, a goal gets tested to prove whether it has been achieved and if not then an operation
gets done to accomplish this set objective. Thereafter, the cycle of test-operate gets repeated till
the set objective gets realized or just abandoned (Estes, 1978). In this concept of TOTE, the basis
under which most subsequent theories get provided in solving a problem and productive systems
is realized.
In particular, information processing theory centers more on the memory encoding and
retrieval aspects for both human brain and the computer. The theorists conclude that the human
mind processes information in the same way an information processor or a computer does. The
theory postulates that people are like information processors or a computer; rather that the notion
by behaviorists that human beings tend to think and respond to the environment as issues come
(Ramon, 2005). The theory equates the human mind’s mechanism of thought to the mechanism
used by a computer.
To conclude, information processing DMT model has generally turned to be a human
cognition theory. In a nutshell, the chucking phenomenon has got verified in every aspect where
cognitive processing is involved. Therefore, it is clear that the human brain functioning can be
equated to that of a computer or an information processor.
INFORMATION PROCESS THEORY 4
References
Estes, W. K. (1978). Handbook of Learning and Cognitive Processes (Vol. 5). Hillside, NJ:
Lawrence Erlbaum Associates.
Lisa, C., Diane, M., Ann, E., Andre, K., & Lynn, N. (2009). Nurses’ Uncertainty in Decision-
Making: A Literature Review. Worldviews on Evidence-Based Nursing, 1-13.
Maxine, O., & Elizabeth, M. (2005). The use of ‘think aloud’ technique, information processing
theory and schema theory to explain decision-making processes of general practitioners
and nurse practitioners using patient scenarios. Primary Health Care Research and
Development, 6, 46-59.
Ramon, S. Z. (2005). Theories of clinical judgment and decision-making: A review of the
theoretical literature. Journal of Emergency Primary Health Care, 3(1-2), 1-13.
References
Estes, W. K. (1978). Handbook of Learning and Cognitive Processes (Vol. 5). Hillside, NJ:
Lawrence Erlbaum Associates.
Lisa, C., Diane, M., Ann, E., Andre, K., & Lynn, N. (2009). Nurses’ Uncertainty in Decision-
Making: A Literature Review. Worldviews on Evidence-Based Nursing, 1-13.
Maxine, O., & Elizabeth, M. (2005). The use of ‘think aloud’ technique, information processing
theory and schema theory to explain decision-making processes of general practitioners
and nurse practitioners using patient scenarios. Primary Health Care Research and
Development, 6, 46-59.
Ramon, S. Z. (2005). Theories of clinical judgment and decision-making: A review of the
theoretical literature. Journal of Emergency Primary Health Care, 3(1-2), 1-13.
1 out of 4
Your All-in-One AI-Powered Toolkit for Academic Success.
+13062052269
info@desklib.com
Available 24*7 on WhatsApp / Email
Unlock your academic potential
© 2024 | Zucol Services PVT LTD | All rights reserved.