Return
to Computers for Education, Dr. Hanson-Smith's Homepage
Recent
Papers by Dr. Hanson-Smith
An audio
slideshow version of this talk, updated and revised, as presented at
the CALL IS 25th Anniversary Symposium at the TESOL Denver Convention,
March 26, 2009, is available at http://callcolloq-tesol09.wikispaces.com/.
A BRIEF HISTORY
OF CALL
THEORY
This paper was
originally
created for The Human Face of CALL, an Electronic Village
Online
course, and was presented at the CALL IS Academic Session,
Wednesday
8:30-11:15
am, Salt Palace Ballroom 1, at the 2002 TESOL Convention,
Salt
Lake City,
Utah, USA. A
paper
version was published in the CATESOL Journal,
15
(1), 2003, pp. 21-30. Please note that many of the links in this
original version are no longer available.
Elizabeth
Hanson-Smith
Computers For
Education and Command Performance Language Institute
ehansonsmi @ yahoo.com
http://webpages.csus.edu/~hansonsm
Over the past
decade, the
personal computer has emerged as a significant tool for language
teaching and
learning. The widespread use of software, local area networks, and the
Internet
has created enormous opportunities for learners to enhance their
communicative
abilities, both by individualizing practice and by tapping into a
global
community of other learners.
Computer use has been proclaimed by some futurists to be as
important to
cultural change as the invention of movable type. The
silicon chip is changing the way we work, how we
communicate, whom we communicate with, and where and how we live.
Given
the enormous
and
far-reaching consequences of computer chip innovation, it is no wonder
that
CALL (computer-assisted language learning, to use its most common
denomination)
has struggled with what Carla Meskill calls the "hand-me-down"
syndrome (Meskill 1999); that is, every technological innovation used
in CALL
has arisen first in some other field and for some other purpose. The
enormous
ingenuity and creative genius of computer-using teachers has thus been
all the
more remarkable in applying technological innovation to language
education. As
an introduction to a consideration of
how and why people use computers, the subject of the speakers'
presentations in the upcoming CALL IS Academic Session at TESOL 2002
(see other
files in this folder for their papers), I would like first to consider
the
history of pedagogical theory in CALL, and eventually consider possible
future
trends and issues.
The
Early
Years
Much
of the early
history of
computers in language learning, from the early 1980s to 1990s, was
spent in
trying to keep up with technological change. Mainframe
computers were at first seen as the taskmaster:
The PLATO system (Bitzer 1960) at many universities supplied a number
of
content courses, particularly in English grammar and computer science. Students
went
to a
lab,
sat
in
rows,
one to a computer (which some of us now think of as "solitary
confinement"), and "mastered" each
piece of a topic bit by
bit, through presentation and "practice" in the form of tests. My
own experience with PLATO, besides attempting to author a
vocabulary/reading
program, included the discovery that my most difficult student had
ignored the
set of grammar exercises I had carefully selected for him but was
spending 2-3
hours a day working crossword puzzles on the machine.
In the mid-80s, the
field
suddenly changed when silicon chips and the desktop personal computer
burst on
the scene. Many of us remember the days of dragging out an Apple on an
OHP
stand from the broom closet in order to let a small group of students
use it in
rotation. Those were the days when the user had to run the operating
system
from a 5-1/4" diskette, and an enhanced Apple had 64K (not MB) of
memory.
The "killer app" was a spreadsheet program that crunched numbers in
slow motion. Word processing was almost an afterthought. However, the
miniaturization of electronics meant that each year, small personal
computers
increased in speed and power, until "multimedia" has become
virtually synonymous with "computer."
As a result of the constant
changes in computers, and their evolution from mainframe to laptop,
much
written about CALL in its early years was devoted to how to use the new
technology, rather than to its empirical effects on learning. One
recurrent
theme throughout these early days, nonetheless, was the crucial
pedagogical
debate, largely framed by John Higgins and Tim Johns, over whether the
computer
was ³master² of or ³slave² to the learning process
(Higgins & Johns 1984). Was the computer to be a replacement for
the
teacher, or merely an obedient servant to the student?
The
early uses of
computers,
particularly during the era before the microchip, promoted the
behavioristic
tutorial-and-test approach (also called "drill-and-kill") of
audio-lingualism, an approach dominant in TESOL in the 1940s and 50s.
One reason
for this return in CALL to an earlier pedagogical model was no doubt
the
limitation of early technology; another reason was that computer
programmers
were not particularly knowledgeable about how language learning worked.
No
doubt inspired by a "cultural revolution" that swept much of the
native-English-speaking world, a flood of new pedagogical approaches in
the
1960s and 70s washed over audio-lingualism and were firmly entrenched
long
before the microchip came to play a role in CALL: among these
experimental
approaches were Silent Way, Suggestopedia, and Community Language
Learning.
(For excellent short summaries and primary documents on these
precursors to
communicative language teaching, see Blair 1982.) Computer-using
teachers
yearned to employ these more experimental models of communicative
teaching and
learning, some of which implied an unprecedented control by students
over their
own learning.
Stephen
Krashen's
significant
body of work in the 1970s and 80s (see especially his widely read Principles
and Practices, 1982) gave a
clear
focus to the experimental approaches and led TESOL into an era of
"communicative language learning." The predicament for CALL was
whether students were to communicate with the computer (the patient and
friendly teacher) or with each other, with the computer merely a
stimulus to
the conversation. One hope was that something like a version of Eliza,
the
shareware therapist who reflected back student/patient input, might be
useful
to language learning. For example, when the computer didn't understand
an
expression, whether because of a typo or a faulty construction, it
would simply
ask, "What is ___?" The standard for artificial intelligence today is
still "When the computer answers, can you tell (or how long does it
take
to tell) if it is human?"
The
use of pairs
and triads
around the computer in games, simulations, and grammar drills was an
attempt to
bring some human interaction into the realm of technology in the late
1980s and
early 90s. A number of papers and research experiments at that time
dealt with
whether language was really being taught or learned simply by putting
students
in groups. I recall one researcher who concluded that the most
significant
language use--while students worked in groups to generate sentences at
the
computer--came when one student lit up a cigarette in the lab and had
to be
told to put it out. At the time, most computer "teaching" programs
were still so limited in their pedagogical approach that students were
mainly
attracted by using the new technology, not by what they could learn
with and
about language. At the same point in time, the word processor became
perhaps
the first computer application that truly supported an innovative
pedagogy: The
"process approach" to writing, which evolved in the late 1980s, would
have been only wishful thinking without the facility the word processor
provided in multiple drafting, revision, and editing.
With
the increased
speed and
power of personal computers came another landmark, HyperCard for the
Apple environment.
This simple authoring program was a godsend to teachers who were trying
to
bring multimedia and interactivity into the world of technology--best
of all it
was distributed free. Programs for ESL/EFL students, as well as
programs
adapted for their use, created with HyperCard or its many imitators on
PCs,
flourished in the early 90s. The collection made by the CALL IS back in
1996
(TESOL/CELIA ' 96 CD-ROM; most of the programs are also still available
from
the CELIA archive at http://www.celia.edu)
is perhaps the best example
of
teacher ingenuity, originality and dedicated effort in a shareware
environment.
Other, more sophisticated authoring software quickly followed, though
again,
much of it was limited by a behavioristic notion of learning.
Moving into
the Present
The
emphasis on
natural or
"authentic" language expressed itself in TESOL practice in two
related but somewhat divergent communicative movements: content-based
learning
(in some contexts expressing itself as "Sheltered English" or SDAIE
("Specially
Designed
Academic Instruction in English"; see Cantoni-Harvey, 1987) and
task-based learning (see Nunan 1989 and 1995). Fortunately, by the
early 1990s,
as these approaches came to have considerable (and continuing)
influence in the
schools, computer technology was catching up to its potential.
Content-based
learning is
greatly enhanced by the computer, since so much information can be
brought into
the classroom on content CDs and via the Internet. In a very tiny
space, one
may store and search a fully multimedia-enhanced version of the Encyclopedia
Britannica, the entire
History of Art
in pictures, or the complete publications of the National Geographic
Society.
One may download from the Internet the complete works of authors from
Project
Gutenberg, or directly access the US Government's satellite pictures of
Mars from the NASA site (Mars Exploration,
http://mars.jpl.nasa.gov/odyssey/index.html).
Task-based
learning
is also
much enhanced by the use of the computer. CALL has taken two avenues to
this
aspect of pedagogy: one is the use
of simulations and adventure games, in which the learner plays a role
in order
to uncover information, while also learning how to use the typical
collocations
of the simulation or adventure. The power of the computer to crunch
numbers
saves hundreds of hours of painstaking labor, and gives students
instant
results when they attempt all the "What-ifs" that the exploration
of a simulated environment demands--all without producing fatal errors,
such as
blowing up the chemistry lab. Another aspect of task-based learning
enhanced by
CALL is the use of multimedia tools for students to create their own
presentations. Simple authoring programs allow students to record their
own
voices, draw pictures, and import graphics, photos, and videos they
have made
themselves or downloaded from the Internet. Creating Webpages is itself
a major
task-based learning component in many technology-enhanced classrooms.
Multimedia can help students discover their own best learning
strategies, while
preparing them for a world inundated by graphic images (see
Hanson-Smith 1997).
Both simulations and multimedia projects also provide the impetus to
use groups
to solve problems cooperatively, develop communication skills, and
practice
written and oral language appropriate to the context of their study.
Content-based
and
task-based
approaches seemed to solve many of the problems of earlier
grammar-based and
aural-oral language approaches because of the rich input provided. Yet
such
input was far too often totally uncontrolled, particularly in the wild
and
wooly Internet environment. Natural language, even with all the
supporting
apparatus of sound and pictures, creates a vast sea of words, words,
words--hardly "i + 1." Thus technology-using teachers often spend
considerable time developing appropriate lessons to support students
who
perform research on CDs and the Internet, or who use simulations and
games
created, for the most part for native speakers (with a few notable
exceptions
such as Escape from Planet Arizona).
Further,
as
communicative
approaches--group work, content-based curricula, and tasks--gained
favor,
researchers like Swain were already pointing out that input wasn't
enough:
output, interaction, and the negotiation of meaning were also essential
to
language learning mastery (Swain 1985; Swain & Lapkin 1995). During
the mid
to late 1990s, as the Internet grew like a giant amoeba, language
teachers
found a remarkable tool for student-to-student communication: e-mail.
Communication over distance or even within a networked classroom
provided
fascinating "content" in the ordinary discourse of people learning
more about each other and each others' cultures (see Cummins and Sayers
1995),
even as they shared information about academic topis. At the same time
networked writing offering a written record of interactions that could
be
studied and interpreted and used for language "scaffolding"--much as
Community Language Learning had attempted to do some 30 years before,
but
without the tedium of hand typing transcripts. (For more on scaffolding
see the
summary of a large body of work by Holliday, Pica, and many others in
Holliday
1999; see also Peyton 2000.) E-mail interactions also have the
advantage of
occurring primarily at the "i + 1" level in an environment where
students have time to reflect on input and to query their interlocutors
about
both content and form--in other words, an ideal language learning
environment.
Luckily,
the
intersection of
multimedia technology with communicative methods occurred just as
teachers and
researchers renewed their interest in the cognitive side of learning.
There has
been much interest in the late 1990s and the early 00s in a pedagogical
theory
called "constructivism." Originally put forward by Sydney Papert,
creator of the computer language Logo, constructivsm describes learning
by
doing and creating meanings, particularly by using the tools of the
computer to
explore simulated--but also very real--worlds (see Buell 1996-97). This
theory
of learning, reaching back to John Dewey (1938) at its roots, dovetails
nicely
with the recent recognition in language pedagogy of the need to
encompass
higher cognitive processes in the learning task. Anna Uhl Chamot and
Michael
O'Malley (1996), creators of the Cognitive Academic Language Learning
Approach or CALLA, are probably the chief proponents of this view at
present.
The cognitive approach speaks to the need for students to be aware of
their own
learning processes, and to organize and structure their learning
themselves. The
plethora of information now available electronically makes just such
cognitive
demands on the language student, while technology can provide the means
to
easily structure and organize new information and incorporate it into
the
learning process.
The
very conscious
use
students make of their own cognitive abilities while learning a
language--and
how computers might make this effort easier--came home to me many years
ago, as
I watched a video of a student at a computer. The student was
highlighting a
word or phrase in a sentence to hear and repeat it--over 20 times. It
is
difficult to imagine assigning a student to listen to and repeat a word
or
phrase 20 times, but the computer controls allow this kind of
intensive,
individualized, autonomous practice without the physical difficulties
entailed
in, say, audiotape. Nor would any teacher have the patience to repeat
something
20 times--leaving aside the problem of what the rest of the class might
do
during this operation. This student waa in control of his own learning,
its
pace and the input he needed at that moment in time.
In
constructivism, technology-assisted language learning has found a
viable
pedagogical theory. One might apply this general theory of knowledge to
language learning as follows: To learn a language is to construct a
series of
approximations of the correspondences between meanings and variations
in
phonemes, morphemes, and syntax. In part this process takes place
through
exposure to "experts" in the language (Krashen's Input Hypothesis),
and in part through trial and error, or hypothesis testing: Learners,
whether
of a first or a second language, try out various expressions and hope
to
receive more information based on the results of the transactions
(Swain's
"output hypothesis" and the negotiation of meaning). While some of
this process is perhaps subconscious, and for first language learners
apparently dependent on some innate acquisition mechanism, for second-
and
third-language learners, some of this process is also conscious and
accessible to
the planned use of memory, deliberate practice, and schema-building.
In
summary, the
field of
TESOL has passed through pedagogical stages in 10-12 year cycles: audio-lingualism
in
the
forties
and
fifties, the experimental era of Silent Way, Suggestopedia, and
Community
Language Learning in the sixties and seventies, the Natural Approach in
the
seventies to eighties, communicative language teaching from the
eighties to the
nineties, and the new cognitivism in the late nineties (notice how many
articles
now refer to "language development" rather than the more
problematic "language acquisition"). CALL has, interestingly,
replicated this 50-year development in a foreshortened or accelerated
manner,
retracing the entire pedagogical history of TESOL methods in only about
10
years. Although at present in CALL audio-lingualism is still with us
(especially since HTML suffers some of the same limitations as early
mainframes), and many manufacturers of language software--and classroom
teachers and administrators--still perceive of the computer as a
replacement
for the teacher, we are primarily in a communicative/cognitive stage,
with most
good new software (and I include here the Internet) incorporating
elements of
group work, task-based learning, authentic language, content-based
learning,
conscious schema-building, and attention to a variety of learning
styles. The
chaotic information of the Internet will no doubt enhance the cognitive
side of
the paradigm, because students will need organizing schema and
strategies to
access and use this largely native-speaker-oriented content resource.
The
Future?
It
remains now to
see where
the future of CALL and TESOL pedagogy will take us. Or perhaps the
question is
really "Where will CALL take TESOL?"
Several
factors
contribute to
what I perceive as a future dominance of CALL in the search for
language
pedagogy. Of import among these factors is the value of
technology-supported or
-enhanced research in second language acquisition (see Hulstijn 2000).
Where
every keystroke, voice message, Webcam file or Internet search may be
recorded
and tracked, we have an enormously useful tool for analyzing how
students
participate in and direct their own learning. Concordance programs
allow us to
compare any set of texts to each other, for example, the differences
between
written and spoken collocations, giving us (and our students) new
insights into
the nature of language. The technology-based tools for research are as
yet only
barely being applied, but they should enhance considerably our
understanding of
linguistics and SLA. (Holliday's analysis [see a summary in Holliday
1999] of a
huge corpus of student e-mail
messages gives us some idea of procedures that may be fruitful.)
Another
factor is
"convergence," the tendency of technologies to meld into and reinvent
each other. We are very close to an affordable cell
phone-PDA-computer-Internet
combination, probably in a "wearable" format, that will give maximum
mobility and convenience to the learner. Learn anywhere, anytime,
through any medium
is a clearly attainable proposition, and is especially attractive in
countries
where the lack of ground infrastructure can be leapfrogged by satellite
telecommunications. It is also a proposition that offers inclusion to
students
with physical disabilities that may currently prevent them from access
to
learning at their own pace and in their own mode. How will CALL
translate the
convergence of voice, video, and mobility into a new paradigm for
language
teaching and learning? (See a history of wearable computers and
examples of
wearables at MIThril,
http://www.media.mit.edu/wearables/lizzy.index.html;
wearables are already used in industry, for example to keep a
hands-free
diagram of airplane wiring before the technician's eye in a
head-mounted display
similar to eyeglasses.)
I
have already
mentioned the
value of the Internet in the communicative/cognitive paradigm. But
further, we
must as culturally sensitive teachers keep in mind the significance of
"glocalization," that is, being both local and global at the same
time. The English language is still dominant on the Internet, followed
closely
by Spanish. This replicates a similar dominance in print media. No
doubt our
jobs as English teachers will be secure through much of this century,
but that
dominance may eventually change. How will CALL prepare the citizens of
a world
culture for multilingualism on a grand scale while preserving the
uniqueness
and worth of the many sometimes tiny cultures that drink from and
contribute to
that fast-flowing stream of information? As distance learning,
especially
Internet-based education, becomes the dominant mode, technology-using
teachers
have a responsibility to look ahead and plan for that eventuality. One
important aspect of such planning would be the creation of standards
for
technology-based distance learning. Virtually every university, public
and
private, charlatan and genuine, is putting courses and whole degrees on
the
Web. How will students know which to choose?
Finally, regardless
of where
an individual teacher or program stands on the communicative/cognitive
spectrum, technology has become an environment for learning language.
The
implications of a technology-enhanced environment are quickly realized
by
teachers and students: once they have technology, there is no going
back to
unadulterated chalkboards and lined theme paper. Teachers cannot afford
to be
the "sage on the stage" (or the drone on the throne), when any
student can seek information, communicate with peers and experts, and
control
learning individually. The old debate over tutor vs. tool or master vs.
slave
takes on different shades of meaning when the role of the teacher
itself has
changed from instructor/task master to guide/mentor. The computer is no
longer
master, but neither is it simply a tool, for it changes what learning
is, as
the printing press changed learning and culture in the late Middle Ages
of
Europe. The debate is no longer over whether to use CALL, but only how
best to
do so. (See also my article on the "quiet revolution," Hanson-Smith
1999.)
While
I have
raised, rather
than answered, the question of where CALL will take TESOL, I trust you
will
have many more questions--and answers--to share with us during this
online
course, and at the CALL IS Academic Session. I look forward to
exploring these
issues further with you and our presenters.
References
Bitzer,
D.
(1960). PLATO [Computer software]. Urbana, IL: University of Illinois,
Urbana-Champaign.
Blair,
R. W., Ed.
(1982). Innovative approaches to language teaching. Rowley, MA: Newbury House.
Buell, J.
(1996-97).
Constructing education: Computers and the transformation of learning. CÆLL
Journal, 7(3), 3-7.
Cantoni-Harvey, G.
(1987) Content-Area
Language Instruction: Approaches and Strategies. Reading, MA: Addison-Wesley.
Chamot, A.U. & O'Malley, M. (1996). The
Cognitive Academic Language Learning Approach: A model for
linguistically
diverse classrooms. Elementary School Journal, 96(3),
259-73.
Cummins,
J. and
D. Sayers. (1995). Brave new schools: Challenging cultural
illiteracy
through global learning networks. New York: St. Martin's Press.
Dewey,
J. (1938).
Experience and education.
New York: Macmillan.
Hanson-Smith,
E.
(1997). Multimedia projects for EFL/ESL students. CAELL Journal, 7(4), 3-12.
Hanson-Smith,
E.
(1999, March/April). CALL environments: The quiet revolution. ESL
Magazine, pp.
8-12.
Higgins, J. &
Johns, T.
(1984). Computers in language learning. London: Collins ELT and Addison-Wesley.
Holliday,
L.
(1999). Theory and research: Input, interaction and CALL. In J. Egbert
& E.
Hanson-Smith (Eds.), CALL Environments: Research, practice, and
critical
issues (pp.
181-188).
Alexandria, VA: TESOL.
Hulstijn, J. H.
(2000, Jan.).
The use of computer technology in experimental studies of second
language
acquisition: A survey of some techniques and some ongoing studies. Language
Learning & Technology, 3(2):
32-43. Available online:
http://llt.msu.edu/vol3num2/hulstijn/index.html.
Krashen,
S.
(1982). Principles and practice in second language acquisition. Oxford: Pergamon Press.
MIThril: The MIT
Wearable
Computing Web Page. (2000, December 13). MIT Media Lab. Retrieved from
the
World Wide Web, December 3, 2001 at
http://wearables.www.media.mit.edu/lizzy/index.html.
Meskill,
Carla.
(1999).
Conclusion: 20 minutes into the future. In J. Egbert & E.
Hanson-Smith
(Eds.), CALL environments: Research, practice, and critical issues (pp. 459-469). Alexandria, VA: TESOL.
Mars Exploration
(2001,
November 13). NASA. Retrieved from the World Wide Web December 3, 2001,
at
http://mars.jpl.nasa.gov/odyssey/index.html.
Nunan, D. (1989). Designing
tasks for the communicative classroom.
Cambridge: Cambridge University Press.
Nunan,
D. (1995).
Closing the
gap between learning and instruction. TESOL Quarterly 29,
33-58.
Peyton,
J. K.
(2000). Immersed in writing: Networked composition at Kendall
Demonstration
Elementary School. In E. Hanson-Smith (Ed.), Technology-enhanced
learning
environments
(pp. 99-110.
Escape
from
Planet Arizona: An EF Multimedia Language Game. [Software]. (1995).
Stockholm,
Sweden: EF Education.
Project Gutenberg:
Fine
Literature Digitally Republished. (2001, May 28). Retrieved December 3,
2001
from the World Wide Web at http://promo.net/pg/.
Swain,
M. (1985).
Communicative competence: Some roles of comprehensible input and
comprehensible
output in its development. In S. M. Gass & C. G. Madden (Eds.), Input
in
second language acquisition
(pp.
235-253). Rowley, MA: Newbury House.
Swain,
M., &
Lapkin, S.
(1995). Problems in output and the cognitive processes they generate: a
step
towards second language learning. Applied Linguistics 16(371-391.
TESOL/CELIA
'96
[Software]. (1996). TESOL and LaTrobe University, Melbourne, Australia.
Alexandria, VA: TESOL. Available on the World Wide Web at
http://www.celia.edu.