Educational technology is commonly associated with computing. In fact, there are many other educational technologies that play an important role in education— some dating as far back as antiquity. This entry explores that history as the context for a larger discussion of the revolution in education created by the computer and the Internet.
The term tabula rasa, which is now used as a metaphor to describe the idea that humans are born with no built-in knowledge of the world, literally refers to the Latin term for a “scraped tablet.” Although the term is associated with the phrase “blank slate,” it actually refers to a smooth wax slate on which Roman schoolchildren inscribed their lessons. It was an educational technology.
Historically, the most important educational technology has been the book. Books became practical when paper was introduced into Europe from China during the medieval period. When books were written by hand, they were expensive, cumbersome, and often inaccurate. The introduction of movable type in the late fifteenth century changed the nature of the book as a cultural and technological phenomenon.
What is commonly described as the Gutenberg Revolution (c. 1450) made possible the distribution of relatively inexpensive and highly accurate texts. Its creator, Johannes Gutenberg (c. 1398–1468), was a goldsmith from Mainz, Germany. His invention of the printed book represented the merging of several independent ideas, including movable type cast from metal, oil-based ink, and a wooden press for creating an impression.
The Impact Of Printing
The effect of the introduction of printing on education in Europe was profound. Essentially, it redefined educational discourse at all levels. Knowledge was no longer limited to what could be remembered in one’s memory, or recorded in cumbersome handwritten documents. It was now possible to distribute relatively inexpensive and highly accurate texts on almost any subject. These texts could include elaborate illustrations, diagrams, charts, and tables that would have been much more difficult, if not impossible, to reproduce by hand. In addition, as a technology, typographic sources lent themselves to careful editing and proofing. Unlike a hand-copied text, each copy of which had to be proofed, a single master text could be proofed once and then reproduced (printed) on an unlimited basis.
Within a generation, European universities abandoned much of their emphasis on oral approaches to instruction that emphasized memorization and recitation. Their curricula increasingly depended on textually based models. Books became means for extending memory, as well as precision in thought. During the early Renaissance, humanist educators sought to establish definitive editions of important classical texts, such as those from Plato and Aristotle, as well as those from a host of lesser luminaries.
Many believe that the introduction of inexpensive printing made possible the Protestant Reformation, as well as many of the advances that came in fields such as medicine, cartography, architecture, and physics during the Renaissance and early modern era. How effective would Luther have been if he had not been able to circulate his ideas through broadsheets and inexpensive books? Would the scientific advances in medicine and cartography during the Renaissance have been possible without the technology of the book? Would Nicolaus Copernicus’s or Isaac Newton’s works have been possible without the resources provided by printing?
Mass-produced illustrations probably first appeared in illustrated and hieroglyphic Bibles. Luther’s early Bible, for example, was illustrated. In addition, Luther and others were responsible for the development of the first catechisms. These religious primers became the basis for the development of secular primers during the sixteenth century.
By the middle of the seventeenth century, new ways of using illustrations began to redefine educational texts. Perhaps the most important example was the work of the Czech educator, John Amos Comenius (1592–1670). In 1658, he created the Orbis Sensualium Pictus, which many people consider to be the first illustrated textbook for children.
In the Orbis Pictus, Comenius included detailed illustrations that provided a visual context for printed material included in his text. This material, which was printed in Latin as well as German, represented an astonishing pedagogical advance, one that was based almost entirely on the technology of the book.
Comenius’s influence can be seen in primers and textbooks from the late seventeenth and early eighteenth centuries. The New England Primer, first published in 1692, clearly is modeled on the Orbis Pictus in its use of illustrations. Textbooks became increasingly widespread by the middle of the eighteenth century. They created a pedagogical discourse that emphasized the written over the spoken word. By the time of the American Revolution, textbooks had become the single most important formal source of curriculum in the schools. As a technology, textbooks create a uniform curriculum across schools. It is no accident that works such as Noah Webster’s 1781 three-volume A Grammatical Institute of the English Language (the first volume of which was the famous “Blue-Back Speller”) emphasized not only basic literacy, but ideological and political knowledge as well. Webster was interested in creating educated Americans—members of a new and revolutionary culture—not loyal British subjects. His texts emphasized English as it was spoken in North America, not in Great Britain.
Blocks And Blackboards
Although textbooks remained the single most important technology used in schools during the eighteenth and nineteenth centuries, other technologies gradually made their way into the classroom. As early as 1692, John Locke (1632–1704) wrote in “Some Thoughts on Education” about the use of alphabet blocks in the education of children.
By the middle of the nineteenth century, educators such as Friedrich Froebel (1782–1852) were developing school curricula around the use of pedagogical devices ranging from construction blocks to weaving activities and even an early type of Tinkertoy® construction system. Similar types of learning devices were introduced by the Italian physician and educator Maria Montessori (1870–1952).
Perhaps the most enduring educational technology of the nineteenth century was the blackboard. Larry Cuban has noted that its persistence in the classroom is indicative of its dependability and functionality. Early blackboards were made from pine covered with a carbon and egg-white writing surface. Slate, introduced later, was replaced in the twentieth century by easier-to-read green boards and eventually by white boards, which are plastic surfaces on which erasable, felt-tipped pens are used.
By the end of the nineteenth century, educators were experimenting with other new technologies, such as the phonograph. At about the same time, lantern slides began to be widely used. Pioneers in audiovisual education, such as the St. Louis public schools, made slide sets available for classroom instruction as early as 1905. In 1911, Thomas Alva Edison (1847–1931) released a series of silent historical films that he hoped would be used in classrooms. By the 1930s, educational radio broadcasts were also being used in classrooms. It was at this time that the filmstrip was invented, introduced for the first time at the Century of Progress Exhibit in Chicago in 1933.
In general, despite wide experimentation with technology, most innovations have had far less impact in the classroom than was originally anticipated. During the 1950s and 1960s, educational television and videotape were thought to have the potential to revolutionize education. But as Larry Cuban has pointed out, educational technologies such as radio, film, and television actually have had only a marginal impact on classroom instruction.
Why is this the case? One possible explanation is that teaching is far more a social process than a technical one. What may count the most in the classroom is the interaction that takes place between teachers and students. Rather than teachers simply being conduits for knowledge, and students receivers of information, a much more complex process of social interaction is at work. Technologies, whether in the form of radio, film, or television, are of relatively little importance in the curriculum as compared to what teachers do in the classroom. Essentially, technologies can be useful to skilled educators, but they have not fundamentally changed the nature of most teaching and learning.
The Computer Revolution
This is not necessarily the case, however, with the most recent educational technology—the computer. Computing devices are as ancient as the abacus, but modern computing as we know it has existed only since the end of World War II. At that time, machines such as the ENIAC (Electronic Numerical Integrator and Computer) were developed by the engineering school at the University of Pennsylvania. An extremely limited machine compared to modern computers, the ENIAC was designed to calculate ballistics for use in the military.
As computers became increasingly widespread in government, business, and higher education during the 1950s and 1960s, their use was largely limited to mathematical calculations and business functions. Because of their prohibitively high cost, their use in education was almost entirely limited to specialized areas such as statistics and the sciences. This began to change in the late 1970s with the introduction of relatively inexpensive microcomputers, or, as they eventually came to be known (as they increased in power), desktop computers.
The first really practical desktop computers were introduced in the 1970s by the Apple Corporation. Founded in 1976, the first Apple computer, the Apple I, was a personal computer kit that had to be assembled by the owner. The Apple II, which was introduced in April 1977, was the first really practical desktop machine. Because of its size and relatively low cost, the machine made computing a possibility for private individuals, as well as various types of educational institutions. By the early 1980s, rivals such as Microsoft and IBM began to compete with Apple for domination of the personal computer market.
Desktop computing, as with earlier educational technologies, was considered by many educators to have the potential to revolutionize the classroom. Eugene F. Provenzo, Jr., in his book Beyond the Gutenberg Galaxy: Microcomputers and the Emergence of Post-Typographic Culture (1986), postulated that the desktop or microcomputer revolution (happening during the 1980s) functioned in many respects like the Gutenberg Revolution nearly 500 years earlier. As was the case with the introduction of the book in the sixteenth century, the widespread availability of the computer created a new set of users who suddenly had access to new types of information and tools, and, as a result, were correspondingly empowered. The new technology provided by computers likewise created new ways of learning, as well as radical innovations in terms of representing and manipulating scientific and humanistic knowledge.
Much of this, as was the case with the revolution in printing, is now taken for granted. Research and the acquisition of knowledge have changed dramatically in the past twenty-five to thirty years as a result of the widespread introduction and use of computing. A library card catalog is essentially obsolete, no longer even available in most libraries. Library catalogs are now largely electronic, as are many of the collections to which they are linked. Students and teachers at all levels now have access to information and data through computers that a generation ago could be accessed only by specialized scholars working in archives and university collections. Anyone in the world can access the great libraries of the world and, increasingly, large parts of their collections. The Internet search engine Google is rapidly evolving into a world library or an encyclopedia or database. Most of the world’s essential knowledge will be available online. More traditional knowledge systems are becoming obsolete. For example, what is the relevance of a traditional text-based encyclopedia such as Encyclopedia Britannica when much of the same information is available online in forms such as Wikipedia? What happens to traditional intellectual authorities such as Britannica, which seeks out the best experts available on a subject, when they are challenged by an “open source” reference work, in which volunteers constantly edit, revise, and update articles, as is the case with Wikipedia?
The Power Of The Internet
Why is the computer such a potentially powerful educational technology? The question is more complicated than may at first seem to be the case. A minimum of three to four revolutions in computing have taken place since the end of World War II. All of them have affected education—some more than others. Mainframe computing, which lasted from the mid-1940s to the late 1970s, has already been discussed. To a large degree, its impact was confined to science and government, with a limited impact on universities and colleges. The introduction of desktop computing during the late 1970s made computing available on a practical basis for the first time in K–12 schools.
With the introduction of the Internet and World Wide Web in the early 1990s, educational computing went through another major evolution—making unprecedented access to information and communication resources possible for the general public. The Internet originated in 1969 when the U.S. Army established an online computer-based communication system known as the Advanced Research Projects Agency Network (ARPANET). The primary concept behind the ARPANET was to develop a networked series of computers that could survive a nuclear attack. Evolving rapidly, the system was made available to university researchers through the National Science Foundation in 1983 and became accessible to the general public in 1985.
Initial access to the Internet was limited due to the complexity of its design and protocols. In December 1990, however, a major innovation occurred when Tim Berners Lee, who was working at the European particle physics laboratory in Bern, Switzerland, developed a graphical-oriented computer browsing system that made it easier to communicate when using the Internet. Known as the World Wide Web, this graphical user interface uses hypertext markup language that allows hyperlinks to connect text, visual, and sound files. Today, approximately 1 billion of the 6 billion people in the world make regular use of the Internet. Use varies widely, however, based on geographic location and level of economic development. In North America, for example, Internet use is nearly 70 percent, whereas in Africa, usage is only 3 percent.
In industrialized countries such as the United States, computer access is becoming sufficiently widespread that its use in education is taken for granted. Even though appropriate training of teachers to use the technology varies widely, and resources are not by any means equal, the availability of computers as an educational technology in the schools is becoming universally accessible. Accessibility, however, does not necessarily guarantee equal use. Serious issues underlie how computers are used in different settings. At the elementary level, boys are often allowed greater access to computers than girls. “Skill and drill” educational programs using computers are often emphasized with minority students and students from lower socioeconomic groups, whereas more creative uses are emphasized with other students.
A New Educational Environment
How do the desktop computing revolution that began in the late 1970s and the Internet revolution of the early 1990s combine to create a new environment for teaching and learning? To begin with, teachers and students are no longer limited by their geographical location. Access to information and communication is potentially available on a worldwide basis. Individuals can communicate via e-mail, exchange data, and access both formal and informal informational networks. As television in an earlier generation provided viewers with a “window on the world,” computers connected to the Internet and World Wide Web provide users with access to knowledge and communication on a global basis.
In addition, educational computing has the potential to augment the intelligence of users. The concept of the computer augmenting intelligence comes from the work of Douglas C. Engelbart. In the early 1960s, when Engelbart was an engineer working at Xero Park, a research center affiliated with Stanford University and funded by the Xerox Corporation, he published a series of seminal essays on computing. In these works, he outlined for the first time the basic principles of word processing, the use of screen icons, as well as concepts such as the computer mouse and digital scanning.
In a 1963 essay, Engelbart also theorized that human beings could augment, or enhance, their intelligence by using a computer. This concept of augmentation was similar to the idea of using a mechanical device such as eyeglasses to improve one’s vision, or a tool such as a pair of pliers to increase one’s grip. The idea of a tool for augmenting one’s intellect was not necessarily new, although Engelbart was the first to clearly articulate the concept. The printed book, for example, is an intellectual augmentation device because it expands the user’s knowledge base as well as the user’s ability to precisely recall information and data.
Computer augmentation of our intelligence occurs all the time. Handheld calculators are, in fact, computers that allow us to add, subtract, multiply, and do square roots at a level that would be impossible to do in our heads and difficult with pencil and paper. Grammar and spell-check systems, now common with word processing systems, are intellectual augmentation devices. Using one makes it possible for a relatively weak grammarian and/or speller to produce a written document that is significantly better than he or she could produce simply by writing with pen and paper, or using a typewriter.
Computers can also augment our physical selves. In the case of children with special needs, adaptive computer technologies make it possible for them to have access to information and ideas that would not otherwise be available. A vision-impaired child can use a computer that magnifies text on a screen, or that reads text aloud. A paralyzed child can use eye movement that is tracked by the computer to manipulate a keyboard, making it possible to type or search the Internet.
Robert Taylor, in an early work on educational computing, identified multiple ways in which computers are used in classroom instruction—the idea of the computer as Tool, Tutor, and Tutee. In the Tool function, a computer is used to do something, such as draw a picture, write a sentence, or multiply a number. The Tutor function involves the learner being taught something by the computer, such as a math program that teaches and reinforces multiplication and division skills. The third function, Tutee, involves the user programming the computer to do something. This last function is relatively rare compared to the other two functions.
Taylor’s model is useful in that it emphasizes the idea that not all educational computing functions are the same, and that they imply different levels of involvement and control on the part of the student. The Tool function, for example, places the student largely in control, whereas the Tutee function emphasizes the learner being acted upon by the computer program.
Other important distinctions in educational computing have been raised by Cleborne Maddux. In a widely cited article from the mid-1980s, Maddux makes a distinction between Type I and Type II uses of computers. Type I use involves the computer being used to do something that has always been done in education, but that is superseded by the computer, such as using the computer to do word processing in place of typing. A Type II use represents a totally new use of the computer, such as a simulation program that allowed a student to practice emergency procedures for flying a disabled airliner. Prior to computers, such an activity would have been impossible.
More Than A Tool
What occurs when a student uses a computer to learn something raises a number of interesting questions. If a student uses a computer to draw, is he or she actually learning to draw or simply learning how to draw with a computer? In this context, it can be argued that the computer is simply a tool, just as a pencil is a tool that allows an artist to create. As with any medium (chalk, ink, oils, acrylics, or charcoal), the medium used changes the nature of what is created. This is certainly the case with the computer. No medium is neutral, but represents a particular way of creating and constructing the world.
According to the German philosopher Martin Heidegger (1889–1976), any technology amplifies or reduces the experience that it mediates. Thus, a telephone amplifies the spoken word while virtually eliminating the visual in its use. Television and film emphasize the visual. Educators must consider what it is that the computer mediates during the learning process. What does it amplify? What does it reduce? As mentioned earlier, calculators are used because they accurately add, subtract, multiply, and divide. Is being able to accurately add, subtract, multiply, and divide so important to learn if students have access to calculators? How well do students need to be able to spell if they have access to a spell-checker? How much time needs to be spent on grammar in a writing course if students can use a grammar checker? Educators have to decide what role the technology represented by the computer is to play in the teaching and learning that goes on in a classroom.
How does the fundamental character of the classroom change as a result of the computer? This is a basic question raised by a number of educational theorists. C. A. Bowers, in his book Let Them Eat Data (2000), for example, argues that we need to understand computers in terms of how they mediate and change the educational environment of classrooms and schools. Like Heidegger, Bowers believes that the computer is not a neutral technology, but one that profoundly shapes our way of knowing and understanding the world. Bowers argues that computer use reinforces certain “root” metaphors that reinforce specific ways of looking at culture and nature. Knowledge, for example, is data driven instead of being based on intuition or faith.
Theorists such as the French hypertext theorist Pierre Levy argue that as a result of the development of inexpensive computers and the Internet, we have entered a new knowledge space that he refers to as the cosmopedia. According to Levy, the computer makes possible the development of a shared or “collective intelligence.” The cosmopedia links people together in vast networks that go beyond traditional text and static images to include video, sound, interactive simulation, interactive maps, expert systems, dynamic ideographs, virtual reality, artificial life, and so on. Levy believes that the cosmopedia can break down the artificial boundaries between the disciplines, making it possible to fold almost any field into another. Thus, traditional ways of knowing and learning are profoundly challenged. Also, new types of intellectual authority may emerge. Thus, an individual can establish a blog or other type of Web site that allows his or her ideas to be heard in ways that were previously not possible. According to Levy, we are standing on new cultural ground as our intellectual traditions mutate into something very different from those experienced by previous generations.
Issues In Social Foundations
Computers in the classroom raise fundamental questions that are traditionally related to the social foundations of education: (a) Who has access to computers? Are computers used in the same way across different socioeconomic, racial, cultural, and geographical groups? What type of training is given with that access—drill versus skill? (b) What assumptions underlie certain models of programming? What types of interfaces are used? Do certain interfaces privilege or favor one group over another? (c) How is the process of writing changed when one learns to do it using a computer rather than a pen and paper? (d) How is the process of researching a topic changed when students have access to the Internet and the World Wide Web? (e) What is the appropriate use of computers?
The use of the computer can shape, in critical ways, how students think and approach the tasks of learning. Elementary and secondary school children now have access to information resources that were previously available only to college and university students a generation ago. How does this affect something as traditional as writing a term paper? Although students have greater access to sources of useful information, they also have a greater potential to plagiarize written material. Teachers now find it increasingly necessary to do global online searches of sections of their students’ work to see if it has been copied.
Computers also have an effect on social interactions between teachers and students. At the elementary and secondary level, teachers can post homework assignments to a Web site, which means that parents can monitor the work that their children are expected to complete. At the university level, office hours have become less important as a means by which students communicate with professors. Contact with students increasingly takes place online. Although convenient, this process changes the fundamental nature of interaction between the student and the professor. One is not likely to ask students via e-mail about their family background, their study habits, or their personal lives as one might during an informal chat during office hours. Much may be lost as a result.
The process of teaching and learning in schools is undergoing a profound redefinition. For some, this process, much like the Gutenberg Revolution, represents a “singularity.” The concept of a singularity is drawn from the work of science writer Vernor Vinge and describes an event that is so profoundly important that the world is redefined as a result. The invention of movable type was a singularity, as was the creation and explosion of the first atomic bomb. The computer revolution that began with the development of mainframe machines and continued with inexpensive desktop computing and the widespread implementation and use of the Internet and World Wide Web is almost certainly another example of a singularity, or possibly multiple singularities.
The emergence of a singularity is a relatively rare event, although, as technology continues to evolve rapidly, it may create a situation in which singularities are more common. The Human Genome Project, for example, which is already revolutionizing fields such as biology and medicine, is almost certainly a singularity, as is the related technology of genetic cloning.
To a significant degree, computers and their increasingly widespread use in our culture and educational system change the nature of educational discourse. These changes are not neutral, but represent very specific ways of knowing and understanding the world. Being aware of this process is fundamental to understanding what can and cannot be accomplished in the teaching and learning process in our schools. As such, the question of technology and its impact on education becomes one of the most critical issues that all educators must address.
- Bowers, C. A. (1988). The cultural dimensions of educational computing: Understanding the non-neutrality of technology. New York: Teachers College Press.
- Bowers, C. A. (2000). Let them eat data: How computers affect education, cultural diversity, and the prospects of ecological sustainability. Athens: University of Georgia Press.
- Cuban, L. (1986). Teachers and machines: The classroom use of technology since 1920. New York: Teachers College Press.
- Cuban, L. (2001). Oversold and underused: The cost of educational computing. Cambridge, MA: Harvard University Press.
- Eisenstein, E. (1983). The printing revolution in early modern Europe. New York: Cambridge University Press.
- Engelbart, D. (1963). A conceptual framework for the augmentation of man’s intellect. In P. W. Howerton & D. C. Weeks (Eds.), Vistas in information handling: Vol. 1: The augmentation of man’s intellect by machine (pp. 1–29). Washington, DC: Spartan.
- Engelbart, D. C. (1986, January). Workstation history and the augmented knowledge workshop. Paper presented at the ACM Conference on the History of Personal Workstations, Palo Alto, CA.
- Engelbart, D. C., & English, W. K. (1968, December). A research center for augmenting human intellect. In AFIPS Conference Proceedings of the 1968 Fall Joint Computer Conference, San Francisco.
- Febvre, L., & Martin, H. J. (1976). The coming of the book: The impact of printing, 1450–1800 (D. Gerard, Trans.). London: NLB.
- Maddux, C. D. (1986). Issues and concerns in special education microcomputing. Computers in the Schools, 3(3–4), 3.
- McLuhan, M. (1962). The Gutenberg galaxy: The making of typographic man. Toronto: University of Toronto Press.
- Papert, S. (1993). The children’s machine: Rethinking school in the age of the computer. New York: Free Press.
- Provenzo, E. F., Jr. (1986). Beyond the Gutenberg galaxy: Microcomputers and post-typographic culture. New York: Teachers College Press.
- Taylor, R. (Ed.). (1980). The computer in the school: Tutor, tool, tutee. New York: Teachers College Press.
- Vinge, V. (1987). True names . . . and other dangers. New York: Baen Books.
- Vinge, V. (1989). Hurtling towards the singularity [interview with Michael Synergy]. Mondo 2000, p. 116.
This example Technologies In Education Essay is published for educational and informational purposes only. If you need a custom essay or research paper on this topic please use our writing services. EssayEmpire.com offers reliable custom essay writing services that can help you to receive high grades and impress your professors with the quality of each essay or research paper you hand in.