Computer Chronicles Revisited 83 — The Language Experience Program and PALS


Dr. Edward Fry, then a professor at Rutgers University in New Jersey, said in a 1987 report that young school children should be taught keyboard typing as opposed to cursive writing. This was a fairly radical notion for 1987 as computers were not yet commonplace in the home. And there was a feeling among many educators at the time that learning to write cursive was an essential step in promoting literacy itself.

This next Computer Chronicles episode, also from 1987, dealt with the subject of adult illiteracy and how computers were being used to address it. Stewart Cheifet opened the episode by showing this week’s co-host, Jan Lewis, a pair of job application forms, for a waiter and a gas station service attendant respectively. Cheifet noted that he and Lewis were both “fairly well educated people,” yet they were having trouble figuring out how to fill out all of the blanks on these forms. So imagine what it must be like if you were functionally illiterate.

Cheifet said computers seemed to be doing well in teaching adults to read even though traditional computer-aided instruction in schools was not working too well. Lewis said the difference was in the implementation at the adult level versus at the elementary school level. The theory–that somebody could learn better if they could learn at their own pace and with immediate feedback–still held at both levels.

California Libraries Used Apple II Software to Teach Adults Reading

Wendy Woods presented her first remote segment, which focused on an adult literacy program used by the Richmond Public Library in Richmond, California. Woods said that California was at the forefront of the literacy movement, with $4 million given to libraries annually to teach adults how to read. And of all the participating libraries, Richmond was considered a leader in the use of computers to teach literacy.

Woods said computers were, in fact, a key element of the Richmond program, known as L.E.A.P. Sharon Pastori, the director of the program, told Woods the computer was non-judgmental and private. It didn’t tell a user it was tired of working on a problem. Repetition was often important in learning how to read, and even with an understanding tutor, the student can feel embarrassed about having to constantly go over the same thing. But that wasn’t an issue with the computer.

Students in the L.E.A.P. program ranged from those who couldn’t read at all to those who simply needed to expand their spelling skills or reading comprehension, Woods explained. That wide range of needs initially prompted months of research. Richmond staffers looked at 20 different software programs that they used regularly. But deciding which programs to use was no easy task, since most literacy software was designed for children learning how to read and write. And the Richmond staff needed to find software that wouldn’t “talk down” to adults.

Susan McAllister, a L.E.A.P. staff member, said that at first it was very hard. They picked a lot of different things and figured they would experiment and find out what worked. There was a lot of software now available that targeted remedial reading at the high school and young adult level, which worked okay. McAllister added there was a growing market for adult schools and community colleges to use computers in adult literacy programs, especially with the speech capabilities of many computers, which could help people learn to read at a basic level. But she said the market was still wide open and there was room for other products to enter this area.

Still, Woods said, most students said they were pleased with the software, and the computers had helped them. One student, Mona Lisa Bean, told Woods that the computer made things easier because there were games to make it a more interesting challenge as opposed to just pure memorization.

Woods said the next addition to the L.E.A.P. program, which currently used Apple II computers, would be a Macintosh. Staffers said they wanted to see how using the Macintosh’s mouse-based interfaced could aid in the teaching of reading.

Literacy for the People

John Fleischman and Jane Nissen Laidley joined Cheifet and Lewis in the San Mateo studio for the next segment. Fleischman was the administrator of media services with the Los Angeles County jail’s correctional education program. Laidley was the executive director of the non-profit People’s Computer Company (PCC).

Lewis opened by asking Fleischman to define “functional illiteracy.” Fleciscman said it was basically a person’s inability to read and write to function effectively in everyday life. There were very few true “illiterates” anymore. Most people could read and write their name. So the new tag that was used was “functional illiteracy.”

Lewis asked how the computer helped with functional illiteracy. Fleischman said the computer was an infinitely patient tool. Many adults do not like to admit to a person that they can’t read. The computer doesn’t know.

Cheifet turned to Laidley and asked what PCC did in the area of illiteracy. Laidley said PCC was a consultant for the Richmond Public Library’s program that Wendy Woods featured in the prior segment. She said the goal was to use the computer to teach reading and writing skills. The computer worked in tandem with both the student and a tutor.

Following up, Cheifet asked if there had been any resistance to using computers. Laidley said there was more resistance from the tutor than from the students based on the feedback from Richmond. But she felt that was primarily because the tutor had lived and taught for 50 to 60 years without a computer. They didn’t expect learning how to use a computer when volunteering to teach adults how to read.

Lewis asked about the balance between the use of the computer and the human tutor to meet a particular student’s needs. Laidley said PCC recommended to Richmond that the tutor and the student stay together for a number of hours in the beginning. What surprised Laidley, however, was that the students moved towards using the computer faster than what had been anticipated by PCC’s course materials. So they had to go back and re-write those materials.

Cheifet said this surprised him given that the stereotype was that an adult was less likely to feel comfortable in front of a computer keyboard than a child. Laidley said the “learner” was more comfortable. The student was generally more eager than the instructor. Many adults who dealt with illiteracy were especially excited as they believed they would never have access to technology like a computer.

Cheifet turned back to Fleischman, who dealt with adult jail inmates. How did computers work in that environment. Fleischman said they were “lined up two-deep to use the computers.” The “time on task” was incredible as his students were completely focused when using the machines.

Pushing back a bit, Cheifet asked if there was any hard evidence to suggest computers actually worked in teaching adult literacy. Laidley said PCC did a year-long study with the Richmond program, which she believed was the first in California and one of only a few done in the United States. Obviously, she would like more studies done. But people don’t fund the research. She hoped that people would look at PCC’s study and use that as a jumping off point for their own research.

Penn State Program Used Speech Synthesis to Help Teach Reading

Stewart Cheifet presented the next remote segment, which focused on a “talking computer” program developed at Penn State University. Cheifet narrated some B-roll footage taken at Northwest Elementary School in Lebanon, Pennsylvania. While most of the students here were kids, lately adults had been taking instruction in how to read, Cheifet said.

For example, Deborah Swoyer was a mother of two school-age children. She was a high school graduate but could not read as well as her daughter, who was in the fourth grade. Swoyer said that made it difficult to help her child with her homework. Cheifet said Swoyer was now working with an Apple IIe-based literacy training course that used an ECHO GP voice synthesizer so that the computer could speak to the student. The software was written in SuperPILOT, an extended version of Apple PILOT.

One problem with computer-based literacy courses, Cheifet noted, was that they required expensive computers like the Apple II. But Dr. Eunice Askov of Penn State, who was in charge of the program used at the Lebanon school, said she hoped that more businesses and industries would open up the use of their computer facilities after-hours to adult literacy programs.

Cheifet said the Penn State software was unique in teaching reading by showing the students word configurations based on research that showed adults recognized the “shape” of words more than their phonetic sounds. But the computer did more than teach reading. Dr. Elizabeth Wile, who worked with the adult literacy program at the Lebanon school, told Cheifet it also helped restore an adult’s sense of self-esteem.

Cheifet added that when Deborah Swoyer and her tutor, Dolores Meck, first started using the computer courseware, both were scared and afraid they couldn’t do it. Now, they both loved the computer. Meck told Cheifet she was a “computer convert.”

Taking a LEAP Into Adult Literacy

Back in the studio with John Fleischman and Jane Nissen Laidley, Lewis asked Fleischman to demonstrate the software he used to teach literacy to adult inmates at the Los Angeles County jail. Fleischman said the software was called the Language Experience Program I (LEAP I), which was recently published by Educational Activities, Inc. (It’s not clear to me if this was the same L.E.A.P. discussed in the earlier Richmond library segment.) LEAP I used a person’s existing skills, such as the knowledge of their own family and friends, and channeled that into a reading lesson. It was like a next-generation word processor used as a reading instruction program.

Fleischman ran a demonstration of LEAP I on an Apple IIe. The main screen asked the user to select from a list of topics they could write about. Fleischman asked Lewis to select one. She selected “JOB.” The software then prompted her with questions about her job. Fleischman said that in an actual classroom setting, he would sit down with the student to explain the process. Cheifet noted that there was a level of sophistication necessary to read the on-screen instructions. Was that a problem? Fleischman said it was a big problem for someone reading on a 1.5-grade level. So those students would need to do more work initially, or work with another student who was reading at a higher level.

Continuing the demo, LEAP I prompted Lewis to write three sentences about her job. Lewis entered some text. The software returned a message explaining that she did not write three sentences as instructed. The software took her back to the prior screen to add additional sentences. Fleischman said when a student completed this assignment, they would take their printed output up to the teacher or tutor and review the materials together. Fleischman noted in Lewis’ sample text she misspelled a word. In a classroom setting, she would sit down and re-type that word and correct any grammatical errors. All of the student’s work was then recorded on a personal diskette. There was about 25 hours of total instruction in the program.

Cheifet asked Laidley how using something like LEAP I was any different than just having a student write three sentences on a piece of paper and hand it to the instructor. Laidley said the difference was that for an adult who has difficulty reading, they also had difficulty writing. Those were different skills. Handwriting required a specific set of motor skills. Using a computer meant the student only needed to know how to push a button. More to the point, the computer produced a printout that could be easily read by the instructor when giving feedback. The student’s handwriting would not get in the way.

Ever the consultant, Lewis asked about the cost-benefit trade-off of using computers in this context. Laidley said PCC’s study didn’t focus on that issue. She suspected that in a volunteer program like the one at the Richmond library, computers were likely extremely cost effective because of the amount of use by the students on their own.

Would PALS Revolutionize the School System?

Since most of the focus to this point had been on Apple II-based software, it was only fair to give IBM some time as well. In this final segment, Wendy Woods reported on an adult literacy program in Flint, Michigan, being used to retrain workers in the automobile industry. Woods noted that Flint reflected many of the difficulties faced by auto manufacturers. And many laid-off workers needed help learning how to read.

Dr. David Spathelf, the superintendent of the Genesee School District, which included Flint, noted the city was in the midst of a transition from an industry that required heavy manual labor with less academic skills to a high-tech industrial community. And when you looked back at the history of people graduating from public schools, they could obtain good high-paying jobs with minimal skills. But that was no longer the case.

Woods said Flint was the test center for a revolutionary approach to teaching literacy with computers, both for beginning readers and those who never learned. The program was called PALS, or Principle of the Alphabet Literacy System. PALS grew out of a program aimed at teaching reading and writing skills to kindergarten-age children. It was developed by Dr. John Henry Martin and licensed by IBM. Martin told Woods that learning to read was the single most difficult thing to learn in the English-speaking world, noting the United States had the highest rate of failure in that level of learning than anything else that occurred between the ages of 5 and 25.

Woods explained that the PALS adult classroom had laserdisc players, touch-sensitive computer screens, typewriters, and PCs. From the moments students arrived, they were immersed in computer technology. The textbook was on a laserdisc in the form of a story related in comic-book style. Students learned touch typing simultaneously and could proceed at their own pace. Martin said that coming to the computer without the intervention of an adult meant that they were in effect teaching themselves. The seeming “dehumaization” that some people decried as a technological intrusion proved to be quite therapeutic, he noted, adding that the real damage of illiteracy had already been done to these students by other people.

After about 20 weeks in the PALS program, Woods said, about 80 percent of students had improved their reading skills by an average of 2.5 years of schooling. The self-paced rate of instruction seemed to be an important part of that success. One student, Genetha Campbell, told Woods the software helped by breaking the words down into individual sounds. During the time Campbell was in school, she said a teacher would just “flop” a word on you, and either you learned it or you didn’t. In the PALS program, however, she could work at her own pace. Another student, Chris Sewell, said the PALS course taught him to read things that he had never read before. And as of today, he could now fill out a job application.

From the teacher’s point of view, Woods said, the central role of the PC might appear to be a threat. But the instructor still played a pivotal role. One PALS instructor, William Deacon, said that the use of computers gave him more time to circulate and talk to individual students about individual problems. The computer essentially did a lot of the “tutorial” work, freeing him up to concentrate on special cases and individualized problems.

Woods reiterated that PALS was adopted from Martin’s reading program for children called Writing to Read, which shared the same underlying concepts. Martin said that students were invited to learn in “abysmally poor” ways. But with Writing to Read and PALS he brought to the forefront the intellectual structure of what a word was and made it apparent to a student that a physical representation of the word was a simple system called the alphabet.

Martin told Woods that it was still too early to tell if computer instruction indicated the need for a radical restructuring of teaching methods. But he did anticipate some major changes. Having lived his life in the world of the schools, the concept that a school consisted of a series of boxes that you put 25 to 30 children with one teacher all day–or shuffling them between specialized teachers–was primed for radical change. Ultimately, he believed that technology would “blow those walls apart.”

Atari Filed Antitrust Lawsuit Against Nintendo Over “Lockout” Chip

Although this episode was originally broadcast in March 1987, the episode available from the Internet Archive was a rerun from December 1988. Cynthia Steele presented the “Random Access” segment for the rerun.

John Henry Martin (1916 - 1994)

Dr. John Henry Martin died in July 1994 at the age of 78. Born and raised in Paterson, New Jersey, Martin spent the early part of his career working in public school administration. In the 1950s he served as principal of a junior high school on Long Island and as superintendent of the Wayne, New Jersey, school district. He returned to Long Island in 1961, taking over as the superintendent of the Freeport schools.

In May 1964, Martin left Freeport for the Mount Vernon school district in New York’s Westchester County. His tenure there proved to be brief due to the ongoing fight against the racial desegregation of the Mount Vernon schools. The white, Roman Catholic-controlled school board opposed allowing Black children to attend schools with white children. Martin attempted to broker a “compromise” with the racists, which proved futile and led to death threats against him and his family. On September 13, 1966, Martin announced his resignation four days into the new school year. (The racists continued to fight desegregation in Mount Vernon for another 30 years.)

In 1973, Martin suffered a severe heart attack at his Long Island home due to complications from a viral infection. According to a 1988 Los Angeles Times profile, Martin was considered “dead on arrival” at the hospital, and even after doctors revived him, he was later “sent home to die.” But two years after this supposed death sentence, Martin recovered and decided to reenter the education field as a literacy consultant.

Martin officially teamed up with IBM in 1982 to distribute his Writing to Read program. A two-year test study of 10,000 children found that pupils following Martin’s program “outperformed comparison groups in many measures used, and equaled performance in other measures,” according to the Times. This led IBM to start marketing Writing to Read not just in the United States, but also in Canada, Bermuda, South Africa, and several Asian countries. IBM and Martin’s consulting firm, JHM, Inc., later marketed PALS as a follow-up adult version of Writing to Read in 1986. As of 1988, there were 350 PALS laboratories throughout the country, including one at the Los Angeles Times itself.

Martin, who moved to Florida after his 1973 heart attack, continued working up until his death in 1994. He passed away at his summer home in New Hampshire.

Jane Nissen Laidley (1942 - 2003)

Jane Nissen Laidley died in December 2003 at the age of 61. Born and raised in the area that later became known as California’s Silicon Valley, Laidley took over the People’s Computer Company (PCC) as its executive director in 1982. PCC grew out of an earlier project called the Portola Institute, which was started by Bob Albrecht, a former computer engineer with Honeywell and Control Data Corporation; and Dick Raymond, a former consultant with the Stanford Research Institute (an early Computer Chronicles sponsor).

Among the non-profit Portola Institute’s early projects was publishing Stewart Brand’s Whole Earth Catalog. (Brand later co-founded The WELL, an online service that was profiled in an earlier Computer Chronicles episode.) While at Portola, Albrecht met a high school teacher, LeRoy Finkel, with whom he co-founded a for-profit computer book publishing arm called Dymax. Dymax later moved to Menlo Park, California, where it established the People’s Computer Center, a publicly available time-share service that allowed anyone to access a DEC minicomputer.

In October 1972, Albrecht launched the People’s Computer Company as a newsletter that primarily featured listings of BASIC programs that anyone could use. PCC then became its own standalone nonprofit corporation. It launched other publications, notably Dr. Dobb’s Journal in 1976, whose first editor was original Chronicles host Jim Warren. By 1979, PCC employed 30 people–most of them part-timers–and had started working with local libraries in California to offer computer services.

When Laidley took over PCC in 1982, the organization began receiving substantial funding from the National Science Foundation to expand its offerings. In addition to the projects discussed in this episode, PCC created a number of model computer-literacy and training projects. PCC also designed Apple’s community affairs program and published a newsletter focused on adult literacy and technology.

While still working at PCC, Laidley earned a doctorate from the University of California, Berkley, and founded a second educational nonprofit, JN4D, which focused on developing technology curricula for schools. (The name JN4D referred to the model of a 1918 plane that Laidley’s family owned and restored.) Unfortunately, it appears that both PCC and JN4D ceased active operations after Laidley unexpectedly died at her Berkeley home in 2003.

About That Atari Lawsuit Against Nintendo…

I normally would not go in-depth on a “Random Access” item in the main post, but the way that Cynthia Steele reported the Atari-Nintendo antitrust lawsuit merits some attention. Here is the exact copy that Steele read:

The bad news is that the Japanese company Nintendo has taken over the U.S. video game market. And to make the situation even worse, Nintendo game consoles ’lock out’ game cartridges made by other manufacturers. So Atari is suing Nintendo, charging them with monopoly and restraint of competition.

This is a surprising degree of editorializing from Chronicles, especially in what was ostensibly a news segment. It basically reads like Atari propaganda, accepting at face value the implication that Nintendo must be doing something wrong. Which is ironic given how this litigation ultimately turned out.

First, it’s important to note that when Nintendo “took over” the U.S. home video game market, it was following a total collapse of that market. And the primary cause of that collapse was supply-chain mismanagement by the original Atari, Inc., which had been the market leader prior to the crash. As I discussed in prior posts, the crash led Warner Communications, which owned Atari as a subsidiary, to split the company up. The home video game console and home computer assets were sold to Jack Tramiel, who reorganized those businesses as Atari Corporation.

Now, this was not the Atari that filed the lawsuit that Steele referenced. Rather, the plaintiff in this case was Atari Games, Inc. This was the part of the original Atari, Inc., initially retained by Warner and included the coin-operated arcade games portion of the business. In 1985, Warner formed a joint venture with the Japanese video game company Namco to run Atari Games. The president of Namco’s U.S. subsidiary, Hideyuki Nakajima, took over Atari Games and later led an employee buyout of part of Namco’s share. This effectively left Atari Games as an independent company with Warner and Namco each owning 40 percent and the remaining 20 percent controlled by Nakajima.

So just to be clear, when Steele said Atari sued a Japanese company that “took over” the U.S. video game market, Atari Games itself was a company controlled by a Japanese executive and co-owned by a Japanese company. And since Steele never clarified this was Atari Games, I suspect many Chronicles viewers thought she was referencing the Tramiel-owned Atari Corporation, which was more commonly discussed on the program.

Moving on, as Steele said there was a “lockout” chip in the Nintendo Entertainment System (NES). Basically, when a player inserted a Nintendo-licensed game cartridge, it contained a chip that matched a corresponding chip in the base system. If both chips were not present, the game would not load. Nintendo developed–and patented–this lockout system in part to prevent the proliferation of unlicensed third-party games, which not coincidentally had contributed to the downfall of the original Atari.

Atari Games formed a subsidiary called Tengen to develop games for the NES. (The different name was necessary to avoid a conflict with Tramiel’s company, who retained the “Atari” brand for use in home video games.) Initially, Tengen tried to reverse engineer the lockout chip, similar to what was done with the BIOS on the original IBM PC by clone makers. But when those efforts failed, Tengen paid Nintendo for a license.

Atari Games/Tengen–and Nakajima in particular–chaffed at Nintendo’s licensing terms, such as releasing no more than five games per year. Hence the antitrust lawsuit. But several months before Atari Games actually filed its complaint, its attorneys engaged in what can only be described as a an egregious bit of fraud. I’ll let former U.S. Circuit Judge Randall Ray Rader explain what happened:

In early 1988, Atari’s attorney applied to the Copyright Office for a reproduction of the 10NES program [the lockout chip]. The application stated that Atari was a defendant in an infringement action and needed a copy of the program for that litigation. Atari falsely alleged that it was a present defendant in a case in the Northern District of California. Atari assured the “Library of Congress that the requested copy [would] be used only in connection with the specified litigation.” In fact, no suit existed between the parties until December 1988, when Atari sued Nintendo for antitrust violations and unfair competition. Nintendo filed no infringement action against Atari until November 1989.

After obtaining the 10NES source code from the Copyright Office, Atari again tried to read the object code from peeled chips. Through microscopic examination, Atari’s analysts transcribed the 10NES object code into a handwritten representation of zeros and ones. Atari used the information from the Copyright Office to correct errors in this transcription. The Copyright Office copy facilitated Atari’s replication of the 10NES object code.

After deciphering the 10NES program, Atari developed its own program — the Rabbit program — to unlock the NES.

So because Atari/Tengen failed to reverse engineer the lockout chip legally, it resorted to an illegal tactic, which in turn prompted Nintendo to counter-sue Atari Games for copyright infringement. A federal judge subsequently ruled in Nintendo’s favor, dismissing Atari Games’ antitrust lawsuit and issuing an injunction to prevent Atari from continuing to use its Rabbit program to circumvent the NES lockout chip. Judge Rader subsequently affirmed that decision on behalf of a three-judge panel of the U.S. Court of Appeals for the Federal Circuit.

To be fair, neither Steele nor the Chronicles producers knew about Atari/Tengen’s Copyright Office shenanigans when they reported on the initial filing of the lawsuit. But it goes to show how in the 1980s, anti-Japenese animus colored a lot of reporting about the tech industry.

Incidentally, Jack Tramiel’s Atari Corporation would end up filing its own separate antitrust lawsuit against Nintendo over the latter’s dominance of the U.S. video game market. That case ended in a 1992 jury verdict for Nintendo.

Notes from the Random Access File