AP NEWS
Related topics

Some Question Value of Educational Software Reviews

June 30, 1985

Undated (AP) _ ″Spelling Sorcery″ is a computer program meant to help third-graders spell. Instead, it left Gloria Catanzaro, very much an adult, muttering at her computer terminal.

″Sure doesn’t come with much documentation,″ she said. ″It’s not telling me what to do. I’m not learning anything. I’m going to hit the ‘escape’ button because I don’t know what I’m doing.

″This,″ added Ms. Catanzaro, who evaluates educational software for New York City schools, ″is why we have reviews.″

Five years into the classroom computer revolution, the market is crowded with some 7,000 educational software programs, the computerized lessons that pop up on terminal screens. The nation’s schools spent an estimated $160 million on software in 1984.

Some of that software is good, much of it is wretched.

With such wide variation in software quality, teachers welcome software reviews and advice on whether a program is entertaining, easy to use, appropriate for a particular grade level, and free of racial or ethnic stereotypes and of factual or grammatical errors.

Educational researchers say school software is slowly improving, thanks partly to the proliferation of reviews.

Unfortunately, say some critics, the reviews - like the software they evaluate - are not all they could be.

Among the problems cited: unfavorable reviews rarely get printed; published reviews too often fail to compare similar programs; at least two software evaluators - the New York City Board of Education and the National Education Association - charge publishers fees to review their programs, and few reviewers consistently field-test programs in classroom settings.

Among the sources teachers can now tap for software reviews are such magazines as Family Computing, Electronic Learning and Classroom Computer Learning; state and local school boards, including those of Minnesota, California, North Carolina, Florida, New York City, Baltimore County, and Iowa City, Iowa; the National Education Association, and the EPIE Institute, a nonprofit group affiliated with Consumers Union.

Not all reviewers have all the problems cited. EPIE, for instance, publishes unfavorable reviews, and the NEA, New York City and Minnesota field- test at least some software in classroom settings.

But in general, said Roy Pea, an educational computer specialist at the Bank Street College of Education in New York, ″Reviewers don’t involve kids very much at all.″

″Unfortunately, that’s not a predominant feature of our review process,″ said Vergie Cox, director of media evaluations of the North Carolina department of education.

Ms. Cox explained that field-testing took more time than many teacher- reviewe rs had and that some software publishers had expressed concern that such testing increased the potential for copyright violation.

While acknowledging that poor software is still common, reviewers are reluctant to pan bad programs, preferring instead to boost the good.

″Newsroom,″ a program published by Springboard Software Inc., retailing for about $50, has gotten raves from Ms. Catanzaro and other reviewers. It teaches youngsters of nearly all ages to put together a newsletter, complete with lively graphics.

Another well-received program, ″Number Garden″ by Software Guild, teaches arithmetic to student with learning problems. Give a right answer and a flower appears in a garden on the screen with a musical jingle. Answer incorrectly and the garden sprouts a weed.

″Spelling Sorcery,″ selling for about $20, is a loser as far as Ms. Catanzaro is concerned. The instructions were so confusing that it took her 13 tries to successfully complete the first lesson: spelling the word ″am.″

″We do not print bad reviews,″ said Lawrence Fedewa, head of the Educational Computer Service of the NEA, the largest U.S. teachers union. ″We inform publishers of our findings. If it’s something that’s reasonably easy to fix, we’ll work with them if they want. But it’s not our intention to sabotage the industry. We felt it would be more constructive to encourage those who are doing well.″

″Almost nobody publishes negative reviews, but I happen to think it would be good to publish reviews that say ‘yea’ or ‘nay.’ The bad software would dwindle more quickly,″ said Walter Koetke, director of technology of Scholastic Inc., which produces educational software.

Critics, including some producers themselves, have said typical reviews suffer because they almost always focus on single programs.

Karen Lansing, a spokeswoman for Springboard Software Inc. of Minneapolis, which produces ″Newsroom″ and other highly regarded educational software, said comparative reviewing would make evaluations fairer and more meaningful.

For example, it’s almost impossible to find a review that compares, say, half a dozen programs aimed at teaching elementary arithmetic.

″I’ve read glowing reviews of software I think is real garbage. We would welcome reviews that compared our products to others,″ she said.

Few reviewing practices have drawn more fire than the NEA policy of charging fees ranging from $200 to review a single uncomplicated program to as much as $20,000 for entire product lines.

The Software Publishers Association in Washington, in a recent report to its more than 130 members, argued that hardly anyone charges fees for reviews, and said that the fees were particularly unfair to smaller publishers. It urged its members to ″utilize other evaluation services.″

New York City’s fees are more modest, ranging from $10 to $40 per program.

Don Ross, who with his wife, Barbara, started the 3-year-old Microcomputer Workshops in Port Chester, N.Y., a small but highly regarded educational software publisher, said he wanted nothing to do with the NEA.

″Their prices are outrageous. They would have charged us $400 a program. We have 70 programs. We don’t use NEA,″ he said.

Henry Jay Becker, project director of Johns Hopkins University’s Center for Social Organization of Schools, which has conducted extensive research in classroom computer use, said software reviews can guide teachers to programs they didn’t know about and tell them whether they are technically sound.

But Becker said typical reviews ignore the central issue.

″No one is taking the effort to find out in a controlled research setting whether students come away with something they wouldn’t get with some simpler pencil-and-paper method,″ he said. ″I don’t think there’s enough of this kind of thinking.″

AP RADIO
Update hourly