From page 54:[page begins] of key words and a preprogrammed set of instructions on how to react. For instance, if the user typed the word sister in a sentence, the computer would ask the user to tell him some more about his family. If no key words were found, the program would default to printing from a selection of stock, noncommittal replies.ELIZA was really just an experiment in how to make people think that a computer could think and understand what they were talking about. Many times the program would produce nonsense, especially when the program stored a sentence incorrectly. When conversing with the computer, people would use incomplete sentences-garbage in, garbage out. The program was interesting, nevertheless.A program like ELIZA was apt to get out of hand. One of the unfortunate effects of ELIZA was that people took the program seriously. The original program was never intended as anything other than an experiment. It was never intended to replace a psychotherapist. People tended to regard the computer as a personality and shared their personal feelings with the computer. They were caught by the illusion. Beware of this attitude as you examine any artificial intelligence programs.