DUALISM:
Are the mental / spiritual kind of thing and the physical / material kind of thing actually different substances, or just different kinds of properties present in physical substances?
Different Substances: Substance Dualism
Different Properties: Property Dualism
MONISM:
Is the one kind of thing in the world a mental / spiritual kind of thing or a physical / material kind of thing?
Mental / Spiritual: Idealism (George Berkeley, 1709)
Physical / Material: Materialism
Terms used differently by Hergenhahn and Churchland
Terms appearing only in Hergenhahn's discussion of Dualism
Mental states, as unobservables, awaited some clearer way of talking about them as observables. Since behavior was their observable consequence, mental states were taken to be just a shorthand way of referring to the behaviors (actual or potential) connected to them. Potential behaviors can be thought of as dispositional properties: how the organism is disposed to behave in a particular situation. As dispositional properties, then, mental states could be given strict operational definitions.
A mental state like "pain" is not observable; its consequences are, though. "Pain" in the head is a concept that means nothing more than that the person experiencing "pain" is likely to: say "I have a headache"; take aspirin; grimace and moan; lie down, etc. The observables determine the meaning of the term.
The thing about pain, though, is that it's a qualitative experience. If the above description were accurate, headaches wouldn't be so bad; but pain is painful. Also, if someone were to fake having a headache, they would engage in exactly the same behaviors; that means either that a fake headache and a real one are the same, or that since the terms can't be distinguished, they are both meaningless. Neither option is satisfactory.
Aside from neglecting qualia, the definition fails logically. There are conditions necessary for any disposition to be present. So, taking aspirin depends on other mental states -- the belief that aspirin helps headaches, a desire to be rid of the headache, and so on. These mental states also require definition, and the list of qualifications goes on forever. Mentalist terms keep creeping into the observations.
Reductionism is based on a concept of "intertheoretic reduction", which says that any statement about the basic terms of one theory can have a corresponding statement about the basic terms of a related theory. These basic terms are usually called the "natural kinds" of a theory. The true or false statements we can make involving those terms are called propositions.
We can make a statement about the natural kinds in our everyday theory of sound: a noise was loud. In a mathematical description of the same thing we would say the signal had a large amplitude; and in physics we would talk about the density of the air molecules.
For a reduction to be valid, the propositions on one level must be translatable to the propositions on the other level (so that when we say "loud" in conversation we also have something to talk about on the physical level); and the principles relating them on one level must be expressible on the other (yelling causes louder sounds because physically, the increased airflow leads to greater density of air molecules in the pressure waves).
arguments about the identity theory: PRO...
All that is required for identifying a mental state is that it should play a certain role in the various inferences and transitions that can be made from one mental state to another (as well as that it be caused by the same things and result in the same behaviors). As an analogy, the roles of king, queen, bishop, knight, rook, and pawn in a chess game could be played just as well by six different plastic chips, or pieces of fruit, or toy automobiles -- but if we make the substitute pieces move the same ways and follow the same rules as the real chess pieces, we could use them to play a game of chess. We say different mental states have different functional roles, organized into a system of inferences and exchanges, a "cognitive economy" as described above.
Just as a "chess economy" can be duplicated using pieces other than traditional chess pieces, a "cognitive economy" could be duplicated in some kind of system other than human brains with their circuits of neural electrical signals. Reduction to physiology, in the technical sense described above, is very unlikely, since the "some kind of system other than human brains" might be, for instance, a Martian brain with a totally unfamiliar biology, or a computer whose mental states are instantiated by silicon electrical chips instead of neurons. In fact, one could imagine a complex inference system built out of tinker toys and empty soda cans connected by strings and pulleys -- since calculating devices have actually been constructed out of such crude materials. (There's no special reason that computers must have the kind of electronic hardware they have, aside from practical considerations of size and speed!)
This is the move from "type identity" to "token identity" in specifying the relation between mind and body. The reductionist identity theory described above holds that there is a type identity: every type of mental state (a pain, a belief) corresponds to a particular type of physical state (activity in the somatosensory cortex, or in the temporal lobe of the brain). But functionalism allows a broader kind of relation: every mental state must correspond to some physical state of some physical system, but it could be a system of neurons, or of computer circuits, or of strings and soda cans -- i.e., not any particular physical system but just some token physical system
Rejecting the strong mind-body "type identity" for the softer "token identity" is what allows psychology to be an autonomous, separate discipline from, say, biology or neuroscience. Psychology need only concern itself with the functional relations among mental states and behaviors; the details of nerve firings in brains or electrical circuitry in computers are the "engineering details" that can be left to the people who specifically study that machinery. This is also the rationale for the field of Artificial Intelligence, which seeks to simulate human mental processes in a computer, without concern for the detailed workings of the entirely different biological machine -- the brain -- that normally carries out mental processes.
arguments about functionalism: CON...