CHAPTER 2 What Is Critical Thinking?
When Arthur was in the first grade, the teacher directed the class to “think.” “Now, class,” she said, “I know this problem is a little harder than the ones we’ve been doing, but I’m going to give you a few extra minutes to think about it. Now start thinking.”
It was not the first time Arthur had heard the word used. He’d heard it many times at home, but never quite this way. The teacher seemed to be asking for some special activity, something he should know how to start and stop—like his father’s car. “Vroom-m-m,” he muttered half aloud. Because of his confusion, he was unaware he was making the noise.
“Arthur, please stop making noises and start thinking.”
Embarrassed and not knowing quite what to do, he looked down at his desk. Then, out of the corner of his eye, he noticed that the little girl next to him was staring at the ceiling. “Maybe that’s the way you start thinking,” he guessed. He decided the others had probably learned how to do it last year, that time he was home with the measles. So he stared at the ceiling. As he progressed through grade school and high school, he heard that same direction hundreds of times. “No, that’s not the answer, you’re not thinking—now think!” And occasionally he would hear from particularly self-pitying teachers given to muttering to themselves aloud: “What did I do to deserve this? Don’t they teach them anything in the grades anymore? Don’t you people care about ideas? Think, dammit, THINK.”
So Arthur learned to feel somewhat guilty about the whole matter. Obviously, this thinking was an important activity that he’d failed to learn. Maybe he lacked the brain power. But he was resourceful enough. He watched the other students and did what they did. Whenever a teacher started in about thinking, he screwed up his face, furrowed his brow, scratched his head, stroked his chin, stared off into space or up at the ceiling, and repeated silently to himself, “Let’s see now, I’ve got to think about that, think, think—I hope he doesn’t call on me—think.”
Though Arthur didn’t know it, that’s just what the other students were saying to themselves.
Your experience may have been similar to Arthur’s. In other words, many people may have simply told you to think without ever explaining what thinking is and what qualities a good thinker has that a poor thinker lacks. If that is the case, you have a lot of company. Extensive, effective training in thinking is the exception rather than the rule. This fact and its unfortunate consequences are suggested by the following comments from accomplished observers of the human condition:
The most interesting and astounding contradiction in life is to me the constant insistence by nearly all people upon “logic,” “logical reasoning,” “sound reasoning,” on the one hand, and on the other their inability to display it, and their unwillingness to accept it when displayed by others.
Most of our so-called reasoning consists in finding arguments for going on believing as we already do.
Clear thinking is a very rare thing, but even just plain thinking is almost as rare. Most of us most of the time do not think at all. We believe and we feel, but we do not think.
Mental indolence is one of the commonest of human traits.
What is this activity that everyone claims is important but few people have mastered? Thinking is a general term used to cover numerous activities, from daydreaming to reflection and analysis. Here are just some of the synonyms listed in Roget’s Thesaurus for think:
| appreciate | consult | fancy | reason |
| ————— | —————- | ———— | ————- |
| believe | contemplate | imagine | reflect |
| cerebrate | deliberate | meditate | ruminate |
| cogitate | digest | muse | speculate |
| conceive | discuss | ponder | suppose |
| consider | dream | realize | weigh |
All of those are just the names that thinking goes under. They really don’t explain it. The fact is, after thousands of years of humans’ experiencing thought and talking and writing about thinking, it remains in many respects one of the great mysteries of our existence. Still, though much is yet to be learned, a great deal is already known.
Mind, Brain, or Both?
Most modern researchers use the word mind synonymously with brain, as if the physical organ that resides in the human skull were solely responsible for thinking. This practice conveniently presupposes that a problem that has challenged the greatest thinkers for millennia—the relationship between mind and physical matter—was somehow solved when no one was looking. The problem itself and the individuals who spent their lives wrestling with it deserve better.
Neuroscience has provided a number of valuable insights into the cognitive or thinking activities of the brain. It has documented that the left hemisphere of the brain deals mainly with detailed language processing and is associated with analysis and logical thinking, that the right hemisphere deals mainly with sensory images and is associated with intuition and creative thinking, and that the small bundle of nerves that lies between the hemispheres—the corpus callosum—integrates the various functions.
The research that produced these insights showed that the brain is necessary for thought, but it has not shown that the brain is sufficient for thought. In fact, many philosophers claim it can never show that. They argue that the mind and the brain are demonstrably different. Whereas the brain is a physical entity composed of matter and therefore subject to decay, the mind is a metaphysical entity. Examine brain cells under the most powerful microscope and you will never see an idea or concept— for example, beauty, government, equality, or love—because ideas and concepts are not material entities and so have no physical dimension. Where, then, do these nonmaterial things reside? In the nonmaterial mind.
The late American philosopher William Barrett observed that “history is, fundamentally, the adventure of human consciousness” and “the fundamental history of humankind is the history of mind.” In his view, “one of the supreme ironies of modern history” is the fact that science, which owes its very existence to the human mind, has had the audacity to deny the reality of the mind. As he put it, “the offspring denies the parent.”
The argument over whether the mind is a reality is not the only issue about the mind that has been hotly debated over the centuries. One especially important issue is whether the mind is passive, a blank slate on which experience writes, as John Locke held, or active, a vehicle by which we take the initiative and exercise our free will, as G. W. Leibnitz argued. This book is based on the latter view.
Critical Thinking Defined
Let’s begin by making the important distinction between thinking and feeling. I feel and I think are sometimes used interchangeably, but that practice causes confusion. Feeling is a subjective response that reflects emotion, sentiment, or desire; it generally occurs spontaneously rather than through a conscious mental act. We don’t have to employ our minds to feel angry when we are insulted, afraid when we are threatened, or compassionate when we see a picture of a starving child. The feelings arise automatically.
Feeling is useful in directing our attention to matters we should think about; it also can provide the enthusiasm and commitment necessary to complete arduous mental tasks. However, feeling is never a good substitute for thinking because it is notoriously unreliable. Some feelings are beneficial, honorable, even noble; others are not, as everyday experience demonstrates. We often feel like doing things that will harm us—for example, smoking, sunbathing without sunscreen, telling off our professor or employer, or spending the rent money on lottery tickets.
Zinedine Zidane was one of the greatest soccer players of his generation, and many experts believed that in his final season (2006) he would lead France to the pinnacle of soccer success—winning the coveted World Cup. But then, toward the end of the championship game against Italy, he viciously head-butted an Italian player in full view of hundreds of millions of people. The referee banished him from the field, France lost the match, and a single surrender to feeling forever stained the brilliant career Zidane had dedicated his life to building.
In contrast to feeling, thinking is a conscious mental process performed to solve a problem, make a decision, or gain understanding. Whereas feeling has no purpose beyond expressing itself, thinking aims beyond itself to knowledge or action. This is not to say that thinking is infallible; in fact, a good part of this book is devoted to exposing errors in thinking and showing you how to avoid them. Yet for all its shortcomings, thinking is the most reliable guide to action we humans possess. To sum up the relationship between feeling and thinking, feelings need to be tested before being trusted, and thinking is the most reasonable and reliable way to test them.
There are three broad categories of thinking: reflective, creative, and critical. The focus of this book is on critical thinking. The essence of critical thinking is evaluation. Critical thinking, therefore, may be defined as the process by which we test claims and arguments and determine which have merit and which do not. In other words, critical thinking is a search for answers, a quest. Not surprisingly, one of the most important techniques used in critical thinking is asking probing questions. Where the uncritical accept their first thoughts and other people’s statements at face value, critical thinkers challenge all ideas in this manner:
| Thought | Question |
| —————————————————————————————— | —————————————————————————————— |
| Professor Vile cheated me in my composition grade. He weighted some themes more heavily than others. | Did he grade everyone on the same standard? Were the different weightings justified? |
| Before women entered the work force, there were fewer divorces. That shows that a woman’s place is in the home. | How do you know that this factor, and not some other one(s), is responsible for the increase in divorces? |
| A college education isn’t worth what you pay for it. | Is money the only measure of the worth of an education? |
| Some people never reach a salary level appreciably higher than the level they would have reached without the degree. | What about increased understanding of self and life and increased ability to cope with challenges? |
Critical thinking also employs questions to analyze issues. Consider, for example, the subject of values. When it is being discussed, some people say, “Our country has lost its traditional values” and “There would be less crime, especially violent crime, if parents and teachers emphasized moral values.” Critical thinking would prompt us to ask,
- What is the relationship between values and beliefs? Between values and convictions?
- Are all values valuable?
- How aware is the average person of his or her values? Is it possible that many people deceive themselves about their real values?
- Where do one’s values originate? Within the individual or outside? In thought or in feeling?
- Does education change a person’s values? If so, is this change always for the better?
- Should parents and teachers attempt to shape children’s values?
Characteristics of Critical Thinkers
A number of misconceptions exist about critical thinking. One is that being able to support beliefs with reasons makes one a critical thinker. Virtually everyone has reasons, however weak they may be. The test of critical thinking is whether the reasons are good and sufficient.
Another misconception is that critical thinkers never imitate others in thought or action. If that were the case, then every eccentric would be a critical thinker. Critical thinking means making sound decisions, regardless of how common or uncommon those decisions are.
It is also a misconception that critical thinking is synonymous with having a lot of right answers in one’s head. There’s nothing wrong with having right answers, of course. But critical thinking involves the process of finding answers when they are not so readily available.
And yet another misconception is that critical thinking cannot be learned, that one either has it or does not. On the contrary, critical thinking is a matter of habit. The most careless, sloppy thinker can become a critical thinker by developing the characteristics of a critical thinker. This is not to say that all people have equal thinking potential but rather that everyone can achieve dramatic improvement.
We have already noted one characteristic of critical thinkers—skill in asking appropriate questions. Another is control of one’s mental activities. John Dewey once observed that more of our time than most of us care to admit is spent “trifling with mental pictures, random recollections, pleasant but unfounded hopes, flitting, half-developed impressions.” Good thinkers are no exception. However, they have learned better than poor thinkers how to stop that casual, semiconscious drift of images when they wish and how to fix their minds on one specific matter, examine it carefully, and form a judgment about it. They have learned, in other words, how to take charge of their thoughts, to use their minds actively as well as passively.
Here are some additional characteristics of critical thinkers, as contrasted with those of uncritical thinkers:
| Critical Thinkers . . . | Uncritical Thinkers . . . |
| —————————————————————————————— | —————————————————————————————— |
| Are honest with themselves, acknowledging what they don’t know, recognizing their limitations, and being watchful of their own errors. Pretend they know more than they do, ignore their limitations, and assume their views are error-free. | Pretend they know more than they do, ignore their limitations, and assume their views are error-free. |
| Are honest with themselves, acknowledging what they don’t know, recognizing their limitations, and being watchful of their own errors. | Regard problems and controversial issues as nuisances or threats to their ego. |
| Regard problems and controversial issues as exciting challenges. | Are impatient with complexity and thus would rather remain confused than make the effort to understand. |
| Strive for understanding, keep curiosity alive, remain patient with complexity, and are ready to invest time to overcome confusion. | Base judgments on first impressions and gut reactions. |
| Base judgments on evidence rather than personal preferences, deferring judgment whenever evidence is insufficient. They revise judgments when new evidence reveals error. | They are unconcerned about the amount or quality of evidence and cling to their views steadfastly. |
| Are interested in other people’s ideas and so are willing to read and listen attentively, even when they tend to disagree with the other person. | Are preoccupied with themselves and their own opinions and so are unwilling to pay attention to others’ views. At the first sign of disagreement, they tend to think, “How can I refute this?” |
| Recognize that extreme views (whether conservative or liberal) are seldom correct, so they avoid them, practice fairmindedness, and seek a balanced view. | Ignore the need for balance and give preference to views that support their established views. |
| Practice restraint, controlling their feelings rather than being controlled by them, and thinking before acting. | Tend to follow their feelings and act impulsively. |
As the desirable qualities suggest, critical thinking depends on mental discipline. Effective thinkers exert control over their mental life, direct their thoughts rather than being directed by them, and withhold their endorsement of any idea—even their own—until they have tested and confirmed it. John Dewey equated this mental discipline with freedom. That is, he argued that people who do not have it are not free persons but slaves to whim or circumstance:
If a man’s actions are not guided by thoughtful conclusions, then they are guided by inconsiderate impulse, unbalanced appetite, caprice, or the circumstances of the moment. To cultivate unhindered, unreflective external activity is to foster enslavement, for it leaves the person at the mercy of appetite, sense, and circumstance.The Role of Intuition
Intuition is commonly defined as immediate perception or comprehension of something—that is, sensing or understanding something without the use of reasoning. Some everyday experiences seem to support this definition. You may have met a stranger and instantly “known” that you would be partners for life. When a car salesman told you that the price he was quoting you was his final, rock-bottom price, your intuition may have told you he was lying. On the first day of a particular course, you may have had a strong sense that you would not do well in it.
Some important discoveries seem to have occurred instantaneously. For example, the German chemist Kekule found the solution to a difficult chemical problem intuitively. He was very tired when he slipped into a daydream. The image of a snake swallowing its tail came to him—and that provided the clue to the structure of the benzene molecule, which is a ring, rather than a chain, of atoms. The German writer Goethe had been experiencing great difficulty organizing a large mass of material for one of his works when he learned of the tragic suicide of a close friend. At that very instant, the plan for organizing his material occurred to him in detail. The English writer Samuel Taylor Coleridge (you may have read his Rime of the Ancient Mariner in high school) awoke from a dream with 200–300 lines of a new and complex poem clearly in mind.
Such examples seem to suggest that intuition is very different from reasoning and is not influenced by it. But before accepting that conclusion, consider these facts:
Breakthrough ideas favor trained, active minds. It is unusual for someone totally untrained in a subject to make a significant new discovery about it. Thus, if Kekule had been a plumber, Goethe a bookkeeper, and Coleridge a hairdresser, they would almost certainly not have received the intuitions for which they are famous.
Some intuitions eventually prove to be mistaken. That attractive stranger may turn out to be not your lifelong partner but a person for whom you develop a strong dislike. The car salesman’s final price may have proved to be exactly that. And instead of doing poorly in that course, you may have done well.
It is difficult to make an overall assessment of the quality of our intuitions because we tend to forget the ones that prove mistaken in much the same way a gambler forgets his losses.
These facts have led some scholars to conclude that intuition is simply a consequence of thinking. They would say that something about the stranger appealed to you, something the salesman said or did suggested insincerity, something about the professor frightened you. In each case, they would explain, you made a quick decision—so quick, in fact, that you were unaware that you’d been thinking. In the case of the breakthrough ideas, the scholars would say that when people become engrossed in problems or issues, their unconscious minds often continue working on them long after they have turned their attention elsewhere. Thus, when an insight seems to come “out of nowhere,” it is actually a delayed result of thinking.
Which view of intuitions is the correct one? Are intuitions different from and independent of thinking or not? Perhaps, for now, the most prudent answer is that sometimes they are independent and sometimes they are not; we can’t be sure when they are, and therefore it is imprudent to rely on them.Basic Activities in Critical Thinking
The basic activities in critical thinking are investigation, interpretation, and judgment, in that order. The following chart summarizes each activity in relation to the other two.
| Activity | Definition | Requirements |
| ——————— | —————————————————————————————— | —————————————————————————————— |
| Investigation | Finding evidence—that is, data that will answer key questions about the issue | The evidence must be both relevant and sufficient |
| Interpretation | Deciding what the evidence means | The interpretation must be more reasonable than competing interpretations. |
| Judgment | Reaching a conclusion about the issue | The conclusion must meet the test of logic. |
As we noted previously, irresponsible thinkers first choose their conclusions and then seek out evidence to justify their choices. They fail to realize that the only conclusion worth drawing is one based on a thorough understanding of the problem or issue and its possible solutions or resolutions. Is it acceptable to speculate, guess, and form hunches and hypotheses? Absolutely. Such activities provide a helpful starting point for the thinking process. (Besides, we couldn’t avoid doing so even if we tried.) The crucial thing is not to let hunches and hypotheses manipulate our thinking and dictate our conclusion in advance.Critical Thinking and Writing
Writing may be used for either of two broad purposes: to discover ideas or to communicate them. Most of the writing you have done in school is undoubtedly the latter kind. But the former can be very helpful, not only in sorting out ideas you’ve already produced, but also in stimulating the flow of new ideas. For some reason, the very act of writing down one idea seems to generate additional ideas.
Whenever you write to discover ideas, focus on the issue you are examining and record all your thoughts, questions, and assertions. Don’t worry about organization or correctness. If ideas come slowly, be patient. If they come suddenly, in a rush, don’t try to slow down the process and develop any one of them; simply jot them all down. (There will be time for elaboration and correction later.) Direct your mind’s effort, but be sensitive to ideas on the fringe of consciousness. Often they, too, will prove valuable. If you have done your discovery writing well and have thought critically about the ideas you have produced, the task of writing to communicate will be easier and more enjoyable. You will have many more ideas—carefully evaluated ones—to develop and organize.Critical Thinking and Discussion
At its best, discussion deepens understanding and promotes problem solving and decision making. At its worst, it frays nerves, creates animosity, and leaves important issues unresolved. Unfortunately, the most prominent models for discussion in contemporary culture—radio and TV talk shows—often produce the latter effects.
Many hosts demand that their guests answer complex questions with simple “yes” or “no” answers. If the guests respond that way, they are attacked for oversimplifying. If, instead, they try to offer a balanced answer, the host shouts, “You’re not answering the question,” and proceeds to answer it himself. Guests who agree with the host are treated warmly; others are dismissed as ignorant or dishonest. Often as not, when two guests are debating, each takes a turn interrupting while the other shouts, “Let me finish.” Neither shows any desire to learn from the other. Typically, as the show draws to a close, the host thanks the participants for a “vigorous debate” and promises the audience more of the same next time.
Here are some simple guidelines for ensuring that the discussions you engage in—in the classroom, on the job, or at home—are more civil, meaningful, and productive than what you see on TV. By following these guidelines, you will set a good example for the people around you.
Whenever possible, prepare in advance. Not every discussion can be prepared for in advance, but many can. An agenda is usually circulated several days before a business or committee meeting. In college courses, the assignment schedule provides a reliable indication of what will be discussed in class on a given day. Use this information to prepare: Begin by reflecting on what you already know about the topic. Then decide how you can expand your knowledge and devote some time to doing so. (Fifteen or twenty minutes of focused searching in the library or on the Internet can produce a significant amount of information on almost any subject.) Try to anticipate the different points of view that might be expressed in the discussion and consider the relative merits of each. Keep your conclusions tentative at this point, so that you will be open to the facts and interpretations others will present.
Set reasonable expectations. Have you ever left a discussion disappointed that others hadn’t abandoned their views and embraced yours? Have you ever felt offended when someone disagreed with you or asked you what evidence you had to support your opinion? If the answer to either question is yes, you probably expect too much of others. People seldom change their minds easily or quickly, particularly in the case of long-held convictions.
And when they encounter ideas that differ from their own, they naturally want to know what evidence supports those ideas. Expect to have your ideas questioned, and be cheerful and gracious in responding.
Leave egotism and personal agendas at the door. To be productive, discussion requires an atmosphere of mutual respect and civility. Egotism produces disrespectful attitudes toward others—notably, “I’m more important than other people,” “My ideas are better than anyone else’s,” and “Rules don’t apply to me.” Personal agendas, such as dislike for another participant or excessive zeal for a point of view, can lead to personal attacks and unwillingness to listen to others’ views.
Contribute but don’t dominate. If you are the kind of person who loves to talk and has a lot to say, you probably contribute more to discussions than other participants. On the other hand, if you are more reserved, you may seldom say anything. There is nothing wrong with being either kind of person. However, discussions tend to be most productive when everyone contributes ideas. For this to happen, loquacious people need to exercise a little restraint, and more reserved people need to accept responsibility for sharing their thoughts.
Avoid distracting speech mannerisms. Such mannerisms include starting one sentence and then abruptly switching to another; mumbling or slurring your words; and punctuating every phrase or clause with audible pauses (“um,” “ah,”) or meaningless expressions (“like,” “you know,” “man”). These annoying mannerisms distract people from your message. To overcome them, listen to yourself when you speak. Even better, tape your conversations with friends and family (with their permission), then play the tape back and listen to yourself. Whenever you are engaged in a discussion, aim for clarity, directness, and economy of expression.
Listen actively. When the participants don’t listen to one another, discussion becomes little more than serial monologue—each person taking a turn at speaking while the rest ignore what is being said. This can happen quite unintentionally because the mind can process ideas faster than the fastest speaker can deliver them. Your mind may get tired of waiting and wander about aimlessly like a dog off its leash. In such cases, instead of listening to the speaker’s words, you may think about her clothing or hairstyle or look outside the window and observe what is happening there. Even when you make a serious effort to listen, it is easy to lose focus. If the speaker’s words trigger an unrelated memory, you may slip away to that earlier time and place. If the speaker says something you disagree with, you may begin framing a reply. The best way to maintain your attention is to be alert for such distractions and to resist them. Strive to enter the speaker’s frame of mind, understand what is said, and connect it with what was said previously. Whenever you realize your mind is wandering, drag it back to the task.
Judge ideas responsibly. Ideas range in quality from profound to ridiculous, helpful to harmful, ennobling to degrading. It is therefore appropriate to pass judgment on them. However, fairness demands that you base your judgment on thoughtful consideration of the overall strengths and weaknesses of the ideas, not on initial impressions or feelings. Be especially careful with ideas that are unfamiliar or different from your own because those are the ones you will be most inclined to deny a fair hearing.
Resist the urge to shout or interrupt. No doubt you understand that shouting and interrupting are rude and disrespectful behaviors, but do you realize that in many cases they are also a sign of intellectual insecurity? It’s true. If you really believe your ideas are sound, you will have no need to raise your voice or to silence the other person. Even if the other person resorts to such behavior, the best way to demonstrate confidence and character is by refusing to reciprocate. Make it your rule to disagree without being disagreeable.Avoiding Plagiarism
Once ideas are put into words and published, they become intellectual property, and the author has the same rights over them as he or she has over a material possession such as a house or a car. The only real difference is that intellectual property is purchased with mental effort rather than money. Anyone who has ever wracked his or her brain trying to solve a problem or trying to put an idea into clear and meaningful words can appreciate how difficult mental effort can be.
Plagiarism is passing off other people’s ideas or words as one’s own. It is doubly offensive in that it both steals and deceives. In the academic world, plagiarism is considered an ethical violation and is punished by a failing grade for a paper or a course or even by dismissal from the institution. Outside the academy, it is a crime that can be prosecuted if the person to whom the ideas and words belong wishes to bring charges. Either way, the offender suffers dishonor and disgrace, as the following examples illustrate:
• When a university in South Africa learned that professor Marks Chabel had plagiarized most of his doctoral dissertation from Kimberly Lanegran of the University of Florida, the university fired Chabel. Moreover, the university that had awarded him his Ph.D. revoked it.
• When U.S. Senator Joseph Biden was seeking the 1988 Democratic presidential nomination, it was revealed that he had plagiarized passages from speeches by British politician Neil Kinnock and by Robert Kennedy. It was also learned that, while in law school, he had plagiarized a number of pages from a legal article. The ensuing scandal led Biden to withdraw his candidacy and has continued to stain his reputation.
• The reputation of historian Stephen Ambrose was tarnished by allegations that over the years he plagiarized the work of several authors. Doris Kearns Goodwin, historian and advisor to President Lyndon Johnson, suffered a similar embarrassment when she was discovered to have plagiarized from more than one source in one of her books.
• When James A. Mackay, a Scottish historian, published a biography of Alexander Graham Bell in 1998, Robert Bruce presented evidence that the book was largely plagiarized from his 1973 biography, which had won a Pulitzer Prize. Mackay was forced to withdraw his book from the market. (Incredibly, he did not learn from the experience because he then published a biography of John Paul Jones, which was plagiarized from a 1942 book by Samuel Eliot Morison.)
• When New York Times reporter Jason Blair was discovered to have plagiarized stories from other reporters and fabricated quotations and details in his stories, he resigned his position in disgrace. Soon afterward, the two senior editors who had been his closest mentors also resigned, reportedly because of their irresponsible handling of Blair’s reportage and the subsequent scandal.
Some cases of plagiarism are attributable to intentional dishonesty, others to carelessness. But many, perhaps most, are due to misunderstanding. The instructions “Base your paper on research rather than on your own unfounded opinions” and “Don’t present other people’s ideas as your own” seem contradictory and may confuse students, especially if no clarification is offered. Fortunately, there is a way to honor both instructions and, in the process, to avoid plagiarism.
Step 1: When you are researching a topic, keep your sources’ ideas separate from your own. Begin by keeping a record of each source of information you consult. For an Internet source, record the Web site address, the author and title of the item, and the date you visited the site. For a book, record the author, title, place of publication, publisher, and date of publication. For a magazine or journal article, record the author, title, the name of the publication, and its date of issue. For a TV or radio broadcast, record the program title, station, and date of transmission.
Step 2: As you read each source, note the ideas you want to refer to in your writing. If the author’s words are unusually clear and concise, copy them exactly and put quotation marks around them. Otherwise, paraphrase— that is, restate the author’s ideas in your own words. Write down the number(s) of the page(s) on which the author’s passage appears.
If the author’s idea triggers a response in your mind—such as a question, a connection between this idea and something else you’ve read, or an experience of your own that supports or challenges what the author says—write it down and put brackets (not parentheses) around it so that you will be able to identify it as your own when you review your notes. Here is a sample research record illustrating these two steps:
Adler, Mortimer J. The Great Ideas: A Lexicon of Western Thought (New York: Macmillan, 1992) Says that throughout the ages, from ancient Greece, philosophers have argued about whether various ideas are true. Says it’s remarkable that most renowned thinkers have agreed about what truth is—”a correspondence between thought and reality.” 867 Also says that Freud saw this as the scientific view of truth. Quotes Freud: “This correspondence with the real external world we call truth. It is the aim of scientific work, even when the practical value of that work does not interest us.” [I say true statements fit the facts; false statements do not.]
Whenever you look back on this record, even a year from now, you will be able to tell at a glance which ideas and words are the author’s and which are yours. The first three sentences are, with the exception of the directly quoted part, paraphrases of the author’s ideas. Next is a direct quotation. The final sentence, in brackets, is your own idea.
Step 3: When you compose your paper, work borrowed ideas and words into your own writing by judicious use of quoting and paraphrasing. In addition, give credit to the various authors. Your goal here is to eliminate all doubt about which ideas and words belong to whom. In formal presentations, this crediting is done in footnotes; in informal ones, it is done simply by mentioning the author’s name.
Here is an example of how the material from Mortimer Adler might be worked into a composition. (Note the form that is used for the footnote.) The second paragraph illustrates how your own idea might be expanded:
Mortimer J. Adler explains that throughout the ages, from the time of the ancient Greeks, philosophers have argued about whether various ideas are true. But to Adler the remarkable thing is that, even as they argued, most renowned thinkers have agreed about what truth is. They saw it as “a correspondence between thought and reality.” Adler points out that Sigmund Freud believed this was also the scientific view of truth. He quotes Freud as follows: “This correspondence with the real external world we call truth. It is the aim of scientific work, even when the practical value of that work does not interest us.”
This correspondence view of truth is consistent with the commonsense rule that a statement is true if it fits the facts and false if it does not. For example, the statement “The twin towers of New York’s World Trade Center were destroyed on September 11, 2002,” is false because they were destroyed the previous year. I may sincerely believe that it is true, but my believing in no way affects the truth of the matter. In much the same way, if an innocent man is convicted of a crime, neither the court’s decision nor the world’s acceptance of it will make him any less innocent. We may be free to think what we wish, but our thinking can’t alter reality.