Microsoft Word - IL37.2 Maynes PUB.doc © Jeffrey Maynes, Informal Logic, Vol. 37, No. 2 (2017), pp. 114-128. Steering into the Skid: On the Norms of Critical Thinking JEFFREY MAYNES Department of Philosophy St. Lawrence University 23 Romoda Drive Canton, NY, 13617, USA. jmaynes@stlawu.edu Abstract: While cognitive bias is often portrayed as a problem in need of a solution, some have argued that these biases arise from adaptive rea- soning heuristics which can be ra- tional modes of reasoning. This pre- sents a challenge: if these heuristics are rational under the right condi- tions, does teaching critical thinking undermine students’ ability to reason effectively in real life reasoning sce- narios? I argue that to solve this challenge, we should focus on how rational ideals are best approximated in human reasoners. Educators should focus on developing the met- acognitive skill to recognize when different cognitive strategies (includ- ing the heuristics) should be used. Résumé: Alors que le biais cognitif est souvent représenté comme un problème nécessitant une solution, certains ont soutenu que ces biais proviennent de l'heuristique de rai- sonnement adaptatif qui peut être un type de raisonnement rationnel. Cela présente un défi: si ces heuristiques sont rationnelles dans les bonnes conditions, l'enseignement de la pensée critique mine-t-il l’habileté des étudiant(e)s de raisonner effica- cement dans ces conditions? J’avance qu’on relever ce défi en concentrant notre attention sur la façon dont les idéaux rationnels se rapprochent le plus de nos raison- nements. Les éducateurs devraient se concentrer sur le développement de la compétence métacognitive pour que les étudiant(e)s puissent recon- naître quand différentes stratégies cognitives (y compris les heuris- tiques) devraient être utilisées. Keywords: critical thinking, pedagogy, metacognition, cognitive bias 1. Introduction Critical thinking is a normative ideal. It is a set of epistemic atti- tudes and/or practices that individuals ought to aspire to in their own thinking. It is also a set of attitudes and practices that we aim to inculcate in our students. They should be able to reason about arguments independently of their own desires and biases. They should be skeptical of appeals to authority, and discerning Steering into the Skid © Jeffrey Maynes, Informal Logic, Vol. 37, No. 2 (2017), pp. 114-128. 115 between epistemic and non-epistemic authorities. They should be open to evidence, and not inclined to only seek out evidence that confirms their preconceptions. They should also be respon- sive to evidence, changing their beliefs on the basis of the best available arguments. The critical thinking literature is rife with work on how best to teach this ideal. One challenge that faces any education in critical thinking is the pervasive effects of cognitive bias. Given this psychological reality, Paul Thagard argues that criti- cal thinking pedagogy has to be responsive to insights from the psychology of reasoning (Thagard 2011). Teaching only the tools of formal and informal logic will not suffice because, even if these tools describe the normative ideal, they do not inculcate it. To put Thagard’s argument simply: the way we actually rea- son is more complex and multifaceted than the serial inference model presumed by these logical tools. In other work, I have argued (Maynes 2013, 2015) that critical thinking pedagogy can and should be responsive to this work in psychology, and I have attempted to show how instructors in critical thinking can better address cognitive bias in the classroom (though see Kenyon and Beaulac 2014 on the challenges of doing so). Yet, one might argue, this is a fool’s errand. On approach- es defended by Gigerenzer (2008), and Mercier and Sperber (2011), the emphasis on cognitive biases obscures the ways in which our reasoning systems have evolved to be highly effective in helping us navigate the world. Instead, these biases are the result of an evolved set of shortcuts and heuristics that we use in order to make sense of a complex environment. While these heuristics are not ideally rational, they are ecologically rational. That is, given the situations that humans face, and have faced in our evolutionary heritage, these heuristics allow us to success- fully operate at a low cognitive cost (in comparison with delib- erative, more ideally rational processes). Bernard Williams famously suggested that “reflection can destroy knowledge” (Williams 1986, p. 148). That is, we already possess rich moral knowledge that is unseated when we begin reflecting on whether that moral knowledge is actually true, and whether our values are really good. Similarly, perhaps critical thinking education destroys critical thinking. If we have success- ful reasoning tools in place now, and we aim to break students out of these heuristics, then we run the risk of undermining the very tools that students use to reason well in the world. A di- lemma faces us: if the heuristics cannot be displaced, then the education in critical thinking is ineffective. If the heuristics and biases can be displaced, then the education may be harmful. Ei- Jeffrey Maynes © Jeffrey Maynes, Informal Logic, Vol. 37, No. 2 (2017), pp. 114-128. 116 ther way, it would seem, we ought not aim at the critical think- ing ideal in our classrooms. In this essay, my aim is to defend the ideal of critical thinking from this objection. I argue that the ideal retains a place in our attempts to teach critical thinking skills, and that we should aim to produce reasoners who can use the tools of infor- mal and formal logic in parallel with the heuristics that guide them in everyday life. This position concedes ground to the ob- jection. We should not expect reasoners to be ideally rational and to fully insulate and protect themselves from cognitive bias. Nor should we want to. I will argue that Gigerenzer, and Mer- cier and Sperber are right that the heuristics play an important role in our cognitive lives that should not be given up easily. However, the ideals of critical thinking provide reasoners with a means to handle the complex reasoning situations presented to us in a sophisticated and modern world. 2. Against the ideal The critical thinker can be defined in terms of a set of abilities and a set of dispositions to make use of those abilities. For ex- ample, Siegel (1988) defines the critical thinker as someone who is “appropriately moved by reasons.” Being the sort of person who is appropriately moved by reasons will require a number of subsidiary skills and traits, including an openness to new evi- dence and the ability to discern when and to what extent a rea- son ought to move you. Among the dispositions and abilities that constitute Ennis’ (1991) definition of a critical thinker are: Dispositions of the ideal critical thinker: 1. to be clear about the intended meaning of what is said, written, or otherwise communicated. 2. to determine and maintain focus on the conclusion or question. 3. to seek and offer reasons. 4. to try to be well informed. 5. to look for alternatives. Abilities of the ideal critical thinker: 1. to analyze arguments. 2. to define terms, judge definitions, and deal with equivocation. 3. to judge the credibility of a source. Steering into the Skid © Jeffrey Maynes, Informal Logic, Vol. 37, No. 2 (2017), pp. 114-128. 117 4. to deduce, and judge deductions (Ennis 1991, pp. 8– 9). Broadly speaking, the abilities involve skill with the tools of informal logic, while the dispositions characterize epistemo- logical virtue, or the characteristics of a good reasoner. That our actual reasoning systems do not match the ideal is clear from the empirical literature on human reasoning. In some cases, our rea- soning systems are prone to error, and even in cases where they are more reliably accurate, they often rely on shortcuts which are themselves epistemically suspect. Consider, for example, differences between the results of probability theory and our intuitions about probability. The gambler’s fallacy occurs when someone assumes that probabilis- tically independent events are, in fact, dependent. If I have had a bad run on the roulette table, I might suppose that my bad re- sults make it increasingly likely I will win in the future (I am “due”). This, however, is a mistake, as the past results have no effect on my future results. This fallacy is common; indeed, Stich reports an example from a nineteenth century logic text that explicitly endorses this kind of fallacious reasoning (Stich 1990, p. 164)! Tversky and Kahneman contend that our tendency to commit this fallacy is caused by the representativeness heuristic, where one judges the probability or properties of some event or object A based on the degree to which it is representative of, or resembles, our beliefs about the more general class to which A belongs. For example, in the roulette case, we expect the series of results to conform to our prior expectation of the probability distribution of wins and losses. Since I expect the entire set of results to conform to that distribution, if I have a bad run, I will expect more wins in the upcoming spins of the wheel, so that the entire set looks more like that distribution (Tversky and Kahneman 1974, p. 1125). They credit this heuristic with causing a number of fallacious reasoning tendencies with regard to probability, including base rate neglect and the conjunction fallacy. We are also subject to a range of related biases concerning our evaluation of evidence. For example, belief bias is the ten- dency to evaluate evidence based upon the believability of the conclusion, rather than on its own merits (Evans, Barston, and Pollard 1983). That is, if evidence supports something we al- ready believe is true, we are more likely to find it to be stronger evidence than if that same evidence was supporting something we did not believe. Such a bias operates on our evaluation of evidence, and not in place of it. That is, the bias is distinct from Jeffrey Maynes © Jeffrey Maynes, Informal Logic, Vol. 37, No. 2 (2017), pp. 114-128. 118 simple refusal to countenance evidence against one’s current position; the bias inclines us to weigh evidence differently based on its consistency with our prior beliefs. These observations about human reasoning are often cast as limitations. In recent work, Gigerenzer (Gigerenzer 2008; Kruglanski and Gigerenzer 2011) and Mercier and Sperber (2011) have argued that many of these biases are better under- stood as ecologically rational heuristics. Ecological rationality is rationality in context. That is, an ecologically rational strategy is one that is rational when used in a particular environment or sit- uation. Our cognitive strategies have evolved in particular envi- ronments, and can take shortcuts given facts about those envi- ronments. Taking these shortcuts reduces cognitive demand, while providing results that are just as good, if not better, than using more sophisticated strategies. The conditions under which a strategy is ecologically rational specify when that strategy is likely to produce these equivalent (or better) outcomes. One of the heuristics in what Gigerenzer calls our “adap- tive toolbox” (Gigerenzer, 2008) is the recognition heuristic (Goldstein and Gigerenzer 2002). A person using this heuristic will, when presented with a pair of alternatives where one is recognized, infer that the recognized item fits some relevant cri- teria better. This heuristic is ecologically rational when recogni- tion validity is greater than 0.5, or put simply, when recogniza- bility is sufficiently correlated with the criteria. For example, when asked to determine the relative size of American cities, German participants were more accurate than their American counterparts (Goldstein and Gigerenzer 1999, 2002). The Amer- ican respondents surely had more information about the cities than did the German respondents. Indeed, the German’s reliabil- ity is likely based on their paucity of information; they judged that the city they recognized was the bigger one. While recog- nizability is not an infallible guide to size, more populated cities tend to be more recognizable. Given that the respondents are living in a world where recognizability and population size are correlated strongly enough, the strategy turns out to be ecologi- cally rational. Such a strategy is adaptive. Not only can it be used when one has highly limited information, but it works in cases where “less-is-more” (Goldstein and Gigerenzer 2002, p. 88), that is, where adding more information makes one liable to make a worse decision. For example, using this heuristic may help indi- viduals avoid consumption of toxic food (Goldstein and Gigerenzer 2002). If a member of my group eats something new, and later dies, I might avoid foods that I recognize as being Steering into the Skid © Jeffrey Maynes, Informal Logic, Vol. 37, No. 2 (2017), pp. 114-128. 119 similar to the one my compatriot ate. This is a case where less- is-more, because if I attempt to make use of my limited bio- chemical and nutritional knowledge, I am liable to make a worse decision than I would have if I relied on the recognition heuris- tic. While exploring the conditions under which these strate- gies are rational may help explain how they evolved, these con- ditions are often met in our everyday reasoning contexts. Take, for example, the 1/N heuristic, which states that, when presented with a resource to allocate over N alternatives, you should split the resource equivalently across all of the alternatives. Applied to financial portfolios, DeMiguel, Garlappi and Uppal (2009) compared the heuristic’s performance against twelve strategies for optimal resource allocation. None of the twelve outper- formed the heuristic. This is not to say, however, that it is ra- tional to always use the 1/N heuristic. If I am given one hundred dollars, and offered the choice of investing it in a stable and pre- dictable fund and/or investing it in lottery tickets, I should not simply split the money equally between the investment and the lottery. The strategy is ecologically rational in conditions of high predictive uncertainty, with a large number N of alterna- tives, and a small learning sample to refine alternative strategies. Mercier and Sperber (2011) have a similarly optimistic reading of the heuristics and biases literature. On their approach, the function of our reasoning systems is argumentative; we use it to convince others of our positions. For example, we are prone to confirmation bias because our aim is persuasion, and so we will cobble together the evidence that will best convince our au- dience. This, in turn, improves the epistemic position of the group. Since many members of the group are forcefully defend- ing their own viewpoints, the entire group has to confront a range of arguments. The truth need not emerge from the efforts of a single interlocutor, but rather it emerges from this group conversation. Further, Mercier and Sperber argue that these bi- ases primarily operate on us as producers of arguments, rather than as evaluators of arguments (at least when we are seeking the truth). As with Gigerenzer’s heuristics, reasoning behaviors that appear to be irrational turn out to be valuable reasoning tools in the right context. In Mercier and Sperber’s argumenta- tive theory, that context is a social one. Heuristics are effective, and in fact, we would likely be worse off if we did not make use of them. Consider a game of baseball, where an outfielder has to make a catch on a fly ball. One way for the player to project where the ball will land is to do the math and calculate the ball’s trajectory. Another is for the Jeffrey Maynes © Jeffrey Maynes, Informal Logic, Vol. 37, No. 2 (2017), pp. 114-128. 120 player to adjust her position to keep the angle of elevation of gaze on the ball increasing at a decreasing rate while also con- trolling the rate of horizontal rotation needed to keep one’s gaze fixed on the ball (McLeod, Reed, and Dienes 2006). In actual fly ball catching scenarios, the second is a far more effective strate- gy. It has the decisive advantage in speed and in its information- al demands. In most fly ball catching contexts, the player has to make the decision quickly, and with only limited information about the speed and trajectory of the ball. The result is that out- fielders using the latter strategy will catch more fly balls than those using the former strategy. The question, then, is whether teaching critical thinking is like teaching the outfielder to catch fly balls through mathemati- cal analysis of trajectory. It may be useless (the outfielder will just revert to well used methods for estimating trajectory) or harmful (she may no longer be able to catch fly balls). Teaching critical thinking may be useless for a pair of reasons. First, it may be that the tendency to use heuristics and biases may simp- ly be too strong. The cognitive effort required to override them may be too high for people to regularly do so. Second, our heu- ristic strategies are positively reinforced; in many cases, they work quite well. Reasoners may lack the motivation to try to override them. Teaching critical thinking skills may be harmful if reason- ers end up using a strategy that is too difficult to use, particular- ly in the time constraints of many real life reasoning scenarios. Mathematical calculation of trajectory is difficult to do, and im- possible to do in the timeframe required to catch a fly ball. Simi- larly, probabilistic reasoning is challenging and time- consuming, and may lead to an increase in time (and so missed opportunities) and in errors committed. If, then, critical thinking education dislodges the habit to use a strategy which is ecologi- cally rational, and instead encourages the use of one which is slower and more error-prone, then that education may actually be harmful. I turn to answering this challenge in the next sec- tion. 3. Ideal reasoners and the aims of critical thinking education What are the possible outcomes from a critical thinking educa- tion that embraces the normative ideal and aims to help students debias? One is that the debiasing fails, and courses in critical thinking are insufficient to bring students closer to the norma- Steering into the Skid © Jeffrey Maynes, Informal Logic, Vol. 37, No. 2 (2017), pp. 114-128. 121 tive ideal. Another is that the debiasing succeeds, but this suc- cess leaves students adrift and unable to use their effective and ecologically rational cognitive strategies in the right scenarios. A third is that the debiasing succeeds so thoroughly that we pro- duce ideal critical thinkers. A fourth is the debiasing succeeds in producing metacognitively aware reasoners who are disposed to use normatively ideal strategies in the appropriate situations. Both experience and theory tell against the third of these possibilities, and I will set it aside here. If the first is true, then critical thinking courses are a waste of time and resources. On this view the psychological barriers to inculcating good reason- ing dispositions are so strong that the methods we use are insuf- ficient to overcome them. If the second is true, then critical thinking courses are not only wasteful, but actively harmful. In the remainder of this essay I will argue for the fourth and against the second and third. That is, I will argue for an understanding of the goals of critical thinking that is responsive to the ecologi- cal rationality critique, but nevertheless makes room for success- ful debiasing and approximation of the normative ideal. To begin, it is worth revisiting the two aspects of critical thinking discussed earlier. The first concerns the dispositions of the critical thinker, such as being more responsive to evidence, more charitable, etc., while the second concerns the skills or log- ical strategies and tools used by the critical thinker. Crucially, while possession of the skills and strategies may be necessary to being a critical thinker, the dispositions are not defined such that constant employment of those skills and strategies is necessary. Achieving the dispositional goals might involve a mixture of a variety of reasoning strategies, including not only the tools of informal and formal logic, but also the use of heuristics. If the tools we teach leave one in a worse off position with regard to these dispositions (the second possibility noted above), then they will make students less critical. In this way, we should be careful not to confuse the tools of critical thinking education with its dispositional goals. Perhaps, one might argue, that the heuristics and biases are by their very nature antithetical to the dispositional goals of a critical thinker. These heuristics incline us to take shortcuts through evidence, and to marshal evidence for our pre-existing beliefs, and doing so may be inconsistent with the dispositional goals. If our goal was to produce ideal reasoners, then this would be true. If, however, our goal is to approximate the ideal reasoner, then best way to do so is to integrate the psychological tools that actual reasoners bring to bear on the world. Jeffrey Maynes © Jeffrey Maynes, Informal Logic, Vol. 37, No. 2 (2017), pp. 114-128. 122 The ideal reasoner serves a similar function to the virtuous person in ethical theory. Suppose that impartiality is a moral re- quirement; we should treat all individuals as having equal moral standing and we should not privilege the goods of those close to us. Actually inculcating the tendency to respond to the world in this way is difficult, if not psychologically impossible. For ex- ample, one might not be able to think of the needs and welfare of other people’s children in the same way as one thinks of the needs and welfare of one’s own child. So, to approximate the impartial moral ideal, I might hope to use individuals’ particular affection for their own children. For example, I might ask you to think about how other parents see their own children, or ask you to imagine someone in need as if they were your child. In this case, I am using the psychological mechanism that undermines impartiality in order to approximate it. So too might we be able to use the heuristics to better approximate the ideal reasoner. What is needed, then, is a pair of goals for critical think- ing. One is the ideal reasoner conceived independently of human psychology. This reasoner shows us what is rational, as the truly virtuous person shows us what is good. The second is the ideal human reasoner, the best approximation of the ideal reasoner consistent with facts about human psychology. The crucial point is that increased attempts to mimic the ideal reasoner may pro- duce a less ideal human reasoner. The goal for critical thinking pedagogy is the ideal human reasoner, the one whose epistemic success most closely resembles the ideal reasoner, even if not of all her methods do. Offering a precise definition of the ideal human reasoner depends on empirical facts that are not yet settled. It is possible, however, to offer a general sense of what this reasoner is like and how she differs from the ideal reasoner. The dispositional component of critical thinking gives us guidance. One of the dispositions on Ennis’ definition is “to try to be well informed.” Such a disposition seems to be undermined by confirmation bi- as, which is our tendency to seek out information which con- firms our prior beliefs. One will hardly be well informed if their information is biased in favor of their prior beliefs. This disposition may suggest an ideally rational strategy of collecting all of the information one can, from the broadest array of sources one can. Yet, perhaps a different strategy would better approximate the ideal. For example, one might follow the ideal strategy in situations which are high-stakes, and where one has time to sift through the information. In other cases, perhaps one is better served by cultivating connections with people who disagree with you, providing a ready supply of contrary to ex- Steering into the Skid © Jeffrey Maynes, Informal Logic, Vol. 37, No. 2 (2017), pp. 114-128. 123 pectation information which might otherwise be overlooked due to confirmation bias. The same could be said for the heuristics. An ecologically rational heuristic is a heuristic that is rational in the right kinds of situations. Consider some of the conditions under which these heuristics are ecologically rational: Table 1: Heuristics (Gigerenzer 2008, p. 24) Heuristic Definition Ecologically Rational if… Recognition Heuristic If one of two alterna- tives is recognized, infer that it has the higher value on the criterion Recognition validity >.5 Tallying To estimate a criterion, do not estimate weights, but simply count the number of favoring cues Cue validities vary little, low redundancy Default Heuristic If there is a default, do nothing about it Values of those who set defaults match with those of decision maker, consequences of choice hard to predict Imitate the Majority Look at a majority of people in your peer group, and imitate their behavior. Environment is not or only slowly changing, info search is costly or time consuming Consider the tallying heuristic first. Suppose that I am planning to purchase a car, and am considering two possible op- tions. I take an inventory of all of the advantages and disad- vantages of each, and attempt to use that information to make an informed decision about which car to buy. If I follow the tally- ing strategy, I’ll total up the considerations for and against, and choose the one with the greatest quantity of supporting infor- mation. This strategy, however, is rational when cue validities matter little (that is, each piece of information is roughly equiva- lent in significance) and the information is low in redundancy. In the car case, however, the cue validities do matter. Evidence about the quality of make, reliability, and performance, matters a great deal more than comfort features like heated seats. What’s Jeffrey Maynes © Jeffrey Maynes, Informal Logic, Vol. 37, No. 2 (2017), pp. 114-128. 124 more, purchasing a car is a decision with high stakes (they are expensive), and for which I have adequate time to reflect and consider the evidence. This is a prime case for recognizing that tallying is not an effective strategy, and that the relative weights of pieces of evidence should be considered. Similarly, suppose that I am deciding on how to vote on a contentious, but important, issue at a faculty meeting. I notice that the majority of people in my peer group are siding with the proposal. If I follow the imitate the majority strategy, I will sup- port the proposal as well. In such a case, however, the search for information is neither costly not time consuming, relative to the importance of the issue. Instead, in this situation, I ought to con- duct a more thorough review of the evidence. One of the crucial skills, then, of the ideal human reasoner is metacognitive awareness of the conditions under which she ought to use ideal strategies, and the conditions under which the ideal strategies are not useful, or worse, harmful. While the ideal reasoner is disposed to employ critical thinking strategies across all contexts, the ideal human reasoner is disposed to employ them in contexts where they would be most effective. Such a disposition requires one to monitor the context, and one’s own thinking, to recognize when the conditions have been met for employing the ideal strategy, as well as the cognitive skill to use the strategy correctly. I have sketched out a conception of the ideal human rea- soner where that reasoner continues to rely on heuristics, but recognizes a range of cases where those heuristics are not ra- tional, and has the tools to reason more effectively in those situ- ations. Since this reasoner uses more sophisticated strategies in precisely the scenarios where heuristics are unreliable, it is a ra- tionally preferable ideal. Is it, however, psychologically plausi- ble? Do we have reason to think that the pursuit of this ideal is more likely to succeed, and not unsettle students’ effective use of heuristics? There are three lines of evidence that support the plausibility of this proposal. First, the required skills are teacha- ble; second, there are successful debiasing strategies available; and third, the heuristics are very hard to dislodge. The teachable skills include metacognitive skills (regard- ing both awareness of cognition and the ability to regulate one’s cognition) and the skills of informal logic (including argument diagramming and evaluation). The skills of informal logic are straightforward cognitive strategies, and are just as teachable as any other similar cognitive strategies (such as learning mathe- matics). The importance and development of metacognitive skill has been a focus of research in educational theory and psychol- Steering into the Skid © Jeffrey Maynes, Informal Logic, Vol. 37, No. 2 (2017), pp. 114-128. 125 ogy for some time, and there is evidence that it is also teachable (Dignath, Buettner and Langfeldt 2008; Donker et al. 2014; Hal- ler, Child, and Walberg 1988; Hartman 2001; Schraw 1998). Not only are the requisite skills teachable, there are strate- gies available which have been shown to be successful at miti- gating the effects of cognitive bias. I have argued elsewhere (Maynes 2015) that debiasing strategies can successfully be em- ployed in the classroom (see also Larrick (2004) for an overview of debiasing strategies). For example, using the “consider-the- opposite” strategy, where one articulates the arguments for an alternative view, has been found to be effective in mitigating cognitive biases such as the hindsight and anchoring biases. Such a technique is not only effective in mitigating bias, it is teachable; by inculcating the metacognitive skill involved in knowing when to employ this strategy, students can be taught to debias themselves. Third, and finally, the heuristics and biases are notoriously difficult to dislodge. Higher general cognitive ability does not reduce or eliminate the presence of these biases. Stanovich and West (2008) found that cognitive bias is not correlated with cognitive ability, and indeed, higher cognitive ability may be correlated with even greater blindness to our own biases (West, Meserve, and Stanovich 2012). Similarly, while debiasing strat- egies have been effective, that effectiveness is limited to mitiga- tion. Even after providing evidence that the consider-the- opposite strategy has debiasing effects, Lillienfeld et al. con- clude that “it is possible that given the formidable barriers against debiasing we have outlined, even the most efficacious of intervention efforts may meet with only partial success” (Lilien- feld et al., 2009 p. 395). That is, while the pursuit of the ideal human reasoner can meet with some success, the heuristics are likely to prove too strong to dislodge completely. I began by considering three possible outcomes for a criti- cal thinking education. One is the metacognitively skilled hu- man reasoner that I have developed in this section. The second is that critical thinking education is ineffective. This worry is dispelled by the efficacy of debiasing strategies, and the teacha- bility of the component skills of the metacognitively skilled hu- man reasoner. The third is that the critical thinking education is destructive, in that students’ use of ecologically rational heuris- tics is disrupted such that they reason less effectively. While this possibility cannot be ruled out definitively, the strength of our heuristics argues against it. The metacognitively skilled human reasoner, who uses heuristics when rational, and superior cogni- Jeffrey Maynes © Jeffrey Maynes, Informal Logic, Vol. 37, No. 2 (2017), pp. 114-128. 126 tive strategies in the right contexts, is an achievable, and prefer- able, goal to set for critical thinking education. This suggests three directions for future research. First, specifying the conditions under which strategies are ecologically rational remains an on-going project. Such work maps the ad- vantages and limits of ecologically rational heuristics. Without a clear sense of the heuristics that we do follow, and when they operate successfully, we cannot identify the ideal human reason- er as opposed to the ideal reasoner. The second is identifying metacognitive strategies that ap- proximate the conditions under which strategies are ecologically rational. These strategies have to be simple enough to guide an actual reasoner in real reasoning scenarios, while at the same time best capturing the conditions where the ideal human rea- soner would follow an ideal strategy in place of a heuristic. For example, it will hardly be helpful for most everyday situations for one to calculate out whether the recognition validity is great- er than 0.5. However, a rule that in high-stakes contexts where recognizability is unrelated, or only tenuously related, to the cri- teria, one should take stock of their evidence and consider the evidence for the opposing proposal, may be more effective. Third, identifying pedagogical interventions which will help develop the metacognitive awareness and skill to deploy these strategies will better enable educators to apply this frame- work in the classroom. While definitions of metacognitive abil- ity vary, they typically have two central components: knowledge of cognition and the ability to regulate cognition (McCormick 2003). Gigerenzer’s adaptive toolbox provides a model for the information needed to possess knowledge of cognition, where the conditions wherein a heuristic is ecologically rational are spelled out. Exercises to develop skill at regulating cognition will then help students to actually employ this knowledge and the ideal strategies when the conditions are appropriate. Taking up these questions, each of which depend upon advances in empirical work in human reasoning and pedagogy, answer the ecological rationality challenge. Aiming at the ideal human reasoner suggests changes in the way we approach teach- ing critical thinking skill. Rather than focusing solely on the tools of informal and formal logic, which specify the ideal rea- soner, we should instead help students develop the tendency to use these strategies in the right conditions. The result will be students who, while not ideal reasoners, will better approximate this goal as they reason their way through the world. Steering into the Skid © Jeffrey Maynes, Informal Logic, Vol. 37, No. 2 (2017), pp. 114-128. 127 References DeMiguel, Victor, Lorenzo Garlappi and Raman Uppal. 2009. Optimal versus naive diversification: How inefficient is the 1/N portfolio strategy? Review of Financial Studies 22(5): 1915–1953. Dignath, Charlotte, Gerhard Buettner and Hans-Peter Langfeldt. 2008. How can primary school students learn self-regulated learning strategies most effectively? A meta-analysis on self- regulation training programmes. Educational Research Re- view 3(2): 101–129. Donker, A. S., H. de Boer, D. Kostons, C. Dignath van Ewijk, and M.P.C. van der Werf. 2014. Effectiveness of learning strategy instruction on academic performance: A meta- analysis. Educational Research Review 11: 1–26. Ennis, Robert. 1991. Critical thinking: A streamlined concep- tion. Teaching Philosophy 14(1): 5–24. Evans, Jonathan St. B. T., Julie L. Barston and Paul Pollard. 1983. On the conflict between logic and belief in syllogistic reasoning. Memory & Cognition 11(3): 295–306. Gigerenzer, Gerd. 2008. Why heuristics work. Perspectives on Psychological Science 3(1): 20–29. Goldstein, Daniel G., and Gerd Gigerenzer. 1999. Betting on one good reason: The take the best heuristic. In Simple heu- ristics that make us smart, eds. Gerd Gigerenzer, Peter M. Todd, and ABC Research Group, 37–58. Oxford: Oxford University Press. Goldstein, Daniel G., and Gerd Gigerenzer. 2002. Models of ecological rationality: The recognition heuristic. Psychologi- cal Review 109(1): 75–90. Haller, Eileen P., David A. Child and Herbert J. Walberg. 1988. Can comprehension be taught?: A quantitative synthesis of “metacognitive” studies. Educational Researcher 17(9): 5–8. Hartman, Hope J., ed. 2001. Metacognition in learning and in- struction: Theory, research and practice. Dordrecht: Springer Netherlands. Kenyon, Tim, and Guillaume Beaulac. 2014. Critical Thinking Education and Debiasing. Informal Logic 34(4), 341–363. Kruglanski, Arie W. and Gerd Gigerenzer. 2011. Intuitive and deliberate judgments are based on common principles. Psy- chological Review 118(1): 97–109. Larrick, Richard P. 2004. Debiasing. In Blackwell handbook of judgment and decision making, eds. Derek J. Koehler and Nigel Harvey, 316–337. Malden: Blackwell Publishing. Jeffrey Maynes © Jeffrey Maynes, Informal Logic, Vol. 37, No. 2 (2017), pp. 114-128. 128 Lilienfeld, Scott O., Rachel Ammirati, Kristin Landfield, Rich- ard Nisbett, Lee Ross, and Thomas Gilovich. 2009. Giving debiasing away: Can psychological research on correcting cognitive errors promote human welfare? Perspectives on Psychological Science 4(4): 390–398. Maynes, Jeffrey. 2013. Thinking about Critical Thinking. Teaching Philosophy 36(4), 337-351. Maynes, Jeffrey. 2015. Critical Thinking and Cognitive Bias. Informal Logic 35(2), 183-203. McCormick, Christine. 2003. Metacognition and learning. In Handbook of Psychology: Educational Psychology, ed. W. Reynolds & G. Miller, 79–102. Hoboken, NJ: John Wiley and Sons. McLeod, Peter, Nick Reed and Zoltan Dienes. 2006. The gener- alized optic acceleration cancellation theory of catching. Journal of Experimental Psychology. Human Perception and Performance 32(1): 139–148. Mercier, Hugo and Dan Sperber. 2011. Why do humans reason? Arguments for an argumentative theory. The Behavioral and brain sciences 34(2): 57–74. Schraw, Gregory. 1998. Promoting general metacognitive awareness. Instructional Science 26: 113–125. Siegel, Harvey. 1988. Educating Reason: Rationality, Critical Thinking and Education. New York: Routledge. Stanovich, Keith. E., and Richard F. West. 2008. On the relative independence of thinking biases and cognitive ability. Jour- nal of personality and social psychology 94(4): 672–695. Stich, Stephen. 1990. The Fragmentation of Reason. The MIT Press. Thagard, Paul. 2011. Critical thinking and informal logic : Neu- ropsychological perspectives. Informal Logic 31(3): 152– 170. Tversky, Amos and Daniel Kahneman. 1974. Judgment under uncertainty: heuristics and biases. Science 185(4157): 1124– 1131. West, Richard F., Russell J. Meserve and Keith E. Stanovich. 2012. Cognitive sophistication does not attenuate the bias blind spot. Journal of personality and social psychology 103(3): 506–519. Williams, Bernard. 1986. Ethics and the Limits of Philosophy. Harvard University Press.