Skip to main content

Cognitive Theory of Scientific Rationality or of Scientific Reasoning?

  • Chapter
  • First Online:
Methodological Cognitivism
  • 783 Accesses

Abstract

The project entitled the naturalisation of epistemology which was launched by Quine in 1969 opened Pandora’s box in terms of naturalistic fallacy. The most disparate attempts to merge the normative with the descriptive have emerged over the past few years. In the theory of knowledge, in metaphysics, in ethics and in the philosophy of science an attempt has been made to transfer empirical knowledge, above all of psychological and neuropsychological models of perception, memory, reasoning and decision, to the construction of normative theories. In the philosophy of science, this attempt goes by the name of “cognitive philosophy of science” which in Italy has been translated, not without some ambiguity, as “cognitive theory of science”. Traditionally, like any theory from the philosophy of science, it should be able to respond to a number of fundamental questions regarding the status of scientific knowledge. In particular, it should be able to justify or otherwise why science is different to other human activities and why it increased man’s knowledge of nature. In short, it should be able to propound a theory of scientific rationality that highlights the methodological specificities underlying the conceptual change of science and that sets it apart from man’s other cognitive activities.

The present chapter is a modified version of Viale, R. (1997). Teoria della Razionalità o del Ragionamento Scientifico?, Sistemi Intelligenti, Anno IX, n. 2.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    In the philosophy of science there appears to be general agreement that a theory of scientific rationality can be proven if it can explain scientific facts in line with the following summarizing model (Newton Smith, 1981, p. 4):

    1. 1.

      The purpose of the scientific community was that present in the theory of rationality.

    2. 2.

      Based on the available evidence, the new theory T2 was superior to the old theory T1 (in terms of the principle of choice specified by the theory of rationality).

    3. 3.

      The scientific community realised that T2 was superior to T1.

    4. 4.

      This awareness of its superiority motivated the members of the community to abandon T1 in favour of T2.

    In general the theories of scientific rationality have aimed to select either true or likely theories (realism) or those capable of generating predictions (instrumentalism). And the principles of methodological choice, decided in relation to the goal to be attained, could be generally summed up, if formal characteristics are equal, to form a good support from the empirical basis (in the form of experimental tests or the lack of false results).

  2. 2.

    The incommensurability thesis has prompted further analysis that weakened the irrationalistic claims. Field (1973), with his hypothesis of the partial denotation of a theoretical term owing to its semantic indefiniteness, proposed that the same term could be used in two different theories with enough common meaning to avoid the paradox of incommensurability. Moreover, as Fine (1975) asserted, given that later theories evolve from earlier theories, it can be supposed that there is an overlap of meaning between the two.

  3. 3.

    The analysis of scientific knowledge in the nineteenth and twentieth centuries was predominantly the privilege of philosophy and the social sciences, as well as of history. Starting from different methodological premises and ends, philosophers and sociologists often found themselves sharing converging judgements and valuations on the cognitive status of science.

    In the nineteenth century Comte and positivism regarded science as the only true human knowledge. Philosophy was only left the task of promoting the spread of the scientific approach to fields as yet untouched by its progress. Like Comte, Durkheim awarded science a superior role to other forms of knowledge. And even Marx, in several passages, appears to acknowledge science’s distinct status from ideology.

    The first half of the twentieth century was still dominated by the affirmation of the diversity of scientific knowledge. On the one hand, neopositivism and Popper justify this diversity based on formal criteria that demarcate what is scientific from what is not. On the other Merton and even Manheim acritically assumed its superiority, solely with regard to natural sciences and the formal sciences, and Merton tried to motivate this on the basis of social norms present within scientific communities.

    During the second half of the twentieth century, the various a priori justifications for the different status of scientific knowledge found increasingly less correspondence with the results of empirical historical and sociological studies. Science does not appear to be governed by rules that differ from those of any other human activity. Scientific knowledge itself does not present characteristics that are qualitatively distinct from more profane and ordinary forms of knowledge. Kuhn and Feyerabend, on the one hand, and Collins, Pickering, Bloor and Latour, on the other appear, to converge on a sceptical vision in which we must yield to the observation that even “the natural world does not limit what we believe it contains” (Collins 1981, p. 54). While earlier scientific activities had appeared to be governed by rational methodological (Popper’s principle of falsification) or deontological rules (the Mertonian ethos), the new philosophy and sociology of science represents science as an anarchic activity (Feyerabend), a negotiation (Mulkay and Collins) or transaction (Latour and Hagstrom). None of the past certainty remains that had made Adam Smith exclaim that “Science is the great antidote to the poison of enthusiasm and superstition”.

  4. 4.

    In contemporary sociology of science there is no clear analysis of the action based on the scientist’s individual decision-making factors (Viale 1991; see Chap. 7 of this volume). Social studies of science are guided, as in the case of the post-Mertonians, Collins and the School of Edinburgh, by a functionalistic type approach (the scientist is a prisoner of his teleological role to satisfy collective variables like power, prestige, wealth, etc.) or, as in the case of Latour and Woolgar, by a qualitative ethnomethodological type approach (the different forms of behaviour of the scientist are dissected and pared down until their meaning dissolves into the banal meaning of everyday life). To sum up, the scientist is either a puppet whose strings are pulled by the puppeteer, Community or Collectivity, or he is a container lacking any specific meaning that interacts with others like himself.

  5. 5.

    The term cognitive theory of science is used to mean the epistemological attempt to justify the unique nature of scientific knowledge on a cognitive basis. This category should therefore exclude the theories and empirical hypotheses on reasoning and scientific thought, and also those of cognitive psychology, neuropsychology and AI programmes in science.

  6. 6.

    There are various formulations of Bayes’s theorem. The standard version, applied to scientific reasoning, also includes a knowledge of background as well as the theory:

    $$ \mathrm{P}\left(\mathrm{T}/\mathrm{E}.\mathrm{B}\right)=\frac{\mathrm{P}\left(\mathrm{E}/\mathrm{T}.\mathrm{B}\right)\mathrm{P}\left(\mathrm{T}/\mathrm{B}\right)}{\mathrm{P}\left(\mathrm{E}/\mathrm{T}.\mathrm{B}\right)\mathrm{P}\left(\mathrm{T}/\mathrm{B}\right)+\mathrm{P}\left(\mathrm{E}/\mathrm{nonT}.\mathrm{B}\right)\mathrm{P}\left(\ \mathrm{nonT}/\mathrm{B}\right)} $$

    where T is the theory, E is the empirical evidence, B is the background knowledge (auxiliary hypotheses, starting conditions and ceteris paribus clause), non-T is the falsity of the theory, P(T/E.B) is the a posteriori probability, P(T/B) is the prior probability of the truth of the theory together with background knowledge, P(nonT/B) is the prior probability that both the theory and background knowledge are false, P(E/T.B) is the likelihood that the evidence is true if both the theory and the background knowledge are true, P(E/nonT.B) is the likelihood that the evidence is true if both the theory and the background knowledge are false. Other formulae use an H in the place of T or D in the place of E and do not include background knowledge.

    The aim of Bayes’s theorem is to establish the probability of a hypothesis following the discovery of new empirical evidence. The probability of a hypothesis (namely the prior probability P(T/B)) is changed based on the empirical evidence (E) to give a new probability (namely the a posteriori probability P(T/E.B)). The evidence may increase the probability of a hypothesis (e.g. the probability that the hypothesis is affected by a given disease increases after laboratory tests reveal signs of the disease itself). In the same way the evidence may diminish the probability of a hypothesis (e.g. if the tests are negative). In some cases there is a tendency to disregard the prior probability and to give importance to the new empirical evidence alone (as in the case where a positive test makes me forget the low probability that I actually have the disease). Or, on the contrary, some studies tend to underestimate new empirical evidence and to give more weight to the prior probability (as in the case where a person focuses on the probability of having a disease without taking into account the negative results of clinical tests).

  7. 7.

    This objection is backed up by Macchi (1995) which appears to show that the anomalies in Bayesian reasoning are eliminated when presenting the information relating to the finding by making the appropriate adjustments to the tests to avoid confusion between a posteriori probability—P(T/E)—and the likelihoods (or likelihood of obtaining the datum when the hypothesis is true)—P(E/T).

  8. 8.

    It might be interesting to examine the case in which in the presence of a hypothesis with a given prior probability (which always includes background knowledge, namely the starting conditions, auxiliary hypotheses and clausola ceteris paribus), the empirical evidence that deductively derives from T.B proves falsified, but no change can be made between the auxiliary hypotheses. According to some authors, in situations like this there is always underdetermination of the theory compared to the falsifying evidence, because it is always possible to hypothesise that falsification is aimed at one of the components of background knowledge and not at the theory. In fact by applying the modus tollens we get:

    • T&B → E

    • non E

    • -------------

    • non (T&B)

    but the negation of the conjunction T&B may be satisfied by the negation of one of the components alone. Even if B is negated, it is worth pointing out that it is not clear what part of the background knowledge is falsified, whether it is one of its auxiliary hypotheses, one of the starting conditions or the ceteris paribus clause. In the same case of T it could be argued that the negation is aimed at one of the components of the theory and not at the theory in toto. In fact, unlike an empirical generalisation that can be described as a single assertion, a theory can be represented as a complex conjunction of assertions.

    Another loophole used to neutralise the anomaly is to appeal to the non-corroboration of the clausola ceteris paribus, namely the assumption that unforeseen effects are present which disturb the phenomenon.

    In the case of Bayesian reasoning the problem is put differently. If negative evidence is found, with a clausola ceteris paribus and sufficiently corroborated starting conditions, the likelihood is:

    $$ \mathrm{P}\left(\mathrm{E}/\mathrm{T}.\mathrm{B}\right)=0 $$

    Therefore the a posteriori probability of the theory also becomes zero, as can be seen in (2). Any attempt to save the theory becomes impossible, according to the Bayesian model, even though here again the a posteriori probability not only relates to the theory, but also to the background knowledge. Given that it is not possible to appeal to disturbing effects, given that the clausola ceteris paribus is corroborated, the only way to save the theory would be, as in the previous example concerning the wave theory, an ad hoc approach, namely to change one of the auxiliary hypotheses. But in this case we are no longer dealing with the original prior probability, but the prior probability of something different (and, sometimes, this is not permitted in scientific activity). Instead, it is not possible to use the thesis of underdetermination based on the indeterminacy of which component of the T&B conjunction has been falsified. This is an important consideration because it implies that the Bayesian model does not appear to allow for the underdetermination thesis, in the form described above, which is one of the epistemological instruments used to justify maintaining the theory (as well as the auxiliary hypotheses) faced with the falsifying finding, which frequently occurs in the everyday reality of scientific laboratories. In some cases, like that just described, Bayes’s theorem does not allow this defence. Therefore, the behaviour of scientists who tend to resort to these rescue operations, is in principle, incorrect in Bayesian terms for a single piece of experimental falsifying evidence (conservative behaviour based on hyper-evaluation of the prior probability or hypo-evaluation of the evidence). It is understood that in scientific reality, the term evidence includes the set of findings that confirm or disconfirm a theory. Depending on the circumstances, they may have a likelihood with values of 0 to 1. Therefore, the a posteriori probability of the theory will depend on the aggregate value of the likelihood estimates for the overall evidential data.

  9. 9.

    The preference of a theory compared to another according to Bayes’s theorem should occur according to the following formula, also known as the Bayesian algorithm for the preference of theories:

    $$ \mathrm{P}\left(\mathrm{T}2/\mathrm{E}.\mathrm{B}\right)>\mathrm{P}\left(\mathrm{T}1/\mathrm{E}.\mathrm{B}\right)\ \mathrm{if}\ \mathrm{and}\ \mathrm{only}\ \mathrm{if}\ \mathrm{P}\left(\mathrm{E}/\mathrm{T}2.\mathrm{B}\right)/\mathrm{P}\left(\mathrm{E}/\mathrm{T}1.\mathrm{B}\right)>\mathrm{P}\left(\mathrm{T}1/\mathrm{B}\right)/\mathrm{P}\left(\mathrm{T}2/\mathrm{B}\right) $$

    Namely T2 may be preferred to T1 if the likelihood ratio is greater than the inverse ratio of the respective prior probabilities. This formula is used to highlight how the choice of a theory is always comparative and the comparison is based on the accumulation of empirical evidence that gradually weights the balance towards the new theory. This happens because, even if the ratio between the prior probabilities may be in favour of T1, the accumulation of negative empirical evidence for T1 and positive for T2 makes the relationship between the likelihoods greater than the inverse ratio between the prior probabilities of the two theories. Furthermore, as negative evidence is gradually accumulated for T1, even its prior probability tends to diminish, whereas that of T2 increases, thereby reducing the value of the ratio between the prior probabilities. In principle this algorithm might also represent a Kuhnian gestalt shift situation. Its focus on the rational comparison between the theories, denied by Kuhn, however appears to be better adapted to a situation of gradual transition from the acceptance of a preceding theory to the choice of the next one.

  10. 10.

    Lavoisier’s chemistry is made up of various elements which, placed in increasing order of the difficulty with which they were accepted by the community, can be outlined as follows: (1) absorption of air in combustion and calcination; (2) the analysis of atmospheric air into two distinct gases; (3) the theory of oxygen in acids; (4) the caloric theory of heat and the vapour state; (5) the composition of water; (6) the rejection of phlogiston; (7) the new nomenclature (Perrin, p. 115).

    The acceptance of (6) and (7), above all, represents a clear adhesion to the new chemistry, whereas the sharing of (1), (2), (3) and (4) could still be included in the framework of the phlogiston theory.

  11. 11.

    Causal thought based on similiarity, also known as “resemblance thinking” is present in some tribal cultures. This was demonstrated, above all, by Fraser (1964) in a series of examples like the “Law of homeopathic magic” based on which an enemy is destroyed or wounded by destroying his image or fetish. However, it appears to be present in many everyday activities of our society. We need only think of the beliefs in astrology whose reasoning is based on this type of thinking: the red colour of Mars is associated with blood, war and aggression; etc. But even graphology, physiognomics, psychoanalysis make extensive use of “resemblance thinking”. On the other hand, even Tversky and Kanheman’s representativeness heuristic (1974), which seems to underlie many of probability judgements, is based on the degree of similiarity between events.

  12. 12.

    A more detailed cognitive examination of magical thought would be the comparative empirical analysis between contexts where this is manifest and contexts where it is absent in relation to the dynamics of counterfactual reasoning. The main questions that a study of this kind would need to answer would be: (1) is there less counterfactual reasoning in magical thought compared to causal thought? (2) If the answer to the first question is negative, is the causal field limited solely to variables from the ontological repertoire prescribed by magico-religious beliefs or are external events also taken into consideration? (3) If the answer to the second question is positive, why do these external events not join the ranks of causal events? This study could provide interesting answers, like not considering magical thought as a deviant form of causal thought, but instead as normal causal thought applied to a narrow ontological repertoire.

  13. 13.

    Not to be confused with the methodological repertoire of the selected factors in a scientific enquiry as potential causes. By ontological repertoire we mean the set of objects that together characterise the real world, the ecological context in which the effect will be manifest, it is thought, and within which it is studied.

  14. 14.

    In the case of the rational justification of scientific inference Cohen’s rather debatable proposal (1981) does not seem sustainable, namely its application to the rationality of the man in the street, to consider a sort of average individual, in order to avoid the plurality of responses to the test. In the case of science the variety of the research traditions, specialisations and disciplines renders any attempt to extrapolate a model of the average scientist impossible.

  15. 15.

    Another proposal which might outline the attempt to construct a cognitive theory of scientific rationality is that of extrapolating the cognitive procedures underlying the judgements for the justification of inferences. Normative theory based on these procedures would become a test to recognise whether or not the specific decisions were rational (Viale 1991, pp. 288–298). Provided that an approach like this ought to meet the same criteria - homogeneity, specificity and non-evident counter normative status – that are valid for the reflective equilibrium test, there are some clues that appear to make the positive outcome of this attempt even less likely. The common notion of justification varies considerably from subject to subject and therefore it would be difficult to identify a set of conditions that are necessary and sufficient to characterise it. Moreover, within the subject itself, justification appears to change in relation to the contexts of choice (Viale 1999a). In the scientist’s case this situation would not be any different, given that it would be in addition to the effect of the various disciplinary traditions concerning the subject of methodological justification (e.g. inductive, deductive and abductive styles of justification).

  16. 16.

    This type of conclusion can be applied to any specific rationality theory that aims to stand apart from the purely instrumental. In ethics as in economics or in social sciences the theory of rationality should contain a clear prescriptive choice regarding the aims of the action. As a result theories of rationality cannot be reduced to or based on the simple empirical generalisation of cognitive reasoning procedures and valid decisions in specific contexts.

  17. 17.

    There are three options concerning the problem of rationality: (1) according to the descriptivist option it is not possible to establish any prescriptive rationality, but instead we should be satisfied with describing the reasoning and decision processes, often underdetermined compared to the aims. There is no rationality theory, but only a theory of reasoning. (2) According to the instrumentalist option it is possible to identify a prescriptive rationality linked to the best procedures to attain any type of purpose. As we have seen, this approach does not differentiate any field of human activity and it is therefore a “rationality for all seasons”. (3) According to the axiological option it is possible to construct a rationality theory for every field of human activity in which a normative hierarchy of the aims to be followed can be identified.

    Turning to the third option, this essay does not tackle the problem of how to define an axiology of science. It would be interesting to highlight a possible result of this research. Either in the event that axiology is completed in a descriptive-naturalistic way or in an a-priori way, we might find ourselves faced with the results of a hierarchy of aims that might not differentiate scientific rationality from that used by ordinary people. For example, the aim of truth, as a correct representation of reality, might also be the goal of the inferential activity undertaken by the man in the street. If this were true, we would be obliged to affirm that the rationality of science is broadly speaking similar to that used by ordinary people. Namely, we would be prompted to accept part of the theses put forward by new philosophy, sociology and the psychology of science without being obliged to recognise that all scientific activity can be reduced to instrumental rationality aimed at attaining any type of aims (for instance, those in situation a).

References

  • Cheng, P. W., & Novick, L. R. (1991). Causes versus enabling conditions. Cognition, 40, 83–120.

    Article  Google Scholar 

  • Cohen, J. (1981). Can human irrationality be experimentally demonstrated. The Behavioural and Brain Sciences, 4, 317–370.

    Article  Google Scholar 

  • Collins, H. M. (1981). Knowledge and controversy: Studies of modern natural science. Social Studies of Science, 11, 3–158.

    Article  Google Scholar 

  • Daniels, N. (1979). Wide reflective equilibrium and theory acceptance in ethics. Journal of Philosophy, 76, 256–282.

    Article  Google Scholar 

  • Dunbar, K. (1995). How scientists really reason: Scientific reasoning in real world laboratories. In R. J. Sternberg & J. Davidson (Eds.), The nature of insight. Cambridge, MA: MIT.

    Google Scholar 

  • Einhorn, H. J., & Hogarth, R. M. (1986). Judging probable cause. Psychological Bullettin, 99, 1–19.

    Article  Google Scholar 

  • Feyerabend, P. K. (1975). Against the method. London: New Left Books.

    Google Scholar 

  • Field, H. (1973). Theory change and the indeterminacy of reference. The Journal of Philosophy, 70, 462–481.

    Article  Google Scholar 

  • Fine, A. (1975). How to compare theories: Reference and change. Noûs, 9, 17–32.

    Article  Google Scholar 

  • Fraser, J. (1964). The new golden bough. New York, NY: Mentor.

    Google Scholar 

  • Giere, R. N. (1988). Explaining science. Chicago, IL: Chicago University Press.

    Book  Google Scholar 

  • Giere, R. (1994). The cognitive structure of scientific theories. Philosophy of Science, 61, 276–296.

    Article  Google Scholar 

  • Gigerenzer, G., Hell, W., & Blank, H. (1988). Presentation and content: The use of base rates as a continuous variable. Journal of Experimental Psychology. Human Perception and Performance, 14, 513–525.

    Article  Google Scholar 

  • Goldman, A. I. (1986). Epistemology and cognition. Cambridge, MA: Harvard University Press.

    Google Scholar 

  • Goldman, A. I. (1993). Philosophical application of cognitive science. Boulder, CO: Westview.

    Google Scholar 

  • Goodman, N. (1965). Fact, fiction, and forecast. Indianapolis, IN: The Bobbs-Merrill.

    Google Scholar 

  • Grice, H. P. (1975). Logic and conversation. In D. Davidson & G. Harman (Eds.), The logic of grammar. Encino, CA: Dickenson.

    Google Scholar 

  • Hilton, D. J. (1990). Conversational processes and causal explanation. Psychological Bulletin, 107, 65–81.

    Article  Google Scholar 

  • Hilton, D. J., & Slugoski, B. R. (1986). Knowledge-based causal attribution: The abnormal conditions focus model. Psychological Review, 93, 75–88.

    Article  Google Scholar 

  • Jurdant, B., & Olff-Nathan, J. (1982). Socio-epistemologie des hautes energies (Dgrst Report No. FRT--78-7-0732). Strasbourg: Gersulp. Retrieved from http://hdl.handle.net/10068/14569

  • Kahneman, D., Slovic, P., & Tversky, A. (1982). Judgement under uncertainty: Heuristics and biases. Cambridge, MA: Cambridge University Press.

    Book  Google Scholar 

  • Kanheman, D. A., & Miller, D. T. (1986). Norm theory: Comparing reality to its alternatives. Psychological Review, 93, 136–153.

    Article  Google Scholar 

  • Kuhn, T. S. (1977). The essential tension. Chicago, IL: The University of Chicago Press.

    Google Scholar 

  • Laudan, L. (1996). Beyond positivism and relativism. Boulder: Westview.

    Google Scholar 

  • Leslie, A. M. (1988). The necessity of illusion: Perception and thought in infancy. In L. Weizkrantz (Ed.), Thought without language. Oxford: Clarendon Press.

    Google Scholar 

  • Macchi, L. (1995). Pragmatic aspects of base-rate fallacy. Quarterly Journal of Experimental Psychology, 48a(1), 188–207.

    Article  Google Scholar 

  • Mackie, J. L. (1974). The cement of the universe. Oxford: Clarendon.

    Google Scholar 

  • Massey, C., & Gelman, R. (1988). Preschoolers’ ability to decide whether pictured unfamiliar objects can move themselves. Developmental Psychology, 24, 307–317.

    Article  Google Scholar 

  • Mitroff, I. I. (1974). The subject side of science. Amsterdam: Elsevier.

    Google Scholar 

  • Morris, M. W., et al. (1995). Causal attribution across domains and cultures. In D. Sperber et al. (Eds.), Causal cognition. Oxford: Oxford University Press.

    Google Scholar 

  • Nersessian, N. (1992). How do scientists think? In R. Giere (Ed.), Cognitive models of science. Minneapolis, MN: University of Minnesota Press.

    Google Scholar 

  • Newton Smith, W. H. (1981). The rationality of science. London: Routledge & Kegan Paul.

    Book  Google Scholar 

  • Perrin, C. E. (1988). The chemical revolution: Shifts in guiding assumptions. In A. Donovan et al. (Eds.), Scrutinizing science. Dordrecht: Kluwer.

    Google Scholar 

  • Quine, W. O. (1969). Epistemology naturalized. Ontological relativity and other essays. New York, NY: Columbia University Press.

    Google Scholar 

  • Rawls, J. (1972). A theory of justice. Oxford: Oxford University Press.

    Google Scholar 

  • Salmon, W. C. (1996). Rationality and objectivity in science or Tom Kuhn meets Tom Bayes. In D. Papineau (Ed.), The philosophy of science. Oxford: Oxford University Press.

    Google Scholar 

  • Shinn, T., & Cloitre, M. (1987). Matrici analitiche dell’organizzazione della scienza. Sociologia e Ricerca Sociale, 24, 83–111.

    Google Scholar 

  • Spelke, E. S. (1990). Principles of object perception. Cognitive Science, 14, 29–56.

    Article  Google Scholar 

  • Sperber, D., Premack, D., & Premack, A. J. (1995). Causal cognition. Oxford: Oxford University Press.

    Google Scholar 

  • Sperber, D., & Wilson, D. (1986). Relevance: Communication and cognition. Oxford: Blackwell.

    Google Scholar 

  • Stich, S. (1990a). The fragment of reason. Cambridge, MA: MIT.

    Google Scholar 

  • Stich, S. (1990b). Rationality. In D. Osherson & E. Smith (Eds.), Thinking. Cambridge, MA: MIT.

    Google Scholar 

  • Stich, S., & Nisbett, R. (1980). Justification and the psychology of human reasoning. Philosophy of science, 47, 188–202.

    Article  Google Scholar 

  • Suppes, P. (1966). A bayesian approach to the paradoxes of confirmation. In J. Hintikka & P. Suppes (Eds.), Aspects of inductive logic. Amsterdam: North Holland.

    Google Scholar 

  • Swatez, G. M. (1970). The social organization of a university laboratory. Minerva, 8, 36–58.

    Article  Google Scholar 

  • Thagard, P. (1988). Computational philosophy of science. Cambridge, MA: MIT.

    Google Scholar 

  • Thagard, P. (1992). Conceptual revolution. Princeton, NJ: Princeton University Press (It. Translation (1994). Rivoluzioni concettuali. Milano: Guerini e Associati).

    Google Scholar 

  • Tversky, A., & Kanheman, D. (1974). Judgement under uncertainty: Heuristics and biases. Science, 185, 1124–1131.

    Article  Google Scholar 

  • Tweney, R. D. (1991). Faraday’s notebooks: The active organization of creative science. Physical Education, 26, 301–306.

    Article  Google Scholar 

  • Viale, R. (1991). Metodo e società nella scienza. Milano: Franco Angeli.

    Google Scholar 

  • Viale, R. (1997a). Causality: Epistemological questions and cognitive answers. In G. Costa, G. Calucci, & M. Giorgi (Eds.), Conceptual tools for understanding nature. Singapore: World Scientific.

    Google Scholar 

  • Viale, R. (1999a). Causal cognition and causal realism. International Studies of Philosophy of Science, 13(2), 151–167.

    Article  Google Scholar 

  • Viale, R., Rumiati, R., Legrenzi, P., & Bonini, N. (1993). Il ruolo dell’expertise scientifica nelle strategie di falsificazione e verifica. Quaderni di Sociologia, XXXVII(5), 137–147.

    Google Scholar 

  • Whitley, R. (1984). The social and intellectual organization of the sciences. Oxford: Oxford University Press.

    Google Scholar 

  • Worral, J. (1976). Thomas Young and the “refutation” of Newtonian optics: A case study in the interaction of philosophy of science and history of science. In C. Howson (Ed.), Method and appraisal in the physical sciences. Cambridge, MA: Cambridge University Press.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2013 Springer-Verlag Berlin Heidelberg

About this chapter

Cite this chapter

Viale, R. (2013). Cognitive Theory of Scientific Rationality or of Scientific Reasoning? . In: Methodological Cognitivism. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-40216-6_5

Download citation

Publish with us

Policies and ethics