ALTERNATE ROUTES A Critical Review Editorial Board: Jim Conley, Joan St. Laurent, Eileen Saunders Alternate Routes is an annual publication of graduate students in Sociology and anthropology. Manuscripts, subscriptions and communications should be addressed to: Alternate Routes Department of Sociology and Anthropology Carleton University Ottawa* Ontario, Canada K1S 5B6 Phone: 231-6634. The editorial board wishes to establish a genuine forum for the exchange of ideas. We welcome critical responses, commentaries or rejoinders, which we will endeavour to publish in subsequent issues. The editors gratefully acknowledge the financial assistance of the department of Sociology and Anthropology, Carleton Univere is Chairperson, Dennis Forcese. We also wish to thank Cruikshank for her administrative assistance. Subscriptions : Individuals and institutions: $2. 75. payable to Alternate Routes. Copyright 1979, Alternate Routes, unless otherwise noted. ALTERNATE ROUTES A Critical Review Editorial Board: Jim Conley, Joan St. Laurent, Eileen Saunders Alternate Routes is an annual publication of graduate students in Sociology and anthropology. Manuscripts, subscriptions and communications should be addressed to: Alternate Routes Department of Sociology and Anthropology Carleton University Ottawa, Ontario, Canada K1S 5B6 Phone: 231-6634. The editorial board wishes to establish a genuine forum for the exchange of ideas. We welcome critical responses, commentaries or rejoinders, which we will endeavour to publish in subsequent issues. The editors gratefully acknowledge the financial assistance of the Department of Sociology and Anthropology, Carleton University, and its Chairperson, Dennis Foroese. We also wish to thank Bev Cruikshank for her administrative assistance. Subscriptions : Individuals and institutions: $2. ?S. Make cheques payable to Alternate Routes. Copyright 1979, Alternate Routes, unless otherwise noted. Digitized by the Internet Archive in 2011 with funding from Carleton University, Department of Sociology & Anthropology http://www.archive.org/details/alternateroutesc03alte Alternate Routes A Critical Review Vol. 3 1979 Contents Phil Heiple 7 University of California, Santa Barbara The PoXitios of Probability ...1 Gordon Haas / Carleton University Claus Offe and the Capitalist State: A Critique ...25 Bonnie Ward / Queen's University The Myth of Autonomy in Family Farm Production . . .49 Sut Jhally / University of Victoria Marxism and Underdevelopment: The Modes of Production Debate ...63 Janice Belkaoui / Rosary College A Critical Assessment of Media Studies ...94 Votes to Contributors ...128 EDITORIAL STATEMENT Alternate Routes is a critical review of sociology and related disciplines. We strive to publish the critical work of graduate students which will inform, and be of interest to students and teachers of social science. In our view, sociologists must be critical, both of their own society and of the work of other social analysts. We seek, therefore, to publish work which challenges existing sociological and societal orthodoxies. To achieve this goal we require manuscripts. We encourage graduate students among our readers to submit essays, reviews, commentaries and rejoinders on a wide variety of subjects. We particularly welcome v^rk which treats some aspect of Canadian society within the wider concerns of critical social science. ISBN 0-7709-0061-5 JOHN PORTER 1921 - 1979 The editors of Alternate Routed wish to acknowledge the contributions of the late John Porter to Carleton University, to Canadian sociology, and to the academic community at large. His seminal work, The Vertical Mosaic (1965J, stands as a benchmark in the development of sociological analyses of Canadian society, serving to encourage the continuing investigation of the nature of social in- equality. While beet known for this book, Dr. Porter was also the author of numerous other works, including Canadian Social Structure: A Statistical Profile, amd Does Money Matter? (co-authored with Marion Porter and Bernard Bliehen). John Porter will be keenly missed, as a scholar, as a member of the university community, and above all, for graduate students at Carleton, as a teacher committed to his students and to their training as critical, skilled sociologists . THE POLITICS OF PROBABILITY Phil Heiple "Statistics" was first used in the middle of the sixteenth century and related to matters concerning the state. Statistics continued in this sense for the next two hundred years. These matters of state concern became more and more numerical and summary methods were used. These provided a framework for social policy and also played a rhetorical function of legitimation due to popular attitudes toward exact figures (Kendall, 1972:196; Clark, 1937:122-124). By the mid-seventeenth century, mathematicians, physicists, astronomers and other scientists adopted and advanced the summary methods of statistics. These advances were aided by the developing systems of rational accounting in use in business and by the philosophies of natural science developed by Galileo, Bacon, Descartes, and Newton (Clark, 1937:79, 133-137; Kendall, 1972:197). Quantification was first being attempted as a form of social thought at this time. The general background was the rational spirit of rising capitalism and the increasing size of different countries which necessitated a more impersonal and abstract basis for public administration. Specific attempts were related to concrete problems of refining the numerical bases of the new insurance systems and to the mercantilists' belief that population size was a crucial factor in the wealth and power of the state (Douglas, 1971a:50; Lazarsfeld, 1961:279). New economic conditions forced the new importance of records- keeping for political assessment. Changing modes of production led to prolonged depression and massive unemployment throughout Europe. Vagrancy laws and institutionalized houses of confinement were among the bureaucratic responses to these conditions (c.f., Chambliss, 1964; Foucault, 1965). These required systematic records as well as agencies charged with the execution of these tasks. This was the beginnings of the bureaucracies of official morality. The growing size and complexity of Western societies created a need for some form of accountable information which could be legally and morally sanctioned as the basis for policy judgments (Douglas, 1971b :51-52). The symbiosis of quantitative social thought with the analytical and calculating form of thought of the bourgeoisie had advanced by the eighteenth and nineteenth centuries to the point where social statisticians and official statistics helped promote a formal standardization of morality as justice and helped civil service and state interventionist power to grow and to become increasingly remote from the qualitative relations of a socially produced and understood world. This is compatible with what Weber depicted as the rationalizing character of bureaucratic thought, which Lukacs later depicted as fundamental to the process of capitalist reification. Parallel with the late-nineteenth century transition from entrepreneurial capitalism to corporate capitalism and the administrative welfare state of the New Deal, sociology in the United States abandoned early descriptive participant-observer methods (e.g. Chicago School) in favor of survey methods and the analysis of official statistics. By the 1930's, sociological methods as represented by the increasing number of methods texts came to be synonymous with quantitative analysis (Douglas, 1971a:55). Articles in social science journals from 1895 to the present reveal an increasing use of quantification and statistical manipulation. Snizek (1975:42—424) found that these methods tended to produce " A realist view of social reality, often associated with the fallacy of reification, ... one that focuses on group properties in hopes of discovering the structural laws that govern behavior" (1975:416). Statistical reasoning's claim to validity in social analysis is that the calculus of probability adequately describes the relative likelihood of events occurring in the social world. Or, as Blalock (1960:509) negatively stated it: If probabilities are unknown, it will be impossible to make legitimate use of statistical inference. I am interpreting this to mean that probability theory is the philosophical (i.e., metaphysical) link between statistical measurement and the world of observables. This dichotomy is interesting. Blalock (1960:19) himself evokes it in an anticipatory aside where he says, "This is a question of fact which is irrelevant to the question of whether or not there is a legitimate unit of measurement." To me, the irrelevant is relevant. My basic thesis is that methods and politics are inseparable because methods for social analysis always contain certain presuppositions about the nature of the social world. Insofar as these presuppositions express or imply a concept or evaluation of social order, or an excuse or means to evaluate social order, they are metaphysical. These metaphysics are a metaphysics of normality. They delimit the scope of normal social relations and are therefore political in their implications for social life. Antonio Gramsci 's criticism of the law of large numbers illustrates this approach. In brief, the law of large numbers states that the larger the number of samples the greater the likelihood (probability) that their average will approximate the average of the population from which they were drawn. Several times in the Prison Notebooks Gramsci (1971:401, 412) mentions the usefulness of this concept for analysing the quantitative expressions of social phenomena. This is acceptable because it does not pretend to avoid selecting for specific characteristics of the sample. When dealing with human subjects, however, Gramsci (1971:428-429) finds the law of large numbers (and the concept of statistical law generally) to be deeply flawed: But the fact has not been properly emphasized that statistical laws can be employed in the science and art of politics only so long as the great masses of the population remain (or at least are reputed to remain) essentially passive, in relation to the questions which interest historians and politicians. Furthermore, the extension of statistics to the science and art of politics can have very serious consequences to the extent that it is adopted for working out future perspectives and programmes of action ... Indeed in politics the assumption of the law of statistics as an essential law operating of necessity is not only a scientific error, but becomes a practical error in action ... It should be observed that political action tends precisely to rouse the masses from passivity, in other words to destroy the law of large numbers. Gramsci is arguing that the law of large numbers contains an important presupposition about its unit of analysis. The presupposition is that the unit of analysis is a passive object. Gramsci points out that the unit of analysis cannot be an active subject. It cannot be in a state of becoming i.e. , it cannot be in a state that is undergoing any kind of qualitative change. This is the metaphysic of normality for the law of large numbers. It delimits its units to static relations. Even if stochastic measures are used, only quantitative changes are possible. In short, it systematically ignores the possibility of a revolutionary subject. Gramsci did not attempt a study of probability theory to see if metaphysics of normality were only accidentally and occasionally present, or generally so. In the next several pages I will survey probability theory with a search for metaphysics in mind. First, let me delineate which issues in probability theory I will be addressing. Three perspectives dominate modern probability theory: the "objective" or "frequentist" position, the "subjective" or "personalist" approach, and the "logical" theory of probability. Putting aside the logical theory for a moment, the distinction between the objective and subjective theories can be posed by their different accounts of the relationship between the concept of probability and the nature of probable knowledge. According to the objectivist position, probability is an objective characteristic of a multi-leveled physical reality. Probable knowledge is knowledge of one of those levels and is incomplete because of the incompleteness of our information. Strict determinacy is assumed in the physical reality. Statistical methods are used to bridge the gap between insufficient information and the strict determinacy of objective reality. The objectivist position is the predominant form of probability theory used in the social sciences. The less widely held subjectivist position holds that our knowledge of physical reality is inevitably limited in principle and that our knowledge is therefore only probably true. This is the interpretation of probability' predominant in orthodox quantum physics: the quantum level is the level of inseparability between the knower and the object known where the knower is nonetheless compelled to speak of the object as if it were not affected by the knower, which results in statements of only probable validity. Statistical methods are employed to express the degree of certitude the knower attaches to these statements (Suppes, 1969:238-242). Quantum theory is, in part, a theory of the indeterminateness of knowledge. No such theory exists in the social sciences, although, to me, its adoption there is long overdue. The logical theory of probability is not in use in the social sciences, Unsuccessful attempts have been made to formulate social probability along the lines of the logical theory. Basically, the logical theory of probability directly addresses the problem of metaphysical content. It tries to avoid metaphysics witn a strictly inductive non-demonstrative mathematical logic of self-evident maxims. In this it is hoped that all subjective judgments will be precluded. My basic thesis on probability is that it is in essence metaphysical and all attempts to remove the metaphysical content are doomed to fail. Further, I will argue that it is precisely because of the metaphysical content that probability theory has been useful to quantitative social scientists. This usefulness is in large part a function of the degree to which the mataphysics of probability are compatible with the hegemonic ideologies of social science. I will try to illustrate this thesis with a few brief looks at the origins and development of the modern theory of probability. Byrne (1968:292-293) provides the following thumbnail sketch of the development of the calculus of probability: In the course of time, Cardano and then Pascal and Fermat came to recognize that gambler's rules already in existence might provide a more effective instrument with which to deal with the contingent. These gambler's rules they and then others developed and systematized. That this more or less systematic instrument of the non- systematic came to be known as a calculus is due not only to its character as a mathematical instrument but to imitation and adulation of the great new instrument of the systematic, the calculus of Leibnitz and Newton ... ... the new instrument thus inaugurated was eventually systematized by Laplace according to standards of his day. But it is important to bear in mind that what is now a demonstrative system in its own right began as an instrument to deal with the non-systematic on the basis of a new theory about how to express the non-systematic: not disjunctively but in terms of a continuum of values between what happens always and what never happens. The "notion of non-systematic" needs clarification. Byrne (1968:285) writes: Notion is here taken in a general sense broader than that of concept and is meant to imply, without further precision, awareness of or consciousness of. Non-sys temat ic is also taken in a broad sense and is meant to imply non-necessity, or non-certain, or non-demonstrated, or even non-scientific in the Thomlst sense which is not unrelated to the modern 'indeterminate'. Being negative non-systematic is meant to imply also 'with respect to a given system. ' Thereby what one calls non-systematic will depend on one's conception of a system. For example, if Newtonian mechanics is taken to be the system, then the non-systematic will be all the relevant phenomena not explained by Newtonian mechanics. This was, in fact, the original relation of the non-systematic: the first probabilists were trying to develop an instrument to handle specifically what was non-systematic with respect to the Newtonian system of celestial mechanics. Whereas the Newtonian mechanics replaced medieval cosmology, the founders of proability theory sought to replace medieval disputational means of discerning the probability of judgments. The medieval concepts of probability as (1) probabilis : an argumentatively supported proposition, and (2) contingens : events that occur either ut in pluribus or ut in paucioribus were carried over into the new probability theory as (1) the logical interpretation: probability or degree of confirmation of a proposition, and (2) mathematical interpretation: probability or relative frequency of a class of variables (Byrne, 1968:302). Now let us take a look at the leading models and see what manifestations these concepts have once fully articulated. The logical theory of probability, as I mentioned earlier, is not in use in the social sciences, but it does indicate the culmination of a particular line of probabilistic thinking and is noteworthy in this respect. Following a theme initiated by Leibniz, attempts have been made to develop a general logic combining formal logic and the calculus of probability. John Venn, C.S. Peirce, Richard von Mises, Hans Reichenbach, and J.M. Keynes all made contributions toward this project, and the work of Rudolf Carnap represents its most successful formulation (Byrne , 1968:20-21; Nagel, 1939:42-43). Carnap distinguished between probability as (1) degree of confirmation, and (2) relative frequency over time. Taking the former as his problematic, he says, "A definition of an explicandum for probability must not refer to any person and his beliefs but only to the two sentences and their logical properties within a given language system" (Carnap, 1950:43; quoted in Byrne, 1968:21). Hence, a deliberate attempt is made to control opinionative (metaphysical) content. Carnap 's approach to probability is part of his project to develop a wholly non-demonstrative Inductive logic. The basic problem lies in consistently assigning numerical values to the degrees of confirmation for opinionative judgments. Carnap believes that quantification is the only guarantor of this needed consistency (1950:220-226). As a general logic, all this could be constructed on the basis of elementary set theory, which Carnap proceeds to do. However, the effectiveness of the construction is problematical. It is already disputable whether opinionative judgments about contingent events are in any sense quantitative and hence mathematically formalizable. Carnap suggests that this issue is merely technical: if the logic can be constructed, then it will be effective (1950:242). Yet, as Byrne (1968:22) points out, that a formal logic can ever be capable of such a task is seriously challenged by two important theorems of meta-logic: Craig's and Godel's. Craig's theorem is for replacing a formal linguistic system of theoretical terms with another with the same empirical content but no theoretical terms. Nagel (1961:135-137) shows that this method of replacement becomes unworkable when the subject axioms (in this case opinionative judgments) are very numerous- and that in order to specify, the replacement axioms the set of true statements about the subject axioms must be closed (i.e., known in advance). Byrne (1968: 22-23) adds that opinionative judgments are at least numerous and in principle unlimited and that the very notion of opinionative judgments entails openness for the set of true statements about them. Godel's theorem shows that formal axiomatic systems such as Catnap's are necessarily incomplete in that proof of internal consistency (in this case, effectiveness) cannot be made within that system. Instead, such proofs must come from without — a requirement that puts Carnap' s entire project of a non-demonstrative inductive logic into serious question (Nagel and Newman, 1958:96-97). In sum, although the issue is not yet closed, there appear to be insurmountable 8 obstacles confronting a non-metaphysical logic of probability at levels both material (Craig's theorem) and formal (Gtidel's theorem). The mathematical theory of probability is like the logical theory of probability in that it can be constructed out of elementary set theory but would be subject to the critique of GBdel's theorem if metaphysical assumptions were not acknowledged. As with Carnap's logical theory, the question being posed of the mathematical theory is not the adequacy of the formal system, but rather the interpretation of that system. While there are numerous ways that a mathematical theory of probability could be built upon the arithmetic of proper fractions (Nagel, 1939:40-41), the linkages between the formal system and physical events would have to be drawn extra-mathematically. As Venn (1962:87; quoted in Byrne, 1968: 31) observes: When Probability is ... divorced from direct reference to objects, as it substantially is by not being founded upon experience, it simply resolves itself into the common algebraical or arithmetical doctrine of Permutations and Combinations . Emil Borel and George Polya suggest that the link is to be established by the practical certitude of the user. Borel is distrustful of the opinion of a single individual, because it is too likely to be subjective. Objective certitude arises through the common agreement of many prudent individuals. Borel (quoted in Byrne, 1968:18) states: The only reason why we regard as certain some well demonstrated mathematical facts ... is that the demonstrations have been reconsidered and verified by a large number of persons. It is fair to infer that these persons are assumed to be mathematicians, thus creating an elite consensus theory of truth (to which I will return in my examination of Polanyi's views). It is also fair to make a strict logical rejection of this view on the grounds that it is tautological: Borel seeks to justify practical applications of the mathematical theory of probability, which would include the law of large numbers, through reference to a practical situation in which the law of large numbers is 9 apparently assumed to be valid a priori. Polya avoids the errors of Carnap and Borel by admitting that a formal system is neither self- justifying nor connected to an observable reality in a non-mediated way. According to Polya, we can, nonetheless, approach a formal system as if it were integrally connected to observable reality, insorfar as it is plausible to do so. Polya (1954a: 198-199) outlines the grounds of plausibility: ... the credence that we place in a conjecture is bound to depend on our whole background , on the scientific atmosphere of our time ... In dealing with the observable reality, we can never arrive at any demonstrative truth, we have always to rely on some plausible ground. I think Polya points in the right direction. Why does probability theory have credibility as a deductive system? It has credibility only because it conforms to our background expectations about what a deductive system is like. Why is there such confidence in the applicability of probability theory to social phenomena? Because our background assumptions about the nature of social phenomena makes such application entirely plausible (though non-demonstrable). In this, Polya is making tacit reference to what I have been calling a "metaphysics of normality." The only thing he has not done is put this into its historical and political context. I hope that the first part of this essay succeeded in establishing some of this context: (1) probability theory was adopted as a technique for use within the general method of quantitative social thought, wherein the conception of social phenomenon employed necessitated the assumptions that social reality had a pre-categorical facticity, that human behavior was quantifiable, and that these quantities had corresponding elements in number theory; and (2) this development was instrumental (a) to the rising bourgeoisie's interest in the rational planning of society through bureaucracies of official morality, and (b) to the ability to legitimate bourgeois rule through rhetorical reference to hard, numerical, quantitative science. In sura, 10 probability theory is plausibly applied to social phenomena because we have historically arrived at reified conceptions of what tocial phenomena are like and how to study them. Of all the probabi lists surveyed thus far, some have acknowledged the presence of metaphysics in their conceptions of science and others have asserted its absence. Of those who recognize the incompleteness or impossibility of logical and/or mathematical formulations of social probability without metaphysical presuppositions, none have tried to specify the use of metaphysics in this enterprise. Micheal Polanyi is an exception. Polanyi provides a strong but ultimately circular critique of the myth of scientific objectivity. Against the view of the relationship between science and its object as being impersonal and dispassionate, Polanyi (1964) presents a wealth of evidence from the history of science to show that in all forms of knowing there is a tacit dimension which is ultimately more important than any established scientific method, evidence, or explanation. This dimension is the "personal participation of the knower in all acts of understanding" (1964:xiii). This participation is the effect of the opinions, prejudices, and preconceptions (i.e., metaphysics) of the knower upon the act of knowing. For the scientist, these would include "personal obligations to universal standards" (1964:17). With regards to probability, a probability statement is an incomplete and personal commitment according to one's framework of personal judgment (1964: 29). Probability statements are thereby partially formalized within certain maxims understood to be no more than "rules of the art" (thus utilizing the strengths of an internally-consistent system without the weaknesses of extra-systemic truth claims) (1964 : 30-31) . Polanyi recognizes that the scientist acts in dialogue with other scientists and with the traditions and norms of scientific practice, i.e., the social nature of science and, 11 hence, the social character of personal commitment. This means that a scientist's personal commitment to "universal standards" is influenced by factors outside the scientist. Polanyi (1964:375) considers the influence on scientists to be "superior knowlege," which he defines, as "beside the systems of science and other factual truths, all that is coherently believed to be right and excellent by men within their culture." This "network of confidence" enables science to survive as a "coherent system of superior knowledge, upheld by people mutually recognizing each other as scientists, and acknowledged by modern society as its guide." Full of post-Enlightenment optimism, Polanyi becomes a propagandist for a hegemonic elite technocracy. Well aware of the horrors of Stalinist and Nazi scientism (1964:224-245), Polanyi nonetheless opts for an elite consesus theory of truth. Despite his sympathy for Galileo, Mesmer, D.C.Miller, and other victim s of elite hegemony, nowhere does Polanyi question the ability of the present scientific community to make the right decisions at the right times. He improves probability theory by including metaphysics in the creative act of probability assertion, but capitulates to bourgeois hegemony by not institutionalizing any means to critically reflect upon the metaphysics received. Polya (1948: 208-209) recognizes this as a procedural necessity: "No idea is really bad, unless we are uncritical ... Don't let your suspicion, or guess, or conjecture grow without examination till it becomes ineradicable. At any rate, in theoretical matters, the best of ideas is hurt by uncritical acceptance and thrives on critical examination." In contradistinction, Polanyi is advocating what I warn against : technocratic planning by scientists uncritical of their own metaphysical presuppositions. Some of the general problems of probability theory have been demonstrated. More difficulties arise when it is used in social science. This can be demonstrated by examining the ways it is used there. One way probability theory enters the methods of social research is 12 as an aid to sampling statistics. There it is used to ensure the selection of a representative sample. The problem it solves is the problem created by purposive sampling — maybe we can control for all extraneous variables we can anticipate, but what about extraneous variables ve have not anticipated? Ford (1975:271) describes Fisher's solution through probability sampling: His idea was to go about the whole business of casting for samples the other way around. If you can select imaginary variables to render them systematic, then, he reasoned, perhaps you can unselect the unimaginary ones to render them unsystematic. If you can be reasonably sure that their variation is indeed unsystematic from the point of view of your theory, then you can assume that any biases in your sample are also unsystematic, so these biases can be regarded as irrelevant from all conceivable points of view and thus can be ignored. The key to this is randomization. Through randomization it is hoped that there will be no relevant differences between the sample and the universe from which it is drawn, and therefore any differences present are irrelevant. Once you have a sample whose members were as equally likely to have been drawn as all those in the universe from which they came,, the calculus of probability may be employed to calculate the probability that your sample is biased in any relevant respect (Ford, 1968:273). This is all well and good as long as one is dealing with experimental data hut, as Selvin (1957) has pointed out, when non-experimental data are used certain problems arise. Since the data are non-experimental, the researcher has no grounds for believing that there are no correlated biases greater than the random errors in the sample. These biases may have even an undetectable supressor effect. This rules out the calculus of probability (in this case, tests of significance). According to Selvin (1957:522), "... only when all important correlated biases have been controlled is it legitimate to measure the possible influence of random errors by statistical tests of significance." Of course, in order to control "all 13 important correlated biases" it is necessary to know what they are. This sounds to me like purposive sampling. If Selvin is right, then probability sampling is randomly-collected purposive sampling with the assumption that there are no correlated biases greater than the random errors (a metaphysic of normality). While finding value in Selvin, Ford takes a different route to the same conclusion. She looks at the formal mathematics involved, specifically the requirement that "Whenever the separate probability values of all mutually exclusive units are added together at the same time, che result must add up to 1" (1975:290). That this is impossible to link with events has been recognized in some statistical circles. In 1948 G.A. Barnard criticized H. Jeffreys' social statistics on this point (it is equally applicable to the school of R.A. Fisher): The snag in Professor Jeffreys' theory is that to work it one has to specify a probability distribution for a class of alternative hypotheses and the whole of the probability has to be distributed. One must when interpreting one's experiments be able to think of all possible explanations of the data, and that, I think, none of us believe that we can do. It is always possible for someone to produce later an entirely new explanation we had never thought of, and which would not be represented in the hypothesis nor in the alternatives we had tested, (quoted in Hogben, 1957:25.) Ford (1975:291) sees this as a severe restriction of the applicability of probability statistics in general: For, unless the cases under consideration at any particular level of analysis may be properly regarded as derivable from a precisely defined and logically exhaustive set of non- overlapping (i.e., mutually exclusive) units, then none of the impressive methods and techniques of probability theory will be applicable. The case against probability sampling is especially strong, for instead of solving the problems of purposive sampling, the formal requirements of the calculus of probability mean that they can only be employed "within a clearly defined set of units, that is, within a PURPOSIVE SAMPLE" (Ford, 1975:291, her emphasis). 14 An illustration of latent political bias is provided by an extension of Neuberg's (1977:4-5, 20) critique of objectivist probability on the grounds of the unique character of a social event. He argues that social-level events (such as France becoming a monarchy) occur in a manner which cannot be recreated by replicating the circumstances. This is equally true for individual elements of a random sample. Neuberg (1977:5) argues: The usual situation in a social-statistical study is a population model . . . The a priori assumption is that each element of the sample has been drawn at random, under similar circumstances, from the same population. If this assumption is not fulfilled the meaningfulness of the resulting explanation is open to question. One area, for example, where the assumption appears systematically doubtable is in .econometric time-series analysis. Here the sample points are, originally, possibly distant from each other in social-level time. It is difficult to conceive a sense in which the elements of such a sample have been "drawn at random, under similar circumstances, from the same population". Hence the uniqueness of social-level events is the basis of an explanation of the explanatory unreliability of econometric time-series models over longer periods of time. Neuberg (1977:19-20) briefly notes that this may be because of the"special quality of soc.al-level («* historical) time" and cites Lukacs' (.1971:89-90) argument that reified thought "degrades time to the dimension of space." In this way "time sheds its qualitative, variable, flowing nature; it freezes into an exactly delimited, quantifiable continuum filled with quantifiable 'things' ... in short, it becomes space." I would extend this argument somewhat to underscore the a priori assumption that social time is quantitative (Newtonian). Gurvitch (1964: 27-38) argues that social time is a conceptual derivative of a world-view and varies considerably within world-views according to need. Husserl (1964:29, 77) argues that social time is wholly constituted by the experiential subject and that phenomenal time (e.g. standard time) has nothing more than an ascriptive status. Both of these positions seriously challenge the assumption that time frames between unique events are comparable in any quantitative sense. This assumption has three additional features — the metaphysics of normality — which make the theorem politically 15 conservative : (1) The social subject cannot be an active participant in social time, either physically (e.g., intervening in such a way as to cause changes within the time frame) or cognitively (reflecting upon, interpreting, or reconstituting the experience of social time) ; (2) No part of the time frame can be anything other than a quantitative development of earlier parts (e.g., quantum leaps in time are precluded as well as any concept of determinism other than a linear causal model. The possibilities/probabilities for the future are wholly limited to quantitative developments of the present.) ; (3) If social time is viewed as Newtonian space, then motive forces to change states woi.ld have to be extrinsic to the subjects at hand. This makes it possible for the subjects to be wholly quantifiable, i.e., for all behavior to be subsumed by and understood through number theory. To assume that social behavior can be subsumed by number theory is to believe that the limitations of numerical formulations do not apply to social analysis or social behavior. This metaphysics of normality is also the logical error of affirming the consequent because the second premise affirms the consequence of the hypothetical first premise. Probability theory is only for the testing of judgments. It is when probability is seen as referring to material occurrence, i.t., events, that the confusion of causality and probability comes about. While I have no sympathy for any causal models in social research, probability in no way fits into a causal model. The concept of a probabilistic law is only possible when the unknown, a feature of judgments, is viewed as a material force in social events. This is one of the most common errors of bourgeois thought: to equate one's way of thinking about the world with the way things happen in the world (mistaking epistemology for ontology). In this way probability theory could be viewed as a source for social theory inductively drawn from frequency distributions (recall Snizek's (1975) finding that mode of analysis determines theoretical perspective). Logical grounds for rejecting inductively obtained explanations are given by Hempel, who finds probabilistic laws to have nothing other than analogic credibility (1966:67). I think the most explicit dismissal of any connection between causal laws and probabilistic reasoning is provided by 16 Polya (1954b; 100), who imagines a doctor trying to comfort a patient with the remark, "You have a very serious disease. Of ten persons who get this disease, only one survives. But do not worry. It is lucky you came to me, for I have recently had nine patients with this disease and they all died of it." My final remark on probability theory as a method in social science deals with several small points made previously about the conflation of events with judgments and about the closed and unambiguous set of transformations available to probability. It has been suggested that graduate study in the social sciences is chiefly a process of indoctrination which occurs surreptitiously through the acquisition of a new language (Pozzuto, 1975:20-21, 166-171). The notion that social probability is a language unto itself is plausible. In Wittgenstein's view, language is a socially shared and practiced activity through which we learn and know what things are: "We predicate of the thing what lies in the method of representing it" (Wittgenstein, 1953:46). That the truth claims of a language cannot be made within that language, but rather must be established through a richer language, is demonstrated by Tarski (1944:341-376). What this means for the problem at hand is that if it is true that social probability is a language or like a language, then it constitutes a picture of the world whose validity cannot be established within its own logic. Instead, validation must come from without, and I think I know where. Consider the plausibility of the following experiment. Go back in mnemonic time to your first indoctrination session into the secret meta-language of statistical probabilistic social analysis. Remember all the examples about coins, dice, cards, and roulette wheels which made the case for the applicability of probability to social analysis so strong. Those examples mystified us. The problem is that social-level events simply do not occur within 17 the same conditions necessary to say anything meaningful about the coins, dice, cards, and roulette wheels. In the case of the latter, it is not by chance that one knows the total distribution of probabilities, i.e., that a coin has two faces, that dice have six sides, and so on. In fact, the key concept of the latter is that everyone has full and complete knowledge of the total set of possibilities. It is in terms of this knowledge that meaningful frequencies can be calculated because everything relevant is known about them. The basic law of social statistics is that you must have a representative sample. Powerful andeffective randomizing techniques are available to ensure this. But in ensuring representation, they also ensure that the parameters of the sample are unknown. Nothing is known about the sample except that it is unknown and that anything discovered about it will be new data, i.e., everything relevant is unknown about it . This difference between dice, etc., and random samples, is irreducible and prevents transferability from one system to the next. Nonetheless inductive leaps are frequently made between the two levels because of common sensical habits of mind, Gramsci (1971:419) said that common sense is "the conception .of the world which is uncritically absorbed by the various social and cultural environments in which the moral individuality of the average (person) is developed," The inductive leap is facilely made because of the pre-ref lective habit of mind to think quantiatively and probabilistically about social events as a way of using human foresight in planning one's day. Life u'ider capitalism predisposes people to pervasive habits of this sort in that every part of one's day can be fractured into discrete units, the manipulation of which is rarely reflectively regarded. The dominant ideas in social reasoning (common- sensical or quantitative-probabilistic) conform to the ideology of the ruling class. 18 In sum, probability theory is a logic, a language, a set theory, or a way of comraonsensical reasoning; the connection of which to social events can only be inductively (metaphysically) established. Rather than making the link to the social world for statistical methods, probability theory is a stalking horse with which we can be ideological and confirm our commonsensical presuppositions about the social world while appearing to be scientific and anti-metaphysical. (Perhaps the discipline selects for people who are predisposed to this sort of reasoning; just the fact that all these people at some time in their lives decided to go to graduate school indicates they are all gamblers to a certain extent). Finally, I wish to conclude by returning to a point raised previously: the connection between quantitative social thought and the "analytical and calculating form of thought of the bourgeoisie." Douglas (1971b ;58-59) believes that there are two major points of confluence : First, (bureaucratic rationality) contributed the view of men and their actions as absolute categories (or absolute typifications) . That is, rather than see men and their actions as the continuous, situation-bound, concrete persons we normally assume for our purposes of everyday interaction, it saw them as discrete, discontinuous phenomena that are independent of time and situations. It is this set of properties which is necessary before one can validly apply real numbers and mathematical analyses to human beings and their actions; it is this assumption that generates the pigeon-hole perspective on man known to all students of introductory methods in the social sciences ... Second, this calculative attitude was fundamental to the development of the rational policy orientation of officials and rulers that made official information the means of 'testing' and 'proving' the effectiveness of official policies . . . the calculation of the effects of official action relative to the policy-determined practices (I.e. , effectiveness) was fundamental to the development of all official information. Quantitative methods have a legitimating function. Despite the undermining of the rhetorical claim to absolute rationality by twentieth century physical scientists themselves, mathematical formulations still evoke an ideology of absolute rationality in social science and the public 19 sphere (c,f. Mumford, 1967). In this way, the official bureaucracies of social planning are invulnerable to any criticism that does not transcend quantitative reasoning, i.e., that does not reject the rhetorical appeal of mathematical formulations as absolute rationality. Weber (1947:184-185) describes formal rationality as resting upon quantification and calculability . He views it as the mode of reasoning for modern science and industrial capitalism, especially in planning. Without it, a capitalist economy could not be rationally (efficiently) administered (1958:26-27). In addition to being indispensable, it is absolutely unavoidable as the fate of the West and will pervade every apsect of social life from the administration of state bureaucracies to everyday life (1946:228-229). His scenario closes with a whole society completely rationalized by capitalism and experiencing "the absolute and complete dependence of its existence, of the political, technical and economic conditions of its life on a specially trained organization of officals" (1958:16). These ideas are very useful, but incomplete. Lukacs (1971:99) expands upon Weber (1946:228) to show that the members of the bureaucracy are themselves subject to rationalization and become dehumanized and mechanized as their service becomes a commodity form. Schroyer (1973:184) adds that what Weber calls formal rationalization is also the rising organic composition of capital, or in other words, increasing value production or growth of the capacity to create value. This is at the same time the growth of the capacity to extract surplus, i.e. material exploitation. Developing themes inaugurated by Kant, Marx, Weber and Lukacs, are the recent observations of Horkheimer (1947:8-9), Mills (1959: 165-176), and Kosik (1976:56-60) on the spread of irrationality simultaneous with and caused by advancing rationalization. Rather than making individual everyday life more understandable and easier to control, rationalization 20 makes social reality more opaque and less subject to control — irrational in these senses. Finally, buried within Schlltz's social phenomenology is a theory of the invisibility of the expert. As experts become both more specialized and more important in the successful concentration of power in society (monopolization) t their role as constructors of world views and as corroborators of taken for granted social knowledge becomes "nearly completely invisible" (Schlltz, and Luckmann, 1973:315) and "entirely hidden in its anonymity" (Schlltz, 1964:133). I think this is an important addition to Marx's concept of reification as the domination of living human potentiality by dead, objectified labor (Marx, 1969:17-18). What these additions do to Weber's formulation is to turn it into a crisis theory. If this is a meaningful description of the tendencies associated with my subject, then the fundamental question to ask is what are the implications for our practical activity — as people trying to understand social reality (perhaps as critical social scientists) , and as people who have everyday lives in that social reality? There are partial answers to the first question. Investigations need to be conducted into the available methodologies of social research to see (1) if they are in fact able to do what they are conceived to do, (2) if there are unacknowledged limitations on the picture of social reality they create, and (3) if these limitations have political consequences, particularly if bureaucracies of official planning are empowered to make social policy on the basis of their picture of social reality. For example, if the view of social reality is a reified one, then their policy will presuppose reification, and the "guns and butter" they impose will correspond best to the reified needs of society, and, hence, be materially constitutive of further societal reification (which is then studied and — lo and behold — confirms the presupposed reified picture 21 of social reality). My criticisms Q f quantitative reasoning, statistics, and social probability can be summed up as a critique of reification. The details of the criticism can be included in Lukacs' (1971:104, passim ) more general critique of reification in science for (1) losing contact with the totality by becoming a "formally closed system of partial laws", (2) ignoring the world-manufacturing effect of its work ("ontological problems of its own sphere of influence") , and (3) losing history through a freezing of the given to produce apodictically certain facts. Although it is necessary to develop alternative methodologies, the answer is not a categorical rejection of quantitative methods. Some of the techniques, like ordinal variables, probability sampling, probabilistic laws, and certain features of analysis, inference, and significance, must be cast on the junkheap forthwith. Other techniques are not inappropriate in that they can be useful as long as one does not pretend that they are capable of dealing with anything other than appearances. What this means is that reified substitutes for real knowledge (quantified observables) are important and necessary means of apprehending and describing indicators of phenomena which as yet cannot be accounted for in any superior manner, as long as the reified status of the observables is acknowledged. This may seem like an overly simplistic solution but in practice it means an active struggle with the makers of social policy over the legitimacy of their research methods. This implies a struggle for social policy. This struggle (if the efforts of working peoples, the Third World, and all others disenfranchised from the decision-making that effects them are unsuccessful) is a struggle for the future. If unsuccesful, there may be no basis for social policy other than the truncated visions of the technocrat. Department <->f Sociology, University of California, Santa Barbara. BIBLIOGRAPHY Blalock, Herbert M. Social Statistics , New York: McGraw-Hill. 1960 Byrne, Edmund F. 1968 Carnap, Rudolf. 1950 Probability and Opinion: A Study in the Medieval Presuppositions of Post-Medieval Theories of Probability . The Hague: Martinus Nijhoff. Logical Foundations of Probability , Chicago: University of Chicago Press. Chambliss, William. "A Sociological Analysis of the Law of Vagrancy," 1964 Social Problems , 12, 1. Clark, Sir George, Science and Social Welfare in the Age of Newton , 1937 Oxford: Clarendon. Douglas, Jack D. "The Rhetoric of Science and the Origins of Statistical 1971a Social Thought: The Case of Durkheira's Suicide," in Edward A. Tiryakian (ed.) The Phenomenon of Sociology : A Reader in the Sociology of Sociology, New York: Appleton-Century-Crof ts . 1971b American Social Order: Social Rules in a Pluralistic Society, New York: Macmillan. Ford, Julienne, Paradigms and Fairy Tales: an Introduction to the 1975 Science of Meanings. London: Routledge & Kegan Paul. Foucault, Michael. Madness and Civilization , New York: Pantheon. 1965 Gramsci, Antonio. Selections form the Prison Notebooks , New York: 1971 International. Gurvitch, Georges. The Spectrum of Social Time , Dordrecht, Holland :D.Reidel 1964 Hempel, Carl G. Philosophy of Natural Science , Englewood Cliffs, 1966 New Jersey: Prentice-Hall. Hogben, Lancelot. Statistical Theory: The Relationship of Probability , 1957 Horkheimer, Max. 1947 Husserl, Edmund. 1964 Kendall, M.G. 1972 Credibility, and Error , New York: W.W. Norton. Eclipse of Reason , New York:Seabury. The Phenomenology of Internal Time Consciousness, Bloomington, Ind.: Indiana University Press. "The History and Future of Statistics: in T.A. Bancroft (ed.) Statistical Papers in Honor of George W. Snedecor , Ames, Iowa: Iowa State University Press. 23 Lazarsfeld, Paul F. "Notes on the History of Quantification in Sociology - 1961 Trends, Sources, and Problems," Isis, 52:2, Lukacs, Georg. 1971 Marx, Karl. 1969 (1863) History and Class Consciousness: Studies in Marxist Dialectics , Cambridge, Mass.: MIT Resultate des unwittelbaren Produktionsprozesses , Franfurt: Verlag Neue Kritik. Mills, C. Wright, 1959 The Sociological Imagination , New York: Oxford University Press. Mumford, Lewis, 1964 The Pentagon of Power ,' New York: Harcourt-Brace-Jovanovich. 1967 "Quality in the Control of Quantity," in S.V.Ciriac- Wantrup and James J. Parsons (eds.), Natural Resources: Quality and Quantity , Berkeley, California University of California Press. Nagel, Ernest. Principles of the Theory of Probability , Chicago: 1939 University of Chicago Press. 1961 The Structure of Science: Problems in the Logic of Scientific Explanation , New York: Harcourt, Brace & World. Nagel, Ernest and James R. Newman. Godel's Proof , New York: New York 1960 University Press. Neubert.Leland Gerson. "The Limits of Statistics in Planning Analysis" 1977 Polanyi, Michael. 1964 Quality and Quantity , 11. Personal Knowledge: Towards a Post-Critical Philosophy , New York: Harper & Row. Polya, George 1948 How to Solve It , Princeton, N.J.: Princeton University Press. 1954a Induction and Analogy in Mathematics : Volume one of Mathematics and Plausible Reasoning , Princeton, N.J. Princeton Unviersity Press. 1954b Patterns of Plausible Inference : Volume two of Mathematics and Plausible Reasoning , Princeton, N.J. Princeton Unviersity Press. Pozzuto, Richard. 1975 Schroyer, Trent. 1973 Scientific Sociology, Alienation and Community, PhD dissertation, Sociology department, University of Oregon, Eugene, Oregon. The Critique of Domination: The Origins and Development of Critical Theory , New York: George Braziller. 24 Schutz, Alfred. "The Well-informed Citizen: An Essay on the Social 1964 Distribution of Knowledge," in Collected Papers, Volume II, The Hague: Martinus Nijhoff. Schutz, Alfred and Thomas Luckmann. Structures of the Life-World , 1973 Evanston, Illinois: Northwestern University Press. Selvin, Hanan C. "A Critique of Tests of Significance in Survey Research,^ 1957 American Sociological Review , 22:4. Snizek, William E. "The Relationship between Theory and Research: 1975 A Study in the Sociology of Sociology, Sociological Quarterly , 16:3. Suppes, Patrick. "The Role of Probability in Quantum Mechanics," 1969 in Studies in the Methodology and Foundations of Science , Dordrecht, Holland: D. Reidel. Tarski, Alfred. "The Semantic Conception of Truth and the Foundations 1944 of Semantics,: Philosophy and Phenoroenological Research , 4. Venn, John. 1962 Logic of Chance, New York: Chelsea. Weber, Max. From Max Weber: Essays in Sociology , H.H. Gerth and 1946' C. Wright Mills Ceds.), New York: Oxford University Press 1947 The Theor • of Social and Economic Organization . New York: Macmillan. 1958 The Protestant Ethic and the Spirit of Capitalism, New York: Scribners. Wittgenstein, Ludwig. Philosophical Investigations , New York:Macmillan. 1953 25