Final Report of the Enquete Commission on "So-called Sects and Psychogroups" New Religious and Ideological Communities and Psychogroups in the Federal Republic of Germany, p. 150
“Thus, the milieu of control identified by Hassan 136), consisting of behavioral control, mental control, emotional control, and information control cannot, in every case and as a matter of principle, be characterized as “manipulative.” Control of these areas of action is an inevitable component of social interactions in a group or community. The social control that is always associated with intense commitment to a group must therefore be clearly distinguished exertion of intentional, methodical influence for the express purpose of manipulation.”
Ftnt 136. The repertoire of hard manipulative control measures includes the generation of physical and mental stress through harassment, overstimulation, or the complete withdrawal of stimuli (“sensory deprivation”).
Extracts From: Melton, Gordon J., Brainwashing and the Cults: The Rise and Fall of a Theory, 1999
“While many objected to their son or daughter joining any religion different from that in which they had been raised, parents were particularly upset by those new groups who sought the full-time commitment of recruits, accepting them not just into membership but into a career either as an administrator, teacher, or missionary for the group, or a resident of a commune or monastic-like community. The brainwashing idea came as a godsend to parents who had been objecting to their offspring's joining one of the new movements, as it offered what appeared to be a scientific rationale for their son or daughter's actions.”
“In reaching out for some reason why a young adult would radically reject the way which parents had prepared for them to fine a successful (and by their standards, normal) life, parents tended to place the blame upon the group that s/he had joined, and increasingly upon the leader of that group. The several organizations founded in the early 1970s drew upon the literature developed primarily by American Evangelical Christian writers that referred to the new religions as "cults…"a "cult" being defined as it was in Evangelical literature merely by its espousal of a radically different set of beliefs.”
“The idea of brainwashing came out of the misunderstanding of the Chinese indoctrination program directed at American Armed Forces prisoners during the Korean War.”
“Lifton, Schein, and their colleagues concluded that in fact coercive persuasion, in which a mixture of social, psychological and physical pressures are applied to produce changes in an individual's self-perception, beliefs and attitudes, does occur. However, they also concluded that a necessary condition of its occurring was the physical element-confinement or its equivalent, As Schein put it, "... the coercive element in coercive persuasion is paramount (forcing the individual into a situation in which he must, in order to survive physically and psychologically, expose himself to persuasive attempts)." (12) They also concluded that it was successful only on a minority of those subjected to it and its end result was very unstable, the individuals so coerced tending to revert to their previous condition soon after the coercive force was removed.”
“Further she added that the use of these techniques led members to become incapable of complex, rational thought, responses to questions become stereotyped, and the ability to make decisions difficult. Much that was asserted in articles such as these resonated with the finding of new religions scholars in general who studied what were seen as "high demand" religions within which a variety of, to borrow a phrase from Rosebeth Kantor, "commitment mechanisms" to encourage and hold group members.(18) However, critics noticed that Singer consistently employed the language of brainwashing and Pavlovian conditioning. While quoting her mentor Edgar Schein, she largely avoided discussions of two key issues: the necessary element of coercion involved in the process of coercive persuasion and the issue of the overriding of the free will of people upon whom the persuasive techniques are used.”
“Among the first, and certainly the most important response to the early writings of Singer and her associates came in the article by Thomas Robbins and Dick Anthony, "The Limits of 'Coercive Persuasion' as an Explanation for Conversion to Authoritarian Sects," the first article to appear below,(24) originally published in the Summer 1980 issue of Political Psychology. While admitting the possible limited use of a "coercive persuasion" model in the study of new religions, Robbins and Anthony argued that the use of such arguments as a justification of deprogramming and legal action was illegitimate. Such extended uses, they argued, ignored the significant differences between different religious movements, wrongly equate the voluntary affiliation operative in religious groups with the physical constraint working on government operated totalistic institutions (such as prisons camps), lack any evidential support that persons subjected to "coercive persuasion" failed to exercise free will, and rely too heavily on the testimonies of ex-members whose account of life in the group had previously been effected by the work of deprogrammers and/or sessions with a therapist. (Singer had noted in her Psychology Today article that her view of the new religions had been almost totally formed by her sessions with ex-members, the great majority of whom had come to her only after being deprogrammed.)”
“These people [deprogrammed] constituted but a tiny percentage of former members (10 to 15%), and were drawn from the same relatively few groups upon which the anti-cult movement was focused.
Attempts to survey and study ex-members was pioneered by J. T. Ungerleider, D. K. Wellisch, Trudy Solomon and Stuart Wright, whose works helped to break many of the stereotypes of former members. Ungerleider and Wellisch (32) were among the first to point out significant differences between ex-members who left voluntarily and the those who were deprogrammed, the later group usually going on to become involved with the anti-cult movement and in the practice of deprogramming others. Solomon and Wright extended the consideration pointing out that those former members involved with the anti-cult movement represented only a very small percentage of former members. Solomon, found in her study of former members of the Unification Church, that attitude toward the Church were directly related to their method of severing membership (voluntary or forced) and their subsequent level of contact with the anti-cult movement (low to high), with the later option correlating with a negative assessment of the Church. (33) In like measure, Wright found that those who voluntarily left the various controversial new religions rarely adopted brainwashing language to discuss their experience.
(34)
Then, spurred by Conway and Siegelman's rather blatant assertions James R. Lewis and David G. Bromley took the research one step further and tested the claim of harm done to members by cults in their study of ex-members, "The Cult Withdrawal Syndrome: A Case of Misattribution of Cause" (1987), (35) reprinted below. This study largely laid to rest the continuing issue of pathology among former members of new religions. Using a more representative sample of former members, Lewis and Bromley measured the presence of the various pathological symptoms that Conway and Siegelman had discovered in their sample of former members (an extension of the symptoms discussed elsewhere by Singer). While disconfirming many of Conway and Siegelman's assertions, such as that people who had been in groups longer would show more symptoms, Lewis and Bromley were able to pinpoint the major source of dysfunctional symptoms among ex-members, the process of leaving the group.
Lewis and Bromley considered the presence of symptoms relative to the type of exit from the group. They divided the sample into those who left voluntarily and received no counseling by individuals associated with the anti-cult movement, those who left and then received some form of voluntary deprogramming (usually termed exit counseling), and those who were involuntarily deprogrammed. While the entire sample showed significantly lower levels of dysfunctional symptoms than the one reported upon by Conway and Siegelman, it did show a dramatic relationship between the method of leave-taking and the presence of symptoms. Those associated with the anti-cult movement had measurably higher levels of symptoms, but those who had been deprogrammed had a radically higher number of symptoms than the general sample.
The Lewis and Bromley study became a landmark study in shifting the onus of pathology experienced by former members of new religions from the religions to the coercive activity of the anti-cult movement. In the wake of this study (and other works that confirmed its findings), treating former members as people in need of psychological help has largely ceased. The lack of any widespread expressed need for psychological help by the tens of thousands of former members of new religions in the succeeding decade has itself become the strongest evidence refuting the early sweeping condemnation of new religions as causes of psychological trauma.”
“After considering all of the arguments put forth by the exponents of the Singer Hypothesis, and listening to the counter arguments, one point of overwhelming consensus had emerged, that brainwashing was an inadequate model for understanding the dynamics operative in new religious movements.”
“However, it was psychologist Dick Anthony who in the end produced the most thorough study of Singer's views and offered what has remained the most important response to them: his lengthy paper, "Religious Movements and 'Brainwashing' Litigation: Evaluating Key Testimony" that appeared in the second edition of the textbook, In Gods We Trust: New Patterns of Religious Pluralism in America (1989). (39)”
“It is this very idea, popularly called brainwashing, which had been discredited by the work of Lifton and Schein, and had never gained any scientific credibility.”
“According to Anthony, the brainwashing paradigm was and is actually pseudoscience. It began as a propaganda ploy which was developed by the American CIA to counter Communist propaganda that clamed that Western POWs in Korea and civilian prisoners on the Communist mainland were converting to Communism. The "brainwashing hoax", as it was referred to by one researcher, claimed that the Communists had invented scientific techniques of coercive persuasion capable of forcing people to convert to Communism against their wills. The essence of the brainwashing notion is that people are put into a hyper-suggestible altered state of consciousness through hypnosis, drugs, debilitation or other means, and then their worldviews are transformed against their wills through conditioning techniques.”
“London drives home the failure to provide supporting evidence of such a unique theory as that offered by Singer and Ofshe, one that has been almost uniformly rejected in the scientific literature. In this regard, he conducted an independent search of the previous fifteen years of psychological literature covering 1400 journals in 29 languages. His search yielded "no empirical studies" supportive of her position and only a modest number of speculative/ theoretical articles. London's work drove another nail in the coffin into which the Singer hypothesis had been placed by the APA and then by Anthony's work. Echoing Anthony, he concludes most forcefully, "... that what I have called the Robot Theory, meaning any theory of social influence processes and/or irreversible social influence processes and/or subversion of the will as a result of these social influence processes, does not present an argument which is generally accepted in contemporary scientific psychology. That is the main reason I believe that this topic has not been the object of scientific study and research in general and is not widely discussed in the literature of the social, behavioral, or medical sciences."”
“Since the late 1980s, though a significant public belief in cult-brainwashing remains, the academic community-including scholars from psychology, sociology, and religious studies-have shared an almost unanimous consensus that the coercive persuasion/brainwashing thesis proposed by Margaret Singer and her colleagues in the 1980s is without scientific merit. To date, no one has come forward to refute the arguments, especially those advanced by Dick Anthony a decade ago, nor has the situation that Perry London found concerning articles providing an empirical base for the theory been reversed. Through the 1990s, it has been difficult to locate any scholar in the English-speaking world who has been willing to attempt a defense of it, and even Singer herself has appeared to back away from her earlier position.”
“Malony argues, with the mass of social and psychological literature to back him up, that "social influence" occurs. That while he personally (along with the entire field of clinical psychology) aims at producing individuals with strong egos, capable of individual self-determination, he cannot escape the fact that none of us can escape from the effects of personal interactions with others and social organizations (from nations to families) to which we belong. There is no act that is totally autonomous. That people have the privilege and responsibility to determine their life for themselves, complete self-determination is at best a heuristic goal. At times, individuals make decisions which most (family members, neighbors, friends, fellow employees) consider unwise, including the choice to join an unpopular religious organization.”
Extracts From: “Brief Amicus Curiae of the American Psychological Association”
In The Supreme Court of the State of California
February 10, 198787
“The viability of plaintiffs' "coercive persuasion" theory hinges on the admissibility of this purported expert testimony; without it, plaintiffs lack any factual basis for their claim of coercion…the lower courts were correct to exclude the testimony of Drs. Singer and Benson. Their proffered testimony failed to meet basic scientific standards of reliability and validity incorporated into the test for admissibility set forth in California Code of Evidence SS 801. Specifically, the conclusions Drs. Singer and Benson assert cannot be said to be scientific in any meaningful sense (Point I.B.), and the methodologies generating those conclusions depart so far from methods generally accepted in the relevant professional communities that they are incapable of producing reliable or valid results (Point I.C.). Stripped of the legitimating lustre of a scientific pedigree, plaintiffs' purported scientific claim of coercive persuasion is little more than a negative value judgment rendered by laypersons about the religious beliefs and practices of the Unification Church. (Point I.D.).” (8)
“Amici will demonstrate that a strong consensus of relevant professional opinion supports the trial court's decision to exclude the proffered testimony of Drs. Singer and Benson because that testimony did not meet these standards of general acceptance in the scientific community. The theories of Drs. Singer and Benson are not new to the scientific community. After searching scrutiny, the scientific community has repudiated the assumptions, methodologies, and conclusions of Drs. Singer and Benson. The validity of the claim that, absent physical force or threats, "systematic manipulation of the social influences" can coercively deprive individuals of free will lacks any empirical foundation and has never been confirmed by other research. (Point I.B.) The specific methods by which Drs. Singer and Benson have arrived at their conclusions have also been rejected by all serious scholars in the field. (Point I.C.)” (15)
“From a scientific point of view, it is exceedingly difficult--most would say wholly illegitimate--to evaluate allegedly coercive acts by measuring their effect on some ineffable human quality called free will. To do so, a scientist would have to define what free will is, describe how the environment affects free will, and decide the point at which the effects become so great that free will can be said to be overborne. In such inquiries swirl the deepest philosophical mysteries of human existence; no responsible scientist lays claim to the power to define or discuss free will in this sense. See Balch, What's Wrong With the Study of New Religions And What We Can Do About It, in Scientific Research and New Religions 25 (B. Kilbourne, ed. 1985) (as a descriptive label, "brainwashing ... is essentially useless because it depends on untestable assumptions about the slippery issues of freedom and control") (hereafter The Study of New Religions).” (16)
“When plaintiffs' theory of coercive persuasion is evaluated in this scientific way, its plausibility evaporates. A significant and uncontradicted body of empirical social science evidence demonstrates that the overwhelming majority of persons who undergo the process plaintiffs describe as "coercive persuasion," even for a period of weeks, choose not to affiliate with the Unification Church. Several studies of Unification Church” (17)
“Proof that individuals with certain traits are more likely to join the Church is not proof that such individuals were deprived of their free will. The traits common to such a group might well be those--such as a questing nature or a desire for community-- that predispose group members toward religious experience.” (21)
“Extant evidence demonstrates that the qualities that dispose individuals toward joining the Church are not qualities of "vulnerability." Barker, Making of a Moonie, supra, at 235 ("it is precisely those whom one might have expected to be the most vulnerable to persuasion who turn out to be the nonjoiners"). Accord Richardson, The Active vs. Passive Convert: Paradigm Conflict in Conversion/Recruitment Research, 24 J. for the Scientific Study of Religion 163 (1985); Richardson, Psychological and Psychiatric Studies of New Religions, in II Advances in the Psychology of Religion 209, 217, 220 (L. Brown, ed. 1985) (hereafter Psychological and Psychiatric Studies). Other studies refute the suggestion that Church members are in any way impaired in their capacity for rational thought and choice. E.g., Ungerleider & Wellisch, Coercive Persuasion (Brainwashing), Religious Cults, and Deprogramming, 136 Am. J. Psychiatry 279, 281 (1979) ("No data emerged from intellectual, personality, or mental status testing [of more than 50 "cult members"] to suggest that any of these subjects are unable or even limited in their ability to make sound judgments and legal decisions as related to their persons and property"). If there be a trait common to those who decide to join the Church it is "strong ideological hunger." Id. at 282.” (21-22)
“Precisely because free will is ineffable and not susceptible to direct observation or measurement, drawing any conclusions about deprivation of free will is an exceedingly uncertain enterprise. When scientists purport to conclude that an individual has been deprived of free will they have stepped beyond the sphere of their expertise. Such a claim does not partake of science because it cannot be measured or tested; it has no empirical foundation. See Point II.A.2 supra. Accordingly, when Drs. Singer and Benson proffered testimony that Church conversion practices overcame plaintiffs' free will, they were not speaking as scientists. Their claim must thus be considered unreliable in the most fundamental sense. It is philosophical speculation, not science. Scientists can evaluate the degree to which the conversion practices of the Church result in a decision to join the Church by those subjected to the practices, but all available scientific evidence of this nature refutes the claim of coercion plaintiffs advance.” (22)
“Drs. Singer and Benson have exaggerated the findings of the original POW studies respecting the efficacy of mind control techniques; such techniques "are neither mysterious nor new, nor have they nearly the effectiveness attributed to them by popular writers." D. Bromley & A. Shupe, Strange Gods 100 (1981) (hereafter Strange Gods). See also Richardson & Kilbourne, Classical and Contemporary Applications of Brainwashing Models, supra, at 31-32; Robbins, Goodbye to Little Red Riding Hood, 10 Update: A Quarterly Journal of New Religious Movements 5, 6-7 (1986)” (23)
“Drs. Singer and Benson have wholly failed to account for a crucial factor distinguishing Unification Church conversion practices from Korean War POW camps: the complete absence in the Church context of physical confinement, torture, death threats and severe physical deprivations. Physical confinement and abuse were--as Dr. Singer acknowledged in another context--central to the debilitating character of POW camps, see Strasser & Thaler [Singer], A Prisoner of War Syndrome: Apathy as a Reaction to Severe Stress, 122 Am. J. of. Psychiatry 998 (1956). Accord Lunde & Wilson, "Brainwashing" as a Defense to Criminal Liability: Patty Hearst Revisited, 13 Crim. L. Bull. 341, 351 (1977) ("Coercive persuasion occurs when a person is subjected to intense and prolonged coercive tactics and persuasion in a situation from which that person cannot escape"” (23)
“Even though religious groups throughout history have required a symbolic humbling of the individual to gain admission, the degree of debasement required by the new religious groups does not seem much greater than hazing by college fraternities and less than that experienced by people entering the armed forces. There is also nothing analogous to the interrogation political prisoners underwent. Converts to some new religious groups may undergo public self-analysis, but this is apparently beneficial for the individuals involved since there is some evidence that their mental health improves after they join. ... The kind of confession that is a part of brainwashing quite obviously plays no role in the new religions. Nor is there anything similar to the manipulation of rewards and punishments that characterizes brainwashing. There is indoctrination in the sense that there is systematic presentation of a belief system without competing belief systems being discussed, but this is by no means a unique feature of new religious groups; it seems to be a universal characteristic of all religious and ideological groups. Although group influence plays a major role in reinforcing beliefs in the new religious groups, this also seems a universal feature of religious and ideological belief formation. Since new religious groups depend on the conversion of adults to gain members, rather than indoctrination of children which characterizes more established religious groups, the practice of the more established groups is, in this respect, closer to brainwashing than that of the new religious groups.—James, Brainwashing, supra, at 254-255 (internal citation omitted).” (25-26)
“Physical confinement and abuse was a necessary condition for coercive persuasion in the POW context.” (26)
“the entire conceptual framework for the conclusions of Drs. Singer and Benson has been rejected by the scientific community.” (26)
“Common sense suggests--and scientific analysis confirms--that some individuals who have left a movement, particularly a movement as demanding of adherents as is the Unification Church, are likely to have become disillusioned. Such individuals may regret the experience or resent the movement for the material sacrifices demanded or for the estrangement from family and friends that may have resulted. Under such circumstances individuals might be expected to provide hostile accounts of their experience. They might be expected to seek self-serving rationalizations to explain to families and to themselves their original decision to affiliate. By explaining affiliation as brainwashing, former members can place responsibility for their past actions and resulting harms on the Church rather than on themselves. Richardson, van der Lans & Derks, Leaving and Labelling: Voluntary and Coerced Disaffiliation From Religious Social Movements, 9 Research in Social Movement, Conflicts and Change 97 (1986) (hereafter Leaving and Labelling). Accord, Bromley & Shupe, Strange Gods, supra, at 203-04. Kelley, Deprogramming and Religious Liberty, 4 Civil Liberties Rev. 27, 31 (1977). See generally J. Biermans, The Odyssey of New Religious Movements 81-94 (1986).” (28-29)
“information received from family or friends of a former Church member is likely to reflect hostility to the Church. Persons close to the former Church member will, like the former member, seek explanations for the former member's decision to depart radically from their previously shared belief system. Such persons will also in many cases need to justify their decision forcibly to impose a regimen of deprogramming. For these reasons, "[r]elatives and friends, no matter how well-intentioned, are known for their anti-cult campaigns and are not impartial observers." Saliba, Psychiatry and the New Cults, supra, at 46. Accord, Melton & Moore, The Cult Experience, supra, at 43.” (29)
“Several recent studies show that individuals who have been "deprogrammed" manifest far greater hostility toward their former organization and claim "brainwashing" or coercive persuasion far more often than do members of the much larger group who leave such organizations of their own volition.” (30)
“Whereas most "deprogrammed" individuals claim they joined the Church as a result of coercive persuasion, e,g., Lewis, Reconstructing the Cult Experience, supra, almost no one among the far larger numbers who depart voluntarily makes such a claim. Wright, Post-Involvement Attitudes, supra. These studies have observed that "presentation of the brainwashing ideology appears to be one of the most essential components of the deprogramming process." Lewis, Reconstructing the Cult Experience, supra, at 157; accord Solomon, Survey of Ex-Members, supra, at 289. See also Barker, Making of a Moonie, supra, at 129. Dr. Singer herself noted in testimony in a case in the United Kingdom that "[t]he deprogrammers . . . tell the current members . . . about how the process of mind-control, brainwashing, the imposed identity change, was brought about." See id. at 129 (quoting testimony of Dr. Singer) (emphasis added). Given the importance of the brainwashing explanation in the deprogramming process and the extreme frequency with which deprogrammed former members--but not former members who departed voluntarily--claim coercive persuasion, it may well be that Drs. Singer and Benson have been observing the effects of deprogramming by persons seeking to sever members' affiliation with the Church,[13] and not the effects of the Church's conversion practices. Lewis, Apostates and the Legitimation of Repression 21 (Institute for the Study of Religion, G. Melton ed. 1986) ("ex-members who have been 'counseled' by anti-cultists should be especially suspect as being less than neutral witnesses") (hereafter Apostates). Coleman, New Religions and "Deprogramming;" Who's Brainwashing Whom?, in Cults, Culture, and the Law 71 (1985) (suggesting that deprogramming process more closely resembles the mind-control techniques employed in Korean War POW camps than does the Unification Church conversion method).” (30-32)
“Just as interviews of divorced persons would not be expected to yield neutral evidence about the institution of marriage or about the moral character of an ex-spouse, so too interviews of deprogrammed Church members cannot be expected to yield neutral evidence about Church conversion practices.” (33)
“the claim that Church membership caused the observed maladies has no validity because it wholly fails to account for other explanations for the observed harms; it equates correlation with causation. For example, the methodology of Drs. Singer and Benson fails to account for four obvious alternative explanations: (i) the observed condition might have preceded membership in the Church; (ii) both the observed condition and Church membership might be explained by a third factor; (iii) the observed condition might have been caused by deprogramming, see note 13 supra; (iv) the observed condition might be the product of readjustment to life after membership in the Church.
This failure to exclude rival explanations for an observed correlation violates another first principle of scientific research. "[W]hen a social scientist observes a correlation between two variables it is often tempting to simply assume that the relationship is causal in nature . . . . This assumption is unsound whenever the observed relationship can reasonably be explained in a different way. Thus, [c]ausal inferences require research designs that can control for plausible rival hypotheses." Neale & Liebert, Science and Behavior, supra, quoted in Monahan & Walker, Social Science in Law, supra, at 54-55.
One equally plausible interpretation of the claimed correlation would be that the observed psychological maladies preceded Church membership and led to the decision to affiliate. Drs. Singer and Benson have failed to exclude this rival hypothesis. Scientists call this uncertainty about cause and effect a "directionality" problem. Id. Nor are Drs. Singer and Benson in a position to exclude the possibility that some other factor (such as poor family relations) independently caused both Church membership and the observed psychological distress. Scientists call this a "third variable" problem. Id. Nor are Drs. Singer and Benson in a position to exclude the possibility that the observed psychological distress was caused independently by another correlative variable such as deprogramming or transition from a religious environment to a secular environment. See Melton & Moore, The Cult Experience, supra, at 57; Saliba, Psychiatry and the New Cults, supra, at 46 ("[R]re-entry problems, like indecision, often follow whenever a person makes a major shift in life and commitment. . . . They are life problems and not cultic ones").” (34-35)
“A significant body of evidence suggests that membership in new religious organizations such as the Unification Church tends to relieve psychological distress. E.g., Deutsch and Miller, A Clinical Study of Four Unification Church Members, 140 J. Am. Psychiatry 767, 769 (1983); Galanter, Charismatic Religious Sects and Psychiatry: an Overview, 139 Am. J. Psychiatry 1539 (1982). See also Melton & Moore, The Cult Experience, supra, at 42; T. Ungerleider, The New Religions 15-16 (1979). Cf. Griffith, Young & Smith, An Analysis of the Therapeutic Elements in a Black Church Service, 35 Hosp. and Com. Psychiatry 464 (1984) (similar findings).[15] Other studies show that those who have been deprogrammed manifest emotional distress symptoms far more often than do those who depart new religions voluntarily. Lewis & Bromley, The Cult Withdrawal Syndrome, supra.” (36)
“The decision to affiliate with a new religion like the Unification Church can involve--or at least appear to involve--drastic personal changes. Total devotion to what Church members believe to be the will of God may require most of a Church member's time, energy and money. Career goals and recreational preferences may be altered, as may family relations. The Church member may adopt a set of beliefs and rituals alien and incomprehensible to those in the mainstream. Such encompassing changes are not easily understood by those who have not undergone them. It may be difficult to accept the idea that Church members could have chosen freely to adopt a way of life that demands so much and a belief system that sets them apart.
The proffered testimony of plaintiffs' experts, Drs. Singer and Benson, purports to supply a scientific explanation that makes these dramatic changes comprehensible: those who join the Church, such as plaintiffs, did not choose freely to adopt this alien way of life but were psychologically coerced into affiliating. Amici believe, for the reasons set forth in this brief, that this assertion has no scientific validity whatsoever and that expert testimony purporting to establish that claim was properly excluded by the trial court.[17]
Denuded of its scientific legitimacy, the coercive persuasion theory, as applied in the Unification Church context, amounts to little more than a refusal to accept that persons could choose to adopt the belief system and way of life of that Church. See Robbins, "Uncivil" Religions and Religious Deprogramming, 61 Thought 277, 280 (1986) (mind-control hypotheses "frequently entail sinister clinical interpretations of behaviors and processes which might otherwise be seen merely as indicative of intense religious commitment"). It is, in other words, simply a layperson's negative value judgment about the beliefs and practices of the Unification Church. See Anthony and Robbins, New Religions, Families, and "Brainwashing", in In Gods We Trust 263, 266-267 (1981); Richardson, The "Deformation" of New Religions: Impacts of Societal and Organizational Factors, in Cults, Culture and the Law 163, 164 (1985); Balch, The Study of New Religions, supra, at 25. As the trial court in this case found, "Both doctors . . . seem to have reasoned backwards from their disapproval of those methods to the conclusion that Plaintiffs were not thinking freely because they were persuaded by them." Quoted in 179 Cal. App. 3d at 466 n.9.” (37-39)
“Guaranteeing "the free exercise" of religion, the words of the Constitution's text also shield conduct undertaken for reasons of faith. "[T]he right to the free exercise of religion unquestionably encompasses the right to preach, proselyte, and perform other similar religious functions." McDaniel v. Paty, 435 U.S. 618, 626 (1978) (Burger, C.J.) (plurality opinion).” (40-41)
“If applied only to new religions such as the Unification Church, plaintiffs' coercive persuasion theory would transgress the First Amendment requirement that government maintains posture of strict neutrality respecting religion. See Goldman v. Weinberger, ___ U.S., ___, 106 S. Ct. 1310, 1314 (1986), (Stevens, J. concurring). Yet it is inconceivable that this principle could be applied universally. The State of California does not regulate the conduct plaintiffs label coercive persuasion in any other non-religious context, or even with respect to any other religion. To impose the restrictions plaintiffs desire on the Unification Church, this State would have to regulate rites and traditions common to mainstream religions, such as indoctrination of the young or manipulation of guilt feelings to induce upright behavior. See James, Brainwashing, supra, at 255. The use of intensive persuasion in nonreligious settings--such as politics or "hard sell" advertising--would also have to give rise to tort liability. At the very least, universal application would drag courts into a swamp of intractable factual and legal judgments. At worst, universal application would require courts to impose liability for a wide range of conduct society accepts as wholly legitimate.” (45-46)
“Had the legal standards plaintiffs advocate been applied to an emergent Christianity, the creeds by which most Americans abide might never have grown to fruition. The Christian tradition abounds with examples of sudden conversion to an intense and previously alien way of life: Saul on the road to Damascus; Francis of Assisi renouncing the commercial pursuits of his father; the disciples dropping their nets and joining Jesus. The annals of Christianity also reveal the disapprobation with which conversion to a life of intense devotion was greeted, even by those closest to the converts. St. Thomas Aquinas, for example, was kidnapped from his order by his father and brothers because the mendicant friars were thought disreputable by mainstream society. Flinn, Criminalizing Conversion: The Legislative Assault on New Religions, in Crimes, Values and Religions 35 (Day & Laufer eds. 1986). The history of religious movements should thus enlighten us to the risk that society may seek to deter the proselytizing of new religions such as the Unification Church because society cannot comprehend and appreciate their alien view of spirituality, not because new religions pose a genuine threat to public health or safety.” (46)
“"A way of life that is odd or erratic but interferes with no rights or interests of others is not to be condemned because it is different." Wisconsin v. Yoder, at 224. The choice to affiliate with a new religion such as the Unification Church and to adopt its demanding ways might strike those in the mainstream as erratic or preposterous, but by its nature faith cannot be subject to tests of rationality. See United States v. Ballard, supra, 322 U.S. at 86.” (47)
“Of the 3500 POW's held in Chinese camps, only 50 ever made procommunist statements and only about 25 ultimately refused repatriation. A. Scheflin & E. Opton, The Mind Manipulators 89 (1978). Edgar Schein, whose POW studies provided the theoretical basis for coercive persuasion theorists, was careful to distinguish between acts of trivial collaboration to avoid punishment or gain amenities and genuine ideological conversion. Although collaboration was prevalent, Schein found, genuine conversion was rare. "[C]onsidering the effort devoted to it," Schien concluded, "the Chinese program was a failure." Schein, The Chinese Indoctrination Program for Prisoners of War: A Study of Attempted 'Brainwashing', in Readings in Social Psychology 332 (Maccoby, Newcomb, and Hartley eds. 1958).” (ftnt. 10)
“In a 1979 article, Dr. Singer stated that at least 75% of those she had interviewed did not leave their "cults" voluntarily and that "most group members had seen deprogrammers." Singer, Coming Out of the Cults, Psychology Today 71, 72 (January 1979).” (ftnt. 12)
“The term "deprogramming" refers to the process--typically undertaken at the instigation of parents of members of new religious groups--to force members of such groups to terminate affiliation. Deprogramming typically involves involuntary abduction of the group member, confinement of the group member for several days, repetitive indoctrination of the group member about the evils of his or her group, and intensive application of psychological pressure upon the group member to renounce his or her affiliation. See Coleman, New Religions and Deprogramming: Who's Brainwashing Whom?, in Cults, Culture and the Law 71 (1985).” (ftnt. 13)
“amici believe there is a much more scientifically responsible explanation for the events at issue. Plaintiffs, like many people their age during these times (as well as other times) were searching for meaning in their lives. They did not reject the values of the past, but--as does the idealistic youth of most ages--they were questioning them. They found a group who seemed to have been through similar experiences and to have found some answers. These people, convinced of the power of their own religious insights, wanted to share their faith. Plaintiffs tried this for a time and during this span thought they had found a belief system that satisfied their search for meaning. Later, for whatever reasons, they began to see that this system of belief did not work for them any longer. They therefore returned to their search, older and wiser. This is a pattern common to many people searching for a belief system. See Richardson, The Active vs. Passive Convert: Paradigm Conflict in Conversion/Recruitment Research, 24 J. for the Scientific Study of Religion 163 (1985) (describing "seekership" pattern). As do many people in transition, Molko and Leal felt compelled to explain these changes of direction, and found it convenient to blame the Church for their failure to find meaning. See Richardson, van der Lans & Derks, Leaving and Labelling: Voluntary and Coerced Disaffiliation From Religious Social Movements, 9 Research in Social Movements, Conflicts and Change 97 (1985).” (ftnt. 17)
Extracts From: “Mind control (brainwashing)” in The Skeptics Dictionary.
“There are many misconceptions about mind control. Some people consider mind control to include the efforts of parents to raise their children according to social, cultural, moral and personal standards. Some think it is mind control to use behavior modification techniques to change one’s own behavior, whether by self-discipline and autosuggestion or through workshops and clinics. Others think that advertising and sexual seduction are examples of mind control. Still others consider it mind control to give debilitating drugs to a woman in order to take advantage of her while she is drugged. Some consider it mind control when the military or prison officers use techniques that belittle or dehumanize recruits or inmates in their attempt to break down individuals and make them more compliant. Some might consider it mind control for coaches or drill instructors to threaten, belittle, physically punish, or physically fatigue by excessive physical exercises their subjects in the effort to break down their egos and build team spirit or group identification.”
“A term with such slack in its denotation is nearly useless. In narrowing down the denotation the first thing to do is eliminate as examples of mind control those activities where a person freely chooses to engage in the behavior. Controlling one's thoughts and actions, whether by self-discipline or with the help of others, is an interesting and important topic, but it is not the same as brainwashing or programming people without their consent.”
“Love and fear may not be enough, however; so guilt must be used, too…What religion doesn't use guilt and fear to get people to police their own thoughts? Even some therapists use similar methods to control their patients…Are the recruits, the converts to the faith, and the patients willing victims? How would we tell the difference between a willing victim and an unwilling victim? If we cannot do that, then we can't distinguish any true cases of mind control.”
“Recruiters and other manipulators are not using mind control unless they are depriving their victims of their free will. A person can be said to be deprived of his free will by another only if that other has introduced a causal agent which is irresistible. How could we ever demonstrate that a person's behavior is the result of irresistible commands given by a religious, spiritual, or personal growth leader? It is not enough to say that irrational behavior proves a person's free will has been taken from them.”
“the tactics of the recruiters differ substantially from those of kidnappers or inquisitors. Recruiters generally do not kidnap or capture their recruits, and they are not known to use torture as a typical conversion method. This raises the question of whether their victims are controlled without their consent. Some recruits are not truly victims of mind control and are willing members of their communities. Similarly, many recruits into mainstream religions should not be considered victims of mind control. To change a person's basic personality and character, to get them to behave in contradictory ways to lifelong patterns of behavior, to get them to alter their basic beliefs and values, would not necessarily count as mind control. It depends on how actively a person participates in their own transformation. You and I might think that a person is out of his mind for joining Scientology, Jehovah's Witnesses, or Jim Roberts' The Brethren, but their "crazy beliefs and behaviors" are no wilder than the ones that millions of mainstream religious believers have chosen to accept and engage in.
“It seems then, that if we define mind control as the successful control of the thoughts and actions of another without his or her consent, mind control exists only in fantasy.”
Recoveries from “Apostasy” in Wikipedia. Version: 15 March, 2008
Bromley and Shupe, while discussing the role of anecdotal atrocity stories by apostates, proposes that these are likely to paint a caricature of the group, shaped by the apostate's current role rather than his experience in the group, and question their motives and rationale. Lewis Carter and David G. Bromley claim that the onus of pathology experienced by former members of new religions movements should be shifted from these groups to the coercive activities of the anti-cult movement. [Bromley David G. et al., The Role of Anecdotal Atrocities in the Social Construction of Evil; Bromley, David G et al. (ed.), Brainwashing Deprogramming Controversy: Sociological, Psychological, Legal, and Historical Perspectives (Studies in religion and society) p. 156, 1984, ISBN]
Massimo Introvigne in his Defectors, Ordinary Leavetakers and Apostates defines three types of narratives constructed by apostates of new religious movements:
Type I narratives characterize the exit process as defection, in which the organization and the former member negotiate an exiting process aimed at minimizing the damage for both parties.
Type II narratives involve a minimal degree of negotiation between the exiting member, the organization it intends to leave, and the environment or society at large, implying that the ordinary apostate holds no strong feelings concerning his past experience in the group.
Type III narratives are characterized by the ex-member dramatically reversing his loyalties and becoming a professional enemy of the organization he has left. These apostates often join an oppositional coalition fighting the organization, often claiming victimization.
Introvigne argues that apostates professing Type II narratives prevail among exiting members of controversial groups or organizations, while apostates that profess Type III narratives are a vociferous minority.
Daniel Carson Johnson, in his Apostates Who Never Were: The Social Construction of Absque Facto Apostate Narratives, refers to the stories told by apostates, to be stories of captive involvement in the past with the targeted religious group, and stories of rescue and redemption in the present. He asserts that these narratives is what confirms the apostate role, and that the stories are not recitations of real-world experiences and happenings but are social constructed and shaped along the lines dictated by an established literary form called "apostate narrative". He advises social scientists studying the subject to consider the possibility that substantial portions, and perhaps entire accounts have nothing to do with real world happenings or experiences. [Bromley, David G. (ed.); Richardson, James T. (1998). "Apostates Who Never Were: The Social Construction of Absque Facto Apostate Narratives". in The politics of religious apostasy: the role of apostates in the transformation of religious movements. New York: Praeger. pp. pp.134-5. ISBN 0-275-95508-7.]
Dr. Lonnie D. Kliever (1932 - 2004), Professor of Religious Studies of the Southern Methodist University, in his paper The Reliability of Apostate Testimony about New Religious Movements that he wrote upon request for Scientology, claims that the overwhelming majority of people who disengage from non-conforming religions harbor no lasting ill-will toward their past religious associations and activities, but that there is a much smaller number of apostates who are deeply invested and engaged in discrediting, and performing actions designed to destroy the religious communities that once claimed their loyalties. He asserts that these dedicated opponents present a distorted view of the new religions and cannot be regarded as reliable informants by responsible journalists, scholars, or jurists. He claims that the lack of reliability of apostates is due to the traumatic nature of disaffiliation, that he compares to a divorce, but also due to the influence of the anti-cult movement, even on those apostates who were not deprogrammed or did not receive exit counseling. (Kliever 1995 Kliever. Lonnie D, Ph.D. The Reliability of Apostate Testimony About New Religious Movements, 1995.)
Gordon Melton, while testifying as an expert witness in a lawsuit, said that when investigating groups one should not rely solely upon the unverified testimony of ex-members, and that hostile ex-members would invariably shade the truth and blow out of proportion minor incidents, turning them into major incidents.[34] Melton also follows the argumentation of Lewis Carter and David Bromley (above) and claims that as a result of this study, the [psychological] treatment (coerced or voluntary) of former members largely ceased, and that a (perceived) lack of widespread need for psychological help by former members of new religions would in itself be the strongest evidence refuting early sweeping condemnations of new religions as causes of psychological trauma. (Melton, Gordon J., Brainwashing and the Cults: The Rise and Fall of a Theory, 1999.)
Bryan R. Wilson, who was a professor of Sociology at Oxford University, writes that apostates of new religious movements are generally in need of self-justification, and seek to reconstruct their past and to excuse their former affiliations, while blaming those who were formerly their closest associates. Wilson utilizes the term atrocity story, [a story] that is in his view rehearsed by the apostate to explain how, by manipulation, coercion or deceit, he was recruited to a group that he now condemns. (Wilson, Bryan R. (Ed.) The Social Dimensions of Sectarianism, Rose of Sharon Press, 1981) Wilson also challenges the reliability of the apostate's testimony by saying that "the apostate [is] always seen as one whose personal history predisposes him to bias with respect to his previous religious commitment and affiliations, [so] the suspicion must arise that he acts from a personal motivation, to vindicate himself and to regain his self-esteem, by showing himself to have been first a victim, but subsequently a redeemed crusader." (Wilson, Bryan R. Apostates and New Religious Movements, Oxford, England, 1994)
Stuart A. Wright explores the distinction between the apostate narrative and the role of the apostate, asserting that the former follows a predictable pattern in which the apostate utilizes a "captivity narrative" that emphasizes manipulation, entrapment and becoming a victim of "sinister cult practices". These narratives provide a rationale for a "hostage-rescue" motif in which cults are likened to POW camps, and deprogramming is seen as a heroic hostage rescue effort. He also makes a distinction between "leavetakers" and "apostates", asserting that despite the popular literature and lurid media accounts of stories of "rescued or recovering 'ex-cultists'", empirical studies of defectors from NRMs "generally indicate favorable, sympathetic or at the very least mixed responses toward their former group." (Wright, Stuart, A., Exploring Factors that Shape the Apostate Role, in Bromley, David G., The Politics of Religious Apostasy, pp. 95-114, Praeger Publishers, 1998. ISBN 0-275-95508-7)
Miscellaneous Quotes
"The apostate is generally in need of self-justification. He seeks to reconstruct his own past, to excuse his former affiliations, and to blame those who were formerly his closest associates. Not uncommonly the apostate learns to rehearse an 'atrocity story' to explain how, by manipulation, trickery, coercion, or deceit, he was induced to join or to remain within an organization that he now forswears and condemns." Bryan Wilson, The Social Dimensions of Sectarianism
"Others may ask, if the group is as transparently evil as he now contends, why did he espouse its cause in the first place? In the process of trying to explain his own seduction and to confirm the worst fears about the group, the apostate is likely to paint a caricature of the group that is shaped more by his current role as apostate than by his actual experience in the group." David Bromley, Anson Shupe, and J.C. Ventimiglia, The Role of Anecdotal Atrocities in the Social Construction of Evil," in Bromley and Richardson, Brainwashing Deprogramming Controversy, p. 156
"Most former members do not become apostates. They remain — in sociological terms suggested by David Bromley and others — "defectors" (members who somewhat regret having left an organization they still perceive in largely positive terms), or "ordinary leave takers" with mixed feeling about their former affiliation. However ordinary leave takers (and, to some extent, defectors) remain socially invisible, insofar as they do not like or care to discuss their genuine representatives of the former members. In fact, quantitative research shows that even in extremely controversial groups, apostates normally represent less than 15% of former members." Massimo Introvigne, Religious Liberty in Europe: Apostate
"Neither the objective sociological researcher nor the court of law can readily regard the apostate as a creditable or reliable source of evidence. He must always be seen as one whose personal history predisposes him to bias with respect to both his previous religious commitment and affiliations, the suspicion must arise that he acts from a personal motivation to vindicate himself and to regain his self-esteem, by showing himself to have been first a victim but subsequently to have become a redeemed crusader. As various instances have indicated, he is likely to be suggestible and ready to enlarge or embellish his grievances to satisfy that species of journalist whose interest is more in sensational copy than in a objective statement of the truth." Bryan R. Wilson, Apostates and New Religious Movements
"The dramatic import of each apostate's story is reinforced in its significance, to the detriment of objective and ethically neutral enquiry into religious phenomena of the kind undertaken by academic sociologists. Contemporary religious bodies, operating in a context of rapid social change and changing perceptions of religious and spiritual belief, are likely to be particularly susceptible to the disparagement and misrepresentation which occurs through the circulation and repetition of the accounts of apostates." Bryan Wilson, Apostates and New Religious Movements
No comments:
Post a Comment