=Paper=
{{Paper
|id=Vol-1895/AIC16_paper9
|storemode=property
|title=The Brain-Mind-Computer Trichotomy: Hermeneutic Approach
|pdfUrl=https://ceur-ws.org/Vol-1895/paper9.pdf
|volume=Vol-1895
|authors=Péter Érdi
|dblpUrl=https://dblp.org/rec/conf/aic/Erdi16
}}
==The Brain-Mind-Computer Trichotomy: Hermeneutic Approach==
The brain-mind-computer trichotomy: hermeneutic
approach
Péter Érdi
Center for Complex Systems Studies,
Kalamazoo College, Kalamazoo, Michigan, USA
and
Institute for Particle and Nuclear Physics
Wigner Research Centre for Physics,
Hungarian Academy of Sciences, Budapest, Hungary
Abstract. A unifying framework, i.e. the brain-mind-computer trichotomy is
suggested analyzed by by adopting hermeneutic approach. We argue that brain is
a hermeneutic device, and hermeneutics is also necessary to understand situations
and other’s minds. Intentional dynamics is a possible method to set this unifying
framework. Specifically, computational studies suggest that schizophrenia, as a
”disconnection syndrome” can be interpreted as a result of broken hermeneutic
circle.
1 Dichotomies
The term ”brain” is often associated with the notions of ”mind” and of ”com-
puter”. The brain - mind - computer problem has been treated within the frame-
work of three separate dichotomies, i.e. the brain-mind problem, the brain - com-
puter analogy/disanalogy and the computational theory of mind.
1.1 The brain-mind problem
First, the brain-mind problem is related to the age-old philosophical debate among
monists and dualists. Attempts to ”solve” the brain-mind problem can be classi-
fied into two basic categories:
1) materialistic monism, leading in its ultimate consequences to some kind of
reductionism; and
2) interactionist dualism, which is more or less some type of Neo-Cartesian phi-
losophy.
The classification is, obviously, a crude oversimplification: a wide spectrum of
monistic theories exist from Skinner’ radical behaviorism [50] and Patricia Church-
land eliminative materialism [14] through Smart’s physicalism [51] to Bunge’s
emergent materialism [10] (see also the controversial book of Deacon [17]).
Interactionist dualism has always been an influential viewpoint since Descartes
defined the interaction between the spatially extended body and a non-corporeal
mind. Though the modern version of dualism was elaborated by two intellec-
tual heroes of the twentieth century (Sir Karl Popper and Sir John Eccles [45]),
still it has been criticized or even ignored by main stream philosophers of mind,
both by functionalists, and as well as by biology-oriented thinkers. Bickle [8]
suggested that philosophers should adopt a ”ruthless reductionist” approach by
learning molecular and cellular neurobiology. The multiple realizability thesis
(say [1]) emphasizes the importance of hierarchical organization from molecules
to social interactions. Any non-reductionist physicalist theory should tell some-
thing about the mechanism of ”downward causation”.
“Downward causation” is a notion which suggests that higher level systems in-
fluence lower level configurations. Classical molecular biology deals exclusively
with upward mechanisms of causation (from simple events to more complicated
ones) and neglects completely the explanatory role of downward causation. Since
we know that both molecules and genes form complicated networks or feedback
loops, it is difficult to defend the concept that there is nothing else in science than
a linear chain of elementary steps leading from cause to effects. [59]. The method-
ologically successful reductionism is never complete. As Popper suggested, there
is always some “residue” to be explained.
”Downward causation” was suggested by Sperry [52, 53] to explain the brain -
mind problem stating that mental agents can influence the neural functioning.
Sperry was criticised by stating that the postulate that physiological mechanisms
of the brain are directly influenced by conscious processes is unclear [19, 55].
Alternatively, it was cautiously suggested by János Szentágothai in a somewhat
overlooked paper that the nervous system can be considered as being open to
various kinds of information, and that there would be no valid scientific reason to
deny the existence of downward causation, or more precisely, a two-way causal
relationship between brain and mind [54].
On some similar way, Campbell and Bickhard [11] argues that ”organization prin-
ciples” should have some priorities since our ”best physics tells us that there are
no basic particulars, only fields in process. The relationship among free will,
downward causation and the emergence of complexity is discussed in an edited
book from a broad perspective [43].
Twenty years ago it was argued [22] that the philosophical tradition of hermeneu-
tics, i.e., the ”art of interpretation”, which is a priori neither monist nor dualist,
can be applied to the brain. Even more is stated: on one side, the brain is an
”object” of interpretation, on the other side, it is itself an interpreter: the brain
is a hermeneutic device. In similar vein, in The Metaphorical Brain 2, Michael
Arbib [2] argued that our theories of the brain are metaphors, while the brain it-
self represents the world through schemas, which may themselves be viewed as
metaphors.
1.2 The brain-computer analogy/disanalogy
Second, the problem of the brain-computer analogy/disanalogy was a central is-
sue of early cybernetics, in some sense revived in the era of the neurocomputer
boom. More precisely, the two sides of the metaphor (”computational brain” ver-
sus ”neural computer”) should be the subject of a brief discussion. There are
several different roots of the early optimism related to the power of the brain-
computer analogy. We recall two of them. First, both elementary computing units
and neurons were characterized as digital input-output devices, suggesting an
analogy at even the elementary hardware level. Second, the (more or less) equiv-
alence had been demonstrated between the mathematical model of the ”control
box” of a computer as represented by the state-transition rules for a Turing ma-
chine, and of the nervous system as represented by the McCulloch-Pitts model.
Binary vectors of ”0” and ”1” represented the state of the computer and of the
brain, and their temporal behavior was described by the updating rules of these
vectors. In his posthumously published book The Computer and the Brain, John
von Neumann [44] famously emphasized the particular character of ”neural math-
ematics”: ”...The logics and mathematics in the central nervous system, when
viewed as languages, must structurally be essentially different from those lan-
guages to which our common experience refers...”
Arguments for the computer-brain disanalogy were listed by Conrad (1989). Dig-
ital computers are programmed from outside; are structurally programmable;
have low adaptability; and work by discrete dynamics; their physical implemen-
tation is irrelevant in principle; they exhibit sequential processing; and the infor-
mation processing happens mostly at network level. Brains are self-organizing
devices; they are structurally non-programmable; they work by both discrete and
continuous dynamics; their functions depend strongly on the physical (i.e., bio-
logical) substrate; the processing is parallel; and information processing occurs
for both network and intraneuronal information.
Though it was suggested more than two decades ago [3] still it looks useful to
consider the brain as a metaphor for sixth generation computing. Instead of hav-
ing a single universal machine, a computing device is ”composed of different
structures, just as the brain may be divided into regions, such as cerebellum, hip-
pocampus, motor cortex, and so on.”
We now know (as part of the collective wisdom, but see e.g. also [46]) that:(i)
brains are not digital computers; (ii) brain does not have a central processing unit,
but rather uses cooperative, distributed computation; (iii) memory organization is
based on dynamical (as opposed to static) principles, (iv) uses the combination of
discrete and continuous time dynamics, and (v) the synaptic organization of the
brain is very unique, and may be the key element of the biological substrate of
human intelligence.
1.3 The computational theory of mind
Third, the computational theory of mind (and classical cognitive science) holds
that the computational metaphor is the final explanation of mental processes. The
classical version of the theory suggests the mind executes Turing style (symbolic)
computation. As is well known, the birth of the formal AI was the Dartmouth
Conference held in the summer of 1956 (an important year, in many respects)
and organized by John McCarthy. The goal was to discuss the possibilities to
simulate human intelligent activities (use of language, concept formation, prob-
lem solving). The perspectives of the cyberneticians and AI researchers have not
been separated immediately. Some of McCulloch’s papers also belong to the early
AI works, as the titles reflect: (“Toward some circuitry of ethical robots or an ob-
servational science of the genesis of social evaluation in the mind-like behavior
of artifacts”.Or this: “Machines that think and want”).
”Connectionism” [49] emerged an ambitious conceptual framework for a general
brain-mind-computer theory movement, but it is based on principles of ”brain-
style computation” that ignore many of the ”real brain” data. The connectionist
movement is thus directed more to the engineers of near-future generation com-
puter systems and to cognitive psychologists. An attempt to integrate the sym-
bolic and connectionist perspectives was given by Gary Marcus [41]
There are recent debates about the meaning of the concept, which states that
”mind computes”. ”Embodied cognition” seems to be a radical alternative [13] to
classical cognitive science. The central hypothesis of embodied cognitive science
is that cognition emerges from the interaction of brain, the whole body, and of its
environment. To relate classical and embodied cognition we should answer the
question: what does it mean to understand a phenomenon? A pragmatic answer
is to synthesize the behavior from elements. Many scientists believe if they are
able to build a mathematical model based on the knowledge of the mechanism
to simulate a phenomenon and predict some other phenomena by using the same
model framework, they understand what is happening in their system. Alterna-
tively, instead of building a mathematical model one may wish to construct a
robot. Embodied cognitive science now seems to be an interface between neuro-
science and robotics: the features of embodied cognitive systems should be built
both into neural models, and robots, and the goal is to integrate sensory, cognitive
and motor processes
It is not yet clear whether there is any reason to deny that (i) a more general com-
putational framework would be able integrate the dynamic interaction of mind
with its environment, (ii) it is possible to build neuromorphic and brain-based
robots by combining computational neuroscience and traditional robotics [40].
2 Hermeneutics
Hermeneutics is a branch of continental philosophy which treats the understand-
ing and interpretation of texts. For an introduction for non-philosophers. see [42].
One of the most important concepts in hermeneutics is the hermeneutic circle.
This notion means that the definition or understanding of something employs
attributes which already presuppose a definition or understanding of that thing.
The method is in strong opposition to the classical methods of science, which do
not allow such circular explanations. Hans-Georg Gadamer (1900-2002) writes
[35]: ”Understanding always implies a preunderstanding which is in turn prefig-
ured, by the determinate tradition in which the interpreter lives and that shapes
his prejudices. (The Nobel-prize winner physicist Steven Weinberg [60] wrote:
”... A physicist friend of mine once said that in facing death, he drew some con-
solation from the reflection that he would never again have to look up the word
”hermeneutic” in the dictionary).
2.1 Hermeneutics of the brain
Ichiro Tsuda [57, 58] applied the principles of hermeneutics to the brain processes
by using chaos as a mechanism of interpretation. He suggested that (i) a particu-
lar chaotic phenomenon, namely chaotic itinerancy, may be identified with what
he calls hermeneutic process; (ii) in opposition to the idea that ”the brain is a
computer, the mind is a programmer”, ”...the brain can create even a program-
mer through the interpretation process expressed by chaotic itinerancy...” (Tsuda
1991).
In [22] it was asked: how, if at all, two extreme approaches to any living systems,
i.e. the ”device approach” and the ”philosophical approach” could be reconciled.
It was suggested by turning to the philosophical tradition that hermeneutics, i.e.,
the ”art of interpretation”, which is neither monist nor dualist a priori, can be
applied to the brain. Further, it was stated that the brain is both the ”object” of
interpretation as well as the interpreter: therefore the brain is itself a hermeneutic
device. For our own dialog with Tsuda see [25].
2.2 The brain as a hermeneutic device
The brain can be considered as different types of devices. Among others the brain
can be seen as a thermodynamic device, a control device, a computational device,
an information storing, processing and creating device, or a self-organizing de-
vice.
The device approach is strongly related to the dynamic metaphor of the brain
[23]. Dynamic systems theory offers a conceptual and mathematical framework
to analyze spatiotemporal neural phenomena occurring at different levels of or-
ganization. These include oscillatory and chaotic activity both in single neurons
and in (often synchronized) neural networks, the self-organizing development
and plasticity of ordered neural structures, and learning and memory phenomena
associated with synaptic modification. Systems exhibiting high structural and dy-
namic complexity may be candidates of being thought of as hermeneutic devices.
The human brain, which is structurally and dynamically complex thus qualified
to be a hermeneutic device. One of the characteristic features of a hermeneutic
device is that its operation is determined by circular causality.
Circular causality in essence is a sequence of causes and effects whereby the ex-
planation of a pattern leads back to the first cause and either confirms or changes
that first cause; Example: A causes B causes C that causes or modifies A. The
concept itself had a bad reputation in legitimate scientific circles, since it was
somehow related to use “vicious circles” in reasoning. It was reintroduced to sci-
ence by cybernetics emphasizing feedback. In a feedback system there is no clear
discrimination between “causes” and “effects”, since the output influences the
input. Roughly speaking negative feedback reduces the error or deviation from
a goal state, therefore has stabilizing effects. Positive feedback, which increases
the deviation from an initial state, has destabilizing effects.
Systems with feedback connections and connected loops can be understood based
on the concepts of circular and network causality. Leaving aside the clear and
well-organized world of linear causal domains characterizing ”simple systems”
we find ourselves in the jungle of the complex systems [24]. Natural, technologi-
cal and social systems are full with feedback mechanisms.
Circular causality (a key concept of cybernetics) was analyzed to establish self-
organized neural patterns related to intentional behavior [34]. In many cases, spe-
cific neural circuits implement feedback control loops which regulate specific
functions.
Analyzing the question of whether the technical or ”device approach” to the brain
and the ”philosophical approach” can be reconciled, it was concluded that the
brain is a physical structure which is controlled and also controls, learns and
teaches, process and creates information, recognizes and generates patterns, or-
ganizes its environment and is organized by it. It is an ”object” of interpretation,
but also it is itself an interpreter. The brain not only perceives but also creates
new reality: it as a hermeneutic device [22].
2.3 Neural hermeneutics
Frith [32] is working on establishing a scientific discipline ”neural hermeneu-
tics” dealing with the neural basis of social interaction. The key elements of this
approach is the assumption that there representations of the external world can
be shared with others, and this share representation may be the basis of predict-
ing others actions during interactions, Recently active inference and predictive
coding was offered [33] as the basic mechanisms/algorithms of social communi-
cation. Social communication is based on internal models about the each other,
and appropriate updating this internal model implies reduction in the prediction
error.
3 Towards the algorithms of neural/mental
hermeneutics
Understanding situations: needs hermeneutic interpretation
– logic, rule-based algorithms, and similar computational methods are too rigid
to interpret ill-defined situations,
– hermeneutics, ”the art of interpretation” can do it.
– hermeneutics: emphasize the necessity of self-reflexive interpretation and
adopts circular causality
Biological systems contain their own descriptions, and therefore they need special
methods. Hermeneutics: emphasizes the necessity of self-reflexive interpretation.
Both natural science as ”objective analyzer” and (post)modern art reiterate the old
philosophical question: What is reality? As it was mentioned, the human brain is
not only able of perceiving what is called objective reality, but also can create
new reality. It is a hermeneutic device.
There are only preliminary ideas about the algorithms of neural and mental hermeneu-
tics. ”Can complexity scientists bridge, in the words of C. P. Snow, the two
cultures of academia - the humanities and the sciences - to create a more thor-
oughgoing explanation of human cognition? More specifically, can the tools of
hermeneutics, mathematics and computer simulation be integrated to assemble
better and more useful models of human social understanding than currently
exist? These are the two provocative and ambitious questions - the former the
broader, and the latter the more specific - that frame the intent and focus of Klüver
and Klüver’s recent book, Social Understanding”, see the review [12] about the
boo [39].
Somewhat parallelly with the arguments of this paper the action-perception cy-
cle, having been motivated by Walter Freeman’s findings and theory [34, 30]
Robert Kozma is working on understanding the neural mechanisms the inten-
tional perception-action cycle [38, 37]. It is stated that knowledge and meaning is
created in the brain by a circular intentional dynamics, where ”meaningful stim-
ulus is selected by the subject and the cerebral cortex creates the structures and
dynamics necessary for intentional behavior and decision-making”.
4 Schizophrenia: a broken hermeneutic cycle
4.1 Hermeneutics, cognitive science, schizophrenia
Gallagher’s analysis implies: (i) Hermeneutics and cognitive science is in agree-
ment on a number of things. An example is the way we know objects. The inter-
pretation of objects needs ”schema theory” [4]); (ii) Hermeneutics can contribute
to cognitive science. The basis of the argument is that understanding situations
(as it was mentioned earlier) needs hermeneutic interpretation. The usual critique
is that logic, rule-based algorithms, and other similar computational methods are
too rigid to interpret ill-defined situations, but hermeneutics ’ can do it. (”Men-
tal models”, which also helps to analyze situations also should mention. Mental
models have played a fundamental role in thinking and reasoning, and were pro-
posed in a revolutionary suggestion by Kenneth Craik (1914 - 1945) [16]. The
idea that people rely on mental models can be traced back to Craik’s suggestion
that the mind constructs ”small-scale models” of reality that it uses to predict
events.) (iii) Cognitive science also has something to offer to hermeneutics, par-
ticularly to understand other minds. The most popular notion today is the theory
of mind or more precisely ”theory of other’s minds”. The most effective method
of cognitive science to understand other minds, i.e. to show empathy is to simu-
late other minds by using analogical thinking [9]. The neural basis of theory of
mind now seems to be related to mirror neurons, which is the key structure of imi-
tation, and possibly language evolution [5]. A failure of attributing self-generated
action generated by the patient himself (what we may label as the lack of ability
to close the hermeneutic circle) can be characteristic for schizophrenic patients
[6].
Independently from my own interest in hermeneutics, in a collaborative project
we adopted combined behavioral, brain imaging and computational approaches
to associative learning in healthy and schizophrenia patients to explain their nor-
mal and reduced performance in an associative learning paradigm. The working
hypothesis we adopted was that schizophrenia is a ”disconnection syndrome”, as
was suggested among others by Friston and Frith [31] and our aim was to qualita-
tively and quantitatively understand the functional bases of these disconnections.
Control loops in chemical, network and regional levels might be the neural bases
of the interpreting iterative mechanisms. Specifically, the impairment of cognitive
control of the prefrontal cortex on hippocampal processes implies uncertainties in
the task to be solved and will result in poorer performance in learning and recall
processes. While the breaking of the circle may lead to schizophrenic symptoms,
combined pharmacological psychotherapeutic strategies should act to repair the
circle. For the technical details [18, 26, 28, 27, 7]
5 Conclusion
The brain-mind-computer trichotomy is suggested to be treated by a unifying
framework based on the hermeneutic approach. We argue that brain is a hermeneu-
tic device, and hermeneutics is also necessary to understand situations and other’s
minds. Broken hermeneutic circle may lead to pathological behaviours, such as
schzizophrenia.
What we see is that the mathematics of hermeneutics must somewhat different
from what we use to describe the physical world. Frameworks of mathemati-
cal models of complex systems and of cognitive systems should be unified by
elaborating algorithms of of neural and mental hermeneutics. But this will be a
different story.
References
1. Aizawa, Kenneth and Carl Gillett, 2009b. Levels, Individual Variation, and
Massive Multiple Realization in Neurobiology, in J. Bickle (Ed.), Oxford Hand-
book of Philosophy and Neuroscience, New York: Oxford University Press,
529581.
2. Arbib MA 1989, The Metaphorical Brain. John Wiley & Sons. New York
3. Arbib MA , 1994, The brain as a metaphor for sixth generation computing. In:
Computing with biological metaphors. Ed. Paton, R. Chapman Hall, London,
pp. 105-123.
4. Arbib, M.A., Érdi, P., Szentágothai, J.: Neural Organization: Structure, Func-
tion, Dynamics. The MIT Press, Cambridge, Mass. (1997)
5. Arbib, M.A.: The mirror system, imitation, and the evolution of language. In:
Nehaniv, C., Dautenhahn, K. (eds.): Imitation in Animals and Artefacts. MIT
Press, Cambridge, MA 229-280 (2002)
6. Arbib, M.A., Mundhenk, T.N.: Schizophrenia and the Mirror System: An Es-
say, Neuropsychologia 43, 268-280 (2005)
7. Bányai M, Diwadkar V, Érdi P: Model-based dynamical analysis of functional
disconnection in schizophrenia NeuroImage, 58(3): 870-877, 2011
8. Bickle J. 2003: Philosophy and Neuroscience: A Ruthlessly Reductive Ac-
count. Kluwer Acad/ Publ.
9. Barnes, A., Thagard, P.: Empathy and analogy. Dialogue: Canadian Philo-
sophical Review, 36, 705-720 (1997)
10. Bunge M, 1980, The mind-body problem. Pergamon Press
11. Campbell, R and Bickhard, MH Physicalism, Emergence and Downward
Causation. Axiomathes 21 (1):33-56 (2011)
12. Castellani, B. Klüver, J and Klüver: Social Understanding: On Hermeneu-
tics, Geometrical Models and Artificial Intelligence (Theory and Deci-
sion Library A:). Journal of Artificial Societies and Social Simulation.
http://jasss.soc.surrey.ac.uk/15/2/reviews/4.html
13. Chemero, A., 2009, Radical Embodied Cognitive Science, Cambridge: MIT
Press.
14. Churchland PS, 1986, Neurophilosophy:Toward a Unified Science of the
Mind-Brain. The MIT Press, Cambridge, Mass.
15. Conrad M, 1989, The brain-machine disanalogy. BioSystems 22(197-213)
16. Craik, K.: The Nature of Explanation. Cambridge Univ. Press. (1943)
17. Deacon, TW. Incomplete Nature: How Mind Emerged from Matter. New
York: W.W. Norton & Company. 2011.
18. Diwadkar, V., Flaugher, B., Jones, T., Zalányi, L., Keshavan, M.S., Érdi,
P.: Impaired Associative Learning in Schizophrenia: Behavioral and Compu-
tational Studies Cognitive Neurodynamics 2(207-219)(2008)
19. Edelman GM, 1978, Group selection and phasic reentrant signalling: A the-
ory of higher brain function. In:THe mindful brain:Cortical organization and
the group-selective theory of higher brain function, eds.Edelman GM. and
Mountcastle VB. pp. 55-100, The MIT Press, Cambridge, Mass.
20. Érdi P, 1988, Neurobiological approach to computing devices. BioSystems
21(125-133).
21. Érdi P, 1990, The brain-mind-computer trichotomy. In: Connectionism:
bridge between mind and brain? Dalenoort GJ and Scheerer E, eds.: ZIF, Biele-
feld; pp. 7.
22. Érdi P: The brain as a hermeneutic device. Biosystems. 1996;38(2-3):179-
89.
23. Érdi, P.: On the ’Dynamic Brain’ Metaphor. Brain and Mind 1,119-145,
(2000)
24. Érdi, P.: Complexity Explained. Springer Publ. (2007)
25. Érdi P and Tsuda I: (2002), ’Hermeneutic approach to the brain : Process
versus device’, Theoria et Historia Scientiarum, VI(2) (2002) 307-321.
26. Érdi P., Ujfalussy B., Zalányi L, Diwadkar VA.: Computational approach to
schizophrenia: Disconnection syndrome and dynamical pharmacology. In: A
selection of papers of The BIOCOMP 2007 International Conference L. M.
Ricciardi (ed.) Proceedings of the American Institute of Physics 1028, 65-87
27. , Diwadkar V.: The schizophrenic brain: A broken hermeneutic circle. In
Neural Network World 19(413-427)2009
28. Érdi P, Bányai M, Ujfalussy B and Diwadkar V: The schizophrenic brain:
A broken hermeneutic circle. Some new insights and results The 2011 Interna-
tional Joint Conference on Neural Networks (IJCNN), 2011. San Jos, CA, USA
3024-3027, 2011.
29. Freeman W.J. (1999): Consciousness, intentionality and causality. In
Reclaim- ing Cognition: The Primacy of Action, Intention and Emotion, ed.
by R. Nunez and W.J. Freeman, Imprint Academic, Bowling Green, Ohio, pp.
143172.
30. Freeman W.J. (2004): How and why brains create meaning from sensory in-
formation. International Journal of Bifurcation and Chaos 14(2), 515530.
31. Friston, K.J., Frith, C.D.: Schizophrenia: A disconnection syndrome? Clin.
Neurosci. 3, 88-97 (1995)
32. Frith, C.: Making Up the Mind: How the Brain Creates Our Mental World.
Blackwell Publ. (2007)
33. Friston, K.J., Frith, C.D.: Active inference, communication and hermeneu-
tics. Cortex. 2015 Jul;68:129-43.
34. Freeman, W.J.: Consciousness, Intentionality, and Causality. Journal of Con-
sciousness Studies. 6, 143-172 (1999)
35. Gadamer H-G: Truth and Method. Sheed and Ward: London, 1976.
36. Gallagher, S.: Hermeneutics and the Cognitive Sciences. J. of Consciousness
Studies 10-11, 162-174 (2004)
37. Jonathan J, Davis, J.J., G. Gillett, R. Kozma (2015) Revisiting Brentano on
Consciousness: A striking correlation with ECoG findings about the Action-
Perception Cycle and the Emergence of Knowledge and Meaning, Mind and
Matter
38. Kozma R. (2007): Intentional systems: Review of neurodynamics, modeling,
and robotics implementations. Physics of Life Reviews 5(1), 121.
39. Klüver, J and Kl—”uver, C: Social Understanding: On Hermeneutics, Geo-
metrical Models and Artificial Intelligence. Springer-Verlag: Berlin, 2010
40. L. Krichmar and H. Wagatsuma, Eds., Neuromorphic and Brain-Based
Robots. Cambridge UniversityPress, 2011.
41. Marcus, GF: The Algebraic Mind Integrating Connectionism and Cognitive
Science. The MIT Press, Cambridge, Mass. 2001
42. John C. Mallery, Roger Hurwitz, Gavan Duffy, 1987, Hermeneutics: From
Textual Explication to Computer Understanding? Published in The Encyclo-
pedia of Artificial Intelligence, Stuart. C. Shapiro, editor, John Wiley &Sons,
New York
43. Murphy, N, Ellis, G and O’Connor, T, 2009: Downward Causation and the
Neurobiology of Free Will. Berlin and Heidelberg: Springer Verlag.
44. Neumann, J von, 1958, The Computer and the Brain. Yale Univ. Press, New
Haven
45. Popper KR and Eccles JC, 1977, The self and its brain. Springer Verlag,
Berlin.
46. Potter, S. M. (2007). What can Artificial Intelligence get from Neuroscience?
In 50 Years of Artificial Intelligence: Essays Dedicated to the 50th Anniversary
of Artificial Intelligence, M. Lungarella, J. Bongard, and R. Pfeifer (eds.) (pp.
174-185). Berlin: Springer-Verlag.
47. Rosen R, 1993, Drawing the boundary between subject and object: com-
ments on the mind-brain problem. Theoret. Medicine 14(89-100).
48. Rössler OE., 1987, Endophysics, In:Casti J, Karlquist A.eds., Real Brains -
Artificial Minds, North-Holland, New York.
49. Rumelhart DE, McClelland JL., 1986, Parallel distributed processing: explo-
rations in the microstructure of cognition, vols 1, 2., MIT Press, Cambridge,
Mass.
50. Skinner, B.F. (1974). About Behaviorism. New York: Knopf.
51. Smart, J.J. Physicalism and Emergence, Neuroscience, 6: 109113.
52. , Sperry RW A modified concept of consciousness Psychol. Rev., 76 (6)
(1969), p. 532
53. Sperry RW, 1980, Mind-brain interaction: mentalism yes; dualism, no. Neu-
roscience 5(195-206)
54. Szentágothai J, 1984, Downward causation? Ann Rev. Neurosci. 7(1-11)
55. Szentágothai J, Érdi P, 1983, Outline of a general brain theory. KFKI-1983-
117., Centr. Res. Inst. Phys. Hung. Acad. Sci., Budapest
56. Szentágothai J, and Érdi P., 1989, Self-organization in the nervous system. J.
Soc. Biol. Struct. 12:367-384.
57. Tsuda I., 1984 A hermeneutic process of the brain. Prog. Theor. Phys. Suppl.,
79:241-259.
58. Tsuda I., 1991, Chaotic Itineracy as a Dynamical Basis of Hermeneutics in
Brain and Mind. World Futures, 32:167-184.
59. Van Regenmortel, MHV: Reductionism and complexity in molecular biol-
ogy, EMBO Rep. 2004 Nov; 5(11): 10161020. doi: 10.1038/sj.embor.7400284
60. Weinberg S, 1996, Sokal’s Hoax, Steven Weinberg, The New York Review
of Books, Volume XLIII, No. 13, pp 11-15, August 8.