From Scholarpedia
Jump to: navigation, search


    Reviewer A

    This nice paper summarizes in an excellent way a complex topic, related in turn to a number of other important subjects. As such, it is virtually impossible to find within the limited space available an exhaustive discussion of all questions related to intentionality. However, I feel that at least two further important points could shortly be taken into consideration, without increasing in a significant way the length of paper itself. Both points are related to the section 4 on neurobiology of intentionality. The first one deals with the problem of the origin of intentionality. The description of brain operation underlying intentionality, contained within this section, evidences how the intentionality itself consists in some kind of general ‘coherence’ characterizing a large set of very different neurobiological processes and in turn emerging from a complex interrelation between different microscopic and macroscopic contexts. Accordingly, we could qualify the emergence of intentionality as a sort of ‘phase transition’, akin to (or different from?) other phenomena occurring in the physical world, such as the passage from paramagnetic to ferromagnetic state, or from conducting to superconducting state. In the last years the author gave experimental evidence, in a number of interesting studies, of the occurrence of phase transitions in brain activity. Therefore he can be considered as a leading authority in this field. It would, then, seem advisable to insert in the same section 4, some references to the problem of origin of intentionality, as well as some quotations of main contributions of the author in this domain. This could help most readers to grasp the relationship existing between the problem of studying and defining intentionality and other problems dealt with, so far, by physicists, biologists, and mathematicians.

    The second point deals with the final part of the section 4, where the author makes reference to the concept of ‘attractor landscape’. This concept has been widely used in a number of abstract mathematical models of brain operation and of cognitive processing (most of them being formulated in a neural-network-based language). However, it is to be remarked that the usefulness of this concept in describing biological brain operation so far has not been fully proved. A simple example, well known to the author, will clarify the things. In this regard, let us suppose that we deal with a dynamical system, and that we are interested in investigating about the nature of its attractors. Once we proved that, for instance, its dynamics is characterized by a chaotic attractor (deterministic or stochastic: here this distinction doesn’t matter), we have obtained, without any doubt, an important information about the possible scenarios underlying the dynamics of system itself. However, in this context the concept of ‘attractor landscape’ becomes useless, as the intrinsic complexity of attractor itself prevents from using this concept for any practical purpose. And, in fact, the latter is used mainly in simple (and perhaps rather artificial) situations, such as elementary examples of systems underlying pitchfork bifurcations (the ones introduced in describing symmetry breaking process since the times of Landau). It is well known that the concept of ‘attractor landscape’ is used (as a metaphor) in some simple neural network models (typically the ones of associative memory), even if this metaphor is unable to allow a complete understanding of the dynamical behavior of these disordered systems. As the previous work of the author shows that he doesn’t share the same ‘philosophy’ underlying these popular models, it seems convenient to introduce, in the final part of section 4, some words, placed in the right place, to highlight the deep difference between the framework adopted by the author and the one of most ‘neural-attractor-modelers’. Maybe expressions such as ‘dynamical attractor’ or ‘dynamical scenario’ could be useful in this context (attractors in the brain dynamics may perhaps consists in dynamical processes, but surely they are not static states).

    Reviewer B

    I have two suggestions: one is to move the last paragraph of his neurobiology section to introduce that section. The other is more pervasive. I urge that Freeman use the term "Intentionality" as he has done so superbly in his historical review from St.Thomas through Heidigger (and Merlo Ponti) but to reserve the terms "Intention" and "Intentional" for the more ordinary conscious use as it is done in the law and by John Searle. It does no good to dwell on confusions that may have been committed here and there. Freeman has the tools to clarify the issue: they are all there in his review.

    Commentary on Reviewer B comments

    I have incorporated your suggestion to shift the paragraph ending the section on Neurobiology to the beginning, with appropriate modifications.

    Regarding the suggestion that I refrain from using "intention" in senses other than those of lawyers and cognitivists, It must be recognized that people can, do, and will use "intention" with differing meanings. My intention here is to list and describe the salient uses as I perceive them in different contexts.

    Clearly there is opportunity here for confusion, so I have introduced the prinicpal qualifiers. "Thomist" and "Husserlian", at relevant usage of "intention".

    Renaissance in the 1600s?

    Your last sentence in the first full paragraph under the heading "The History of Intentionality" is historically inaccurate and problematic. You write of Aquinas:

    "His doctrine provided the foundation for bioscience, medicine, law, and all other fields of intellect in Western Europe for the next 400 years leading to the Renaissance."

    First, your chronology is off. Aquinas lived in the 13th century. The Renaissance that you refer to, I take is the classical renaissance of Florentine Italy. The renaissance had already started by the end of the 13th century, with figures like Dante. Your chronology suggests that the renaissance did not start until the 17th century, which is the Age of Reason.

    Second, you imply that Aquinas was the foundation for the renaissance by your statement "leading to". This is inaccurate. If anything, it would be more proper to speak of the renaissance as a reaction against Aquinas and the Scholastic schoolmen. The renaissance is often viewed as a return to the classical humanities and Plato.

    Finally, your assertion that Aquinas was the basis of Western European thought is also inaccurate. It has been a matter of past and contemporary debate within the academy as to the importance of Aquinas as a thinker. Saying that he provided 'the foundation' neglects Ibn Sina, Ibn Rushd, and other Arabic through whose works the Latin West acquired much of the classical Hellenistic corpus.

    Renaissance: to be more accurate

    When does Renaissance begin? If we look at Italy, basically Florence (but not only: to make a complete recognition we should consider also Siena, Rome, Venice, Messina, Mantova, Ferrara…), the period which is called ‘Renaissance’ has, in literature, its pioneers in Dante (born 1265) and Petrarca (1304); they are co-eve to Aquinas. In architecture and fine arts this period arises one century later, with Leon Battista Alberti (1404), Piero della Francesca (1415), Leonardo (1452), Michelangelo (1475), Raffaello (1483) and other great intellectuals, not only in Italy. In fact, if we consider painters contemporary to Dante (Giotto and Cimabue, mainly), their style can still be called ‘medieval’. So, the question is: which is the main feature distinguishing Renaissance from Middle-age or whatever you call the preceding era? I would answer, with most of scholars, it is the centrality of man; here man is rediscovered in all his ‘humanity’ (therefore Renaissance is often called Humanism); in Renaissance man turns to be the ‘heart’ of the universe, the main subject of poems, paintings, speculations; not in contrast with a theological, theo-centric, perspective (which still represents the dominant context of that era), yet situated in the very center of creation, in a very ‘human’ way. Paradigms of this view are the famous ‘Vitruviano’ of Leonardo, the ‘Annunciata’ of Antonello da Messina and Michelangelo’s ‘Adam’ of the Sistine Chapel. Adam’s finger touches God’s finger in a person-to-person relationship. Yet this centrality derives from a renewed study (and rediscovery) of the ‘classics’ - Cicero, Seneca, Plato, Ovidius, Homer, Catullus and so forth - that’s why another name for this period is ‘Classicism’.

    What about Philosophy? We might say that Aquinas belongs to that era of transition between Middle-age and Renaissance: he is actually the ‘translator’ into Latin (european common language of that period) of Aristotle’s books that were not known until that time because they were not in Western Europe libraries. These books ‘arrived’ at Paris’ University around 1240, after a long ‘migration’ from Eastern Europe via north-Africa and muslim Spain (Andalucia). As a matter of fact, until XIII century in Western Europe only Aristotle’s books of logics were known; on the contrary, important texts as Physics, Metaphysics and Psychology were not. Aristotle’s Psychology was the main source for the theories proposed by Aquinas concerning the soul and intentionality; Aquinas interpreted Aristotle’s conceptions of the relationships between soul and body in a different way respect to Ibn Sina, Al-Farabi and other very important muslim philosophers (but also differently from other important medieval thinkers as Duns Scoto and William Ockam), so giving an interpretative frame which remained the ‘orthodox’ interpretation in western Christian thought until ‘Renaissance’. I believe that in Philosophy, although the italian philosophers to whom we refer when we consider XV century are the Florentine Platonists such as Marsilio Ficino or Pico della Mirandola, we could more properly define ‘Renaissance’ that stream of thought whose main authors are Descartes (1596), Hobbes (1588), Locke (1632) and Leibniz (1646), who brought on a sort of revolution in thinking that begun with Galileo (1564) and unfolded all along XVII and XVIII century. If Galileo ‘reacted’ against the Aristotelic cosmology, and Descartes introduced a brand new metaphysics, their philosophical background was still that of the Scholastics. Reaction to Aquinas’ and Scholastics’ methodology begun a little later, first with Hume (1711) and Kant (1724), then with ‘early Phenomenology’ and ‘Functionalism’. With specific regard to intentionality, later phenomenologists such as Heidegger and Merleau-Ponty reverted to an interpretation of it very close to that of Aquinas. If we consider then in particular Psychology (the so called ‘Psychologia rationalis’), we could also say more; we could affirm that in this field western thought did rely on Aquinas’ conceptual framework and did not progress very much until Wundt (1832); in this sense Aquinas’ thought really represent a foundational reference for the whole Western Europe’s culture.

    Reply to "Renaissance in the 1600s?"

    Aquinas was far more than a funnel inserting Aristotelian and Arabic writings into Western Europe. He received the finest education available to him in the natural and social sciences of his era as well as theology and used it to transform radically the philosophical landscape of his century. While I am not an expert in Scholasticism, in that part of his work that is relevant to contemporary neuroscience, his Treatise on Man, his comprehension of biology and his critique of Platonic doctrine, in my opinion, far exceed the writings of his predecessors in cogency and clarity. His dedication to humanism is undeniable, though not of the pre-Christian strain.

    It is clearly correct to state that the Renaissance appeared and evolved in the 400 years following the seminal work of Aquinas, so it is self-evident that my term “leading to” denotes a process that began perhaps even before his early death and continued apace. What I stated is that Thomist doctrine provided the philosophical and conceptual foundation for the emergent technologies that included medicine, law, economics, navigation, and industry, not merely for “Western European thought”, after Aquinas had Christianized the doctrines he had received in translations and had adapted them for practical uses by his contemporaries. These technologies enabled the accumulated wealth that supported flowering of the Renaissance arts.

    History aside, the great value I find in the work of Aquinas is his formulation of the process of perception as the creation of form from the knowledge within the self and not by the incorporation of forms in what we now call “information” by the self from outside the brain and body. In explicitly disavowing the Platonic explanation of perception he rediscovered the early Socratic dialectic of recursively probing, grasping, assimilating, and probing again in search for truth. That insight was more recently rediscovered by pragmatists and phenomenologists and expressed in John Dewey’s experiential ‘acting into the stimulus’ and Merleau-Ponty’s ‘action-perception cycle’. It was certainly not to be found in the later dialogues of Plato, nor in the writings of Aristotle, Galen, Descartes or Husserl.

    From my limited knowledge I do not know whether that insight appears in Arabic texts, though that bears looking into. An important channel for Thomas Aquinas (1225-1274) may have been the widely known writings of Maimonides (1138-1204), a Spanish Jewish philosopher whose thought was strongly influenced by Arabian scholars through the Moorish occupation of the Iberian Peninsula. For an instructive example from physiology, we credit William Harvey (1578-1657) for discovery of the lesser circulation, but it is well documented that the discovery is properly credited to the Arabic scholar Ibn Al Nafis (1208-1288) three centuries earlier than Harvey's publication of De Motu Cordis in 1628.

    I think you have Thomas exactly backwards, if such murky stuff can be so deftly mishandled. By "you" I mean all of you. Karl Rahner's doctoral thesis, Spirit in the World, is a radical document for modernity, post-modernity, what not, in the terseness of its thesis, which is directly from Aquinas and probably from Aristotle: there is no knowledge without the phantasm (the mental image of the sensible object). That means, as Thomas writes in the passage from the Summa Theologica that Rahner coments on, knowledge occurs in the imagination. Indeed, there can only be an "in" in a place, and imagination is a place, it is the very essence of time and place in cognitive terms--perhaps, Thomistically speaking, it is the material aspect of the human subject which allows us to engage the material world at large--since we evidently are not angels who know spiritually and thus all at once and everything. From world to sense to common sense to imagination: and there we transship to the intellect--but the movement really stops there, for as soon as we imagine something else, that act of knowledge is as if it had never been, albeit we can (try to) revisit it in a new act of knowledge (depending on how carefully we were paying attention when we thought it the first time). Freeman writes above: "... his formulation of the process of perception as the creation of form from the knowledge within the self and not by the incorporation of forms in what we now call “information” by the self from outside the brain and body." That is a formula for wishing or pretending or projection. It would not be receptive. You are probably trying to avoid the (supposedly Platonist) doctrine of illumination whereby the form comes whistling across the universe and lodges in your head. "Wow, that was some feisty information, dude." How does the knowledge get within the self? The better question is, how do we use it once it gets there? The answer is the same--the imagination. Someone reminded me yesterday that this is not just of the video, but of the touch, smell, anything sensible. When we pay attention to it, we are holding it in our mind's eye/ear/hand (can you grasp that?). Perhaps "attention" is a better term than "imagination" since we seem to reserve the latter term for wishing or pretending, a topic Thomas does not seem, in my scant acquaintance, to have addressed--yet such plasticity of imagination is of the nature of human freedom. "Die Gedanken sind frei." I say "better term" because we here are searching for biological/cybernetic/mechanical explanations of cognition. Hume contributed an oar to this regatta with his term "theater of the mind". I can tell, from moment to moment, when I am rapt in gazing at this screen and when I am reflecting on what I see. My rule is that you cannot simultaneously do what you're doing and know what you're doing. That should make the search of the brain for the mind easier. As a BA in biology, as well as one in Economics and Urban Studies and a JD, it seems the failure of the search so far has largely been due to a failure to define adequately what is being searched for. So I suggest, with Thomas backing me up: look for the imagination, which we might also call the attention, or wakefulness. One clue is that there is a noticeable timelag between your eyes seeing something and your mind registering (in what book?) what it is. Then we can embark on the search for the concept. You can't search for the concept in the brain until you realize that it is not part of the act of knowledge. Try this now: the concept, in Thomistic terms, the what-ness, the quod est, is available to us only in the readiness ("readiness is all", Hamlet), the skill, that is available to us once we categorize (with a common noun, a concept) what we have sensed. The rule of that concept, the verbal definition, is not the same thing. To "see" that rule requires a separate act of cognition, and it might turn out that the way we define a concept is not the way we use it in practice. Cognition affirms an object as existing and thus subject to our action via a skill. (Albeit most of the time we exercise skills without thought--automatically, as it were.) Diseases for which there are no cures or treatments are not diseases, and we do not notice what we have absolutely no use for. The ancient question was about the one and the many. The many are represented in/by the universal, which finds a new member in the one in this act of knowledge of this sensible object in this momentary state of the imagination, this one being not the first nor likely the last of this category to be encountered. Encountered where: in the imagination/attention/wakefulness. It is useful not to assume that the ancients were a bunch of nitwits, but also that they were perfectly capable of intellectual negligence or malice: as much so as we. Finally, let me mention a plausible attempt at brain science I read, about autism, and brain scanning that seemed to monitor activity in brain tissue creating self-reference, self-absorption, self-imagery, nostalgia--ring a bell? Test subjects were given arithmetic problems. Autistic people would never "power-down" their self-absorption tissues enough to allow them to do the math. Now if you could trace the brain activity in/by which the decision to power-up or power-down that self-absorption, given that it is probably fantasically, cosmically complex, like with hundreds of factors, you'd be on the trail of the imagination/attention/wakefulness, and then you could, that terrain discovered, trace how a concept is formed, reformed, used, etc. To study the laws, first you must discover the courthouse, or more generally, the forum where the law works.

    Personal tools

    Focal areas