Talk:Symbol grounding problem
Given that Harnad is THE person to write about the symbol grounding problem, of course the article is acceptable pretty much as written. Nonetheless, I have a couple of suggestions for clarity and one comment of substance.
The abstract is really dense and does not invite readers (particularly naive readers) to go further. I would rewrite completely.
- [SH: Slightly reworked, but I cannot see how to make it any simpler while still giving the gist of the actual content: An abstract should not be just an advertisement, but should actually present the gist of the content of the paper.]"--Harnad 22:10, 11 July 2007 (EDT)
The section on the Chinese room needs a brief description of what the "room" is, the task, and Searle's argument.
- [SH: The "room" turns out to be unimportant and redundant. Searle first formulated the argument as himself, ignorant of Chinese in a room, receiving Chinese symbols as input, looking up symbol-manipulation rules, based only on symbol shape, not meaning, written on the wall, and then, according to the rules, sending Chinese symbols as output. The argument was that there is no "meaning in the room." But the right way to have put it would have been to have Searle memorize the rules (hence no wall, no room). The point would be the same: no meaning; Searle still does not understand Chinese, and Searle is all there is to "the system." Hence the room is irrelevant to the argument. Now, I could say this too, in the article (I've said it in the publications I cite), but would it add to the article?]--Harnad 22:10, 11 July 2007 (EDT)
The comment of substance: I would take out the stuff on consciousness. Given that we don't know a thing (scientifically) about consciousness, pointing to it as a possible solution to any problem is like pointing to empty space.
- [SH: Consciousness is not pointed to as the solution to the symbol grounding problem. The solution to the symbol grounding problem is sensorimotor grounding. Consciousness is a criterion for Searle's Chinese Room argument. Without appealing to the fact that Searle does not understand Chinese -- a conscious criterion -- there would be no Chinese Room Argument! For then Searle would be "speaking" Chinese, hence "understanding" Chinese, regardless of whether he was conscious of it or not! I agree that there is little to be said about consciousness "scientifically." Nevertheless, consciousness not only exists, but it is critical not only for making and understanding the Chinese Room Argument, but for understanding the continuing difference between grounding and meaning. For without consciousness, all there is is grounding...]--Harnad 22:10, 11 July 2007 (EDT)
- [In common with other reviewer(s) I believe Harnad is the ideal person to write on this subject and, on the whole, I feel he explains the concepts very clearly and uncontroversially, with perhaps one significant omission: there is no reference to the earlier, strongly related, philosophical problem of *intentionality* from Brentano (i.e. The relationship between 'mental acts' and the 'external world'). In the case of cognitive science and artificial intelligence, intentionality can be construed as an attempt to solve the 'symbol grounding problem', which, as Harnad explains, is exactly the problem of how to connect the symbolic elements of internal computation to the external objects and states of affairs which they are supposed to represent. I think this omission serious as (i) it forms a bridge between two disciplines (philosophy & cognitive Science/AI), (ii) it highlights the importance and relevance of consciousness in the debate around 'symbol grounding' and (iii) given Searle's earlier work on speech acts and intentionality, it gives a wider context to Searle's famous intervention - via the Chinese Room Argument - in the debate about symbol grounding and machine understanding.]
- [SH: I have added an Appendix on Brentano, as requested by the referee. The problem of intentionality and the symbol grounding problem are not the same problem, hence the solution to the symbol grounding problem is not a solution to the problem of intentionality. However, they are related; and as Brentano is an influential thinker, I have added an Appendix to explain the relation between the two problems.] -- Harnad 03:38 Wed Aug 29
- Less seriously, perhaps Harnad may also wish to consider including reference to Bringsjord and Noel's 2002 paper  on the 'missing thought experiment' in the context of Symbol Grounding Problem as the work purports to show that, even in a Total Turing Test passing robot, meaning would still be absent.  Bringsjord, S. & Noel, R, (2002), Real robots and the missing thought-experiment in the Chinese room dialectic, in Preston, J. & Bishop, J.M., (2002), Views into the Chinese room, Clarendon Press, Oxford.]
- [SH: I don'tthink that article is important enough to warrant treatment in an encyclopedia entry.] -- Harnad 03:38 Wed Aug 29
Voss: Proposed solutions to the SGP
A good overview is given in
- Taddeo, Mariarosaria & Floridi, Luciano (2005). The symbol grounding problem: A critical review of fifteen years of research. Journal of Experimental and Theoretical Artificial Intelligence, 17(4), 419-445. Online version
The authors later proposed a solution to the SGP:
- Taddeo, Mariarosaria & Floridi, Luciano (2007). A Praxical Solution of the Symbol Grounding Problem. Minds and Machines 17(4), 369-389 Online version
The SGP has also been claimed to be solved by others, see:
- Steels, Luc (2006). The Symbol Grounding Problem has been solved. So what’s next?'. De Vega, M. and G. Glennberg and G. Graesser, editors, Symbols, embodiment and meaning. Academic Press, New Haven. Online version
You can argue that the proposed solutions are not valid, but they at least should be mentioned in the article. -- Voss 14:31, 12 November 2010 (CST)