Richard David Ruquist
Guest
|
Post by Richard David Ruquist on Dec 4, 2003 10:07:52 GMT -5
Tarvo, I have come across a new paper in the Cornell archives by Lisewski that purports to define consciousness on the basis of virtual reality and set theory. I am not endorsing the views of this paper. In fact I am just about to read the paper. It is not at all clear that I will be able to understand the paper. But it seems that it should be right up your alley, so to speak. So if you care to read it, I would be very much interested in your opinion of what it has to ay about consciousness. How does it relate to your work? Here is a copy of the abstract followed by a link to the archives: The concept of strong and weak virtual reality Authors: A. M. Lisewski Comments: 15 pages Subj-class: Logic in Computer Science; Computational Physics; Adaptation and Self-Organizing Systems By means of set theory, we reformulate the traditional view on virtual reality which is based on the qualities of immersion, presence, and of interactivity. We also explore resulting implications of this reformulation. For this purpose, we argue that the traditional view on virtual reality is closely related with Sommerhoff's systemic definition of consciousness. We then introduce a representation of Sommerhoff's first order self-awareness through sets. We further assume that these sets, which we call events, originally form a universe for ZFC set theory. Our concept of strong virtual reality characterizes all those collections of events which are elements of the original universe. We then logically weaken strong virtual reality to obtain the concept of weak virtual reality. Its definition characterizes collections of virtual reality mediated events altogether forming a collection {\it larger} than the original universe for ZFC. By giving reference to Aczel's relative consistency result on his non-wellfounded ZFC$^-$+AFA set theory we indicate that this definition is not empty. Further, with Baltag's Structural Theory of Sets (STS) we are able to show that Sommerhoff's first {\it and} second order self-awareness as well as our concept of virtual reality are adequately modeled by the infinitary modal logic of STS. Within the STS framework we propose a structural unfolding process from strong to weak virtual reality. We argue that this process may become real in human history. This argument is supported by empirical evidence and by theoretical aspects known from physics. arxiv.org/abs/cs.LO/0312001
|
|
|
Post by tkorrovi on Dec 4, 2003 16:44:41 GMT -5
Sommerhoff's description of consciousness includes only one ability of consciousness, ie awareness (including self-awareness). Such mere representation can be easily described by set theory, what may be useful, but it cannot be considered artificial consciousness because the other objectively known abilities of consciousness are missing (artificial consciousness aims to understand the natural processes through implementing artificial consciousness, not by just studying different aspects of consciousness, otherwise it would include all mathematics, linguistics etc). Sommerhoff also admits it: "Second, representations of the first and second kind do not contain representations of merely possible objects, events etc., as these are the elements of the subject's imagination. Sommerhoff argues that the latter category is certainly necessary for processes such as thought, but it is not a necessary condition for what we mean by being conscious about the world, the self-in-the-world and about events." As I explained, a form of imagination is necessary to predict and the ability to predict includes many abilities like dynamism and control of the external processes what don't come from awareness or for what awareness only is not sufficient. Why Sommerhoff said that imagination is not a necessary condition for "what we mean by being conscious"? I don't know, in medical professions "being conscious" may mean so called phenomenal consciousness, ie patient is considered to be conscious if it shows that he is aware of what happens around him, this form of consciousness can be defined as just awareness. Otherwise, if we talk about consciousness or "being conscious", then this is allowed to be interpreted in the widest possible sense, ie "totality of thoughts and feelings" or "psychological consciousness". So the subject of this paper is not psychological consciousness, artificial consciousness or consciousness in a common sense.
|
|
|
Post by ruquist on Dec 4, 2003 22:27:52 GMT -5
Tarvo,
The only consciousness I am personally concerned with is awareness. You say that is the medical definition of consciousness, if I read your post correctly. And you say that that form of conciusness is easily treated by set theory.
But then you say:
"So the subject of this paper is not psychological consciousness, artificial consciousness or consciousness in a common sense. "
So I am perplexed. It seems very contradictory. Certainly both psychological and common sense consciousness must include awareness. Artificial consciousness must also include it if it is to be at all useful.
So what's wrong with that paper. You seem to be dismissing it while agreeing that what it did was easy.
I just do not understand.
Regards,
Richard
|
|
|
Post by tkorrovi on Dec 5, 2003 7:55:04 GMT -5
Richard,
You wrote "I have come across a new paper in the Cornell archives by Lisewski that purports to define consciousness on the basis of virtual reality and set theory." Psychological consciousness includes awareness, but this doesn't mean that this paper purports to define psychological consciousness, artificial consciousness or consciousness. Sommerhoff himself said that he excluded some abilities what are necessary for thought, so he clearly did restrict his theory to phenomenal consciousness. I didn't say that there is something wrong with the paper, but it isn't what you thought it is.
|
|
|
Post by ruquist on Dec 5, 2003 10:22:51 GMT -5
Tarvo,
Please forgive me for saying this, but you are telling me what I think; and right I just got done telling you that the awareness aspect of consciousness is all I am personally interested in.
", but it isn't what you thought it is. "
I am quite happy to see a paper that treats phenomenal consciousnes correctly. So how does it relate to your treatment of consciousness. Is it a different aspect ofconsciousness?
Yours,
Richard
|
|
|
Post by tkorrovi on Dec 5, 2003 11:35:37 GMT -5
Richard,
I don't tell you what you think, you just use the word "consciousness" in "I have come across a new paper in the Cornell archives by Lisewski that purports to define consciousness on the basis of virtual reality and set theory" and expext me to understand it as "phenomenal consciousness". This is not what you can expect, if word is used without making it more exact, then it must be interpreted in the most general sense, ie "totality of thoughts and feelings" or "psychological consciousness".
> awareness aspect of consciousness is all I am personally interested in.
Artificial consciousness must implement all abilities of consciousness what are objectively known, so unless you are interested how awareness fits into a system what is meant to satisfy that criteria, then what you are interested in is not artificial consciousness. Your paper explains how to analyze awareness, not how to implement it, it may be useful to analyze the aspect of awareness of artificial consciousness, but then there is a question how to apply these conclusions to the whole system.
But if you only want to talk about awareness or systems what implement awareness then I don't want this forum to fall into the hole that it widespreads to almost everything. If we include everything what may somehow be related to artificial consciousness then we must also include all mathematical logic, all linguistics, all neurobiology and maybe also all love and feelings. So artificial consciousness includes them only as much as they are clearly necessary for artificial consciousness (that mathematics may be partly necessary to analyze artificiall consciousness doesn't mean that everything in mathematics belongs to this forum). Just nobody would be able to deal with such variety of issues and the topic would simply be lost.
|
|
|
Post by ruquist on Dec 5, 2003 23:07:14 GMT -5
OK. You are not interested in my kind of consciousness. But you have yet to convince me that your program has anything to do with any kind of consciousness. To me it sounds more like the use of spin networks o derive space and time
|
|
|
Post by tkorrovi on Dec 6, 2003 9:15:45 GMT -5
The difference is that spin networks have unchanging structure while absolutely dynamic systems have changing structure, every change is change in structure and is only based on change in structure. Why it is done so is that the change is done so that any system can emerge within bigger system, what gives the necessary dynamism and enables to create all necessary possibilities in certain conditions. I explained many times what absolutely dynamic systems have to do with artificial consciousness, so I just say that the properties it have give it a potential to satisfy all the criterias of artificial consciousness what there may be, this is the only such known system. Neural netwoks en.wikipedia.org/wiki/Neural_networks have restricted functionality, you may consider cellular automata en2.wikipedia.org/wiki/Cellular_automata but not everything can emerge there, there is no known reason why they must be artificial consciousness, it has fixed grid and in theory as well as in practice the structures disappear quite quickly. Then there are genetic algorithms en2.wikipedia.org/wiki/Genetic_algorithms (sometimes also called artificial life) but there we must have fixed rules how to generate possibilities, what restrict the functionality. Then there is also swarm intelligence where primitive creatures interact, but the restricted behaviour of a single one still restricts the functionality of the whole system. These are almost all AI systems en2.wikipedia.org/wiki/AI what are not based on language processing. Absolutely dynamic systems are not for simulating physics, they were created as unrestricted artificial intelligence system, though they may simulate physics. I guess it's enough for one post.
|
|
|
Post by tkorrovi on Dec 7, 2003 20:48:58 GMT -5
I don't convince, I only explain. These were my answers to your questions:
I don't argue that it is conscious, I argue that it is proposed mechanism for artificial consciousness, what means that at present there are no reasons to say that it doesn't satisfy the artificial consciousness criteria. ie "artificial consciousness is artificial system theoretically capable of achieving all known objectively observable abilities of consciousness".
I formulated one such ability and this was ability to create a necessary rule in all circumstances when it is possible based on the information it gathers, this would be possible because of the dynamism of the system. The other ability was formulated by people while trying to define intelligence, and this was ability to predict the external events in all possible environments. Prediction means that processes what fit in the environment would survive, this comes from selective deleting and also generating all possible systems what fit with others. This includes subject, what means that it must be able to act so that it can predict the external events, this would be useful in teaching such systems. The system was not taught much yet, but as little as it is, it satisfies the criteria. Theoretically they supposed to satisfy the criteria, but for example Neural Networks or other weak AI systems don't because their functionality is restricted. The most important property is that absolutely dynamic systems are absolutely dynamic, ie within bigger system (certain very small systems may be exception) any system can emerge, when system is points connected to each other.
Wasn't meant predict everything, but predict when it's possible (for human) to predict and do this in all possible environments. This was not only formulated by me, so next time I would write it more exactly - "ability to predict the external events in all possible environmebts when it is possible to predict".
Prediction doesn't explain imagination, but imagination is necessary for prediction. It's necessary to consider different possibilities to find out what would happen.
|
|
|
Post by tkorrovi on Dec 9, 2003 11:53:04 GMT -5
If there is quantum consciousness then it seems to me that if any process propagates as a pilot wave then this includes quantum consciousness and it depends on the entanglement of the particles. This is just how I can conceive it. But concerning how we can model it, the cellular automatas don't seem to be a proper way also because they have regular grid and therefore they cannot model any non-locality.
|
|
|
Post by tkorrovi on Jan 4, 2004 13:43:11 GMT -5
I don't know why you say that I dismiss your paper, but what importance you think there is of relations between strong virtual reality (comes from collection of events in some universe) and weak virtual reality (comes from collection of all events) formulated in terms of set theory for artificial consciousness? We may say that prediction is an event of weak virtual reality, but how does it help when it doesn't give us any mechanism of how prediction happens. In simple words what this paper says is that our world of imagination is larger than world what surrounds us, what is self-evident, and then describes these two worlds in terms of set theory, what is only a formal description, but of course enables to find relations between these two descriptions. I don't deny the importance of it for set theory, but why we need such relations for artificial consciousness? It is also self-evident that artificial consciousness must among the rest implement so-called "weak virtual reality" because there is a lot what it cannot implement otherwise. And "strong virtual reality" is part of it, so if it implements one then it implements another. As much I see, nothing more what we can get for definition of artificial consciousness. And set theory only enables to analyze such systems, it doesn't give any means how actually implement such system, so we cannot use it for that. We may use set theory to analyze AC system (because of ability to analyze graphs in a particular way), but then again this theory doesn't give us any means of how to detect the elements of strong virtual reality and weak virtual reality in the real system. If we would have such mechanism, then we may analyze the results with set theory and only then we may find out whether the theory presented in your paper would be useful for that. I say again that I don't denie an importance of that paper for set theory and possibly computer emulation, but what is important here is importance for artificial consciousness. As I understood the only importance you considered was this "definition of consciousness" there (not defined in or subject of this paper), and we talked about that, if you see anything else then you could say, or you think that it is self-evident that this paper is important for artificial consciousness.
|
|