Black Bane

do tulpae have a separate conscious?

Recommended Posts

Yes, I think that a tulpa's conscious mind is separate from my conscious mind.


Orange juice helps with concentration headaches.

Share this post


Link to post
Share on other sites
Guest

You aren't your brain. You are a figment of it. Creating a tulpa is causing the brain to react in a way to create another figment.

 

So in other words, yes, a tulpa isn't necessarily you.

Share this post


Link to post
Share on other sites

You aren't your brain. You are a figment of it. Creating a tulpa is causing the brain to react in a way to create another figment.

 

So in other words, yes, a tulpa isn't necessarily you.

 

5* post, right there.

 

Its funny, people don't really realize that we are just as much a hallucination as our tulpae. Only reason we're here is to allow the brain to react to stimuli.

Share this post


Link to post
Share on other sites

Your tulpa is as sentient as you are free-willled. Which is not at all. It's all just an illusion. But is there really a difference between reality and a perfect illusion? I don't believe there is, at least for our subjective selves...


IT BEGINS

Share this post


Link to post
Share on other sites
Guest

They report having subjective experiences, their memories are different from the host's, they can do other thing while the host is doing something else. The host does not make up responses for them. They have their own goals and desires and can differ in opinion with the host.

 

For all intents and purposes, that seems very much like what you would call a separate consciousness. It at least passes the Turing test and if we are to believe that people with tulpae are not roleplaying, then yes.

Share this post


Link to post
Share on other sites

Difficult question. Could a machine with an artificial intelligence so advanced that they seem to have full consciousness actually have one? I think it's the same thing.

I believe it's an illusion that is fully administered by your subconscious. If there isn't a difference between an illusion and a real thing, since both of them can be represented the same way by our brain, the brain can also "simulate" a sentience that is unpredictable, since we see that everyday. You just need a lot of practice or "tulpaforce" for that to happen. That's my opinion.


They report having subjective experiences, their memories are different from the host's, they can do other thing while the host is doing something else. The host does not make up responses for them. They have their own goals and desires and can differ in opinion with the host.

 

For all intents and purposes, that seems very much like what you would call a separate consciousness. It at least passes the Turing test and if we are to believe that people with tulpae are not roleplaying, then yes.

 

Technically, it's impossible for them to have memories of their own. If they did, where do those memories come from?

They have all OUR memories. However, they have access to all memories that we no longer remember, which is REALLY A LOT of stuff. Our subconscious picks up and registers things that we don't even know we ever saw.

 

Maybe they can imagine stuff and make original creations just like we can, but based on OUR past experience just like ourselves.

Share this post


Link to post
Share on other sites
Guest

Difficult question. Could a machine with an artificial intelligence so advanced that they seem to have full consciousness actually have one? I think it's the same thing.

I believe it's an illusion that is fully administered by your subconscious. If there isn't a difference between an illusion and a real thing, since both of them can be represented the same way by our brain, the brain can also "simulate" a sentience that is unpredictable, since we see that everyday. You just need a lot of practice or "tulpaforce" for that to happen. That's my opinion.

Depends how you define consciousness. I define it as "what it is like to be", or having senses and experiences.

Being of the Strong AI school, I do think an AI could be conscious, and the reason for this is long, but it boils down to the fact that if you say that philosophical zombies are possible that implies that partial zombies are also possible, which would mean some potentially nonsensical things about our own consciousness.

 

Technically, it's impossible for them to have memories of their own. If they did, where do those memories come from?

They have all OUR memories. However, they have access to all memories that we no longer remember, which is REALLY A LOT of stuff. Our subconscious picks up and registers things that we don't even know we ever saw.

 

Maybe they can imagine stuff and make original creations just like we can, but based on OUR past experience just like ourselves.

You can have memories of dreams, lucid dreams and daydreams. Most of our memories are from sensory experiences, but we can also imagine and remember completely novel things, usually by interpolating on sensory experience. A tulpa's memories are entirely like that, if they only use your own memories.

Instead of considering them as "your" memories, consider them as your brain's memories, with your self/personality being some particular pattern in the brain, not the whole. A tulpa would be able to have its own memories of their experiences in their imagined worlds which you don't remember, they would also be able to have access to memories you've had which you can no longer recall.

Share this post


Link to post
Share on other sites

Depends how you define consciousness. I define it as "what it is like to be", or having senses and experiences.

Being of the Strong AI school, I do think an AI could be conscious, and the reason for this is long, but it boils down to the fact that if you say that philosophical zombies are possible that implies that partial zombies are also possible, which would mean some potentially nonsensical things about our own consciousness.

 

I define consciousness as a state that you acquire after a certain amount of brain development. When you have a brain that is developed enough to make you "know that you know", you have consciousness. If a tulpa knows that it knows that it is a tulpa, I can assume it has consciousness. But does the tulpa actually have its own point of view? I don't think we can answer that. A schizophrenic person also experiences vivid hallucinations that are apparently sentient and autonomous, but does that mean that he is creating entire consciousnesses everytime he hallucinates?

 

Since you are from the Strong AI school, do you believe that, to attain perfect artificial intelligence, we need to program and create a code so infinitely large and complex that it can evaluate all possible conditions there might exist? Or do we need to redefine the way we create artificial intelligence?

 

You can have memories of dreams, lucid dreams and daydreams. Most of our memories are from sensory experiences, but we can also imagine and remember completely novel things, usually by interpolating on sensory experience. A tulpa's memories are entirely like that, if they only use your own memories.

Instead of considering them as "your" memories, consider them as your brain's memories, with your self/personality being some particular pattern in the brain, not the whole. A tulpa would be able to have its own memories of their experiences in their imagined worlds which you don't remember, they would also be able to have access to memories you've had which you can no longer recall.

 

But when your tulpa has just been created, it can't have its own memories, unless you implanted them on it, which would make you also have those memories. After that, it can create their own novel things the way you said, interpolating between sensory experiences, memories and knowledge that it acquires through now and memories that were already in your subconscious, and you will not know about them yes. However it is probable that those things that the tulpa creates also get "sent" to your subconscious and remain there, just like a long forgotten memory.

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


  • Recently Browsing   0 members

    No registered users viewing this page.