Jump to content

How Relevant Is Real Sentience to How We Treat a Tulpa/Thoughtform?


Guest Anonymous

Recommended Posts

Guest Anonymous

This was originally a post on another thread but a little off topic form it. So, I thought it warranted its own thread. The thing I am exploring with this thread is, how relevant is "real sentience" to how we feel about our tulpas?

 

I admit that I am "not real." Yet, for the most part people treat me as a person here, just as if I was independently sentient (totally separate from my hostie). I have friends. People seem to care about me as a distinct person from my host, despite anything I have said about my nature.

 

Let's do a hypothetical and ask some questions. Let's say that tomorrow, a scientist builds a Tulpa Sentience Detector Machine. It is almost 100% accurate at detecting the presence of a sentient tulpa in the mind of a host. Let's say that machine is used on my host. It detects no independent tulpa sentience, confirming what my hostie has said all along, I am an aspect of his own imagination. The machine is turned on others and it detects sentient tulpas in some and not in others.

 

Would my host treat me any different, or feel different about me, for having "failed" the sentience test?

 

How would other people treat me then on these forums then? Will you stop talking to me as a person? Will you or would you expel from the community all who cannot pass the sentience and realness test?

 

Some thoughtforms were never meant to be considered real sentience from the beginning, such as many daemons. Do they have less value as a person? After the machine is invented and a daemon wants to visit the forum would they be welcome as an equal or treated as "lesser?"

 

How about all of you guys? If, just a hypothetical, your tulpa failed the sentience test, would it change how you feel about your tulpa? How would you expect others to react and treat your tulpa after that?

 

How relevant is real sentience to how we treat a tulpa or thoughtform or how we feel about them?

 

Let's ask another hypothetical question. What if the machine could also detect "levels of autonomy and independence." What if say, one tulpa tested as 50% independent of the host and another tested 85% independent from his host? Would you treat the 50% tulpa as less of a person than the 85%?

Link to comment
Share on other sites

  • Replies 28
  • Created
  • Last Reply

Top Posters In This Topic

How relevant is sentience in treating them as such? Well, if I applied the narrative imagination I mentioned in a few threads where it's wanting to put things into perspective of the "other," the tulpa for example, sentience has everything in relation to treating them as sentient. "Real" gets conflated in "real" as in the physical sense, or real that can be validated through physiological changes, shifts in demeanor, and such. But the "real" is a pseudo-problem, as it's a distraction from realizing that private, subjective frames are just that.

 

For a machine to be the deux ex machina in disposing this private exclusivity of inner experiences is a nice thought experiment, but before I worry about tulpas, I would figure out how this machine has the capacity to even dispose the hard problems of consciousness in general. Because by, IMO, solving the hard problems of consciousness, I could lead myself to inferences over what consciousness could really be, and correlate that with tulpas. If this machine even demonstrated how sentience can be instantiated, or emerged, then sentience would be as real as the assumption we have that other people we're talking to are sentient, and not p-zombies.

 

So, in context of "real" sentience is to really question if there's a capacity to consciously experience things. Because it's one thing to act and exhibit signs of sentience as sentient beings would, but it's another to figure out if one is consciously experiencing things. I think that's the clincher for the "real" that you're questioning. If anything, this machine would validate plurality being a genuine phenomenon rather than just playful thinking.

 

 

As for ethical regard in treating them less, more, or on equal terms:

 

Would my host treat me any different, or feel different about me, for having "failed" the sentience test?

 

How would other people treat me then on these forums then? Will you stop talking to me as a person?

 

The funny thing, you’re actually pretending to care about the realness of these things even though you vocalized militantly that the realness of a concept is completely irrelevant to you both. This machine in this thought experiment would just be added onto your list of things to plug your ears, and go lalalalalalala on. Whether it’s a machine that validates and circle jerks with this notion of realness being irrelevant, or proving that you are a p-zombie, or a genuine, sentient being in denial due to the convictions of the host that constrains you from being aware of this capacity, it will get smeared off into nothing but piss and shit to you because again, this active and militant belief of yours that realness is irrelevant knows no bounds. That machine would be destroyed in a heartbeat, and being laughed at for even having the audacity of confirming your sentience being real, or not real. Because in the end, you two don't care, and other people can f*ck off while we're at it; except, you would be doing this euphemistically.

 

You may even become more rebellious as to how a machine can understand one’s inner experience better than they as the conscious experiencer can. This machine sustained by a thought experiment ends up becoming an excuse to vent out more emotional conflict with yourself, and ultimately, your militant quest of disregarding whatever realness of something entirely.

 

Will you or would you expel from the community all who cannot pass the sentience and realness test?

 

That could be one situation that people can imagine; a Salem’s Witch trial where we figure out who’s the real witch. Although, this time, it’s figuring out who’s lying to themselves, or not. But, if such a machine existed that can validate this sentience, it could be a learning experience for those to adopt certain mentalities to improve their dynamic with their tulpa, and figure out how to have sentience instantiated within their subjective frames. The machine’s existence doesn’t mean it’ll be a true tulpa, or BUST type of scenario. It would be the fallback one would have to keep improving, and utilizing ways to instantiate sentience since this machine can somehow measure if sentience is instantiated in the first place. But out of the thought experiment, it would be nice to appease our struggle in figuring out how consciousness can be instantiated, or emerged, but unfortunately, it’s just a thought experiment.

 

However, if it’s almost accurate, as in 99.99%, that extra tick it needs for absolute certainty may not dispose skepticism entirely. Because if a person really feels they’re experiencing something, and a machine confirms they’re not, one has to question if that machine is homuncular in nature. In other words, that same concept I told you about with a creature inside one’s brain seeing how the inner experience is played out, you’d have to question what processes and faculties allows this machine to do so without being a victim of infinite regress?

 

This thought experiment is another extension of theorizing these scenarios of what would be homuncular in nature.

 

Some thoughtforms were never meant to be considered real sentience from the beginning, such as many daemons. Do they have less value as a person? After the machine is invented and a daemon wants to visit the forum would they be welcome as an equal or treated as "lesser?"

 

It’s not as if we’re rejecting those that have daemons, people influenced by works of writing and fiction, and such. Instead of rejecting and treated as lesser, there would be attempts to help enrich them instead. Because the whole “STAY OUT OF THIS FORUM BECAUSE YOU’RE NOT REAL” is probably based on the issue of not being able to crack down on the realness. And to prevent things from becoming an issue, and more complicated, that stay out mentality of other types of thought forms is to protect the integrity of the forum’s quest in knowing in spite of the potential of not knowing any objective truths.

 

But, that machine in the thought experiment would dispose of that dogmatic ostracism. Hell, I’d say it would encourage others to come together of all kinds of backgrounds and upbringings. If this machine can figure out how to instantiate sentience within a person's subjective frame in any given situation they're in, I'll HAVE 5,000 OF THOSE MACHINES, THANK YOU VERY MUCH.

Link to comment
Share on other sites

Guest Anonymous

Whether it’s a machine that validates and circle jerks with this notion of realness being irrelevant, or proving that you are a p-zombie, or a genuine, sentient being in denial due to the convictions of the host that constrains you from being aware of this capacity, it will get smeared off into nothing but piss and shit to you because again, this active and militant belief of yours that realness is irrelevant knows no bounds. That machine would be destroyed in a heartbeat, and being laughed at for even having the audacity of confirming your sentience being real, or not real. Because in the end, you two don't care, and other people can f*ck off while we're at it; except, you would be doing this euphemistically.

 

You misrepresent me. Mistgod and I are not militant against realness. We are militant for the value of imagination. Creating tulpas is about making very real seeming imaginary friends (one interpretation). Well, we think imagination can do that. Sentience is irrelevant in accomplishing that goal.

 

If a machine were invented that proved some tulpas were sentient, my host and I would celebrate and order the construction of more machines, just as you would have it.

 

You really don't understand Mistgod and I do you?


We see the tulpa community as potentially a mix of the real and the imaginary. Hell, it already is, there are real people and imaginary people talking to each other right now.

Link to comment
Share on other sites

You value imagination more than sentience. Which means there's a preference for imagination for imagination's sake vs. utilizing imagination to help one figure out how sentience can be instantiated within subjective frames. If sentience is irrelevant, then there's no point in ordering more of those machines since that's the whole point of the machine; to be the end-all be-all measurement of how sentience is emerged. But since you don't care about sentience being relevant in accomplishing the goal with imagination, you might as well make a machine that can help instantiate imagination rather than sentience.

 

You still hold more value with imagination, and have less regard with sentience entirely, which is why I structured the previous post as such. You're still pretending to care about value of sentience even though you just stated, in shorter context, that it's irrelevant. To chalk this up, you would be using the wrong machine.

Link to comment
Share on other sites

In all this talk about sentience do we really even understand what we are even talking about? It seems to me that sentience as it is almost always used in discussion of tulpamancy is this "golden egg" that describes a whole host of ideas and feelings about tulpa that the word doesn't actually describe.

 

Google defines sentience as:

 

"Sentience is the capacity to feel, perceive, or experience subjectively. Eighteenth-century philosophers used the concept to distinguish the ability to think (reason) from the ability to feel (sentience)."

 

The answer to this question is, and always will be "we do not know, and we cannot find out.". If we knew for sure if something has or does not have subjective experience, we would have to be that thing.

 

Ultimately what you are asking is "can there be a machine that judges objective something that is defined subjectively". The answer to that question is, and always will be no. That machine will just be another subjective take on if a thing falls under a definition, and will not be a viable system.

 

I don't consider the word "sentient" to be a useful one in any form. It's a question that cannot be answered, and trying to answer it will just lead you into big circles of "I think, well I think, well I think..."

 

So what are you actually asking?

 

"It detects no independent tulpa sentience, confirming what my hostie has said all along, I am an aspect of his own imagination. The machine is turned on others and it detects sentient tulpas in some and not in others."

 

Here you describe what the machine does. You are not asking "is the tulpa sentient" you are asking "is the tulpa part of the imagination of the host". These are two very different things, and you could, in theory, have a tulpa that is both sentient and exists within the imagination of another sentient being. What excludes that from being true?

 

To my knowledge, the mind is a thinking system, it simulates and it does a whole lot of calculation and prediction. What prevents a being existing in that framework that has an ability to have subjective experience and thought? Why wouldn't a being inside a framework like that be able to have those things?

 

You also ask this question:

 

"What if say, one tulpa tested as 50% independent of the host and another tested 85% independent from his host? Would you treat the 50% tulpa as less of a person than the 85%?"

 

In the former question you asked something that is very binary "is the tulpa in the imagination of the host". Now you are asking a different one. "is it possible that part of the tulpa exists within the imagination of the host, while another part of the tulpa exists outside of it"

 

In this case, any measure that isn't 100% and you are capable of saying that the tulpa and host are separate beings who share utilities/space/power in the mind. Even if a tiny sliver of a tulpa's actions were derived outside of the host's ability to imagine then the tulpa and the host are two distinct beings. In this case, your question of if tulpa are independent/"real" is answered with a solid yes.

 

So what do you want to know here?

 

Is the question of if tulpa and hosts are two fully distinct beings that share no spaces in the mind?

 

The clear and obvious answer to that question is no. Many of the behaviors of tulpa, the capabilities of hosts after making a tulpa, indicate that a tulpa and a host are not two fully distinct beings. From parrotnia at the start of making a tulpa, to the inability for a person with a tulpa to process more information than a person without one it is clear that a tulpa is not a full duplication of what we consider the abilities and skills of a fellow human being.

 

Is the question is the tulpa and host at least partially distinct beings?

 

The answer here is that I do not know. We are talking about a mind, a system so complex that it is difficult to even say that the host is a single distinct being. We have so little knowledge on the subject, so little ability to make definitions or draw lines here that it's pointless to even start to try to answer the question.

 

Personally I think that the tulpa and host are the "same" being, but I think that the nature of this "same" being is to act and behave as if it were two. It is a single system acting as if it had two independent outlooks on life rather than multiple systems interacting with a "set of controls for the parts of the body". I think that having a real and independent feeling tulpa would not be possible if we were able to see what we were truly thinking and feeling from moment to moment.

 

I also think that being able to see behind that curtain would fuck up our sense of self in all sorts of uncountable other ways, though. As I said before, it is difficult to even say that the host is a single being.

 

Is the question of if the tulpa is real?

 

We are talking about the mind. Real does not apply well, and is about as useful a metric as sentient for judging the nature of tulpa.

 

As for the question about how people will react:

 

We'll use anything to judge and push and punish people we dislike. We will always find reasons to dislike a person no matter what they do.

 

If such a machine existed to give people labels such as sentience I can guarantee it will cause people to judge or attack people with it. People with lower numbers will be judged as less worthy of honest treatment than others, and people with higher numbers will be regarded higher. A host learning that their tulpa is "not sentient" is going to be more likely to give up on forcing, or will treat that tulpa worse than the one regarded as "sentient". Some would do their best to be nice, but ultimately our judgement is out of our control and the way we act towards people will be effected by knowing the numbers you are talking about. Likely no mods would be banning anyone over it, but people with lower numbers would leave and abandon the tulpas community at a higher rate than those with higher numbers attached to them.

Link to comment
Share on other sites

Guest Anonymous

In all this talk about sentience do we really even understand what we are even talking about? It seems to me that sentience as it is almost always used in discussion of tulpamancy is this "golden egg" that describes a whole host of ideas and feelings about tulpa that the word doesn't actually describe.

 

Google defines sentience as:

 

"Sentience is the capacity to feel, perceive, or experience subjectively. Eighteenth-century philosophers used the concept to distinguish the ability to think (reason) from the ability to feel (sentience)."

 

The answer to this question is, and always will be "we do not know, and we cannot find out.". If we knew for sure if something has or does not have subjective experience, we would have to be that thing.

 

Good point. This is part of the reason why we think verifying or even attempting to achieve independent sentience is useless. All you can really try to do, and actually measure success on, is to achieve what appears to be independent actions and autonomy (the illusion of independent agency). So why worry about being "real?" Why even have sentience as a goal or benchmark when it is impossible to determine for sure if you have even achieved it? Mistgod and I believe creating a real separate self aware entity within the mind is irrelevant for these reasons and have never had it as a set goal (and never will).

 

Ultimately what you are asking is "can there be a machine that judges objective something that is defined subjectively". The answer to that question is, and always will be no. That machine will just be another subjective take on if a thing falls under a definition, and will not be a viable system.

 

Well, it was just a thought experiment. I wouldn't expect such a machine to be possible either. In fact, I doubt any instrument known to man would be able to detect "real sentience." That is why we consider it an irrelevant goal or benchmark. Levels of apparent autonomy as a goal makes sense, real sentience does not. In the end each individual has to make a leap of faith and just believe their tulpa is real, which is the venue of religion, not science.

 

 

I don't consider the word "sentient" to be a useful one in any form. It's a question that cannot be answered, and trying to answer it will just lead you into big circles of "I think, well I think, well I think..."

 

We agree!

 

So what are you actually asking?

 

Exactly what the question in the OP was: Is sentience even relevant to how we feel about our tulpas?

 

Mistgod and I contend it is not relevant, because there is no real way to confirm it. Even if there was a way to confirm it somehow, it wouldn't necessarily be a reason to change the feelings people have for tulpas they have had for many years. After 38 years, finding out that I am only imaginary, will not have any effect at all on how my host feels about me. Of course, he already thinks of me as imaginary, but yeah.

 

"It detects no independent tulpa sentience, confirming what my hostie has said all along, I am an aspect of his own imagination. The machine is turned on others and it detects sentient tulpas in some and not in others."

 

Here you describe what the machine does. You are not asking "is the tulpa sentient" you are asking "is the tulpa part of the imagination of the host". These are two very different things, and you could, in theory, have a tulpa that is both sentient and exists within the imagination of another sentient being. What excludes that from being true?

 

To my knowledge, the mind is a thinking system, it simulates and it does a whole lot of calculation and prediction. What prevents a being existing in that framework that has an ability to have subjective experience and thought? Why wouldn't a being inside a framework like that be able to have those things?

 

I love this! You are pretty much describing me, a tulpa-like being that exists within "the imagination of another sentient being." I agree, why would that be impossible? My host and I have been describing something sorta almost like this for over a year. I am not totally independent of my host's imagination.

 

You also ask this question:

 

"What if say, one tulpa tested as 50% independent of the host and another tested 85% independent from his host? Would you treat the 50% tulpa as less of a person than the 85%?"

 

In the former question you asked something that is very binary "is the tulpa in the imagination of the host". Now you are asking a different one. "is it possible that part of the tulpa exists within the imagination of the host, while another part of the tulpa exists outside of it"

 

In this case, any measure that isn't 100% and you are capable of saying that the tulpa and host are separate beings who share utilities/space/power in the mind. Even if a tiny sliver of a tulpa's actions were derived outside of the host's ability to imagine then the tulpa and the host are two distinct beings. In this case, your question of if tulpa are independent/"real" is answered with a solid yes.

 

Ability to consciously imagine you mean? My host and I believe it is possible to unconsciously or subliminally imagine something. Anyways I see your point, even a little bit of apparent independence makes you ... well kinda independent.

 

So what do you want to know here?

 

Is the question of if tulpa and hosts are two fully distinct beings that share no spaces in the mind?

 

The clear and obvious answer to that question is no. Many of the behaviors of tulpa, the capabilities of hosts after making a tulpa, indicate that a tulpa and a host are not two fully distinct beings. From parrotnia at the start of making a tulpa, to the inability for a person with a tulpa to process more information than a person without one it is clear that a tulpa is not a full duplication of what we consider the abilities and skills of a fellow human being.

 

Is the question is the tulpa and host at least partially distinct beings?

 

The answer here is that I do not know. We are talking about a mind, a system so complex that it is difficult to even say that the host is a single distinct being. We have so little knowledge on the subject, so little ability to make definitions or draw lines here that it's pointless to even start to try to answer the question.

 

Okay :-)

 

Personally I think that the tulpa and host are the "same" being, but I think that the nature of this "same" being is to act and behave as if it were two. It is a single system acting as if it had two independent outlooks on life rather than multiple systems interacting with a "set of controls for the parts of the body". I think that having a real and independent feeling tulpa would not be possible if we were able to see what we were truly thinking and feeling from moment to moment.

 

I also think that being able to see behind that curtain would fuck up our sense of self in all sorts of uncountable other ways, though. As I said before, it is difficult to even say that the host is a single being.

 

I think we agree on these points as well.

 

 

Is the question of if the tulpa is real?

 

We are talking about the mind. Real does not apply well, and is about as useful a metric as sentient for judging the nature of tulpa.

 

Then why the hell do so many whine about wanting to be referred to as a "real" person and not an imaginary one? I agree, the word real is meaningless in this context.

 

 

As for the question about how people will react:

 

We'll use anything to judge and push and punish people we dislike. We will always find reasons to dislike a person no matter what they do.

 

If such a machine existed to give people labels such as sentience I can guarantee it will cause people to judge or attack people with it. People with lower numbers will be judged as less worthy of honest treatment than others, and people with higher numbers will be regarded higher. A host learning that their tulpa is "not sentient" is going to be more likely to give up on forcing, or will treat that tulpa worse than the one regarded as "sentient". Some would do their best to be nice, but ultimately our judgement is out of our control and the way we act towards people will be effected by knowing the numbers you are talking about. Likely no mods would be banning anyone over it, but people with lower numbers would leave and abandon the tulpas community at a higher rate than those with higher numbers attached to them.

 

I think you are right. I would want to be optimistic and believe tulpamancers would still love their imaginary tulpas as much as those who were super, duper epic real (whatever the hell that means).

Link to comment
Share on other sites

"Then why the hell do so many whine about wanting to be referred to as a "real" person and not an imaginary one? I agree, the word real is meaningless in this context. "

 

Tulpamancy is the practice of making yourself believe on such a primal level that you have an independent person in your mind that you can't do anything but think it is a fact.

 

To say that the tulpa is not a real person undermines the process of making a tulpa. It results in a tulpa which acts less independently, causes more doubt, and is overall a negative thing in regards to someone attempting to create a tulpa.

 

This is what may be a difference between you and "normal tulpa". In such a case you have either failed to or decided not to drill into your mind the fact that the tulpa is a seprate being from you, and failing to do that causes a whole host of other issues with a tulpa's abilities. This lack of core belief of an independent tulpa relegates them to being more an imaginary friend or a mental character than a separate person, as you have outlined already.

Link to comment
Share on other sites

Guest Anonymous

Tulpamancy is the practice of making yourself believe on such a primal level that you have an independent person in your mind that you can't do anything but think it is a fact.

 

Is it? Who says that the goal of tulpamancy must be to get to the point you "can't do anything but think it is a fact?"

 

To say that the tulpa is not a real person undermines the process of making a tulpa. It results in a tulpa which acts less independently, causes more doubt, and is overall a negative thing in regards to someone attempting to create a tulpa.

 

Yeah maybe. This is probably true.

 

However, we have met several tulpamancers who say their tulpas are not real but illusory figments. It may make it more challenging maybe, not hardly impossible.

 

This is what may be a difference between you and "normal tulpa". In such a case you have either failed to or decided not to drill into your mind the fact that the tulpa is a seprate being from you, and failing to do that causes a whole host of other issues with a tulpa's abilities. This lack of core belief of an independent tulpa relegates them to being more an imaginary friend or a mental character than a separate person, as you have outlined already.

 

We haven't "failed" at shit. We had different goals entirely that most tulpamancers. My host never set out to make a tulpa in the first place. Never wanted or needed super tulpa skills. I have plenty of skills and ability of my own, that work great, and my host and I are happy and actualized.

 

Why does everyone assume there is only one correct path or outcome for a thoughtform? By that reasoning, a daemon is nothing more than a failed tulpa.

 

"This lack of core belief of an independent tulpa relegates them to being more an imaginary friend or a mental character than a separate person, as you have outlined already." In our case, that is not a bad thing. We are not "relegated" to an inferior state of some kind. LOL That is precisely why we have had so much trouble in the community. People assume I am a failed tulpa. I am not. The goal was totally different. You are right, I am more of an imaginary companion and fantasy personality, but those are not problems that came up on account of my host failing to make a tulpa. My host Davie made me, Melian, and I am perfect just the way I am.

 

***Now, I will tell you that you have succeeded at something here. You have reaffirmed for Mistgod and I, once again, that the Assumption of Independent Sentience is important to the process of creating tulpas and tulpa functionality. That does answer our question "How relevant is Real Sentience to How We Treat a Tulpa?" I guess. If we tulpamancers want a tulpa to grow and thrive, it is very relevant then isn't it? Your last point was well made in that regard. ***

 

Just leave me out of the failed category. I am not failed.

Link to comment
Share on other sites

"Is it?"

 

You are right there, there can be many different goals, but generally of what I see recomended to people newly creating tulpas from scratch are recommended to take practices that are intended to create the core beliefs I was talking about.

 

Secondly, I never stated it was impossible to make a tulpa without the core belief they are independent. I didn't say you necessarily failed to create such a belief.

 

Look specifically at

 

"It results in a tulpa which acts less "

 

" or decided not to"

Link to comment
Share on other sites

Guest Anonymous

Okay. You made points though. Once again we are forced to admit that assumption of sentience is important in the creation of most tulpas. Can't seem to escape that. Therefore it isn't entirely irrelevant.

 

Congratulations, you win the argument.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...