Jump to content

(Almost) Sharing Qualia


Lucilyn

Recommended Posts

The title says nothing about tulpas versus hosts.

 

Tulpa: Almost sharing qualia

Host: Having full capacity in experiencing qualia, i.e., having internal, conscious experience of things and phenomenon going on.

 

Here’s an analogy about ‘more than, less than’:

 

- For example, something can’t be more artificial, or less artificial than the other. It would be artificial either way. Especially with AI – even if the programming may be better for one AI vs. another, they’re both still artificial; they can never have the capacity that allows us to consciously experience things.

 

- P-zombies – entities that seem to act like sentient beings, but lack conscious experience. If a tulpa gets conceptualized as almost sharing qualia, but not having the potential to have the same capacity of having qualia, i.e., internally subjective conscious experience, then they will indirectly be compared to the host that anyone else would assume has sentience and the capacity for qualia.

 

@tulpa001:

 

- I acknowledge what you’re saying. It’s incontestable that qualia is what separates us from p-zombies, obviously. I wanted to use the backoffice of the mind as a metaphor to focus on what allows qualia to be instantiated, or manifested . In other words, the processes and what have you that is one of the hard problems of consciousness (e.g. how subjectivity comes to be prevalent).

 

- Yes, the metaphor falls short at some point because it’s just a reminder that we all can’t know, until the hard problems of consciousness are alleviated in the future, how qualia is prevalent. That’s what I’m focused on, and when I see the thread that talks about ‘Almost’ Sharing Qualia, I want to focus more on whether or not others believe a tulpa can share the same backoffice, metaphorically speaking, to have their own means of internal, subjective experience. Anything else about qualia and ‘conscious’ experience, I am in full agreeance.

 

- The specific concern, for me, is how we’re splitting the metaphor from the basic term of qualia, and taking qualia out of the equation. I’m just looking at one specific part that somehow allows qualia and first-person POV to come to be, the ineffable processes that have yet to be proven due to the hard problems of consciousness.

 

- Which, when I combined with a general assumption in neurological science, I acknowledge that it is ad-hoc like you mentioned (e.g. potentially baseless since there’s no empirical support related to tulpas). In spite of this, however, I think one can acknowledge that whatever goes on within their everyday cognition is contingent on each person’s mind, obviously.

 

- Thus, with switching, I think what may have been difficult for me to point out is:

 

o Rather than me focusing on just qualia, something we’re aware of, I wanted to introduce the back office metaphor for that ineffable phenomenon that instantiates qualia somehow. I feel that a tulpa is obviously tied down to this as I can’t imagine a tulpa being an other-worldly being. Now, just because they’re tied down to those same neurological underpinnings (which I acknowledge can have different pathways, and what have you), I can’t rely on the mechanical model of that because I acknowledge the probability of a tulpa having the capacity to tap into those ineffable, mental phenomenon, and having their own POV.

 

o Because if they don’t have a different POV from the host, then switching and possession loses their meaning. In other words, if it’s the same POV that one transitions in, then POV is used wrong, and another word must come to aid this conflict. It’s a lack of better wording on my end. My apologies.

 

o In essence, in my little theory:

 

The brain somehow has probable ineffable phenomenon that can create multiple POVs. We don’t know how to empirically explain it due to the hard problems of consciousness still being prevalent to this day.

 

Which means a tulpa may have the probability of having a POV altogether, and being conscious of experiences. In other words, they’re not ‘quasi-sentient,’ or ‘almost sentient’ as the thread seems to imply (e.g. p-zombies).

 

 

That ‘artificial’ example is used to express how when compared to ‘sentient’ beings, one cannot be ‘almost,’ or just ‘kinda’ sentient as it would imply that p-zombies are not just conceivable, but also having the potential to be actualized.

 

If this is the case, then switching would become invalidated since it revels in the premise of there being another conscious evaluator that can shift from one POV of something to another (e.g. mental to this reality and vice versa)

-

Link to comment
Share on other sites

  • Replies 31
  • Created
  • Last Reply

Top Posters In This Topic

I still don't get where you're getting the separation of host and tulpa thing from. No one said tulpas "can't experience qualia", just that no two people can share the exact same qualia because by definition they would have to be the exact same person. Was gonna write more but it was a repeat of what I said in my last post, so..

Hi! I'm Lumi, host of Reisen, Tewi, Flandre and Lucilyn.

Everyone deserves to love and be loved. It's human nature.

My tulpas and I have a Q&A thread, which was the first (and largest) of its kind. Feel free to ask us about tulpamancy stuff there.

Link to comment
Share on other sites

qua·li·a

ˈkwälēə/

nounPhilosophy

plural noun: qualia; noun: quale

 

the internal and subjective component of sense perceptions, arising from stimulation of the senses by phenomena.

 

 

I think we're using qualia to shoehorn more specific things tied down to conscious experiences. With this definition, it's talking about the bigger picture, including those ineffable phenomenon that allows qualia to be. If a tulpa cannot share, completely, of those internal and subjective components going on in the brain, then how is one supposed to believe that they have the capacity of be part of that phenomenon that allows them to be another conscious evaluator?

 

The rooting of this issue is how some may be undermining the full capacity of qualia. That it's, 'oh, just a person's subjectivity,' rather than including that it's also the phenomena in the brain that instantiates it. One has to question if the brain can instantiate a variant; another conscious evaluator. If a tulpa can only 'almost' share those ineffable phenomenon, then this whole tulpa phenomenon becomes a fringe concept between p-zombies, and potential conscious evaluators with an assumed, different POV. When you say no two people can share the same qualia, I get that you're acknowledging a tulpa as a potential label of a 'person,' but I also feel it's because we naturally know that when it comes to person-to-person physically, they can't be the same person (e.g. solipsism, and other subjectivist ideologies that may support that).

 

But, if it's now the state of affairs with mental phenomenon, when we try to use that same physical person-to-person reference, it's causing some conflict because again, 'it's the same person' if it, qualia, can be shared completely (to you). There's a different between sharing, and there being not splitting. Sharing is about having a portion of something. By saying they can't share completely, that means they can't share a portion of those same ineffable phenomenon. It also assumes that a tulpa isn't part of whatever allows the mind to spur the experience of them having the capacity of being a potential, conscious evaluator. Because by your logic, if they could share completely, they would be the same person. When it comes to physicality, people will assume it's the same person, albeit with a few nuances in their behavior (unless they had prior knowledge of there being a tulpa, and having the benefit of the doubt). But internally between host and tulpa, if a tulpa can only 'almost' share, or never completely being a part of those ineffable, mental phenomenon that gives a host their individuality, then treating a tulpa as sentient is just wishful thinking.

 

Undermining the full horizon of qualia is what tramples out the novelty here.

Link to comment
Share on other sites

I really don't get what you mean. You acknowledged my definition of qualia but then followed it with logic using your own definition again. The "bigger picture" stuff you're talking about is what's being shared, and the personal entirely subjective to the experiencing individual qualia is what can't be, without "being the same person". Which is a confusing way of saying only one individual. Two people can't be one person, my tulpas aren't me, if at any point it wasn't clear that's not what I was saying. All of the peripheral experience of the qualia, memories and thought patterns and whatever else have you, is what can be shared and what this thread's about. It's only "almost" sharing qualia because there's still the unique-to-the-individual part that, again, can't be shared because it's unique to the individual. In our case, Lucilyn experienced the song in a more exciting and enjoyable manner than she first did, but did not personally have the same connections to it and so didn't share the exact same experience as Tewi. But she did share a large part of Tewi's experience, that had its origin in Tewi's perspective/thought patterns and not her own. That is the entirety of what I, her, and the thread were trying to say. For clarification's sake I'm talking about it a little deeper, but Lucilyn certainly didn't think it through so intensely before making the thread. So if you're at all replying to us, you're definitely reading too far into it because it was never meant to be read that far into. But if you're just discussing more intently - that's alright, just note whatever she said before now wasn't trying to be thorough about the idea.

 

And I try not to get into the huge philosophical / theoretical discussions anymore, because I don't see a point other than "food for thought" entertainment. I get hung up on trying to make everyone understand each other (including myself) rather than enjoying the discussion, since my overall goal here is to help people understand things, not necessarily share my own ideas. If that makes sense. Basically, writing posts like this is a little stressful. But uh, anyways, carry on with the discussion. Lucilyn made the thread just to give people something interesting to think about (which it looks like successfully turned into a discussion), and I'm iffy on how deep into it I want to get.

Hi! I'm Lumi, host of Reisen, Tewi, Flandre and Lucilyn.

Everyone deserves to love and be loved. It's human nature.

My tulpas and I have a Q&A thread, which was the first (and largest) of its kind. Feel free to ask us about tulpamancy stuff there.

Link to comment
Share on other sites

For example, something can’t be more artificial, or less artificial than the other. It would be artificial either way. Especially with AI – even if the programming may be better for one AI vs. another, they’re both still artificial; they can never have the capacity that allows us to consciously experience things.

Not quite. Something can be more artificial or less artificial than something else. One key point is that even if artificiality is yes or no phenomena, there are things in this world that are less artificial by dint of being natural.

 

Artificial intelligence of the sort of philosophical interest has the capacity for self modification. Unlike artifice, whose product is universally considered artificial, self modification is usually seen as organic. Consequently, a self evolved artificial intelligence is artifical in name only.

 

The concept and idea of consciousness is an interesting unknown about artificial intelligence. The belief that it is impossible to artificially induce the phenomenon can *only* be substantiated by the belief in a soul_object that possesses both the properties of providing the phenomenon of consciousness and and of being impossible to create artificially.

 

The first property is required or there is no restriction on what can and cannot be made to be conscious. The second property is required or we can artificially create a soul_object and place it in a computer.

 

The soul_object cannot be made out of any physical phenomenon hereto discovered. All such phenomenon can currently be reconfigured and altered by artificial means, with the exception of black holes. However, black holes do not seem decent candidates for being the source of soul_objects at this time.

 

An argument is also to be made about the current completeness of our understanding of physics. Basically, the four basic forces and the elements of matter, and the elements of exotic matter appear to account for *all" observable phenomenon of local extent. Local here being within talking distance of planet earth. One would be for example justifiably surprised if an alien entity that operates beyond the laws of physics were to warp in and start threatening the planet earth. The primary argument against this is a series of handwaves considering the historical inaccuracy of scientific models. This is an appeal to probability, but does not work so well in light of the fact that physics is a mature science.

 

Scientifically observable phenomenon are generally restricted to phenomenon that demonstrate frequency and consistency. So, for example, miracles, even if they did exist, would be firmly outside the domain of science, as they are not frequent, or consistent. However, souls cannot be considered miracles by this definition. Humans are both frequent and consistent, and it is alleged that all humans have souls.

 

Consequently, to say that a soul does not follow the laws of physics is to make an argument in the face of existing evidence. It proposes some easily accessible physical phenomenon that has so far somehow evaded scientific detection. When nothing else has.

Link to comment
Share on other sites

- I acknowledge what you’re saying. It’s incontestable that qualia is what separates us from p-zombies, obviously. I wanted to use the backoffice of the mind as a metaphor to focus on what allows qualia to be instantiated, or manifested . In other words, the processes and what have you that is one of the hard problems of consciousness (e.g. how subjectivity comes to be prevalent).

Then our misunderstanding is an easy one. The back office of the brain does not contain qualia, it produces qualia. Indirectly, in this case. The human subconscious mind instantiates consciousness. Each consciousness will have their own exclusive sensation, despite being created and supported by the same engine, in the case of host and tulpa.

 

Your continuing discussion of this back office leads me to note that there is a weakness of oversimplification here, beyond just the above. Not only is there an underlying engine which supports the conscious mind, but aspects of how the conscious mind is capable of experience are understood, but do not come out through a flat metaphor of an office space. The processes of self-modification, trial and error, self-imaging (self reflection), are key. Particularly self reflection, which is the process of creating a symbolic map of the internal components of the self. Sensitivity, the ability to respond to stimulus also factors in.

 

The brain somehow has probable ineffable phenomenon that can create multiple POVs. We don’t know how to empirically explain it due to the hard problems of consciousness still being prevalent to this day.

Point of view is not reliant on the preexistence of qualia or consciousness. So the hard problems of consciousness are completely irrelevant to the brain's ability to process multiple perspectives.

 

--In fact, computers can process multiple perspectives in a provable manner in an environment supposedly devoid of conscious awareness. This is a trivial truth exposed by a basic study of computer science. It is fundamental to the entire discipline.

 

--The brain frequently handles multiple perspectives simultaneously. When two perspectives collide, cognitive dissonance results. The human's ability for empathy proves that a single conscious awareness can accept an additional perspective into itself for processing. This seems a key tool used by the human mind for processing situations: to artificially invoke several mental "persons" then calculate their behaviour in a hypothetical scenario.

 

--Perspective is inherent in our methods for organising thought and knowledge. You have discussed first person perspectives, which are tied to an individual observer, and a default perspective, which is the one tied to yourself. But there are also role-perspectives, such as the way you think when you are being a scientist, and the way you think when you are being a parent. These perspectives are non-identical, even inside the same person.

 

There is also the tool-perspectives, such as the general purpose "bird's eye view" objective perspective. There is also the mechanistic functional perspective, useful for determining the exact behaviours of things. There is also an extensional perspective which defines things through membership in a group, and an intensional perspective that defines things through the properties they possess. There are also numerous other such tool-perspectives, currently only mostly being explored in the field of mathematics.

 

--Oh yeah. And also, if perspective couldn't be handled separately from qualia, then p-zombies would be an impossible construct. They would malfunction at every turn, making their ability to simulate consciousness non-existent.

Host comments in italics. Tulpa's log. Tulpa's guide.

Link to comment
Share on other sites

Luminesce:

 

I was fixated on the title of the thread, not really the person that started it. It wasn’t my intention to question Lucilyn as that would be an attack. It’s just the concept I’m focused on.

 

I was just comparing to see differences and similarities. I was seeing how your idea of qualia, and the need to prevent the implication that it’s the same person vs. someone that’s conscious in their own right (e.g. tulpa). Let’s take experiences in lucid dreaming where one encounters dream characters that seem to appear sentient. They think and act as such, and one has to question if they can apply the same logic that qualia has to be this split of exclusivity, or qualia is an inclusive thing. If it’s the former, then it’s a form of dualism of there being a split, and potentially unchallengeable exclusiveness of having individual subjectivity. But, it raises the request that if one, or the other cannot share that POV (e.g. how one sees this reality), especially in context of switching, then how can switching even be possible?

 

It’s one thing to acknowledge a representation of different POV, but it’s another to acknowledge whether or not both parties in question can transition in and out of those altered states of awareness. If the tulpa cannot share the first person POV typical in visualizing this reality of phenomenal events (e.g. via our eyes), then this would mean the tulpa cannot shift into that POV while the host shifts into the mental phenomena. Think of it this way, if a tulpa cannot share the POV of the body’s eyes to apply context to this reality, then the metaphorical model of how these POVs are transition have their limits to describe accessibility of physical things, in other words, the eyes. If a tulpa cannot, in theory, transition to fixate more on this reality, and become a dominant conscious evaluator of it while the host revels inwardly, then this means they need a new pair of eyes for their own use. But obviously, they’re pinned down to the physicality of the body.

 

The more mental experiences can be whatever the person wants to since those experiences are more malleable than the actual physicality. But if qualia in the sense of events with this reality cannot be shared, and yet can be shared in the ‘bigger’ picture sense that you seem to acknowledge as well, ‘qualia’ gets used in a different context. Which means it’s somehow being used as an umbrella term, and thus the confusion is prevalent.

 

 

War:

 

With the explanation you provided of things actually being more, or less artificial than the other, along with your inference that self-reflection seems probable even in artificial means:

- Even if one is seeing AI and such going through the motions of looking as if they’re sentient, it doesn’t mean they have conscious awareness that’s exactly like our means of being conscious. If this were the case, then there’s an implication of the hard problems of consciousness being solved to the point where consciousness and such can be shifted in and out of anything. In other words, just because your conception of self-modification would be deemed as organic doesn’t necessarily mean that AI will have the same implication of ‘organic.’

 

- If it could, then it would imply that the matter in which this AI is limited to has the potential to become conscious (e.g. panpsychism and/or proto-panpsychism). Or, at least some kind of subjective ideology where there’s a preference of mind over matter. But one still has to acknowledge how non-experiential matter (e.g. the AI) somehow gains experiential matter (e.g. qualia, and other attributes of sentient entities).

 

- However one twists it, the AI will still be considered an example of p-zombies because unless humanity has mastery over what instantiates their own means of qualia and consciousness, it’s just people being “woo’d” over how there’s an implication of self-modification and other attributes that seem to be apparent in the AI, but shrugging off whether or not there’s a similar ‘backoffice’ that instantiates it, or unconscious totality of processes, etc.

 

 

 

Point of view is not reliant on the preexistence of qualia or consciousness. So the hard problems of consciousness are completely irrelevant to the brain's ability to process multiple perspectives.

 

Point of view is contingent on a conscious evaluator(s). How does one expect the brain to apply context of this reality without a conscious evaluator? Saying POV isn’t contingent on consciousness and qualia is literally just trampling that whole point of consciousness and qualia being there in the first place. The hard problem of consciousness cannot be shrugged off just by assuming POV being independent of qualia and consciousness.

 

Your continuing discussion of this back office leads me to note that there is a weakness of oversimplification here, beyond just the above. Not only is there an underlying engine which supports the conscious mind, but aspects of how the conscious mind is capable of experience are understood, but do not come out through a flat metaphor of an office space.

 

Yeah, it is a metaphor, and I’m not expecting the metaphor to be the realized phenomenon if I acknowledged how humanity has yet to master what instantiates the very same things that allows them to have qualia and consciousness. It’s just to simply state and acknowledge the ineffable occurrences that would be related to the hard problems of consciousness.

 

The processes of self-modification, trial and error, self-imaging (self reflection), are key. Particularly self reflection, which is the process of creating a symbolic map of the internal components of the self. Sensitivity, the ability to respond to stimulus also factors in.

 

Yes, those would be part of the many processes that can be chalked up as the backoffice of the mind, i.e., the unconscious totality of processes and such that contribute to qualia and consciousness being prevalent. I’m not trying to go further on this limit with this metaphor into something transcendental. This is just us preaching to the choir over examples of processes, modalities, etc. typical of sentient beings such as us. Those processes you mentioned require a POV, and that POV requires some intertwined occurrence with qualia, consciousness, and anything that instantiates these qualities.

 

 

--Oh yeah. And also, if perspective couldn't be handled separately from qualia, then p-zombies would be an impossible construct. They would malfunction at every turn, making their ability to simulate consciousness non-existent.

 

P-zombies ends up being a probable construct because you’re splitting POV from qualia. You assume POV can occur regardless of there being foundations of qualia and consciousness. That seems to be the perfect example of a p-zombie that seems to have a POV, but lacks the conscious awareness to apply context and such common in sentient beings.

 

It’s that extreme cut between POV and qualia that causes the confusion in this thread. I get that people use dualism as a metaphor, but when one states POV can be existent without qualia and consciousness, the very same rudiments that allows a POV to even be apparent, then it’s just trampling, or undermining the vastness of what consciousness and qualia are. Those list of perspectives are just several modalities a person can shift into, i.e., altered state of awareness. I agree those modalities being accessible and shifted and transitioned into, it’s just that I don’t agree on POV existing without qualia and consciousness; that’s just an example of p-zombies, and I don’t believe such a concept is even possible to be actualized. It can be conceivable (since we have those processes to create that thought experiment in our head) but it’s just a thought experiment; nothing more, nothing less.

Link to comment
Share on other sites

On the AI stuff:

Even if one is seeing AI and such going through the motions of looking as if they’re sentient, it doesn’t mean they have conscious awareness that’s exactly like our means of being conscious. If this were the case, then there’s an implication of the hard problems of consciousness being solved to the point where consciousness and such can be shifted in and out of anything.

Hmmm. So, how does consciousness shift in and out of a human brain?

 

If it could, then it would imply that the matter in which this AI is limited to has the potential to become conscious (e.g. panpsychism and/or proto-panpsychism). Or, at least some kind of subjective ideology where there’s a preference of mind over matter. But one still has to acknowledge how non-experiential matter (e.g. the AI) somehow gains experiential matter (e.g. qualia, and other attributes of sentient entities).

So, what kind of matter is the human brain made out of? How is this matter isolated from the same problem suffered by the matter of the computer as you demonstrate?

 

However one twists it, the AI will still be considered an example of p-zombies because unless humanity has mastery over what instantiates their own means of qualia and consciousness, it’s just people being “woo’d” over how there’s an implication of self-modification and other attributes that seem to be apparent in the AI, but shrugging off whether or not there’s a similar ‘backoffice’ that instantiates it, or unconscious totality of processes, etc.

Is it safe to assume anyone is sentient? I would not want to be wooed by the cries of pain and suffering of my torture victims. Something strangely unsafe feeling about just assuming that an agent who behaves with passion and asserts their rights is not sentient.

 

I don't understand how the human mind works. By the above logic, we must assume it has no back office.

 

Point of view is contingent on a conscious evaluator(s). How does one expect the brain to apply context of this reality without a conscious evaluator?

Actually, no. Point of view is established by any form of evaluator. AI being a basic example. Unless you are asserting that AI is conscious?

 

...the vastness of what consciousness and qualia are.

Being poetic, but vastness here is not even appropriate in metaphor. Qualia is an artifact, the side effect of consciousness. All things in the universe would proceed as normal without it. Including cognition.

Host comments in italics. Tulpa's log. Tulpa's guide.

Link to comment
Share on other sites

Interesting. It's clear that when we all use qualia, we tend to break it down not only to things it may be, but things that it may not be as well. Instead of me answering to your replies, I'll make more threads in the future in which we can actually go into discussion about this. But, give me a few days, and feel free to discuss in them like anyone else. Also, the back-office metaphor should literally be taken as such, but if one doesn't want to believe there are occurrences and such that are unconscious, and what have you, then okay.

Link to comment
Share on other sites

  • 2 weeks later...

Just fixed an old link in the draw your tulpas thread, so I was looking at said picture I drew, and it brought to mind

Lucilyn was talking about. So I listened to it, and had to come back here. I'd actually never listened to it myself before.

 

So, I don't hear it even close to how Lucilyn does and only slightly similar to Tewi, maybe. Describing how exactly it made me feel would be hard, but in words it was something like "Realizing how big of a deal she is". Not gonna try to explain that, but the point is I immediately heard/experienced the song differently from them, too. Interesting on its own, maybe, right?

 

If I had to guess, I'd say how we experienced the song was dependent on our relationship to Tewi (and so "her" song), including her own. How she feels toward/sees herself, how Lucilyn does (she focused more on the song itself and how Tewi felt about it), and how I do (apparently I automatically thought of the song as if describing Tewi and not the other way around). I'd say those line up with how Tewi feels about her pseudo-past, what Lucilyn focuses on just in general (enjoyment, experience, sharing those), and how I feel about Tewi (she's rather important in my life).

 

No argument or question here, just adding something to the thread.

 

 

Extremely related to the thread's subject matter though, I feel like mentioning that I occasionally find myself envisioning scenarios where, in the future, people can record and share their qualia (I never quite used that word though) with each other. As simply as uploading and sharing Youtube videos, if not a little less casual. Still has some sort of hub (or maybe many) with divided subject matter/genre/whatever-have-you, from extreme sports to creating or experiencing music/arts to philosophy/ideas and literally everything else I guess. I don't think I have any real attachment to the fantasy - I don't want to make it come true nor do I even necessarily consider it an appropriate activity for humans to partake in realistically speaking. But nonetheless ideas like that randomly come to mind every long once in a while, of "recording" experiences so that others may "experience" them later. The exact mechanics of that are questionable too, since you would be experiencing it just as they did, yet it's implied you'd somehow gain something/make your own connections to it. Perhaps afterward? Well, it doesn't really need to make sense. But it's an interesting idea.

Hi! I'm Lumi, host of Reisen, Tewi, Flandre and Lucilyn.

Everyone deserves to love and be loved. It's human nature.

My tulpas and I have a Q&A thread, which was the first (and largest) of its kind. Feel free to ask us about tulpamancy stuff there.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...