A Captivating Journey Through Psychiatry, Philosophy, and Psychology
#Psychiatry #Philosophy #Psychology #MentalHealth #Hippocrates #Plato #Aristotle #Stoicism #Epicurus #Sociology #Durkheim #Marx #Weber #Ethics #PhilosophicalCounseling #HistoryOfThought #MindAndBody #HumanExperience #CriticalThinking #PhilosophyOfMind #MentalWellBeing
Can you think of an emotion or mental state where somome will tend to avoid pleasure? Is there such thing as a kind of ascetic drive?
I'm thinking there is something missing where you have emotions that drive one both away from (e.g. fear, lain) and toward (e.g. anger) negative stimuli but I van only thing kf emotions or mental states that drive us towards positive stmuli (e.g. desire). Maybe satiation counts? What else could?
#philosophy #philosophyofmind #psychology
@psychology @philosophy
The Ai community displays "Womb envy":
AI researchers often confuse computation with consciousness—but they are not the same. Computation is medium-independent—it can run on paper, a calculator, or a computer. But consciousness? It isn’t just an algorithm. It only appears in biological systems, meaning it depends on life itself.
AI and Producing Consciousness | Dr. Bernardo Kastrup https://www.youtube.com/watch?v=zMaSxj60JAw
#AI #Consciousness #ArtificialIntelligence #MindVsMachine #PhilosophyOfMind
Reducing felt experience requires not preemptively dismissing the solutions
Annaka Harris has a new audio book out which she is promoting. I haven’t listened to it, but based on the interviews and spots like the one below, it appears that she’s doubling down on the conclusions she reached in her book from a few years ago, that consciousness is fundamental and pervasive.
https://www.youtube.com/watch?v=nP2swgDVl5M
Harris starts off by discussing the profound mystery of consciousness. But she clarifies that she isn’t thinking about higher order thought, like the kind in humans, but something more basic: “felt experience.” She takes this to be something that can exist without thought, and so discusses the possibility of it existing in plants and other organisms that don’t trigger most people’s intuitions of a fellow consciousness.
As I’ve noted in a couple of recent posts, the hard problem of consciousness seems specific to a particular theory of consciousness, that of fundamental consciousness, the idea that manifest conscious experience is exactly what it seems and nothing else, that there is no appearance / reality distinction or hidden complexities. I’m sure Harris, like so many others, will argue that there’s no choice but to accept fundamental consciousness. How else to explain the mystery?
But like David Chalmers and many others, she starts off by dismissing the possible solution, “higher order” processing. Without that, felt experience, the feelings of conscious experience, do look simple and irreducible. But that’s only because we’ve chosen to isolate something that didn’t evolve to be isolated, that has a functional role to play in organisms.
Harris’ example of looking at decisions vines make in where to grow is a good example. In most biological descriptions, this behavior is automatic, done without any volition. She wonders if this may not involve any felt experience. But she doesn’t seem to wonder if similar behavior in a Roomba, self driving car, or thermostat has similar types of feelings. (Some panpsychists do admit that their view implies experience in these types of systems, but in my experience most resist it.)
Many animal researchers have similar intuitions, that the observable behavioral reactions in relatively simple animals must involve feeling, since similar reactions in us are accompanied by them (at least in healthy mentally complete humans). Of course, similar to most panpsychists, they typically resist the implication for machines, often gesturing at some unknown biological ingredient or principle which will distinguish the systems they want to conclude experience feelings from those they don’t.
My take is that the solution is to reject the theory of fundamental consciousness. What’s the alternative? A reductive theory. But how do we reduce felt experience? Remember, to do a true reduction, the phenomenon must be broken down into components that are not that phenomenon. If anywhere in the description we have to include the overall phenomenon itself, we’ve failed.
Along those lines, I think part of the explanation of what feelings are is that they are composed of automatic reactions that can be either allowed or overridden. So if an animal sees a predator and always automatically reacts by running away, that in an of itself isn’t evidence of fear. On the other hand, if sometimes the animal can override their impulse to run away, maybe because there’s food nearby and they judge the risk to be worth it, then we have an animal capable of feeling fear.
So a feeling is a perception, a prediction, of an impulse which an organism uses in its reasoning to decide whether to inhibit or indulge the impulse. This means the higher order thinking Harris immediately excludes from her consideration is actually part of the answer. That answer, incidentally, also explains why we evolved feelings.
An organism is generally only going to have feelings if they provide a survival advantage, but that advantage only exists if they have some reasoning aspect to make use of it. Note that this reasoning aspect doesn’t have to be as sophisticated as what happens in humans, or even mammals or birds necessarily, although the sophistication makes it easier to detect. It just needs to be present in some incipient form to act as one endpoint in the relationship between it and the impulse, the relationship that we refer to as a “feeling”.
This requirement for a minimal level of reasoning seems to rule out felt experience in simple animals, plants, robots, and thermostats. It also gives us an idea of what a technological system would need to have it, a system of automatic reactions, which can be optionally overridden by other parts of the system simulating possible scenarios, even if only a second or two in the future.
Figuring out how to do this is not trivial. None of the current systems people wonder about are capable of it. But while it’s hard, it’s not the utter intractability of the hard problem of consciousness. Once we dismiss fundamental consciousness, that problem seems to no longer exist.
Unless of course I’m missing something?
The Hidden Trap of the Mind: How to Escape Illusion and Find True Freedom
https://youtu.be/BN1dvdlxix0
#SpiritualAwakening #MindTrap #BreakFree #Consciousness #IllusionOfSelf #TrueFreedom #AwakeningJourney #SelfRealization #OvercomeSuffering #LetGoOfEgo #HigherAwareness #PhilosophyOfMind #SpiritualGrowth #InnerPeace #EscapingIllusion #Mindfulness #WisdomPath #EnlightenmentNow #DetachAndAwaken #FreedomFromMind
Philosophy of mind – Psychoanalysis, behaviourism and the science of the mind
#philosophyofmind #psychoanalysis #behaviourism #sigmundfreud #gilbertryle #bfskinner #dualism #ReneDescartes #mind #philosophy #psychology #historyofphilosophy
https://philosophyindefinitely.wordpress.com/2020/08/04/psychoanalysis-behaviourism-and-the-science-of-the-mind/
Where exactly are our thoughts, after all, if the brain is opened nothing even similar to external objects are seen, therefore some investigation must be done into what really is mind...
#philosophy #philosophyofmind #mind #brain #mindbody #reason #Perception #intentionality #Qualia #senseperception
https://philosophyindefinitely.wordpress.com/2020/07/24/mind-and-body/
Block (1995): Consciousness isn’t just information access—it’s experience. Dehaene et al. (2021) argue that machines might achieve “global workspace” awareness. But without qualia, is it consciousness—or a convincing simulation? Which matters more: functionality or feeling?
#Consciousness #AI-ethics #PhilosophyOfMind
https://library.oapen.org/bitstream/handle/20.500.12657/47279/1/9783030541736.