A happy little solipsism
I’ve mentioned a few times that I will discuss the concept of Enlightenment in some depth. That will require more than one post. Here’s the first one.
I don’t think of Enlightenment as an end-state; I think of it as a process of reaching higher and higher levels of awareness. I also tend to think of it in evolutionary terms: a human’s awareness is in some sense more expansive than a frog’s awareness. That’s not to say that anything is “bad” or “wrong” about the frog; frogs are fine just the way they are. It’s just an observation that the human has more of something than the frog.
When I say “evolution”, I mean it in a very broad sense that includes but is not limited to the Darwinian sense of the word. Someone who has gained a lot of wisdom from a life of interesting experiences is more “evolved” than someone who has not had those opportunities, even if they are at the same point of Darwinian evolution. That’s why the name of my website refers to “evolving”.
One of the first big milestones of personal evolution is growing out of something called “Naïve realism”: the assumption that our senses provide us with direct awareness of objects as they “really” are.
Despite calling it “naïve”, naïve realism is openly endorsed by numerous modern philosophers and scientists. I’ve heard it passionately argued.
The belief that nothing exists outside your own mind — surely there must be some way of demonstrating that it was false? Had it not been exposed long ago as a fallacy? There was even a name for it, which he had forgotten. A faint smile twitched the corners of O’Brien’s mouth as he looked down at him.
‘I told you, Winston,’ he said, ‘that metaphysics is not your strong point. The word you are trying to think of is solipsism.
George Orwell (Eric Blair), 1984
This is a counterintuitive exchange to have written given that in the real world, inner party members think the same way Winston Smith does. The fact that they sometimes impose “reality bubbles” on the rest of us does not imply that they think that there is not also some “objective reality” “outside” of our minds; in fact, sometimes they forget their own duplicity and get caught in their own reality bubbles, finding it difficult to maintain several different mental models of reality, one for themselves, and one for their dupes. Winston Smith, and by extension Eric Blair and a lot of other intellectuals, are not the only ones who don’t understand metaphysics:
- It’s not a “belief” that nothing exists outside your own mind; it’s lack of any means to know one way or the other.
- No one has ever come up with a way to demonstrate that anything inside our minds corresponds to anything outside of our minds.
- No one has ever come up with a way to demonstrate that there is any “outside of our minds”.
- Rene Descartes tried, and failed. “I think, therefor I am” is an invalid inference. What is this “I” and where did it come from? The correct inference is “Thinking is going on, therefor thinking is going on”. You can stop right there.
- This has nothing to do with “metaphysics”.
- What’s the problem with a solipsism?! You don’t have to assume that there is any reality “outside of your mind” in order to be perfectly functional and happy. Quite the opposite: you can become super-functional by subscribing to a purely pragmatic view of reality whereby success, including and in particular Darwinian biological success, rather than certainty, is the goal.
Most of the time, your experiences are reasonably close to what you expect them to be, at least, enough to be functional. Most of the time, your experiences sufficiently match your expectations that you’re able to live your life as well as you have up to now. Not knowing the nature of “ultimate reality” has not had any impact on your life at all. Solipsism is not a problem!
Prof. Donald Hoffman goes so far as to claim that evolutionary selective pressure favors not those minds with the most accurate representation of reality, but instead, those with the most successful representation of reality, which is probably a function of variables like efficiency. You can’t fit the complexity of the Kosmos into your brain, so you make a lot of functional over-simplifications. Some of them might be grossly inaccurate, but to the extent they get the job done, it’s pointless to care other than to be aware of the possibility in order to change or discard them when they stop working.
Prof. Hoffman uses the analogy of a software user interface: you drag the virtual folder to the virtual trash can. But there is no physical folder and no physical trash can; the software is all electronic circuits that don’t look like folders or garbage cans.
I have a feeling that religious mythology serves to create mental models that are more functional than the default, without regard to whether or not they correspond to anything “outside” of the practitioner’s mind. By “functional”, I mean that they help process experiences, typically the bad ones like conflict, suffering, and death. This is why Muslims would still be winning the “Clash of Civilizations” even without the help of our profoundly atheistic ruling class. It’s not so much that they’re winning as simply displacing a civilization that is dysfunctional and dying after discarding some critical “cultural technology”.
The problems start when your mental model doesn’t help you process your experiences, because your experiences take you completely by surprise, and usually an unpleasant one at that.
This happens commonly because of
- incomplete data
- inaccurate data
- translation errors in the process of interpreting what the data mean
The fact that our experiences do sometimes get out of synch with our mental models of reality seems to imply that something other than our own mental models is driving the process, but that assumption adds nothing to our knowledge regarding what that “something” might be. “There is this thing called a ‘hocnonest’. I don’t know what it looks like, sounds like, or anything at all about it, other than that it exists.” In other words, we’re still trapped in a solipsism. All we really know is how well our mental models have served us up to now. What we’re really measuring them against is our experiences, or “new data”.
Translation is the process of turning raw data into something meaningful to you, ultimately, qualia (that is, subjective conscious experiences, like red, or the sound of middle C on a piano). For example, if you look at an apple, you’re not experiencing the apple directly; you experience neurons reacting to a chain of events that started when photons were emitted by the apple and some of them reached your eyes. Then there is an extremely complex translation process from that into neural impulses. There are possibilities for mistakes and data loss along the way.
Someone might reasonably ask “ah…but how do you know that?” The answer is that I don’t; that’s my mental model of how the process works based on other people’s experiments. It’s not particularly accurate; a lot of details are missing, and I might have made some mistakes. It’s good enough to serve my purposes, until the day I get unexpected feedback.
Imagine trying to get home again using a defective map, while being absolutely 100% certain that the map is a perfect representation of the experiences you will have as you try to reach your destination. You keep running into the same obstacles and falling off the same cliffs, no matter how many times you retrace your path and try over again.
This is how people
- make bad decisions that mess up their own and other people’s lives
- never learn from their bad decisions, because they never understood what caused the problem
- get stuck in their problems
- develop personality disorders like borderline personality disorder or narcissistic personality disorder
Ask yourself why people fall in love with their beliefs, or get angry when their beliefs are challenged.
If you assume that your mental model of reality IS reality, then it never occurs to you that it could be mistaken. So, whenever you find evidence that doesn’t fit your model, what do you do? You discard the evidence!
That’s called “confirmation bias”.
This is how you get people who have beliefs that aren’t based on any evidence at all. They might have formed a belief because of something a high-status person told them, or because of purely conditioned associations. When they experience contrary evidence, they discard the evidence, rather than the belief, out of hand: “facts don’t matter”. Some people never realize that their mental models of reality are defective because they don’t care.
I often hear them expressing sentiments like “I want this to be true”, after hearing a lurid and obviously fictional story that validates one of their political beliefs and offers them a pretext to do something violent. These are the ones who will go so far as to manufacture “evidence” in the form of hoaxes in order to try to “prove” what they already want to believe. I don’t quite understand the psychology of this phenomenon, but it seems to have something to do with “This is true. I can keep it true by creating evidence for it and making other people believe it too”.
I used to call this “magical thinking”, but I gave up on that expression because many people use that expression to mean something else, that I don’t want to confuse with my intended meaning.
This habit of cherry-picking data according to whether it validates cherished beliefs is how you get all sorts of irrational behaviors even without any symptoms of mental illness.
- ignore evidence that the building you are in is burning down and you need to evacuate. Instead, wait for an authority figure to tell you so
- be aware of evidence that a sexual predator is sexually abusing children, but do nothing about it, or even try to help hide the evidence, even without any personal stake in the situation
- join a cult and if rationalize why the cult-leader’s prophesies never happen as predicted
- remain in an abusive relationship with someone who has a personality disorder or is mentally ill, after buying into the same mind-space
- be needy for the approval of others, especially high-status individuals
- consciously or unconsciously rig an experiment or falsify the data
- believe whatever the government and media claims without requiring confirmation by independently-obtained evidence, even with ready access to such evidence.
These are all common behaviors. It’s not uncommon for firemen to find the charred remains of patrons of nightclubs and dance-halls still positioned around tables next to their drinks, because nobody told them they needed to evacuate. It’s not unusual for investigations into child sexual exploitation to discover numerous people who were aware of what was going on, but either did nothing, or even took steps to cover it up. It’s not that they wanted children to be sexually abused; it’s that they literally think that if there is a discrepancy between what they believe to be reality, and evidence, that the evidence is wrong. It’s like if you drain the barometer of mercury, it will start raining. In cults, usually only one key central figure has a “dark triad” personality; most of the cult members have perfectly normal psychology.
Some professional scientists are among the worst offenders of all! They tend to be over-confident about themselves and their beliefs. They’ll go through all the motions of removing their own biases through double-blind experiments, then blatantly rig the experiment. Scientists are some of the favorite subjects for “mentalists”, who are entertainers who use cognitive illusions to mystify their audiences; the scientists assume they aren’t prone to cognitive illusions so they can’t figure out how the mentalist keeps tricking them over and over and over again.
Eduard Bernays had the idea of using scientific authority for promoting propaganda. And since scientists are not particularly resistant to cognitive illusions, they make perfect dupes to trick other dupes.
Sometimes the wisest thing you can say is “I don’t know”. Sometimes you get ahead by unlearning information and discarding beliefs.
Whenever you have a problem, there is a process you can go through to either resolve it quickly and efficiently, or realize that your time and attention would be better spent elsewhere, instead of getting stuck in it. It makes you more resourceful and more effective. It also makes you more persuasive and helpful within your social circle, because you can show them how to solve their own problems instead of telling them, and then they’re more likely to see it for themselves instead of rejecting your advice out-of-hand.
There’s another process that you can go through to start responding more to the evidence of your senses, and less to purely conditioned associations, so that you can let go of beliefs that keep resulting in unpleasant surprises. It takes a little more work than the other one, but accomplishing it is another milestone on the way to Enlightenment.
Subscribers will have a chance to learn how to perform these processes to increase their resourcefulness and ability to adapt to new situations.