Tuesday, June 28, 2011
Saturday, June 25, 2011
Sunday, June 19, 2011
Logic has spent the greater part of its existence trying to be something that is a priori, as in knowledge derived from logic does not require sensory experience for it to work.
Bertrand Russell was obsessed with logic and with equating mathematics to logic, thus giving math the status of a priori knowledge.
I think that both logic and math are fundamentally empirical things. In physics, you only know that your math is correct with an appeal to experimental data. Similarly, how do we know that two and two equals four? How do we even know that the concepts exist? It FEELS like there is some underlying principle,
And certainly, special areas of the brain have evolved to handle numbers, so we do have some degree of neuropsychological data to back that idea up
But I think that it's essentially a multi-faceted illusion. We know that two “exists” because I look at a table and see an apple while seeing a separate apple in the same visual field. Thus, I call this impression “two apples.” Likewise, we have the ability to hypothesize about a surrounding environment about which we do not currently have sensory data. If I see an apple, turn around, I assume that the apple is still behind me. So if I fill my visual field with apples, I can count them all, then turn to another pile and continue counting. This ability to assume the existence of numbers outside of the visual field allows mathematics to function without sensory data.
But for math to work, it must come back to 2+2=4. If it doesn't we can never “know” if it's “correct”
Likewise with logic. It is not a priori. It is fundamentally linguistic. Our language operates on certain rules that evolved to fulfill the needs of communication, but there are many aspects of language that never evolved rules simply because there was never any need.
Our complex language has a multitude of semantic issues, such as contradictions. For example, I can say “The King of Mexico is a 6-foot-tall woman.”
This sentence works. It doesn't make any sense, but I am able to say it. If we assume the truth of the words, like no women masquerading as kings, then a king can't be a woman, and Mexico doesn't have a king.
There are similar issues in the world of logic. One of the most famous is Russell's Paradox, which has to do with sets.
Sets are exactly that, a set of something. The set of all spoons is a logical construct that includes all spoons everywhere. If we create a set of all things that are not spoons, that includes cell phones, dogs, Mars, and chocolate cake. Anything that isn't a spoon.
But what about sets of sets? Let's say, the set of the sets of things with ears. Thus, we have a set that includes the set of rabbits, the set of dogs, the set of people, ANYTHING with ears. All of the groups within the parent set are still separate sets unto themselves, but they are also part of the larger set of all things with ears.
But let's say we have a set of all sets that doesn't include itself. That set would necessarily be included in itself, because it doesn't contain itself... but if we do, that suddenly does contain itself.
A famous practical example is the barber that shaves all of the men in a town who don't shave themselves. If he doesn't shave himself, then he must shave himself, which is a contradiction.
What this does is reveal not problems with logic, but fundamental problems with language, and logic is built on language, not the other way 'round.
Russell's problem is that a set of sets is empty. There is no such thing as a set. It's an internal hypothetical construct based on that assumption that there are things that we do not currently sense, but can sense if we explore.
So for example, the set of all apples is actually a statement that there are apples that I am not currently seeing, but could see in the future. We verbalize this as the nebulous idea of apples.
Plato tried this with his idea of the perfect essence of apple. Again, he was talking about an evolutionary predisposition to believe that there exists a thing that I have seen that also exists in places that I am not currently seeing. This sort of programming fosters curiosity, since there be apples in them thar hills.
All of logic is linguist, just as math is linguistic, and language is empirical. Is and not is is logically impossible simply because we've never seen something both be and not be at the same time. It sounds impossible to be and not be, but that's because our language itself is also based on our empirical experiences.
The best a logician can say is that the concept of “be and not be” are ineffable internal concepts that represent something beyond empirical reality. And anything beyond empirical reality is philosophically void, intellectually bankrupt, and wholly useless.
I'll only briefly talk about that. Language as we use it represents an internal world and an external world. When I say “a bear,” I'm referring to the sensory experiences of a bear, and the word “bear” evolved from common usage of the term between people to communicate either the actual persistence of an empirical entity within the visual field, or the idea of a bear.
But if I say “I'm angry,” what I'm communicating is an internal state. The word “angry” did not evolve to communicate this internal state, since they are necessarily ineffable. I cannot show you what “angry” means to me.
Instead “I'm angry” actually represents a set of behavior. When I'm angry, I feel the need to punch things and scream.
I don't actually. I'm not psychotic, but for the sake of argument.
When I see other people punching things and screaming, everyone uses the word “angry” to describe that set of behaviors, and since I want to act the same way, I assume that his similar empirically observable behavior represents similar internal sensations.
As such, we have a disconnect between what a word represents and what it means.
I find this absurd. It's as though I believe myself to be scum after finding out that my great-great-grandfather was a serial killer. It has NO effect on me. It does not change me. I am still the "me" that is defined by my actions and words. My past does not define me in the slightest.
This sort of personal or moral essentialism drives me up a wall. People who identify as Irish, or black are insane. You are not Irish. You are you. You are whatever you decide of yourself. To subscribe to some socio-historical narrative and then define yourself as that is barking mad!
It forces you to accept that the behaviors of your ancestors are somehow the right behaviors. That you have a responsibility to continue some tradition, regardless of how stupid that tradition may be.
History is an illusion. All that exists is the present. We are defined by our present and our memories, and at any moment we can choose to change. The fact that my Great5000 Grandmother was chimp-like (remember, she wasn't a chimp. Chimps didn't exist yet.) means nothing to me.
If anything, it forces me to marvel at the mechanisms in nature that are able to produce me --a GPS-using, latte-making, music-listening, blog-writing marvel-- from the bug-chewing, lice-infested, poop-throwing raw materials of that long-dead ancestor. That's fucking amazing!
I feel downright special knowing that I am the intellectual zenith of evolution on this planet. I am not separate from it. I sprang from it. If you want a spiritual sense of one-ness with the universe, there it is! I am not somehow fundamentally separate. I have no transcendent soul. I am not special in comparison to other animals because some nebulously defined creator made me.
I am a gear in an astounding machine. I am made from the same stuff as my dog or my girlfriend. They and I are one. And just as I can choose to change, my body, and the genetics that comprise me are continuing to change over vast time scales. That doesn't diminish me. My Great5000 Grandmother was a chimp-like thing, but I am not. And that's all that matters.
Saturday, June 18, 2011
I hate them because all of the tablet fanatics out there, perhaps intoxicated by Steve Jobs exaltations, believe that tablets will one day be the predominant form of computing. At least for now, this is total nonsense.
Every time you talk to a developer, they talk about one of two things: "specific considerations" of tablet development, if they're careful about the words they use; or "limitations" of tablet development, if they aren't. I prefer to use the term "limitations" of the tablet platform. Essentially, until we have neural connections with our computers, tablets are a much more limited form of communication between user and device. That is undeniable. There are simply fewer channels of interaction.
Moreover, the media community and even the tablet developers continue to make those limitations even worse! First was Apple's exclusion of Flash, which was a real bummer, since the iPad's best at media consumption, and that means Flash. Then, it was the media companies trying to block pretty much everything from the mobile browsers. Even on Android, you can't watch music videos on YouTube. You can't watch Hulu.
That is in no way a comparable experience. It's not even different. It's inferior.
When Windows is on tablets, I'll bite. That way, a tablet will be an actual computer. Not a toy.