Friday, March 30, 2012

Mass Effect 3 Disappointment Was Expected

As this article, weirdly on LiveScience shows, some people seem to think that video games are a new form of story telling. They are not. They do not "free" story telling from the shackles, because story telling is inherently a one-way street. If the person listening to the story becomes involved, they must be involved within limits else it turns into role-playing... which is why role-playing games are called role-playing games. If the ending changes, it is not the same story.

World of Warcraft is not a story. Yes, there is an overarching story that drives many of the major events, but millions of players have nothing to do with that. They spend all of their time wandering around the open world, "writing" their own story. Yes, there is a narrative to it, but a story must have a purpose; it must have some end. And in games, that some end must be pre-determined. And even with games where there are multiple endings, there is always one ending that is counted as the true ending, which is usually the one that takes the longest to achieve and must match a set number of variables in the game. Unless you count Silent Hill's "dog" ending.

For example, if we wanted a game to be a real interactive story, there would never be something like a "game over" state. All endings would be a true ending. But in many games, you can die within the first thirty seconds of game time. To call that a legitimate ending would be absurd. There must be limits. In a real story, there are no limits, only a creative vision. It is obvious why this is so: a player can easily exceed the limits of a created world, because any created world must necessarily be finite.

Games will never be pure stories because they must be within the limits of game rules. For example, I love Halo. I read the Halo novels like Fall of Reach. While far from sci-fi genius, they were decent fluff. The problem is that the stories didn't make ANY sense in relation the game. The description of the super-soldiers (of which you play one in the game) is so beyond anything, that they would have been an unstoppable juggernaut. Strong enough to lift cars. Can run at 60mph.

Perhaps a more famous example, at least in the world of video games, is that of Final Fantasy VII. This was an era-defining game. It is frequently held up as the first truly cinematic attempt at making a video game. It also includes one of the most famous deaths in all of video-gamedom: Aeris. Only, according to the rules set out by the game, it doesn't make sense. On any given play-through, Aeris could have died dozens of times; you could always resurrect her with a spell.

Why should her scripted death be any different? Because it was required by the story. There are some games that try to get around this, by making in-game deaths permanent, one that sticks in my mind is the (old) game Gain Ground. But that was part of the game. Everyone understood the rules because they were communicated explicitly to the player. In a narrative game, if deaths are always made permanent, the player is left in a state of confusion. Should they restart? Is this death avoidable? Is a death at this point necessary?

You'd have crazed players replaying the same part of a game over and over in an attempt to save everyone's life. Conversely, you can have players who charge through a game with reckless abandon, and are left with no one alive, crippling the possible story in that way.

No, I'm sorry. Games are and always will be an imperfect story medium. They are no more liberating from the shackles of story requirements than Choose Your Own Adventure books.

Granted, there are advantages to this. A game should not be seen as a story. It should be seen as a game with rules where certain narrative bits are essentially rewards for completing tasks. To go back to the Silent Hill example, that dog ending doesn't make any sense. It's absurd beyond reason. And that's great! In a real story, it would be bound by the limitation of having only a single ending, which in the terms of a story isn't much a limitation, but it's still a limitation as regards entertainment. In a Choose Your Own Adventure sort of concept, the author can write tons of endings, some of which make no sense, simply because it's funny.

Working toward surprises, hidden rewards, undiscovered narrative elements, it's all part of experiencing a set narrative more fully. For example, people love to discover unfinished versions of books or movies. Deleted scenes, unfinished scripts, all of them are gobbled up by dedicated fans of something. They do this because it is the exact same process as going through a game and finding hidden bits of the story. It is a narrative that isn't simply delivered. It is a narrative discovered. That is what makes games both less and more than a book or a movie.

Wednesday, March 28, 2012

A Second Thought On The New iPad

I was reading over a few reviews of the New iPad, and many of them have a similar sentiment to them: the new iPad is cool and all, but is, in many ways, a "meh" sort of product. Why is it "meh"? Because it doesn't blow the earlier versions out of the water. They are actually upset that the new version of a product doesn't render the old version obsolete and useless.

Seriously? Maybe I'm a total curmudgeon. Maybe I'm old and and cantankerous long beyond my years. But this just seems absurd to me. These reviewers are actually desiring the seeming need to buy a new, expensive toy, every year. I'm all for consumerism, afterall, it drives the American economy, but this just seems ridiculous to me.

I love PC's because it takes forever for my PC to become obsolete. I've been rocking the same laptop for four years, and many parts of my desktop are over a decade old. I very much enjoy indulging in the newest bits of technology (I was a serious, overclockin', water coolin', SLI'n PC gamer for awhile) but when an industry is predicated on people flipping out over the next shiny thing every year, and when you don't provide them said flip-out-worthy shiny thing, they get angry, the industry is broken.

No wonder you can't actually get anything done on an iPad. It's not meant to have anything done on it! It's meant to be a casual distraction. And with iPad versions having a lifespan similar to fruit flies, serious developers aren't given a stable-enough platform on which to do things. This might help to explain why the majority of both paid and free apps on the App Store are games.

I don't mean to be a cranky jerk. I don't mean to rain on people's parades. Man, if I wanted to go after conspicuous consumption, the iPad is a pretty minor offense. I think it irks me because computers are so great. They are a democratizing force. And instead of seeing computers heralded for their potential, they're being used to sell wan distractions to a bleary-eyed populace. Instead of people who worked to turn computers into a foundation of modern society, Steve Jobs, a man who sold "magical" stuff, gets candlelight vigils.

The iPad could be great, but in the mad rush to sell simple bits of wow to as many people as possible, and every year have an orgasm with the newest iProduct, it's not happening. I don't want to be hit with "wow" every year. I want a product that does something that I will want it to do for me, every day, for the next ten years.

Sunday, March 25, 2012

My Experience With The New iPad (3)

I'm not a big fan of tablets. I think that the "post-PC' era isn't post-anything. I don't like Apple. Truly, I am a massive curmudgeon. As such, my following, short criticisms of the iPad should be seen through that lens. Still, I hope to explain myself.

I got a chance to play with a new iPad today. The beefiest version of it, loaded with LTE internet access. As is always the case with Apple products, it felt nice to hold. It was firm, with an excellent tactile quality to it that absolutely every other company on the planet lacks.

The screen is very cool. At normal holding distances, the resolution gels in such a way as to make it a perfectly-solid, moving image. Colors and contrast are the best that I've ever seen out of an LCD. In fact, I loved the screen so much, I wish that it was slightly bigger and attached to a laptop.

Other than that, the thing that struck me as I caressed this beautiful thing, and as I swiped around and jumped in and out of programs, is that it is a toy. Pure and simple, it is a toy. I see potential for it to be used for other things (a medical environment really sticks in my mind), but other than potential, it is an expensive toy.

I could never imagine writing on this. Doing graphics work on it. Editing photos. Creating a presentation. Doing deep, serious research. In fact, aside from casual media consumption, I could not see myself using this for much of anything.

That's not to say that I won't ever. The platform works. The platform is popular. With those two variables accounted for, cool things will undoubtedly happen in the future. For example, whether Microsoft, Sony, and Nintendo want to admit it, I think that the New iPad, more than anything else that I have held, screams loud and clear that this is the future of gaming.

Yes, we will always have dedicated gaming systems for people, but that will be a comparable niche. The Playstation Vita, Nintendo 3DS, and any future home gaming systems will need to make massive strides in every element of their business model to compete with this.

This again confirms my belief that tablets are not the future. The argument that they are the future is essentially a marketing pitch, fueled by magazine writers searching for something, anything, to write about. This damned iPad cost nearly $1,000 after all was said and done, thus allowing me to play old Playstation games and browse the web. Both are things that I can do on a $400 laptop... and much more.

I understand that tablets can do many things that laptops aren't terribly good at. They conserve power. They turn on instantly. They can be thrown into a bag more easily. But even then, with a cell phone that sports a 4.5" screen,  I already have a tablet. The always-with-me computer that acts as my clock, planner, navigator, and impromptu gaming device is already with me. I don't need another one.

I wasn't impressed with the iPad, the iPad 2, or any of the two million Android tablets littering the market. They don't feel like a step forward for me; they feel like a step back. A step back to an era when computers were obsolete two days after you bought them and everything about their use was a concession of some sort. I don't want that. It took us a long time to get our computers to the point where they are at, today. The desperate need for the next shiny thing from Apple distracts people from the cheap, useful, powerful tools that modern computers have become. They are epic, life-changing tools. Tablets are not.

And why get up in arms? Why not let people spend their money as they see fit? Because we already are rocking more personal debt than any other country on Earth. We are already consuming more resources than any other country on Earth. It is annoying to see an entire industry geared up to seduce the market with yet another product that it doesn't need, and if sales of Android tablets are any indication, doesn't even really want.

Thursday, March 22, 2012

Support The Middle Class

An economy is predicated on the strength of its middle class. By middle, I mean those individuals in the 40th to 60th percentile of income.

I have a decent economics background, but I don't claim to have incredibly deep knowledge. But for the purposes of this post, that doesn't matter. A focus on the middle class is something that pretty much all mainstream economists agree on. That is why you have both parties talking about supporting the "working man," or "salt of the Earth," or the "Everyman," or "John Q. Public," or any other such ridiculous euphemism for the 40-60.

The problem is that we know full well what kinds of legislation helps the 40-60. We can find examples in dozens of first and second-world nations. Unions, labor protection, cheap education, low income inequality, etc. These are known things, and yet we somehow have huge debates that rage in the public sphere. Indeed, if not for the recent hay-making about female contraception, that would undoubtedly be the primary talking point in contemporary politics.

Unfortunately, the debate actually has nothing to do with what is ostensibly being debated and is actually predicated on underlying assumptions about the state of the world that go unspoken in the debate. The argument is actually entirely concerned with value judgments about segments of the population, and you can see it easily in the toxic words used to describe the "enemy."

For example, medical care. The actual debate is about whether people deserve to die or not. The conservative viewpoint is that if you do not have the money to pay for medicine, you do not deserve the medicine. The progressive viewpoint is that everyone deserves medicine. There are major logical problems for both viewpoints.

The progressive viewpoint cannot answer the one million dollar pill problem. Basically, if a pill is invented that guarantees an extra year of life, but that pill costs one million dollars, who gets it? Everyone obviously can't. There is no answer in the progressive viewpoint. The conservative viewpoint handles this easily. The ones who get the pill are the ones who can afford it. No judgments. No morals. If you've got the money, you get the pill.

But outside of raw hypotheticals, the conservative viewpoint isn't very strong. It says that people who cannot afford medication deserve to die if their condition is life-threatening. That isn't a very easy sell to an emotional public. It's actually rather heartless.

So instead of stating their actual viewpoints, conservatives couch their arguments in dodge words. They try to argue that their plans will offer more medical care. Their plans will grow jobs. Thus we have the now famous, and incomprehensibly ridiculous, statement from Senator Mitch Daniels in response to the 2012 State of the Union Address.

We do not accept that ours will ever be a nation of haves and have nots; we must always be a nation of haves and soon to haves.
Mitch Daniels


Statements like this reveal either abject cluelessness or intellectual dishonesty of a Brobdingnagian level. Their ideas will achieve none of the things that they claim. And of course they won't, no one thinks that they would. But the argument isn't actually about that. It's about fueling ideology.

And that isn't to say that progressives aren't frequently just as ideological as the conservatives. Frequently, they are even more so, rejecting evidence in favor of an idealized world view. And don't even get me started on libertarians/communists. But that's beside the point.

I was sent on this tear by two recent articles, one discussing the rise of the Mexican middle class, and how the border wars near the US are as alien to cosmopolitan Mexico as they are to SOHO, and a second discussing how China now outpaces the United States in smartphone sales and activations. Why is this happening?

There is only one reason: we are sending all of our manufacturing to these two places.

Manufacturing is critical because it provides an economic base that doesn't require pre-investment on the part of employees. You show up, get the job, and learn as you go. Degree-requiring positions, which I argue frequently don't even need the degree, require a large investment on the part of the applicant. We cannot have an economic base that requires more than two hands and a desire to learn.

As education gets more expensive, the need to have an extant support system to see you through the necessary schools becomes ever more important. In our "idea" economy, we are speeding toward a system where it will be nearly impossible to rise from the lower economic classes to the higher ones.

Furthermore, manufacturing is important because ideas, in the "idea economy" are an infinite good. Once created, they are everywhere. The creation of an idea has value, but the idea itself does not. The idea must then be attached to something that does have inherent value, and that means physical goods. Value creation, real value creation that conjures value from the ether, requires the making of something noncorporeal, corporeal.

As such, we take time and physical labor, two non-corporeal things, and apply that to corporeal materials which have set value, and the rearrangement of the materials injects value into them from the work and time. Raw materials are made more valuable than they were alone, thus creating genuine, palpable value that wasn't there previously, thus growing the economy.

---

There are three pillars to a strong society, and these have existed since the beginning of civilization: the market, the welfare state, and the educational system. There is a fourth pillar, the military, but that doesn't have to do with the actual state but is instead focused on other states and possible threats.

The system works by generating money and value in the free market, which then fuels the welfare state, which gives people the foundation necessary to aspire to education, which then fosters ideas and growth in the free market.

We suck at all three pillars. Our market is wildly unfair, our welfare system is broken and overly expensive, and our educational system deprives those in the greatest need and prices the majority of people out of higher education.

An unfair market is not necessarily bad. It is supposed to be wild and wooly. The problem is when that market becomes unstable and unregulated it can put the stability of the entire system at risk. In a perfect cycle, this would be a small risk; even if the free market causes the collapse of business, the welfare state is there to provide for those ejected into poverty. They can find their feet and head back out, knowing that the state has their back, as it were. Our current system is generating poverty1 and leaving them with little protection.

The welfare state is perfect when it provides all elements of basic life to all people in a society. Three meals a day, a fixed address, and access to cleaning and grooming. If the farthest that people can fall is a dorm room with three meals a day, then they can always get back up again. We aren't doing this. We half-ass everything. Instead of developing and providing food directly to people, we use the inefficient food stamp programs.2

Instead of providing to everyone, we kinda'-sorta' provide to families and children, but then rip our support away at arbitrary times, thus negating all of the benefits that we provided earlier. Our welfare state shoots itself in the foot constantly.

Our educational pillar isn't as hosed as the other parts. Our public education system is actually quite good and our universities and colleges are the best in the world. The situation for particular demographics is awful, though, and things are getting worse.

Those that need the education the most are getting the least. As our society becomes more delineated between the haves and the have-nots, money is flowing out of schools with large populations of underprivileged kids. When budget cuts need to happen, the schools are the first to feel the pain. Scientifically proven programs like Head Start get gutted as step-one. These issues matter little for wealthy schools and families that send their children to private institutions. But for everyone else, things are getting worse.

Again, the economic arguments that I am making are pretty widely accepted. Unless you are an Austrian School lunatic like Ron Paul, these are things that are understood. Yet we have legions of people willing to argue their heads off that these concepts are completely mistaken. Those economists with their fancy degrees don't know shit! But what can one do? Again, the wellspring of this resistance is not logical. There is no arguing.

What really blows my mind is the blockheadedness of opponents. "It's my money!" they yell, without any comprehension about what money is. "Taxes strangle business!" is another popular line.

Let's assume that these people are correct. They are "job creators" and/or the "productive class." What do they think allows them to be job creators and/or productive? The hoi polloi. The legion of socioeconomic bricks that make up the vast majority of the economy. By strengthening them, you become stronger.

Everyone benefits when the working class is made stronger and richer. The economy is more stable and more easily absorbs the fluctuations inherent to a free market. By resisting support to the working class, everyone is shooting our collective foot. We are knee-capping our economy just when competitors are rising with an eye on our crown.

But, yet again, this all has little to do with logic. It has to do with the poor being scum, trash, unworthy of love. And all of their pain, their loss, their suffering, and their want is perfectly deserved. Deserved because how couldn't it be? That world would be unimaginable.


--------------------------------------------

1: According to the US census, the poverty rate plunged by half from 1959 to 1972, then leveled off. We saw spikes during the 1980's and early 90's. The poverty rate in 2010 was the highest since 1993 and it's continuing to go up. Along with this, income disparity is the highest in the Western world (we're just below Cameroon), a majority of bankruptcies are triggered by medical issues, and as the working class earns less, corporations earn record profits. 


2: I don't mean inefficient in the way that conservative pundits mean it. As the system is designed, it works very well. I mean that all programs rely on people going to for-profit businesses to buy food with government money. That means that a percentage of tax dollars is going directly into the pockets of grocery stores instead of into the stomachs of the people who need the food.


A better system is government-purchased food, bought for cheap in bulk, and then distributed free to anyone who wants it. I argue that this would result in growth for grocery stores since people could get simple food for free and save their money for the more elaborate and expensive foods available at private companies.

Sunday, March 11, 2012

Southern Poverty Law Center Addresses Men's Rights

The Southern Poverty Law Center, famous for studying and keeping tabs on hate groups, has published a detailed analysis of the garbage being spewed by the "Men's Rights" movement, such as it is. Almost needless to say, their thoughts are not kind.

I feel vindicated by this after my first experiences with the men's rights movement from my first videos and posts about the subject. I was subjected to the most awful comments of my online writing career. Nothing I have written has elicited a response of such a nature. Not even close.

These men, and a few women, are angry. And for someone who is used to listening to generally level-headed people make attempts at cogent conversation, the vitriol that I received was jarring. It hurt. For a long time, I tried to change minds. I cared what these people thought of me.

I began to stop caring about their comments after it hit me that they are, by and large, no better than racists. While I will not tolerate a white supremacist, I don't pay them much emotional mind. I don't care what they think, I only care insofar as those thoughts may trigger behavior.

In a not-too-distant past, I would have loathed these people. Now, I feel simply sad. What strange precedent caused them to become this way? How terrible the world that they inhabit must be. I don't mean to try and reduce their awfulness. They should be attacked (not physically, obviously), argued with, belittled, and otherwise reduced. Their hate cannot be allowed to perpetuate. But at the same time, responding with hate does little to help us. It doesn't help us advance our own perspective on the human condition, which needs to be regarded dispassionately, even when it turns so dark.

Friday, March 09, 2012

It Is Not A Post-PC World

The seemingly endless talk about the "post-PC" era is beginning to drive me up a wall. There is no such thing. It reminds me of the same hollow buzzwords being thrown about in the "cloud" computing framework, even though EVERY idea in the "cloud" has existed, and in many cases been implemented, for a couple of decades. All that was missing was the bandwidth.

Likewise, I think that "post-PC" is a buzz word because the technology might be changing shape, but it is still the same technology. It's the same protocols, the same concepts, all simply shoehorned into a different interface. It sounds like I am belittling the advances made that allowed touch to work, of which there are many, but I am not. I am saying that the advances that were made had less to do with paradigm and more to do with form.

For example: touch and tablets. The iPhone was the first fully-touch interface that worked. Was there something paradigmatically different about it? No! Windows Mobile, Palm, and Symbian all could have done similar, even on older technology; they simply couldn't see past their old interface ideas and look at it from a true UX perspective. It was a problem of perspective, not technology.

Even if the pundits are correct, and the primary form of computing in the future is a tablet of some shape, it is still not the post-PC era. We will still have a dock at home, with a keyboard and most likely a mouse. All that has happened is that the PC has been made modular, which is a form in which we have seen experiments like the Asus Transformer tablet, Motorola's WebTop, and the recent announcement of full Linux running on cell phones. But even there, that is a transition that won't happen for some time since all it will do, at least initially, is increase cost and reduce performance.

My own experiences with consumers also do not support the new post-PC party line. No matter what cell phones or tablets people own, at the center of their technological world is a laptop or desktop. I know only a single person who actually relies on their iPhone for computing. Truly, nothing I have experienced indicates that the post-PC era is post-anything, much less the PC.

So from whence comes this talk of the PC's end? Magazine and internet writers in desperate need of something to cover are assuredly part of the drive. Sales for PC's aren't the shooting star that cell phones and tablets have become, which makes a numbers game especially conducive to this argument. But I don't think that either of those are truly the root of the debate.

I think that it is marketing, plain and simple. Marketing, specifically, from the very companies that conveniently have post-PC products to sell us. Apple started it, Google, Samsung, and Motorola all jumped on board. Now Microsoft is even making similar noise. That marketing people from the various companies both birthed and are now driving the post-PC conversation leads me to believe that they have... ulterior, motives. A recent computer purchasing trip revealed them to me.

My mother's computer recently, how shall I put it, shit the bed. So of course I, being the technowonder that I am, was called upon to make the situation right. After analysis of the nearly-seven-year-old computer, I decided that a new rig1 was needed. I didn't have the time to design and build her a computer, so I hit the interpipes looking for a good one.

The old computer was a Sony Vaio, purchased in 2005 for about $950. It had decent hardware for the time, including a PCI Express slot for an external video card. It served my mom -who was the kind of user to simply install things until the computer stopped- in a more or less problem free manner. If her room hadn't been filled with dust and cigarette smoke, the computer would be 100% usable even now.

Think about that. A computer that is nearly seven years old still does pretty much everything she could ask of it. Compare that to the early 1990's. Imagine buying a computer in 1990, then trying to use that same computer in 1997.

In 1990, you probably would have had a 20Mhz Intel 486 (or AMD/Cyrix if you were a rebel), 4MB of memory, and a 50MB hard drive. The idea of external video processing was a dream, and CD storage of data was something that only the techno-elite had.

In 1997, you would have had a Pentium MMX 200MHz with perhaps as much as 64MB of memory, a 20GB hard drive, CD's and CD burners were becoming common, and Vérité Quake had been out for a year. The two systems were in different leagues. The old system would have been essentially useless.

This was the era that birthed the complaint "I buy a computer and it's immediately obsolete!" Because, at the time, this was somewhat true. Computer hardware remained up-to-date for fleetingly short times. Today, people repeat the complaint, but it doesn't mean the same thing. Today, the complaint means "I just bought this shiny thing and now there's a shinier thing!"

And that is the root of this marketing-fed nonsense known as the post-PC era. A good computer from five years ago will do most of what you could want of it today. That is not sexy. That is not a money-maker. That doesn't drive sales numbers up. Companies don't like it when the turn-over rate on their products is measured in decades. That's why the "planned obsolescence" of the American auto industry included cars that rusted into piles after five years.

Initially, companies didn't need to worry about this. Through the likes of Dell, Gateway, and Compaq, they commoditized  high-turn-over computers below $1,500, then $1,000, then $500, and finally $200. Then, to the various computer companies' horror, even cheap, crap-boxes of computers started lasting for two, three, five years. Meaning that razor thin per-unit profits had to be stretched out over longer periods of time.

For example, the new computer that I bought included an AMD FX-4100, a 1GB Radeon HD 6670, 8GB of memory, and half-a-terabyte of hard drive space. Total cost: $600. That hardware will last, easily, for five years. So I could buy a high-powered computer, capable of doing anything that I want... or I can buy a tablet that is capable of doing little more than surfing the web. And, importantly, it will be obsolete in a very short time: awful for me, but great for the company.

Look at the way companies treat their cell phones, even though they already have a natural life-span of two years. They try their damnedest to get away with not releasing the newest versions of operating systems to them, even when the hardware in the phone could easily power it. They want to make the phone obsolete, because they are desperate to stop smart phones from turning into PC's. Glorious, beautiful, cheap, user-friendly, democratized PC's.2

This reality is being intentionally obfuscated by the companies who want our money, but it is also being unintentionally ignored by users because everyone has a computer. They are simply part of life. But to say that we are moving past this common aspect of everyday life is like saying that we live in a post-refrigerator world because they don't get coverage in the news like they did in 1910.

We are not in the post-PC era. We are in the PC's golden age. It is an age when a powerful computer that will last for years and do everything that a user could ask of it, including games, can be had for the price of an iPad.

This golden age is not sexy. It is not highly profitable. It is not predicated on accelerate obsolescence. In many ways, it is an era in the control of the users and not the companies who are trying to define and control a new zeitgeist. And that is why it's being ignored.

--------------------------

1: Rig is an ultrageek term for a computer. It stems from us geeks usually custom building our own "rigs."
2: Apple doesn't feel the need to do this since they play the "this new thing is shinier than my old thing" market like a fiddle.