Computer Mysticism

Last weekend I installed two different versions of Windows on two computers. One was a brand-new PC I built myself, and one was an HP that needed a reinstall. One needed a VPN connection to the MIT network to validate. The other one needed to have its proprietary drivers backed up and restored.

There’s a certain magic to computers when you start getting into the low-level stuff. I don’t mean programming-wise. Reinstalling Windows is more of a mystical art than a straightforward process.

Ancient forum tomes are filled with archaic tutorials. Software is a moving target, and complex formulas and hacks are prone to break down over time.

But even worse is the amount of superstition that gets poured into computer maintenance. Each user has rituals that they are convinced ward off errors. Actually, we see this in all sort of technology usage; people have rites designed to improve buffering speed, battery life, and disk readability. I know a group of people who have a running joke that involves standing on one foot when doing any complex computer maintenance to make it work.

The reclusive Linux alchemists mix their own potions (disdaining the draughts pushed by the shops in town), but use indecipherable notation in their recipes. Elixirs are delicate brews, and the average person doesn’t have the same instincts that let alchemists be productive.

Yet after going through the ordeal of reinstalling Windows or constructing a computer from scratch (and having it work!), you have a lingering feeling of power. The minor incongruities and annoyances that plague modern software usage no longer make you feel helpless. You are an empowered user, able to conquer any confounding roadblock. You may not be a mage, but you aren’t completely powerless under the whims of the wizards in the grand Corporate Tower.

Advertisements

The Future of the Source Engine

Valve’s Source and GoldenSource engines and Epic’s Unreal engines have had a long, acrimonious feud. Both Golden Source and the Unreal Engine debuted in 1998 in Half Life and Unreal, respectively. Both were considered revolutionary games at the time. Unreal blew technical and graphical expectations out of the water. Half Life left a legacy as one of the most influential games in the FPS genre.

Unreal Engine screenshot Unreal Engine screenshot
i2Zan0DmFkTfy Golden Source screenshot

Fast forward 6 years. Valve, in the meantime, has released Team Fortress Classic and Counterstrike, both extremely revolutionary games. The Unreal and Unreal 2 engines (the latter was released 2 years prior) had become extremely popular platforms for game developers, mostly because of the engines’ notable modularity and room for modification.

In 2004, Valve debuts the Source engine with Half Life 2, a ground breaking game that completely demolishes competition and sets a long-lasting legacy in terms of story, gameplay, and graphics. For comparison, Unreal Tournament 2004 was published the same year.

Unreal Engine 2 screenshot Source screenshot

In another 7 years, Unreal Engine 3 has been released and games like Gears of War and Batman: Arkham City have been developed using it. Valve has just published their first widely supported game, Portal 2. The Source engine has been evolved over the years, and many graphical upgrades have been applied along with compatibility with major game consoles.

Batman: AC screenshot
screenshot-2

However, it becomes readily apparent that the visual styles of these two engines have diverged in the years since 1998. The Unreal line of engines have supported games like Bioshock and Mass Effect, but have also bourn the brunt of AAA games. Such games are known for their muted brown-grey color pallete, uninteresting story, and factory-made gameplay. Unreal Engine games are commonly criticized for having character models that look “plastic” (a result of game developers setting specular too high on materials), awkward character animations, and overuse of lens flares and bloom.

Games on the Source engine, on the other hand, consistently revolutionize some aspect of gaming. For example, Team Fortress 2, Portal, and Left 4 Dead are widely known for innovative gameplay. Unfortunately, Valve has lagged behind in terms of pushing the graphical frontier. Half Life 2 was smashingly good for its time, much in the same way that Halo stunned the gaming world back in 2001. However, every Source game since its debut has looked more and more aged.

Even worse, developers are driven away from using the Source engine due to a set of tools that have barely evolved since they were developed in 1998. Hammer, the level creation program, and Face Poser, the character animation blender, are unwieldy and unfinished; Source SDK tools are notorious for their bugs and frequent crashes.

Conversely, the Unreal toolset is streamlined and easy to jump into. This appeal has drawn more and more amateurs and professional developers alike. The editor allows you to pop right into the game to see changes, whereas the Source engine still requires maps to be compiled (which can take minutes) in order for the most recent revision to be played. Unreal’s deformable meshes dwarf the Source engine’s awkward displacement system.

However, I have a feeling that a couple of factors are going to come together and boost both engines out of the recent stigma they have incurred. The biggest factor is that at some point the AAA game industry is going to collapse. The other critical event is Half Life 3.

Yes! Do I know something you don’t? Have I heard a rumor lurking the Internet about this mysterious game? No. But I do know history. And that is more useful than all the forum threads in the universe.

Half Life was released in 1998. Half Life 2 was released in 2004. Episode 2 was released in 2007. Half Life 2 took 6 years to develop, despite being on a side burner for some of that time. By extrapolation, Half Life 3 should be nearing release in the next 2 years. However, circumstances are different.

The Source engine was developed FOR Half Life 2. Graphics were updated. But the toolset remained the same. In the time between HL2 and now, Valve has been exploring other genres. Team Fortress 2, Portal 2, and Left 4 Dead 2 all took a portion of the company’s resources. In addition, that last few years have been spent intensively on developing Dota 2 (which, by the way, was the cause of the free release of Alien Swarm). The second Counterstrike was contracted out. So Half Life 3 has been a side project, no doubt going through constant revisions and new directions.

However, unless Valve is going to release Day of Defeat 2 or Ricochet 2 (yeah right) in 2013, production on Half Life 3 is going to kick into high gear. There is one fact that drives me to believe even more heavily in this theory.

Since 2011, and probably even earlier, Valve has been pumping a huge amount of effort into redesigning their entire suite of development tools. It had become readily apparent to everyone at the company that the outdated tools were making it impossible to develop games efficiently.

“Oh yeah, we’re spending a tremendous amount of time on tools right now. So, our current tools are… very painful, so we probably are spending more time on tools development now than anything else and when we’re ready to ship those I think everybody’s life will get a lot better. Just way too hard to develop content right now, both for ourselves and for third-parties so we’re going to make enormously easier and simplify that process a lot.”
-Gabe Newell

Because both TF2 and Portal 2 have been supported continuously since their release, they have been the first to see the effects of this new tool development. Valve seems to have used these games as testing grounds, not only for their Free to Play business model and Steam Workshop concept, but also for new kinds of development tools. First, the Portal 2 Puzzle Maker changed the way that maps were made. In the same way that Python streamlines the programming process, the Puzzle Maker cuts out the tedious technical parts of making a level.

The second tool released was the Source Filmmaker. Although it doesn’t directly influence the way maps are made, its obviously been the subject of a lot of thought and development. The new ways of thinking about animation and time introduced by the SFM are probably indicative of the morphing paradigms in the tool development section at Valve.

Don’t think that Valve is going to be trampled by any of its competitors. Despite Unreal Engine’s public edge over the Source engine, especially with the recent UE4 reveal, the AAA game industry is sick, and no other publisher has a grip on the PC game market quite like Valve does. And although 90% of PC gamers pirate games, PC game sales are hardly smarting. In fact, the PC game market is hugely profitable, racking up $19 billion in 2011. This is just a few billion shy of the collective profits of the entire console market. Yet the next best thing to Steam is, laughably, EA’s wheezing digital content delivery system Origin.

Numbers Source

Anyways, here’s hoping for Half Life 3 and a shiny new set of developer tools!

Why Richard Stallman is Wrong

I listened to an interview with Richard Stallman, and I truly believe he is wrong regarding the ethics of proprietary software and especially the fundamental beliefs behind computer and Internet usage.

Fundamentally, he assumes incorrect things. He says that people should be able to use computers for free. That doesn’t mean that having people pay to improve the experience is evil. I can decide to gnaw through a tree on my property for free, but I can obviously pay to have it cut down. Similarly, a user should be able to do anything they want for free, but should also be able to pay to either improve the experience, do it faster, or change the feel. The point at which you start getting involved with morality is when the development of proprietary software begins to interfere with the development of open-source software. However, I think that if proprietary software was somehow banned, the rate of development of open-source software would not increase by very much.

Stallman is fine with software developed for a single client, where the developer is paid for the development of free software, rather than the software itself. However, that is fundamentally the same as distributing proprietary software. The cost of the proprietary software represents the effort that went into making it, as well as upkeep for the company including other worker salaries and continued research and development. I do agree that such costs can get out of hand and that a ridiculous amount of money can end up going to those higher up on the corporate ladder. However, that is a necessary evil to keep high quality proprietary software pumping out at a rate faster that free software can be developed.

Although he demands that the functionality of ebooks mirror that of books, he doesn’t seem to make the same connection regarding proprietary software and its real world parallel: non-free services. Although you should be able to live in a house and use public transportation for minimal costs, you almost always buy furniture and hire services to make your life more comfortable. Similarly, proprietary software allows users to improve the aspects of their experience that they want to.

As I said before, Stallman discusses ebooks, and how you should be able to do the same with an ebook as you can with a regular book. However, as a completely different medium, you can’t just demand something like that. Suppose I demand that JPEGs be viewable in the same resolution as the paintings at a museum, for free. That doesn’t even make sense. Being a completely different medium, we need to approach ebooks in a completely different fashion. It would be nice to be able to easily share ebooks or sell them used. However, for an ebook to exist in an economic and material singularity similar to that of a paper book, proprietary software is absolutely necessary. Using Stallman’s logic, I can say that if you want a book to be freely available, write it yourself!

In some ways, open source philosophy (or at least Stallman’s) is like Communism. Everybody pools their resources and in return everybody gets the same, free software. However, as we see with many actual implementations of Communism, somebody who contributes resources may not need all the products. If I spend time coding, I want a video editor, not a database manipulator. The obvious solution is to have both developed and then have those who want the video editor to give their share of resources to that developer, and those who wanted the database software to the other.

%d bloggers like this: