Tuesday, July 30, 2013

One Year Later: "Call Me Maybe"


It can be a fool's errand to explain in objective terms why we like the songs we do. Oftentimes, we like what we like, and that's that.

And I think that's okay.

There's no need to feel guilty about why you like a tune; it happens. We get obsessed with songs. That's not to say that we shouldn't discuss music critically or try to parse out why we like them or why there may or may not be ethical problems with some songs, etc. But as far as explaining why or why not a melody attracts us, this seems to be one of the least explainable and most subjective qualities in music criticism, especially in radio pop, where most of the songs do their darnedest to find the line between playful and irritating.

So, now that I've spent two paragraphs and change rationalizing what I'm about to say, here's what I'm getting at: I think "Call Me Maybe" is wonderful, and I would even call it one of the finest pop songs of recent years.

Seriously. Like, seriously. When "Call Me Maybe" spent nine consecutive weeks at the top of the pop charts, it was one of those rare, luminous moments when I felt like I was in on the national party that is Top 40 radio. I'd even go so far as to profess love for "Call Me Maybe;" luff, even—I luff "Call Me Maybe."

Yeah, I'm being a little over-the-top here. But I sincerely think it's a great song; no guilt or irony here. However, a lot of folks who know me find my adoration for such a piece of pop fluff to be inexplicable, given my other musical preferences (R.E.M., Miles Davis, and, um, progressive rock, for the most part). So, in answer to my friends' year-long befuddlement and in celebration of the rough anniversary of my first hearing of the song, here are five reasons why "Call Me Maybe" is great.

As I said before, there's a certain ineffable quality to the song—and every pop songthat defies explanation, much less my own infatuation with it. Listen, it's almost as mysterious to me as to my friends why I like "Call Me Maybe" as much as I do. But I'll do my best to quantify the tangible good in the song. And plus, it's been a year since its pop dominance; I say it's about time to reexamine its excellence. So here we go: five reasons.
  1. It's a freaking catchy song. Holy cow, is it catchy. I defy you to name a bigger earworm from last year. Once you hit that little skip in the melody at "Now you're in my way," you know there's no way that tune is ever going to let go. Not that all good songs are catchy, mind you. I happen to think that "Call Me Maybe" toes the aforementioned line between playfulness and irritatingness handily, but I know a lot of people who find that same catchiness annoying. To be clear: it would not be good if mnemonic tenacity were its only strength. That's why I've got four more reasons.
  2. It's short and sweet. Clocking in at just 3 minutes and 13 seconds, "Call Me Maybe" is brief enough to drop in, sprinkle its charm on listener ears, and leave without overstaying its welcome. Even if you dislike the song, you've got to admit that it's at least over quickly. Not only that, but the song has a airy sweetness to its melody that makes it go down easy. It's lightheartedness gives it a nimble sound that makes it a joy to experience. There's nothing heavy about the song whatsoever, and this is, I think, what rescues the lyrics from their would-be triviality; the song never tries to be anything but trivial, right down to its musical arrangement. It would be annoying and trivial if the song were a "Someone Like You"-esque, stone-cold-serious, bleeding-heart ballad. But it isn't. It's fun and light and nothing but fluff...
  3. ...except for that "call me" moment. Okay, here's where I get a little overly analytical. There's this one moment in the chorusthe most important moment, I might arguewhen Carly Rae sings, "But here's my number/So call me, maybe?" And yeah, duh, of course she sings that, since it's the title of the song and everything. But it's the way she sings that pair of lines, specifically the way her she sings the "call me" part, that is subtly great. At the word call, her voice cracks ever-so-slightly, and the tune takes a momentary shift to a minor chord. It's this fantastic moment of vulnerability in the center of all the song's fluffiness, and it proves that there's a heart at the center of this song. Robert Christgau says that a cornerstone of pop music is "sadness made pretty," and if this is true, "Call Me Maybe"'s moment of vulnerability is just the touch to make the song essential pop.
  4. Then there's the music video. Dat music video. Not to let anyone think Ms. Jepsen was too serious about "Call Me Maybe"'s content, the video introduces a vicious streak of self-parody to the song's tone. Please, do yourself a favor and watch it. Mute the sound if you must, but... just watch it. It's one of the most hilarious and self-aware music videos to come out of the music video renaissance of the YouTube era.
  5. The response of the culture at large. Every once in a while, some bit of pop culture ephemera hits all the right chords to take on a mutated alternate life outside of its original context that goes way beyond anything the original creators envisioned. Such is the case with "Call Me Maybe," which has been subject to an unusual outgrowth of Internet parody, tribute, and mimicry. My three favorites: NPR's spoken-word rendition, Cookie Monster's "Share It Maybe," and, most gloriously, this arrangement for choir and orchestra.
Oh, and bonus reason #6, for all you bandwagoners out there: "Call Me Maybe" even got a fair amount of critical acclaim. Pitchfork (I mean, goodness, hipper-than-thou Pitchfork) ranked it the 29th best song of 2012, Rolling Stone called it the 50th best, and The Guardian called it the best song of last year. And that's not even all the accolades it collected. Do a quick Google search on last year's year-end lists. "Call Me Maybe" pops up a surprisingly frequent number of times.

But listen, I know lots of people who hate it. Maybe you even hate it. And that's okay. You like what you like. But let me know why you hate it in the comments. I'll be happy to discuss.

Until next time.

Monday, July 29, 2013

Pacific Rim: There's a Review in Here Somewhere


A paradox lies at the heart of Pacific Rim, the newest (and loudest) addition to director Guillermo del Toro's filmography. It's a movie that is determined to be both outsized and small-scale, to be enjoyed both ironically and sincerely. It's this paradox that makes me feel as affectionately towards the film as I do, and yet, it's probably what will also keep a lot of folks from liking it at all. So basically, if you can embrace this paradox, you'll have a great time. If not, well... you might want to go to Redbox instead (or hey, how about Monsters University? I hear its a good 'un).

On the one hand, in terms of sheer size, Pacific Rim looks like it's trying to be the biggest, baddest summer movie it can be. If Jaws made the summer blockbuster's promise of "You're gonna need a bigger boat," Pacific Rim takes the promise to its logical conclusion by supersizing everything. I mean, the whole premise of the movie is that robots the size of the Empire State Building fight extra-dimensional kaiju monsters on the open sea, and the film's creative team makes full use of that size. The movie ain't called "Pacific Rim" for nothing; given the size of the combatants, it doesn't take much for the action to involve the entire Pacific Ocean. To get much bigger than that, you're gonna need a bigger planet.

As soon as the camera fixes on its first robot (roughly one minute into the movie), we know we aren't in the world of subtlety or realism anymore, just pure blockbuster logic, and it's refreshing that Del Toro's direction acknowledges what only a few such enlightened blockbusters realize, that hey, this isn't the real world. In one of the movie's most deliciously outsized moments, for example, one of the robots picks up a battleship and wields it as a club. Forget momentum, center of gravity, principles of acceleration, or fuel economy; it's a giant robot using a less-giant thing to hit a more-giant thing. This movie is, at its best, about the beauty in the blockbuster formula's potential to transcend the plausible for the sake of the visceral and the huge. And when Pacific Rim does just that, it's glorious.

On the other hand, though, Pacific Rim can feel frustratingly small in the context of some of the more notable action blockbusters of recent years. That's to say, while the action and concept and scale are huge, the human plot and emotional stakes of the film are straightforward and by-the-numbers—often boringly soespecially when compared to the superhero epics that have come to dominate the summer season. Unlike most recent action blockbusters, Pacific Rim is a standalone movie with no explicit connection to a prequel, sequel, or character mythology to draw depth from. It has no Themes-with-a-capital-T like Nolan's Batman trilogy, and it pays only the faintest lip-service to the Whedon-esque (or, in the case of The Avengers, Whedon-created) quippy-yet-angsty heroes from the Marvel films. Its characters broadcast their motives with unambiguous sincerity and have backstories explained by brief montages or flashbacks. There are no antiheroes, only heroes. In short, very little goes on under the hood of Pacific Rim when it comes it its characters, so it can't help but feel small comparatively.

Which is a little ironic. So much of the ethic in modern superhero movies has to with making the heroes seem small, incompetent, and petty. Look at the middle section of Thor when Thor has lost his powers and bumbles through everyday life on Earth; the movies makes a point of turning the hero into a fool. Or think of any of Nolan's Batmans, where a major idea is how blurred the line between hero and villain is—again, a "hero" turned small and criminal. And I haven't seen Man of Steel yet, but I'm told that movie tries to make Superman into a similarly ambiguous hero who doesn't always live up to the larger-than-life tropes of the character. Yet for all their undermining of larger-than-life-ness, these movies still end up feeling "epic"—in the sense that they have a sweeping tone and have big stories to tell, not that they're necessarily good.

Pacific Rim actually does go the larger-than-life route with its characters; the good guys are actually purely good, and there is no ambiguity that the bad guys (who aren't really characters, but oh well) need to be exterminated. It is, in a sense, a return to the traditions of the '80s action heroes or even epic heroes like Beowulf. And in a lot of ways, that's a very charming idea. I'm getting tired of angsty superheroes. However, in a blockbuster with the scale of Pacific Rim, it doesn't take long for "charming" to morph to quaint and then dull. So it's somewhat disappointing that the movie spends as much time with its protagonists as it does. Not that Pacific Rim is a character study or anything, but the broad, unencumbered human drama feels increasingly superfluous and small as the movie goes on.

In re-reviewing 2001: A Space Odyssey for his "Great Movies" feature, Roger Ebert observes that 2001's dialogue "exists only to show people talking to one another," to prove that human society does indeed exist in the mostly silent and isolated 2001. This may be the only thing Pacific Rim and 2001: A Space Odyssey have in common. Pacific Rim's characters often feel like plot placeholders, only there to provide humanistic, logical reasons for the action to go on and to ensure the audience that yes, humanity is still human despite the fact that they use skyscraper-sized robots to fight their wars.

I would say that the boring characters are a weakness in the movie, but that's where we get back to the whole "paradox" thing. It pretty much is a flaw that Pacific Rim has boring, generic characters; in narrative filmmaking, character is one of the primary concerns, so if your film has unengaging characters, yeah, there might be something wrong with it. But with Pacific Rim, it's a little bit more complicated.

The problem is, it could not be the movie it wants to be with more fleshed-out characters. The movie's main attraction is no-doubt the ridiculously oversized robot-kaiju action and Del Toro's gung-ho embrace of that oversizedness in his direction, but this main attraction would deflate if the "flaw" of simplistic characterization were fixed. If the movie gave us more intricately designed characters, it would stumble on the host of ethical questions that regular dramas run up against whenever they encounter violence, such as the human toll of such violence, the bleakness of war, etc. Such ethical questions would only make the over-the-top direction seem callous to the human cost of the ridiculous action, which would, of course, only undercut the sincere joy we feel from the action sequences' visceral beauty*.

Instead, Del Toro is smart enough to give the humans just enough humanity to justify the action scenes, but not enough to distract from the thrills and kills. It's directing that's aware enough of its wide-eyed commitment to single kind of movie experience—the no-holds-barred action movie—to enhance that experience. Brilliantly, Pacific Rim manages to land at the point right before self-awareness becomes parody or, worse, nihilism. And in that way, you can sincerely enjoy the spectacle.

See, so on one level you have to view Pacific Rim with the ironic detachment of knowing that the movie is kind of dumb and therefore not worth deconstructing for flaws, while at the same time you have to be able to buy into the central idea that there is something beautiful at the heart of the bigger-is-better action aesthetic. You need to ignore its premise so the flaws don't bother you, but you also need to embrace the movie, hook, line, and sinker, to let the direction amaze you. And sometimes that's kind of hard to do.

And yeah. That's about all I have to say about that. You may have noticed that I didn't even bother trying to parse out Del Toro's influences, the kaiju and mecha genres of old. I'm afraid my experience with those genres is limited to one viewing of Godzilla ten years ago, my childhood obsession with Mighty Morphin Power Rangers, and this SNES game. I'll leave the influence-spotting to the experts. Other than that, thanks for reading! Let me know what you think in the comments. That comment box is there for a reason.

Until next time.


*This is, I think, one of the many problems with Michael Bay's direction. His movies flirt so much with antiheroes and human flaws that they make the over-the-top action feel misanthropic rather than exuberant.

Thursday, July 25, 2013

Race and Hip: The History


I'm frustrated, but let's start by saying something positive, shall we? There are plenty of reasons to read John Leland's Hip: The History. So many, in fact, that I'll bullet point a few of them for easy consumption.
  • As the on-point title indicates, it's a nonfiction book about the history of "hipness" in America, which according to Leland stretches all the way back to colonial times. I mean, come on; doesn't that sound awesome? It is.

  • It's a nonfiction book about hipness published in 2004, making the book a sort of historical artifact itself. Apparently trucker hats were the cool thing to wear nine years ago.

  • Hip: The History is not nearly as smug as its title implies. Leland obviously knows way more than most of us on the subject of cool, but there's rarely a note of condescension or cleverness. "There's something inescapably nerdy about compiling a history of hip," he writes in the preface, and he apparently means it, since the nerd historian (read: endearing nerd historian) is how Leland comes across.
  • It never once uses Wes Anderson- or Portland-derived signifiers to describe "hipsters," a welcome relief from the inexplicable and increasingly stale pigeonholing that a lot of people seem obsessed with perpetuating.
  • It's surprisingly educational. Leland traces an often unpredictable path of hipness through American history, and he rarely spares the historical details to flesh out the journey. The book is full of some seriously interesting historical and cultural connections. In addition to hitting the obvious "hip" touchstones (jazz, beat poetry, drug culture), Leland spends a great deal of time fleshing out more nonstandard portions of the hip story, including the history of blackface, the connections between hardboiled detective novels and gangsta rap, the rhythms of Chuck Jones cartoons, and the role of the dot-com boom in shaping hipsterdom. It's all pretty fascinating and insightful.
  • Speaking of insightful, Leland does a wonderful job of examining race as a significant factor in hipness for most of the book (most—we'll get to the rest in a second). If the book has a thesis, it's the idea that hipness is the interplay between the white and black races throughout American history, with an emphasis on interplay. While the book doesn't minimize the damaging role white supremacy, slavery, and outright racism play in the story of America, Leland argues (quite effectively) that "hip" consists of contributions from both white and black culture. His analysis interrogates clear-cut racial narratives such as "the white boy who stole the blues" to find the cultural swirl between the races that yielded "cool"—the blues, for example, was created by American slaves drawing on both their African heritage and the European musical traditions taught to them by their white owners.
Just keep all those things in mind as I go on, because I want to make it clear that I thought this book was overall pretty great. For fourteen chapters, Leland examines the cultural history of America with style, depth, and nuance, avoiding easy answers and classifications while still providing a convincing case why "hipness" is important. And it's all good. The problem is that there are fifteen chapters, not fourteen.

Chapter 15, "Everybody's Hip: Superficial Reflections on the White Caucasian," is meant to bring readers up to speed with the state of hipness in the present day (which, for today's readers, means nine years ago). And in that chapter, Leland says this:
"The trucker hat[*] and other post-hip accessories play with the meaning of whiteness in a multicultural world. They make white visible. Without the black/white dichotomy to anchor it, and without numerical dominance to give it weight, whiteness is up for grabs. Especially in cities that are now 'majority-minority,' or less than half non-Hispanic white, whiteness is no longer the baseline, something taken for granted; it's something to be explored, turned sideways, debated for its currency. It's a mask, like burnt cork or 'blackting,' the slang for acting black ... Post-hip treats whiteness the way fashion and entertainment have historically treated blackness. It swaths white identity not in race pride but in quotation marks. Whiteness doesn't define you, you define it—and you don't have to be white to wear it ... The history of the white negro [i.e. how white culture mimics black culture] has not come to the end, but without the binary opposition of black and white, it has come to an end."
Long quotation, I know. So I'll be to-the-point: I just... think Leland's wrong. Dead wrong.

Listen, I was fourteen and living in the American South in 2004, not fifty-five and living in the East Village like John Leland, so maybe I missed out on some gloriously transcendent cultural moment when the Millennial generation shrugged off America's history of racial tension. I could have missed that. My life exposure is limited; I get that. But to say that contemporary culture is "without the binary opposition of black and white"? I just don't understand that.

I realize that in this quotation Leland talks mostly about whiteness. But it's only a page earlier that he writes, "For the post-hip generation, the black and white poles that for so long defined race have given way to a kaleidoscope of color, race, and ethnicity," including all ethnicities in the discussion. To be clear, I agree that "race" is a somewhat fluid concept that resists hard definitions; what I'm about to say has nothing to do with what I think races are or aren't in a scientific sense. I'm just talking about how society views race. And that's my problem with what he's saying. We aren't in this "kaleidoscope" utopia of race, or at least we don't act like we are. We may have gotten better than previous decades, but can we really say that America still isn't huddled around reductive definitions of white and black?

Racial relations in America are still broken and binary. I'm white, I'm privileged, I'm mostly free from any sort of racial discrimination, and even I can see that. I don't want to speak with any sort of authority on the issue of race, because I am certainly not in a position to do such a thing. But I don't think I'm stepping over any line by saying that race (particularly that binary white/black definition Leland mentions) is still a big issue in the United States.

It's still widely acceptable for a person (even our president) with only one parent of African heritage to be viewed as "black." Doesn't that show a culture that tends toward "black and white poles" rather than a kaleidoscope? We just saw a major court case painted in racial terms of "white" and "black" by the mainstream media, despite that one of the individuals involved was of mixed race. I don't really see a free-flowing definition of race there.

Or fine, let's stay within American pop culture, the parameters of Leland's book. Can anyone listening to the new Kanye West album really say that racial identity isn't a part of the pop culture conversation anymore? And I can't even count how many people, even my own peers, I've heard place blame on hip hop for the corrosion of morality, responsibility, grammatical proficiency, etc. in black youngstersalways black, despite how kids of all races listen to rap, and always hip hop, as if it's the only musical genre that uses slang and occasionally promotes negative behavior. Or, on the other end of the spectrum, consider Vampire Weekend, a band that has been plagued with criticisms of being "too white" for their world music stylings since before they even dropped their first album**. I'm not saying there aren't nuances to any of these examples, nor am I saying whether or not race should be a factor in pop culture (that's a whole new conversation). I'm just saying it is a factor, and we need to be open about that.

Hip: The History has plenty of great things to say about race and its role in culture. It's one of the best things about the book, actually, and that's why I devoted a whole bullet point to it at the beginning of this post. And that's why I'm so disappointed. For a book that, prior to its final chapter, does such a good job avoiding easy answers on America's racial issues, dismissing race's role in modern culture just strikes me as awfully reductive and naïve. John Leland is throwing in the towel on an issue that's not at all in the past. "Well," he might as well be saying, "good thing we don't have to worry about that anymore." But we do.

Look, I realize that there's something kind of pompous and futile about ragging on a passage in a book almost a decade old. And I'm really trying to do my best to avoid that pompousness. Normally, I wouldn't even be saying anything about it, this little anecdote about a cultural moment nine years ago. The thing is, though, I still hear stuff like this today. I keep hearing people (even myself not so long ago) say things about how racism isn't such a big problem anymore or how we live in a post-racial society, and I've just gotten to the point where I can't get behind sentiments like that anymore.

Race is something we need to talk about. It's something we need to talk about with sensitivity and nuance. It's isn't something to ignore or dismiss with quick, all-inclusive answers. So... let's not do that.

And that's all, folks. Sorry. Had to get that out of my system. I'll get out of the pulpit now. Next time, it's back to carefree pop culture talk, I promise.


*I know, right? Who knew??

**Just for a moment can we sit back, put aside our differences, and reflect on how freaking good Modern Vampires of the City is? Man, I could listen to that album all day.

Wednesday, July 24, 2013

Spirit Tracks: The First Lazy Zelda


The Legend of Zelda: Spirit Tracks is not without its charms. It's a Zelda game for Pete's sake, an entry in a series that has enough charm to jeopardize Professor Flitwick's job security (or, erm... something... yeah, I'll just see myself out). The game has plenty of side quests and collectibles; the Tower of Spirits sections are inventive and fun; the controls are tight and intuitive, even more so than the DS's previous (and otherwise superior) Zelda contribution, The Phantom Hourglass; the art style is characteristically vibrant, though nothing too divergent from what's in Phantom Hourglass. And you get to ride a train, which is pretty cool if only for the fact that it's a freaking train and trains are awesome.

So yes, I liked some things about Spirit Tracks. But nothing in the game was quite captivating enough to fight the overall feeling of developmental complacency I got from this title. It's a lazy take on a Nintendo property that can do much, much better. I've been a Zelda fan since my squeaky pre-teen years, played every canonical Zelda game*, and beaten all but one (damn you, Adventures of Link), and out of all of them, Spirit Tracks is the only one I would say is lazily designed.

Why Spirit Tracks? Plenty of reasons: repetitive design, unnecessary similarity to Phantom Hourglass, and way-too-low difficulty level, to name three. In this post, however, I'm going to focus on a fourth problem, one that has a little more bearing on the series as a whole and the direction in which it's been going for a while. The problem is how Spirit Tracks treats in-game exploration. In short, Spirit Tracks uses exploration as a reward rather than rewarding exploration.

Let me explain. In the game, Link has to collect items called Force Gems in order to gain access to certain areas. Here's how it works: as in The Windwaker and The Phantom Hourglass, the only method of travel between Spirit Tracks's various towns and temples is by vehicle, this time by train. Ignoring the fact that mandated train travel over solid ground lacks the internal consistency of TWW and TPH's forced boat voyages (I get that Link can't walk on water**, but what besides the DS's technical specs is keeping him from ditching the train and walking across solid ground?), this mechanic is problematic because it never gives the player true freedom to explore Hyrule. See, in addition to unleashing an ancient demon on the kingdom, the villainous forces at work in Spirit Tracks have also destroyed the majority of Hyrule's train rails. You maniacs.

Not to fear, though; Link finds rail maps that allow him to ride his train into game's four main areas, which is fine. Phantom Hourglass did basically the same thing with its sea charts. However, once you choo-choo your way into these areas, you find that Link's ability to explore is still heavily limited, since the rail maps only restore the very basic sections of the track. That's where the Force Gems come into play. If you want to explore off the beaten path at all (and I mean at all—the rail maps only give you access to the bare minimum of locations you need to beat the game) you have to complete side quests for the game's assorted NPCs, who then reward you with Force Gems that restore small portions of the tracks that give you access to more of the game map.

Don't get me wrong; I love me some Zelda side quests. Side quests are one aspect of the Zelda franchise that I think has consistently improved over the course of the series, and the games have used them as engaging ways to flesh out the in-game universe and ancillary NPCs, to often great effect. Plus, they're just fun. My problem with the side quests in Spirit Tracks isn't the side quests themselves (though they do get a little repetitive) but with what you earn by completing them. With the exception of the two major collectibles quests (stamps and rabbits) and maybe one or two other mini-games I'm forgetting about, the only reward you directly receive for doing side quests in this game is Force Gems—i.e. access to a new bit of map to ride your train through.

Okay, fine, someone might say. It's a little frustrating, but how does this make the game lazy? The short answer is that it makes the player earn something that every other Zelda game integrates into its core play experience. But when has this blog been about short answers? Onward!

Exploration has long been a central component of the Zelda experience. Think back to the series's roots—matter of fact, let's go all the way back to ground zero, The Legend of Zelda on the NES. This game requires the player to explore. From the opening minutes, nearly the entire world is available for exploration if you dare, and with next-to-no in-game guidance, you have to spend a good deal of time just wandering around the labyrinthine forests and mountains of Hyrule to find your way to the next dungeon or item. There isn't even an overworld map. Just look at it.



If you glance at the upper-left portion of the screen, you can see what the game manual calls the "radar," which is really just that gray box with the green dot in the center. The radar shows you roughly where you are in the overworld (e.g. if you're near the bottom-left of the map, the green dot will be at the bottom-left of the box, etc.), but crucially, it shows you absolutely nothing about the terrain of the overworld itself. This game was released in 1986—graphically primitive times in the video game industry, to be sure, but not so primitive that the developers couldn't have just filled in that gray box with colored pixels to give us a rough world map. But they didn't. In leaving that box gray, the developers invite the player to fill it in themselves. That blank radar helps to generate mystery, the intrigue of what may lie ahead that provides the impetus for exploration. In some interview that I can now only find cited on Wikipedia, but I'm sure I read first elsewhere, Zelda series mastermind Shigeru Miyamoto said this about his inspiration for the first Zelda game:
"When I was a child, I went hiking and found a lake. It was quite a surprise for me to stumble upon it. When I traveled around the country without a map, trying to find my way, stumbling on amazing things as I went, I realized how it felt to go on an adventure like this."
The very act of exploration, of freely wandering in an unknown area, was an integral part of what Miyamoto wanted to capture in the original Zelda. And it's intoxicating. One of the biggest pleasures of that NES classic is the unparalleled level of freedom the game gives you in exploring. It's a masterpiece of nonlinear game design. You can enter the eighth dungeon before completing the first if you want. You can accumulate items and heart containers by wandering the overworld, or you can plunge right into getting that first Triforce fragment. It's up to you. And, importantly, such detours and out-of-the-box thinking always yield rewards—an unexplored alcove of the map might contain the entrance to a dungeon or a store where you can purchase more powerful armor. You can go anywhere you want and be confident that anywhere will be worth your while.

That idea of an open world that allows you the freedom to go anywhere (and rewards you for doing so) has been key to every Zelda game since. That's not to say that every Zelda game allows the same level of freedom as the original. To the contrary, starting with A Link to the Past and Link's Awakening, the series began taking cues from Metroid by building obstacles into the overworld that you could only pass after finding certain items. In doing so, the series lost some of its nonlinearity, but it never completely let go of the central joy of exploration. Even in relatively linear games like Twilight Princess or the Oracle titles, there is always the opportunity to stray from the current story objective to wander, and such wanderings always yield something cool or useful.

It's also important to note that when these games do take away core elements such as player freedom or nonlinearity, they gain other elements to fill out the experience, such as deeper storytelling or a greater emphasis on puzzle solving. Think about that awesome moment in Ocarina of Time when you rush back to the Temple of Time after finding the three Spiritual Stones, only to see Zelda and Impa fleeing Ganondorf on horseback.


The game sacrifices some of that core exploratory urge by setting a linear path for us (Zora realm to Hyrule Castle to Temple of Time), but in return, it gives us some thrilling action and raises the emotional stakes of our quest. So, Zelda games don't have to be about exploration and freedom (even though most of them are, on some level). But if the developers take away those elements, they have to replace it with something else or the game's remaining elements will feel thin.

That's exactly what happens with Spirit Tracks. Instead of letting exploration be a foundational element of the gameplay experience (as it is in other Zeldas), Spirit Tracks takes away exploratory opportunities only to parcel them out as rewards for quests. Whereas all other Zelda games provide pieces of hearts, collectibles, and item upgrades for completing side quests, Spirit Tracks only offers its piecemeal game map. Other Zelda games have exploration and side quests with rewards; Spirit Tracks combines the two into one. Even this might be an interesting take on the Zelda formula if the new sections of tracks held interesting pieces of the world to explore, but aside from a few mostly barren stations and opportunities to bag a few more rabbits, the new stretches of track are just shortcuts to other already uncovered parts of the map, full of the same dull, drive-by scenery that fills the rest of the Spirit Tracks world. The game short-changes side-questing and exploration by taking away the usual incentives of both.

This is what I mean when I say that Spirit Tracks is lazy. It offers less content than a normal Zelda game without making the effort to fill in the gaps with new mechanics.

Please don't mistake this post for what it is not. This isn't some punk-rock call for the Zelda series to get back to basics or "return to its roots." Although I wouldn't be against a self-conscious throwback in the vein of Mega Man 9 & 10 (such a prospect would, in fact, thrill me to no end), that can't be the permanent direction Zelda takes if Nintendo wants it to continue to be as consistently successful as it has been. A series can't thrive solely on nostalgia. It needs to innovate, and I'm all for some innovation in the Zelda franchise, which is in a lot of ways still doing victory laps after the one-two punch of A Link to the Past and Ocarina of Time***. The problem with Spirit Tracks isn't that it changes the Zelda formula—the problem is that it takes away key components of that formula without contributing anything new.

Now, let's see something new. Zelda Wii U and A Link Between Worlds, I'm looking at you.

Until next time, tell me whatcha think in the comments. If I'm full of it, please let me know. Thanks a bundle for reading, everyone!



*C'mon, we all know the CD-i ones don't count. I don't think I've ever even laid eyes on a CD-i console myself, such is its apparent vileness.

**Unless, of course, he could somehow find those nifty boots from Zelda II. Isn't it about time to bring those back, Nintendo? 

***For a good example of a successful innovation of the Zelda formula, look no further than Skyward Sword, which all but does away with nonlinearity/open-world exploration and instead beefs up on puzzle solving to the point where the "overworld" is as much of a dungeon as the temples themselves.

Monday, July 22, 2013

Lost Purpose: Why I Blame Season Two for the Backlash Against Lost's Series Finale

 
So, I'm rewatching Lost with my wife (who *gasp* has never seen it before!), and at the time of this writing, I'm in the throes of Season Two's midsection. I've revisited my favorite episodes from time to time before, but this is the first time I've made the effort to go back and watch the show again in its entirety. And now that I've left the first season behind for the second time, I've realized that the first year of Lost was an anomaly in the show. Like, it's really different from Season Two. Yeah, yeah, duh. Lost evolved in such a way that every season was a way different experience than the previous one. I know. But now that I'm rewatching everything, it strikes me that the shift between the first and second seasons is special. In terms of how it affects the show in the long-run, it's pretty seismic, and I almost feel that Lost in its first season is a fundamentally different show than it is in the other five. It's not, of course, but there's I reason why I'm tempted to make such judgements. Two reasons, actually: episode structure and viewing hook.

Before I explain, a quick declaration of allegiances. First, I think Lost is one of the great achievements of scripted television. Its mixture of science fiction, humanism, and existential mystery is not without precedent on TVsee also The X-Files, The Twilight Zone, and, especially, The Prisonerbut its particular treatment of community and faith within these genre trappings is something we probably won't see again any time soon, certainly not on network television. And yeah, I really like the way Lost ended. It moved me and brought us more closure than I think most fans will admit*. That being said, I recognize that Lost is a flawed show, sometimes deeply so, and those flaws extend to the series finale (really, if your show isn't called Freaks and Geeks, it ain't perfect). Second, I would readily rank Lost's first season among the greatest seasons of television ever, so some of my commentary might be tainted by what I see as the dip in quality, however slight, between "Exodus" and "Man of Science, Man of Faith."

Really, though, I don't think the difference between Seasons One and Two has much to do with difference in quality. Later seasons also diverge in quality, but I don't sense the same shift. It does, however, have everything to do with why a lot of people didn't like the series finale. But I'm getting ahead of myself.

So yeah. Episode structure. What's the first thing people say about Lost when they aren't defending or raging against the resolution of the Island's mysteries? What I hear most often is how compelling the show's cliffhangers are. People don't always say that in so many words, but it's in the subtext of a surprising number of reactions to Lost. "It's addictive" is a common sentiment, as well as "I watched [insert large number] episodes back-to-back" and "I just had to know what would happen next." There are many reasons why people feel compelled to binge-watch a TV series, but without a doubt, the use of episode-ending cliffhangers is one of the most effective. It's also something that the writers at Lost excelled at. Before beginning my rewatch of the series, I would have even gone so far as to call cliffhangers a Lost trademark, right up there with flashbacks. But upon reviewing the show, I noticed something interesting: Season One doesn't have that many cliffhangers.

Sure, all the ones you remember are there: Claire's kidnapping, Boone's injury, and most iconically, Rousseau's transmission that "the others are dead" in the pilot. But... that's pretty much it. In fact, by my count, only about one third of the season's twenty-five episodes end on an honest-to-goodness cliffhanger**. Given that "cliffhanger" is a somewhat subjective term, that ratio is a little debatable, but even the most liberal definition will only get you around ten or eleven episodes. Far more common is for Season-One episodes to end in moments of introspection or character development. Locke watches his wheelchair burn in "Walkabout." Sawyer can't burn his letter in "Confidence Man." Walt and Michael bond over the raft in "Born to Run." For the most part, Season One follows this structure: an event on the Island causes a character to experience some sort of existential crisis stemming from his/her pre-Island life, the action to resolve the Island event escalates while flashbacks explain the character's crisis, and the episode ends with the character transcending/coming to terms with the crisis. Cut to "L O S T" title card. Roll credits.

The beginning of Season Two throws this structure out the window. For one, Season Two loves cliffhangers, and I realize now that it's really with this season that the episode-capping cliffhanger become a Lost staple. Just look at that opening trio of episodes, which announces with panache that Season Two is different. The first two episodes end with definite cliffhangers (Desmond holds Locke at gunpoint, Michael and Sawyer find Jin fleeing the "others"), and the third ends with, if not a cliffhanger, a moment of such tension (Locke's manning of the hatch computer) that it could hardly be called introspective. No longer do episodes end with character montages, soft music, and melancholy. Instead, they end with adrenaline and some new Island mystery. Just to be clear: melancholy, montages, and the rest have not disappeared from Lost in Season Two or any of the seasons that follow. We continue to get plenty of character development throughout the show's run, and I'd argue that it's the most consistent aspect of the show. However, starting with the second season, these elements fit much differently into Lost's episode-by-episode structure, much less often serving as episode cappers or even climaxes than they do in Season One.

There's that old bit of advice about public speaking: end with what you want the audience to remember. Well, I'd argue that advice works for storytelling, too, and particularly for television, which gets the unique opportunity to "end" every week. That's just how we humans process information; we dwell most on what occurred most recently. When a speech or movie or book or TV episode ends with a particular word, phrase, or image, it says to the audience, "Thisyes, thisis what we're here to tell you about. This is the most important piece of what you just experienced." In short, how something ends reveals its focus. The issue is compounded on TV, where endings aren't just endings but persuasive devices to maintain an audience. A TV shows wants viewers to come back next week or next season, and it tells them so in the final minutes of an episode. The last scene of a TV episode not only brings the episode to a close but says to the audience, "If you come back, you'll get more of this." An episode's ending makes a promise to its audience. Sometimes it's something like "You'll get to find out who shot J.R. Ewing" and other times it's merely "You'll get to see Cliff Huxtable be funny," but a promise is made nonetheless. This promise is the viewing hook.

So what does Lost want its audience to remember from week to week? What is its focus? What promises does it make to its audiences from episode to episode? Well, that depends on whether you're watching Season One or Seasons Two-through-Six.

The majority of the time, first-season episodes end with shots of characters bonding, finding things out about themselves, or just hanging out. Heck, there are even pop-music montages, courtesy of Hurley's Walkman. This tells the audience that these people are what is most important. More than any other season of Lost, Season One is focused on character. In the first year, the characters are what this show is about—not just how the characters move the plot along from one Island mystery to another but who these characters are and the demons that afflict and motivate them. When each episode ends, we see the centrality of these characters to the show, and just as importantly, the show promises us that returning to the show will give us a deeper connection with the characters.

Now, that promise, that importance of character, definitely exists in the other five seasons of Lost, and we even get the occasional episode that ends with a Season One-esque montage or character moment. But the show becomes much less explicit about that character centrality. By having the majority of episodes end with cliffhangers (and especially cliffhangers revolving around the show's mythology and unresolved mysteries), the Lost writers change the conversation with its audience. Structurally, plot becomes the most important aspect of the show, and the viewing hook shifts from bonding with the characters to feeling intrigue. Instead of character connections, the show begins to promise excitement and Island mysteries (and implied solutions to those mysteries). While the characters always reside at the heart of Lost (and are what keep it great throughout its run), the post-Season One episode structure of mystery-based cliffhangers elevate numbers, infertility, black smoke, and electro-magnetism to a point where they appear to be the most important element of the show rather than how these mysteries affect the characters. They become the viewing hook and therefore the reason why viewers continue to watch the show. And all that starts in Season Two.

Is it any wonder, then, that a sizable portion of the Lost audience is upset that the show ended in a way that emphasized the character connections and downplayed the ongoing mysteries? The show's structure had, for all intents and purposes, lied to them about what to expect from the show. Beginning with Season Two, the show promises the audience one thing and eventually gives them another. This is not a fatal error (the show does, I think, answer many more of its mysteries than people give it credit for, anyway), especially since it devotes its whole first season to establishing the rules that it eventually sticks to. But it is a problem that the show temporarily changes the rules along the way. Five years is a long time to misstate one's purpose.

And that's about all I've got for now. Looks like it's another long post, folks. Don't let all that that length go to waste. Tell me what you think! Do you love Lost? Hate it? Have you lost your love for it? Again, I've only rewatched halfway through Season Two; those of you who have rewatched the series in its entirety, am I remembering wrong about the trajectory of future seasons? Inquiring minds want to know.

Until next time.


*I will (and have) argued that point ad nauseum.

**"Cliffhanger" should not be mistaken for tension, of which Season One has plenty. An underlying tension, brought to the surface only occasionally by jarring, episode-ending events, is what makes Lost's first season addicting even without the heavy use of cliffhangers.

Friday, July 19, 2013

Why Christians Should Read More Modern Fiction


Yesterday, RELEVANT Magazine published an article on its website titled, "Why Christians Should Read More Fiction." In this article, author Paul Anderson makes the fine point that Christians should, well, read more fiction. He begins by sharing the very real frustration of how narrow the Christian conversation about literature can be. According to Anderson (and I will vouch for this, too), Christian discussions about books begin by hitting the "evangelical heavyweights" of Lewis, Chesterton, Tolkien, and Piper before breaking into theological arguments about The Shack and finally affirmations of Narnia's brilliance. So true. But then Anderson says this:
"And that’s when I start wailing on the bullhorn. 'But guys, what about fiction?! What about Faulkner? Melville! Does the name Steinbeck ring any bells?!' I cry, as the crowd trickles away, whispering about the guy with crazy eyes and a copy of East of Eden in his hand."
And as much as I applaud the overall sentiment (yes, Christians most definitely should read more fiction!) the first thing I thought was, "But Paul, what about modern fiction?! What about Jonathan Franzen? Vonnegut! Does the name Tony Morrison ring any bells?!" Alright, I guess I might have waved around a bullhorn and clutched a copy of Swamplandia!, too.

My point is, look at the names Anderson drops. John Steinbeck, the most recent of the bunch, died in 1968. Herman Melville didn't even make it to the 20th century. William Faulkner lived until 1962. You know what happened between 1962 and 2013? The collapse of the USSR. The Beatles. Post modernism. A lot of the Civil Rights movement. Two whole waves of feminism. Watergate. Hip hop. 9/11. The rise of the personal computer. I could go on.

And, okay, I'm being a little glib. Anderson has a BA in English Literature, so he has no doubt read plenty of modern authors. This is no slight against his reading habits. Maybe it's a coincidence that he only names authors who haven't published new work in fifty years. Maybe if he had gone on with his bullhorning, he would have mentioned Gabriel García Márquez next. But, intentional or not, the authors Anderson names speak to another kind of narrowness in Christian literacy, which is that many Christians do not read contemporary secular literature*.

I got my undergraduate degree at a private Christian university where I majored in English and therefore had oodles of time to talk with other Christians about the books they read. What I noticed is that there are two basic literary canons that many Christians (and I should probably specify "Evangelicals," since that was the primary inclination of my alma mater) seem to adhere to. The first Anderson identifies as the "evangelical heavyweights:" Tolkien, Lewis, and other explicitly doctrinal authors. These are authors whom most Evangelicals have read or at least been familiarized with through quotes or summaries. Then there's the second canon, which is what Anderson appeals to. This consists of "classic" literature, which I know is a broad and ambiguous classification but that I mean to signify works published within the period of time that spans from Homer's writings to the first half of the 20th century. William Shakespeare, Charles Dickens, Jane Austen, and yes, Faulkner, Steinbeck, and Melville tend to fall into this camp. These are authors whom Evangelicals with a deeper interest in literature tend to latch onto. Other authors from the time period who (tellingly) are often omitted from this canon: James Joyce, Virginia Woolf, Voltaire.

We Christians can sometimes be classicists. And there is plenty of great stuff in the classics. But by only reading classic literature, we not only miss out on great literary works, but also feed that other uncomfortable Christian tendency, that of looking back a little too often at the "good ol' days" when there is so much going on around us today.

I love what Anderson has to say about art: "One thing that unifies its many forms is the desire to distill the chaos of existence, in all its wonder and tragedy, into a single, unified image of beauty." The thing is, the "chaos of existence" is just that: chaotic. It isn't static. The turbulent, existence-defining forces we experience today are not identical to the ones that we experienced yesterday or the ones we experienced ten years ago or the ones people Victorians experienced over a century ago. I mean, I said I was being glib earlier, but I wasn't being that glib when I mentioned all the monumental things that had changed over the past fifty years. Of course the art of fiction taps into universal questions of what it means to be human (that's why we still read Herman Melville one hundred and fifty years down the line), but it's also very specific. Every piece of fiction is a unique distillation of the specific chaos of the cultural moment in which it was written. To kind of cross media for a second, a friend of mine once remarked that Seinfeld would never have worked in the 21st century, since so many of its episodes hinge on problems of miscommunication (or lack of communication) that today's internet and cell-phone conveniences could easily solve. Seinfeld's "The Bubble Boy" is universal in its depiction of human frustration in the face of chaos (those moops, man), but it is also specific to that cultural instant when Jerry couldn't just look up directions on his iPhone. Getting back to fiction, we can read a comedy of manners centered on the social mores of Victorian England and enjoy the universal ideas of class and joy and otherness, but we only see how these universals apply to the chaos of that Victorian moment. When we read classic literature, we are looking back at what the chaos used to be. Modern literature distills beauty for us out of the specific chaos we live in today, iPhones and all.

Now, just to be absolutely clear: this is not an indictment of older literature. My favorite novel of all time is Adventures of Huckleberry Finn, which is about to enter its fourteenth decade of existence. That's pretty "classic." Steinbeck, Faulkner, Melville, and many other older writers have given us a plethora of literary masterpieces, and you are not wasting your time by reading any of them. Reading older literature is just as important as reading recent works. We can learn from the past, finding beauty in the present world from the way the past authors found beauty in their own chaotic worlds. In fact, all literature is, in a sense, "older" literature, since it all applies to moments in the past. Even if Jonathan Franzen were to write a novel today and I read it tomorrow, it would still be a novel about the past, since I will always have to read something after it's written. But the closer something is written to the present, the more directly it can find beauty in the chaos of the present world we experience. There's something to be said for directness.

One more point I want to be clear on: I like the RELEVANT article. It has a lot of great things to say about why Christians should read fiction in general. This post is not a refutation of what Paul Anderson writes, just an addition to it, a "yes, and..."

Speaking of which, feel free to add your own "yes, and..." in the comments. Or if you think I'm completely out-of-line, contribute some buts and howevers. I'd be happy to discuss any of this post (or related issues) with anyone. Again, I owe you all my most enthusiastic gratitude for reading my ramblings. I know it can't be all that pleasant.

Until next time.


*There's a certain self-defeating irony in the fact that I'm writing this on a blog whose only other literature-themed post is about Beowulf. Okay, yeah, touché.

Wednesday, July 17, 2013

How Monsters University Breaks the Pixar Mold

 
Monsters University opened in the U.S. on June 21. I saw it on July 4. Now I'm writing about it on July 17. I never claimed this blog would be timely. On the other hand, four weeks and change after the release date is better than eight years or one thousand years, so maybe I'm getting better at being contemporary.

As promised in my last post, this write-up will be shorter than the ones for Beowulf and Me and You and Everyone We Know. And that isn't me condensing myself. I don't have much to say about the first eighty-five minutes of the movie. It's the kind of movie the folks at Pixar can apparently make in their sleepfull of vibrant, detailed animation, endearing characters, creative set design, tightly scripted dialogue etc.—and I would rank it somewhere in the middle of the great Pixar pantheon*. There isn't a lot I could say about most of Monsters University that couldn't also be said of the Toy Story trilogy, A Bug's Life, Cars, or especially the original Monsters, Inc.

Notice I said most of Monsters University. Because what I do want to talk about is the fifteen minutes that ends the film. And oh man, that ending.

Pixar movies have always flirted with dashing the ambitions of their characters. Just look at the opening minutes of Up or the first half of Ratatouille or the third act of Finding Nemo, and you will see Pixar baiting its audience with the very real possibility that things will not end happily. It's this willingness to allow their characters the possibility of failure that creates the emotional heft that Pixar movies have become known for (and that Cars 2, their weakest film, lacked). But never has one completely committed to the idea that a protagonist just isn't meant to get what he wants. At least, until Monsters University.

At the end of the movie, Mike Wazowski finds thatdespite his lifelong dream of being a scarer, despite the movie's entire plot revolving around this ambitionhe is not scary enough to be a one. So he doesn't become one. End of story. (Yeah, I know there's that montage of him climbing the ranks at the scare factory, so it's a mostly happy ending, but at the end of the day, he still doesn't get to be a scarer) Think about it. That would be like if the first Toy Story ended with Woody and Buzz flying back to Andy only to find that he didn't want them anymore, or if Finding Nemo ended with Marlin realizing that Nemo was better off without him. The closest thing to a moment with as much devastating finality as the end of Monsters University is the incinerator scene in Toy Story 3, but even then a deus ex machina saves the toys. Regardless of how much soul searching and potential disappointment a character goes through, every Pixar movie eventually gives the characters what they want in the end. Monsters University is the first time that a Pixar movie has not blinked when it stared at the opportunity to disappoint a character permanently. It is the first to tell kids that they might not be good enough to pursue whatever dream career they have been pursuing**.

And why does Monsters University of all movies, a film that for most of its running time just seems content with giving its audience a good time, end up doing this? I think it's able to do so because it's Pixar's first prequel. Prequels have a safety net that other movies lack. A moviegoer never has to worry too much about characters in a prequel because the earlier position in the franchise timeline guarantees that everyone will be in good enough shape to make it to the later installment. The filmmakers of Monsters University seem to realize this, giving the MU story much smaller stakes than those in Monsters, Inc. And this is smart; any big, life-threatening stakes would have been a cheat that we audience members would have quickly seen through, since we know that Mike and Sulley are a-okay in the next movie. MU's prequelness also gives the filmmakers the freedom to crush Sulley's dreams without seeming too cruel or pessimistic; after all, we know he comes out on top by the time Monsters, Inc. rolls around. A prequel's inherent lightheartedness allows Monsters University the opportunity to be unflinchingly honest with its characters without alienating the viewers.

Monsters University would have been a good movie without the ending. I just want to be clear on that. It's a funny movie, sometimes a very funny movie, especially when it plays around with our perceptions of previously established characters—straight-laced Sulley is a douche, the banished Abominable Snowman is a careless mailroom grunt, villainous Randall is, well, mild-mannered. But during those final few scenes, during Mike's disappointment, it becomes a great movie, however briefly so.

Actually, was this any shorter than my other two posts? Anyway, feel free to discuss the length of this post to yours hearts' content in the comments. Or, you know, you could discuss Monsters University. Or not comment at all. It's up to you.

Until next time.


*I don't think Pixar has ever made a "bad" movie (even Cars 2 succeeds in being fun in my book), but for those of you who are curious, here is how I would rank them, Monsters University included.
  1. Toy Story 
  2. WALL-E
  3. Up
  4. The Incredibles
  5. Toy Story 2
  6. Finding Nemo
  7. Ratatouille
  8. Toy Story 3
  9. Monsters, Inc.
  10. Monsters University
  11. A Bug's Life
  12. Cars
  13. Brave
  14. Cars 2

**No, I don't think the other Pixar movies are any worse for giving their characters happy endings. Those endings are all well-deserved and naturally reached. But Monsters University provides a refreshing counterpoint to the sometimes-blind optimism of the "you can do anything if you set your mind to it" school of thought. And yeah, I know that we technically knew from Monsters, Inc. that Mike becomes a technician, not a scarer, but I was still surprised at how much of a plot point they make his failure. I mean, a lesser film could have just had him change careers between the movies or something.

Tuesday, July 16, 2013

Me and You and Everyone We Know


Halfway through Me and You and Everyone We Know, I thought I had it pegged down. I was wrong, and I'll explain why. But let me give some context first.

Sometime during the past two years, too gradually for me to pin on a specific date, I got bored with "indie" movies. Now, people can quibble a lot about the word indie, whether such-and-such movie really qualifies as indie or not when so-and-so actor worked on it and whatever studio distributed it, but I'm not here to get into credibility debates, which are pretty dull anyway. Moreover, the term indie has described so many different things over the years that it's basically meaningless as anything but a genre distinction. In 2013, indie isn't a studio, or lack thereof; it's a style. So when I say that I got bored with indie movies, what I really mean is that I got tired of romantic comedies whose primary emotions are ennui and sincerity, where the men are stammering and disaffected, and the women are outgoing and have eclectic hobbies, and the children are precocious, and the dialogue lets the audience know it's sophisticated because it references literature and uses the word "fuck," and everything about the cinematography and soundtrack is just a little too precious.

Of course, in addition to being a horrible trainwreck of a run-on, most of that last sentence reeks of an oversimplified view that dismisses a diverse collection of films made by hardworking, talented people. Not that such trends don't exist in certain brands of cinema (I'm pretty sure they do, and I am tired of them), but it's unfair and disrespectful to dismiss huge swatches of (mostly) American film output based just on aesthetic criteria. Plenty of good things have actually come out of this genre, and to tell you the truth, a lot of movies that I like quite a bit share many of those "indie" qualities I listed above.

Nonetheless, such was my mindset going into writer/director/actress Miranda July's 2005 film.

Here's the movie's synopsis on Netflix, which appears beneath its pastel-saturated poster: "Eccentric Christine seeks emotional connections in the modern world while newly single shoe salesman Richard copes with his recent separation and his teenage son experiences a sexual awakening." Quirky woman, disaffected man, precocious teen: indie flick, right?

Well, as it turns out, sort of. The first half of the movie or so didn't do a whole lot to change the impression I had gleaned from the cover and Netflix blurb. In its early goings, Me and You and Everyone We Know seems to aspire to a romantic comedy take on the Magnolia-style film that became popular around the turn of the new millennium, where casual acquaintance and synchronicity interconnect the lives of a large cast of characters. It would be cumbersome to describe the considerable number of characters and subplots in the movie (a cumbersomeness that Me and You avoids through clear and efficient storytelling, which itself should have clued me in that something special was going on), but the focus initially remains on quirky Christine (Miranda July) and sad-sack Richard (John Hawkes), which is what gives the movie such strong indie rom-com vibes early on. The two first meet in what I assumed to be a take on the typical "meet-cute," with a perhaps overly clever conversation on how the street signs mirror the relationship they might have together. Before long, the script gives the two a few more of these Holly Golightly-esque chats, and the movie is well on its way to indie-pop territory.

But then something happens in the final forty-five minutes of the movie that blew me away: Me and You and Everyone We Know ceases to aspire to indiedom and instead achieves the sublime. The film pulls away from the supposedly central romance and lets the rest of the characters' stories unfold in deeply human ways that push toward the surreal and the emotionally naked. For example, the beginning of the movie shows Richard's teenaged son, Peter, goofing around on an internet chat room; there he meets an anonymous user, to whom he describes, at the suggestion of his six-year-old brother, Robby, a practice Robby calls "pooping back and forth forever" (which is exactly what it sounds like). Robby then begins to log on to the chat room by himself and continue talking to the anonymous user (who has taken quite a liking to the idea of "pooping back and forth forever"). The movie gets a lot of comic mileage out of the situation, but the payoff of this subplot, when Robby meets the chat room companion (a thirty-ish woman who curates an art museum), is absolutely transcendent and indicative of the beauty this movie dishes out in its second half. When the two meet, there is no screaming or revulsion between the two, but instead a tender understanding and a completely non-ironic embrace. This ending somehow manages to frame the two as wistful lost souls finding peace (read: real human beings) without reneging on the movie's promise to deliver on the ick-factor and comedy of a six-year-old meeting a thirty-year-old with a fecal fetish for a blind date. These are two characters finding unexpected profundity out of a situation whose initial premise would have exploited their situation solely for comedy.

What's striking is that while nearly every character gains this profundity by the end of the film, often in equally bizarre ways that contrast the initial realist/indie aesthetic of the beginning, these resolutions never feel jarring or ridiculous. This won't make sense to anyone who hasn't already seen it, but that it doesn't seem out of place for a movie that begins with a marriage falling apart to end with the image of a child literally causing the sun to rise speaks to the astoundingly natural way that director Miranda July manages to transform the lives of her characters. This is the stuff of avant-garde cinema, grown like a weed out of conventional meet-cute scenes.

Upon finishing the movie, then, my question became: why start with the indie trappings at all? This is a movie that dresses itself up in indie pop sensibilities but by the end refuses to be pop in any sense of the word. Why even bother with all the meet-cute stuff at the beginning? The answer, I think, partially lies with a sympathy for the audience. You don't want to throw viewers into a world that ignores conventional rules of science and human interaction; no matter how beautiful the images you create, your audience will always be alienated without some kind of lifeline. The conventional human drama provides that lifeline. I'm thinking of Charlie Kaufman's great Synecdoche, New York, a movie I saw recently that reaches the same kind of avant-garde profundity at the end as Me and You. Synecdoche doesn't give off the indie rom-com vibes, but it still takes time to establish realistic character motivations and conflicts before diving into the stages-within-stages and actors playing actors.

But the more I think about the indie false start in Me and You and Everyone We Know, the more I think there's more to that story mode than just hooking the audience. And here's what I've realized: that was never the story mode it was trying to adopt. Only my expectations of the movie's plot set it up as such. I read the plot synopsis and heard the opening notes from the soundtrack, and I assumed I knew what I was seeing. Looking back at those first scenes, however, I realize that the surreal bent exists throughout the movie, albeit in a subtler form. If I'm not looking for a meet-cute, Christina and Richard's conversation about their prospective relationship is less cutesy and more just plain weird. Really, it's pretty strange that these two people just launch into the sort of conversation that contemplates their own mortality, just out of the blue, but it's right in keeping with the movie's ultimate bent on sublimity. Here are two characters passing the time by passing through time, stepping out of their own realistic situation into a metaphysical head space. I didn't notice this because I was looking for clunky artificiality of genre tropes instead of something with its own personality altogether. I put the conventions there by looking for them.

In short, it's unfair to a movie to judge it by its genre, especially since a lot of movies actual resist classification by genre.

And, well, that's all I've got. Sorry, it's another long post. Please don't hate me. Thank you so much for slogging your way to the end, and I'd love for you to share what you think in the comments. Readers are wonderful. Oh, and I promise I'll do a short one next time.

Until then.