Harry Nichols and the Philosophy of Fandom (Crashing a Fandom Part I)

Potter_Puppet_Pals

It has been a woefully long time since I’ve posted here. Suffice it to say that between my seven month old baby, my insanely busy summer break, my music habit, and losing 80 lbs (!), my capacity for committing nerdly thought to words has been limited. However, things are settling. The school year is back on, and I’m falling into patterns and rhythms again.

As I announced to my friends on Facebook, I’ve decided to read through the Harry Potter books. Despite being the perfect age for them when they came out, I was incredibly resistant to the series as a youngster.  However, my wife is a fan, and we plan to read them to our baby. I want to convert to fandom for the same reason parents sometimes convert to religions when they have kids– to present a united front. With the routine of the school year coming together, I have managed to squeeze out enough time to knock out the first two, and I kind of want to kick myself for my earlier reluctance.

Before I begin, I feel it worth mentioning that I am very aware of my outsider status. I have seen (most of) the movies, but many aspects of the series have remained mysterious to me, or have been lost to the years since I saw the films last. Some spoilers I know, but there are enough secrets to feel like I’m seeing it through fresh eyes. And yet, bringing up these fresh experiences with experienced fans is a bit daunting. As a fundamentally social person, I want to share my experiences but feel self conscious talking about reading the books for the first time. I’m acutely aware that all of my opinions, hypotheses, and judgements are falling on ears of those inundated with the series since childhood. Their opinions on it are informed by decades of thinking about the complete narrative, much as mine are for other fandoms. I haven’t found others to be condescending, but I am so used to being the one with the great big beard, it’s actually a little uncomfortable to be so green on a subject.

Despite these feelings, I’ve had some initial judgments to share. Ultimately, I think I’ve loved these first two books. I say “I think” because I’ve had to re-evaluate my critical approach to stories a bit lately (perhaps another reason why now is the time to get into the series). Once upon a time, I demanded absolute perfection from a narrative. Plot holes had to be miniscule or non existent. Dialogue and narration had to be natural, free of cliche, and understated. But with those expectations, came the inevitable fact that many of my own most beloved tales can’t measure up. Take Star Wars, for instance. Star Wars is probably nearer and dearer to me than any other lore. And yet, I can’t deny its imperfection. Even Empire, my favorite in the series, is built around a giant plot hole involving either a surprisingly short training regiment for the Jedi, or an absurdly long time spent floating among space junk. If even the crown jewel of that series doesn’t really stand up to the level of expectation I had for stories, a level of expectation that it partially created by impressing upon me its finer qualities in my formative years, then I have to admit that I am rigged to reject stories. Reading books and watching films becomes not so much a process of enjoyment, but one of scrutiny. In doing, I am denying myself the social experience of loving those stories because I demand they meet unattainable standards. Thus, I have committed to taking in this series as a blank slate. I have resolved not to pass judgement on its quality until it is entirely over, and I have agreed to myself that I will allow for flaws that I recognize as an adult, which might have been overlooked as a kid.

Surely, J.K. Rowling was not an amazing writer in her earliest Potter years. I cringed a bit when, in both stories, the climaxes consisted mainly of the villain fully explaining their complete master plan to our protagonist, who was surely soon to meet his demise. But her early books were fun, whimsical, and uniquely captivating.  At the same time, I was surprised to find that many of her characters are beautifully flawed, and unexpectedly three dimensional to the adult reader, even when viewed through Harry’s limited perspective. I find the psychology of the adults in the tales particularly interesting (cough* Gilderoy Lockhart anybody?). This, I’m sure, is to be expected, being a teacher myself, and able to relate more readily to McGonagall’s day to day life than Ron’s. I actually found it quite funny how relatable teaching at Hogwarts seemed to be. It got at a universal truth that served as a grounding for me when I wanted most to question the stories’ value. Even wizard children are, first and foremost, kids. The series, with all of its fantasy and occasional absurdity and squeaks, is a commentary on that age. Kids try more than anyone to make a meaningful impact on their world, as illustrated by Harry, Ron, and Hermione’s constant meddling into affairs deemed by the adults to be beyond the scope of their power.

The first books also carve out an interestingly consistent meld of genres. Start with a base equal parts fantasy and mystery, sprinkle in trace amounts of horror (more for Chamber than Sorcerer’s Stone, but notable amounts in both). It takes a considerable amount of artistry to make satisfying, and yet different enough stories out of the same materials, but Rowling does it. They aren’t perfect, but then again, I had decided that perfection isn’t worth the effort. Rowling shows talent despite her flaws.

Enjoyment of the series aside, there is a tiny part of me that feels like an imposter in this endeavor. I lack the nostalgia that is key for any burning fandom. My appreciation is adopted, rather than an intrinsic part of my makeup. I think among fans, there is an expectation that in order to call yourself a true fan of something, it has to reach down into your very being. You have to have been molded by the art in some way fundamental to your identity. This, of course, essentially means that there is a window to become a true fan of anything. Once your formative years have passed, you are welcome into fandoms as a tourist, but not a native. Considering this, I would that I had given the Harry Potter series, flaws and all, a fairer shake when I was younger. This ride is ultimately enjoyable so far, and I look forward to getting on with the rest of the series.

Having completed what I’m told is the “childhood Potter,” I plan to take a short break, as I do with all series. I’ll be back with more when I dive into a few of the adolescent works. Hit me up in the comments.

Advertisements

On the Staying Power of Final Fantasy VII

Final_Fantasy_VII_Logo

I forget exactly how I came upon Final Fantasy VII back in the day. I had a Playstation, and had gamed since elementary school, but I don’t remember the exact moment in time I heard someone first say the game’s title, or saw it played by someone else. The game sort of creeps its way into my memory with considerable stealth for something I hold in such high regard. It’s actually the same with a couple other works of art close to my heart. I have no recollection of the first time I saw Star Wars, for instance. It’s just kind of always been there.

Many will argue that FF7 is one of the greatest games of all time. The discussion I see fewer people having with any great depth is why. Beyond its graphics and mechanics, what gives a game lasting power over people? Nostalgia certainly plays a part in it, but if it were just nostalgia, individual games would still die out over generations. The Final Fantasy series is pretty loaded with games that were once acclaimed, but with less staying power than FF7. Why has that installment, despite its undeniable flaws, continued to live on in popular culture, its copies passed down to the new generation like beloved records? My answer: an artful story that becomes increasingly relevant the further along we get.

In case you need a brush up (and if you do, who are you, even?) Final Fantasy VII placed the gamer in a pseudo-futuristic world ravaged by Shinra Inc., a malevolent company harvesting the planet’s life force for energy. The story is long and becomes pretty complicated, but basically, Cloud, our mercenary protagonist, struggles at first against Shinra, but ultimately against Sephiroth, a powerful entity hell bent on becoming a god at the expense of the planet.

957986-191933_final_fantasy_vii_safer_sephiroth_super

Over three discs, the player is immersed in an expansive world quite beyond what had been done elsewhere in the mid-to-late 90s. Though exploration is a major facet of why FF7 was so successful, the game is really geared toward narrative, which again, was a change of pace from the popular Playstation fare of the day. Characters were crafted and given an arc. The game conveys thematic statements. It makes heavy use of flashback, and dedicates hours of game time to backstory for various characters, not only in cut scenes, but game play as well. This, in turn, makes the gamer feel more invested in each character’s journey while simultaneously coloring the events of the main conflict.

The plot itself, as previously mentioned, becomes incredibly complex and even a little postmodern. Upon replaying FF7 this past year, I was surprised by how well I followed it as a tween. By adopting narrative techniques that had been pioneered by generations of writers and artists before it, Final Fantasy VII was among the first games to make people wonder if games could be more than just kid stuff, if their complexity could elevate them to actual art.

Final Fantasy VII is also lent a little extra staying power because its characters, events and themes actually feel more relevant today than they did in 1997.

The game’s environmentalist messages hit harder in 2017, when a growing body of reports continue to indicate that human activity has caused irreversable global warming. There is a sense of tense foreboding, particularly in the early hours of game play, that the planet is on the brink of death, that everybody is waiting for it to collapse. Still, you find all manner of people populating Midgar and the surrounding world, some concerned at the state of things, but most simply concerned with their own getting-by. Particularly in America, where our recent election highlighted a clear discord between those who think collectively and individualists, exploring the world of the game and interacting with these diverse people feels particularly poignant.

In fact, Cloud’s major character arc can be adequately summarized as a maturation from individualism to collectivism. Early on in the game, Cloud, in his bad-ass-hard-bitten mercenary-ness, is concerned only with getting paid and expresses no interest in the cause for which AVALANCHE fights. Through interactions with Aeris and discovering his origins, he is transformed at the end of the game. He realizes that achieving your individual agenda requires a certain amount of collective thinking and empathy.

Cloud Motivation

Speaking of that recent election, the gold-headed and immoral President Shinra bears a bitter resemblance to our own turd cannon of a president.

tumblr_inline_ocuz86MPk21qi3s12_500
Trump wishes he had that jawline and Tom Selleck ‘stache.

The resemblance adds a layer of unplanned irony, but also gives the entire game an air of prophecy. Played in 2017, Final Fantasy VII actually feels very little like a fantasy at all. Its fantastical elements are merely superficial. The thematic material it grapples with are very much rooted in our present lives.

Finally, FF7 breaks away from previous Final Fantasy titles by setting its tale in a semi-futuristic world possessed of technologies we are ever closer to obtaining. In keeping with the paradox of progress, few seem to be enriched by the technology, and mostly it appears to do a considerable amount of evil at the behest of Shinra Inc. Hojo, a major figure in the game, embodies the traditional mad scientist, throwing ethical concerns to the wind for his experiments by splicing Jenova cells into human clones. His character reflects a culture’s growing fascination and concern with the technology’s incremental advancement toward omnipotence, a fascination that the 20 years of technological advancement since the game’s release has only heightened.

Haters are quick to point to any number of flaws in the game. The graphics are terrible for the most part. Much of the dialogue is clunky. Some of the character’s aren’t aging well. These flaws granted, it was still a game that made you think in the age of Spyro the Dragon and Crash Bandicoot (as much fun as those games were).

I realize that as a gamer, I have been chasing this experience pretty much ever since I finished FF7 for the first time back in the day. I’ve played many games I’ve enjoyed, but few with the same sticking potential. I’m a few generations behind in consoles, but I’ve recently picked gaming back up, and I’d love to know what titles of similar quality I’ve missed. Hit me up in the comments.

The Difficulty in Finding Engaging Text for Seventh Graders

parental_advisory_explicit_content_lge_logo

Seventh grade is a terribly awkward year in every sense. For students, wild hormones combine with a sudden awareness that other people have thoughts about them, turning them effectively into paralyzed, self imprisoned weirdos. Familiar with their childhood selves, and inexplicably called to an adult world they they don’t really understand, their lives are constant tension centered around social issues. It behooves the writers of curriculum to keep this in mind when picking study materials for an age group so heavily in the middle of things.

Of course, seventh grade is awkward for the teachers as well. Not only do the sometimes bizarre actions of students put us in uncomfortable situations as their caretakers and disciplinarians, but simply trying to accommodate the odd little buggers and hold their interest is uniquely difficult. It’s out of a concern for their interest that I historically haven’t taught full length novels, relying instead on short fiction to convey most of my key learning.

But then again, I teach English, god damn it. I want to read a novel with my students. The question simply becomes which? The answer is harder to come by than expected.

0380714752
Even Charlotte Doyle looks bored with The True Confessions of Charlotte Doyle

When I think back on my own seventh grade experience, I find that I can recall the titles I read surprisingly well, better even than those of some college courses I took. They were pretty well worn middle school fare: Avi’s The True Confessions of Charlotte Doyle, Jacob Have I Loved by Katherine Paterson, and April Morning by Howard Fast all stick out in my mind. Though vastly different stories, they all shared one commonality: They bored the hell out of me. There’s an entire cannon of novels for middle school that are slowly falling out of favor with young readers.  Even the most recent of those listed, Charlotte Doyle, was written in a markedly different world than the one in which kids now find themselves. The times have changed, and I want to teach something that reflects the world my students see around them. And so, my struggle begins.

 

Thanks to their thoroughly in-between nature, twelve and thirteen year olds often come up against the wall of appropriateness preventing them from discovering meaningful , potentially motivating content. Parents are often careful what they allow kids of this age to consume. As a teacher, I need to be extra careful to avoid offending families of varied sensitivity to sex, language, and violence. This year, in particular, I have felt my pedagogical creativity stifled by that wall, to the point where I needed to pose a question: in protecting our kids from inappropriate material, are we not also smothering the sparks of their interest? Which is the greater good, to shield students from the world’s nasty bits, or to engage their developing minds, possibly helping them connect their own struggles to their studies?

Should kids of that age be allowed to read, watch, and play absolutely anything? Of course not. But I think at times those who would censor school materials focus on the superficial. Take for example the unit I sought to develop this year on Sherman Alexie’s The Absolutely True Diary of a Part Time Indian. The book is a modern classic of young adult literature. However, because our protagonist at one point proudly admits to masturbating and refers on occasion to erections to describe interest in things only occasionally sexual (ex. having a “hard-on for words”), I was shot down.

My problem with this is twofold. First, these sexual references are superficial. They are the surface language through which the book’s deep thematic material is delivered, rather than the thematic material itself. Second, the book, being about a kid’s upbringing on a Native American reservation, contains several ties not only to historical content covered in seventh grade social studies, but to the world these kids experience daily. In the year that the indigenous people of Standing Rock had to suffer to stave off the desecration of their sacred grounds, what would have been more pertinent that reading a book which takes as its backdrop the genocide and relocation of the Native American?

Disappointed, I turned first to the internet for alternative narratives reflecting the Native American reservation life. Nothing even came close to covering the material in quite the depth or quality of Alexie. (If someone knows of one, please point me in the right direction). So eventually, I began to look outside the subject I was planning on covering. What other novels out there might I find that are modern, relevant, gripping, connected to other subject areas, and appropriate?

The solution is, unsurprisingly, hard to find. I looked into a new novel called The Last True Love Story by Brendan Kiely. It’s got relevance, being set around Ithaca, NY, where my students live. It has a connection to other content, being itself a sort-of YA Odyssey, and drawing on students knowledge from 6th grade. And yet, like every epic hero, it has its tragic flaw: a bunch of naughty words. Common Sense Media rates it 14+, the same death blow dealt to my beloved Alexie unit that never was. Raunchy language, again, is superficial. It is the delivery vehicle for the book’s meaning. And certainly, most of these kids have never heard or used such language before, right?

I didn’t just look into these two books, either. Finding a book that is both meaningful to the students and appropriate has proven unexpectedly difficult for me this year. Perhaps I shouldn’t be surprised. By seventh grade, kids are no longer living school appropriate lives. Why should a novel that reflects these lives be any different? We should make a distinction between objectionable material on a superficial and thematic level. Allowing the former would create so much more opportunity to draw students in and encourage thoughtfulness, while still shielding students from material they are not yet ready to process.

 

 

 

Getting Old in Punk Pop

Blink182_AH5B4305_gbvh7y

Blink 182’s most recent album, California, begins with a starkly honest set of lines:

 There’s a cynical feeling saying I should give up / You’ve said everything you’ll ever say

The lines resonated with me. A songwriter with power pop leanings myself, I sometimes worry about the easily observable tendency in artists to produce less material as they get older. While I don’t feel anywhere near that point just yet, I’m a long-term thinker, and constantly work to avoid the day I run out of ideas however far in the future it may be. I want my craft to last into my old age, and I don’t want it to get stupid along the way.

The muses of pop-punk are ephemeral, however. As with all forms of pop, melody is king. Particularly with power pop, lively energy and a certain angst are a part of the sound. Of course, we all eventually start running short of lively energy as we get older, and that angst has a tendency to fade as well, at least towards the subjects that are usually sung about: romantic relationships, rejection of authority, and youthful stupidity.

Needless to say, I eagerly listened on to hear what Mark Hoppus, a man who rose to fame writing songs about making prank calls, pubescent sexual tension, and grandfathers shitting their pants, would have to say about continuing to perform with Blink 182 into his 40s. With 2003’s self titled album (which I love), and 2012’s Neighborhoods (which grows on me more and more), I was led to expect some musical growth. The opening lines were promising for the lyrical content to follow. There is definitely an untapped wealth of inspiration in the tension created when you outgrow your craft, and punk pop music, at least the way Hoppus defined it, requires such youthful point of view to maintain.

Instead, California turned out to be an absolute regression. Rather than nod to where he has been, using his wisdom and experience to comment on his former self, he tries simply to become his former self. “Teenage Satellites,” a forgettable mid album track, tells a story of completely cliched young-and-wild love that feels forced. The single, “She’s Out of Her Mind,” heavily recalls “The Rock Show” even as its music video remakes the iconic streaking scenes of “What’s My Age Again?” The song romanticizes its subject’s instability like only a horny 17-year-old can, except these guys are 45.

blink_182_nudie_run_comparison_h_1016.c9d7d64b9a9df77bbae4e4d5f04fc77e

There are a few less regressive songs on the album. “Rabbit Hole” is about self perpetuating misery, but still, it smacks of pubescent mopey-ness. There’s also a streak of Golden State worship in the album’s lyrics, most of which feels out of place on an album otherwise populated with the imaginings of man-children seeking their youth through music. “California” has a few flirtations with honest nostalgia about a past spent in the title state, but it spends so much time on cliches of Californian life that it sounds like it was sponsored by the board of tourism.

One mustn’t forget the joke songs, either. As Hoppus belts in one song,  “I wanna see some naked dudes / that’s why I built this pool.” Just like in their heyday, Blink has provided a few 30-second splashes of sophomoric entertainment, like the icing on a cake of sadness. Clearly for Hoppus et. al, the solution to running out of ideas is to take all your old ideas and do them again, regardless of how goofy it looks on you now.

Green Day, another of my childhood favorites, put out an album recently as well. They aged a bit better than Blink 182 throughout the 2000’s, releasing American Idiot, which turned their signature punk pop into a more progressive stadium rock sound, ordaining79a2201c67a9aa386cda76562666c62d.500x500x1 their youthful songs of rebellion with lofty concepts. Billy Joe Armstrong still sang about young love, but he began to create characters, and while the band went through less extreme stylistic changes than Blink, they still seemed to mature better. They used their political anger to fuel change in subject matter and maturity.

I maintain that American Idiot is one of the more important albums from the George W. Bush era. It was a rare evolution from what is usually a very stagnant genre. Post American Idiot, they released several duds, as they failed to grow more. 21st Century Breakdown played like an album of American Idiot b-sides. Uno, Dos, Tre! sought a return to a more traditional album, stripped of all higher concepts, but came out incredibly boring. And now we have Revolution Radio, another album heavily derived from their American Idiot developments.

revrad

Lyrically, Armstrong is still writing ironic, politically angsty sounding lyrics. He has a few entertaining and memorable quips like “I found my soul / beneath the sofa cushions.” They don’t always land, and often leave the listener puzzling at their meaning. Other tracks, like “Bouncing Off the Walls” are pretty generic songs of rebellion not really worth remembering.

As with all post 2000 Green Day albums, you kind of get lost in the steady stream of distorted eight notes and power chords. Tre Cool uses only his kick drum, hi hat, and snare for everything with extremely rare exceptions, and songs are punctuated with chants of “hey ho” that sound sampled at this point. In the middle, the temptation gets strong to switch the record despite a few new arranging tricks, such as the verses of the album’s opener, “Somewhere Now,” which contrast 60’s esque ringing guitar arpeggios with stomping stadium rock choruses. The hooks are certainly more ornate and interesting than those on California, but at the end of the day, it’s another f*cking Green Day album. They may have risen to a musical precipice beyond Blink, but they’ve been there for 13 years, with only very minor changes beyond that.

At this point, I haven’t seen proof of a power pop artist continuing thoughtfully through their career, innovating, but still staying true to the basic tenants of their genre. It’s possible that the kind of excitement you get from a well crafted punk pop tune just isn’t meant to be made by those who aren’t in the budding years of their lives. It’s a truth that is particularly painful because for so many creative types, myself included, it serves as a gateway for a glimpse of back-in-the-day. It’s only natural to want to generate that feeling for others, and it requires care and vision to attempt after certain benchmarks of your life are passed.

Top Ten Tuesday: Horror Flicks that Don’t Rely on Jump Scares

I love horror movies of all kinds, but lately I’ve come to really appreciate those that can disturb their audience without over-reliance on the age old horror tool, the jump scare. I wrote a thing about it here, but in essence, when a film doesn’t rely heavily on jumps and jolts, it allows the its atmosphere and themes to really shine through, which tend to be more memorable and enduring for the audience. So here’s a quick list for #TopTenTuesday of great films that focus on deeper thematic horror beyond just jumps. Not that there isn’t still a jump or two here or there, you know, for good measure.

#10: The Invitation

The Invitation (2015 film) POSTER.png

There’s a current trend in mainstream comedy that my colleague and I have dubbed cringecore– basically comedy excruciatingly derived from awkward situations. The Invitation seems a bit like the black sheep of that family, in that it follows what might be the most awkward dinner gathering in history. Rather than turning funny, however, things get pretty messed up. The horror here comes in the audience’s unraveling of the relationship history among the attendees, culminating in a pretty thrilling climax. It’s enthralling for all the right reasons.

#9: A Girl Walks Home Alone at Night

AGWHAN poster.jpg

Filmed in a noir-reminiscent black and white, this Persian-scripted film is an excellent, fresh take on the vampire tropes. Visually memorable and feminist in nature, the movie has been largely praised by critics, who like me, agree that it’s awesome to see such a vivid, humanist portrait of what is so often treated only with broad strokes. Add to this that the move is also just downright creepy thanks to its use of darkness, shadow, and of course to a haunting performance by Sheila Vand, and we have a winner.

#8: It Follows

Image result

A lot has been said about this movie. It’s definitely a darling of the indie horror circuit largely for its use of its unsettling, mildly surreal atmosphere. Some interesting reading on that atmosphere can be found here. It also predated Stranger Things as a scary love letter to the 80s with its haunting synth-heavy soundtrack. All elements collide in just the perfect way to turn what could have been a lame cautionary tale about teenage sex into something much more nuanced and artistic.

#7: Antichrist

Larsvontrierantichristposter.jpg

Lars von Trier’s first entry into his unofficial “depression trilogy” is laden with his signature use of rich metaphor and symbolism, which some find overbearing. It covers a topic oft dealt with in horror: grief. Willem Dafoe and Charlotte Gainsbourg play parents dealing with the death of their young son, who foolishly decide to figure things out at an isolated cabin in the woods. The film was the subject of controversy upon release, drawing rightful accusations of misogyny in its thematic messages. However, taken as a trilogy with its successors, Melancholia and Nymphomaniac, once can start to see with more nuance the message: the assessment that mankind by its very nature is evil, because nature itself is evil. It’s pretty heavy metal.

#6: We Need to Talk about Kevin

Image result for we need to talk about kevin

 

Be warned. This one is definitely going to harsh your mellow. It’s a story that gets at deep seated horrors of parenting with ambiguity enough to keep you thinking about it for years. Without giving away spoilers, it also gets at some of the darkest trends of real life horror observable in current events: truly an American horror film. It’s always weird to see John C. Reilly giving serious performances (forever Dr. Steve Brule to me), and yet he does so admirably here. However, the real credit is due to an intense, heartbreaking performance by Tilda Swinton. If you haven’t seen it, you need to. Just make sure it’s already been a crappy day, because things won’t get better afterwards.

#5: The Babadook

The-Babadook-Poster.jpg

The Babadook offers a more allegorical approach to the subject of child rearing. Essie Davis protrays Amelia, a widow struggling at once to cope with the death of her husband, and to manage her particularly difficult son, Sam, who shows some symptoms of autism, although it is never explicitly discussed. Sam’s obsession with a childhood monster becomes a supernatural pressure point on Amelia, whose unraveling illustrates with incredible foreboding the toll her life is taking on her. It’s always nice to see women of horror at work, and director Jennifer Kent sets the bar high.

#4 Alter Ego

Image result for alter ego shimizu

I’m sure at this point, the list has become pretty predictable, so I wanted to throw a wild card on here for you. Takashi Shimizu (director of Ju-On, among other memorable J-horror films) was a “supervising director” on this low-budget affair in 2002, and let me first say that most would probably not consider this film on the same critical level as others on this list. The production is cheap, the acting isn’t great and the script, well. Let’s just say it follows suit. HOWEVER: I found it to have a certain B-film charm to it. The film follows a group of girls at a photoshoot on the campus of an empty school. One by one, malicious doubles appear to pick them off. Shimizu makes the most with what he’s got, using some simple, but genuinely freaky special effects to make the doubles memorable, wacky and surreal. With its short run time of just about an hour, I recommend carving out a moment for this one, particularly if you like quirky, weird horror. Find it on Shudder.

#3: Contracted

A portrait of a diseased woman

Here’s a novel idea: a zombie movie that focuses on the horrors of a single person becoming a zombie, rather than the apocalyptic hordes-of-undead stuff of traditional zombie lore. If the purpose of a jump scare is instantaneous payoff, Contracted seems to follow an opposite philosophy: the film’s horror is all about the slow degradation of its protagonist. You may need to look past a few hard-to-swallow moments, but this one rewards, overall. Extra kudos to the makeup team for subtly turning the protagonist into a pile of death over time.

#2: Beyond the Walls

Image result for beyond the walls

Okay, technically Beyond the Walls is a three part miniseries rather than a film, but the pieces can easily be taken as a whole. It’s got an intriguing premise: a woman who inherits a mysterious house discovers a door inside its walls leading to a strange other-worldly place. I leave the spoilers off from there. The story is imaginative and dark, and there are a handful of cool visual moments that lend some memorable creepiness to it all, all without a single jump scare. There are a few unexplained moments and plot peculiarities, but on the whole, this one satisfies.

#1: The Witch

Image result for the witch

 

There was a lot of hype surrounding this one, and for once, it was all deserved. I can’t recommend this film enough to any horror fan. It is subtle, artistic, dark and heavily atmospheric while featuring some outstanding performances. The film is about a puritan family who is excommunicated to live on their own. Slowly, they succumb to their own flaws and are preyed upon until the film’s disturbing and memorable climax. The film is a wealth of possible interpretation, and I’m sure many will put their spin on it in the years to come. If you are a horror fan, you kind of have to check this one out.

 

Why No Alien Sequel Will Ever Be as Scary as the Original

alien-covenant-1280

The Alien series has walked a crooked line in the sci-fi genre. What began as a sci-fi/horror franchise quickly gave way to  sci-fi-action leanings with the release of Aliens and ensuing sequels. Eventually the franchise crossed over into B-film territory (Alien V. Predator, anyone?) before director Ridley Scott attempted a return to seriousness with Prometheus. In terms of genre, it is truly a meandering franchise.

Of course, for my tastes, straight horror is the way to go. The new trailer for Alien: Covenant looks promising in that regard. What few glimpses it offers seem to make excellent use of shadow and dark atmosphere, contrasted with a blisteringly exposed chest-bursting scene, all of which signal a return to the techniques of tasteful horror film making.

But sequels to Alien are, by their nature, flawed. I’ll go so far as to say that no Alien sequel will ever be as unnerving as the original. What ruins further installments is the viewer’s foreknowledge of the Alien universe as he or she experiences each new film. Director Ridley Scott seems hell-bent on clearly delineating the Alien mythos with his continued installments, thus reducing the audience’s discomfort with each sequel.

I wasn’t yet born when the Alien was released in 1979, but I can imagine what it might have been like to be in that audience. For someone who’s never heard of the Xenomorph, the experience of watching one reproduce must have been, well, alienating. The discovery of the downed alien spacecraft and the room of eggs on LV426 conveys a sense of dread, mystery, and foreboding, accentuated by the copious darkness in the atmosphere. The original chest-bursting scene, with no previous knowledge of what would take place, would have induced an incredible helpless anxiety, like being on a roller coaster you aren’t sure is safe. The entire process, from face-hugger to chest-bursting to the acid blood, is a horrific exploration that the crew of the Nostromo desperately does not want, but thanks to the claustrophic confines of the ship, have to endure. It is a plot designed to make its viewer feel estranged from normalcy and hopeless. In order to achieve that effect, the viewer must also find it strange and new.

21000
H.R. Giger’s original artwork would eventually become the Xenomorph.

In addition to the strangeness of the Xenomorph life cycle, one should also consider the appearance of the beast itself. Surrealist artist H.R. Giger designed the Xenomorph to be at once horrifying and alluring. Importantly, it lacks eyes, preventing audience empathy and projection, and adding a level of unpredictability to the creature’s actions. As whatculture.com notes, there is also a certain rapey sexuality in the construction of the Xenomorph, such as its phallic inner mouth, or the shape of its elongated dome. As some have pointed out, it is a creature which is meant to embody rape itself, in form and deed. An audience’s first view of the xenomorph is at once familiar and unusual in a most unsettling way that requires repeated viewings to get used to. And yet, with the years, sequels, and community formed around the film, a collective desensitization has been accomplished.

0c1803cec0d33d5f8507847f860e3624

Of course, this phenomenon goes along with any monster that stars in multiple installments of a franchise. With Alien, however, the very purpose of the film is to immerse its audience in an uncomfortable strangeness. Without that strangeness, the film makes considerably less of an impact. John Carpenter brilliantly circumvented this problem, either intentionally or unintentionally, by turning the franchise from horror to action. In doing so, he created a different kind of satisfying experience, but not a scary one. We horror addicts have been hard pressed to find a post-1979 Alien experience to be anywhere near as nightmare-inducing.

With this in mind, the only way Alien: Covenant can live up to its horror origins is by introducing aspects of the universe the fans haven’t seen yet. Doing that risks losing the original film’s concision, however, and could easily tip over into ridiculousness. I maintain a cautious optimism about the new film, but I expect it will seem more like an impersonation of the original rather than a unique installment worthy of consideration on its own.

On the Morality of Resurrecting Dead Artists

Image result for peter cushing cgi

Over the holidays, I spoke with my friend who works in the CGI end of the film industry. It was just after Rogue One had come out, and we had a lot to hash out. Of course, one aspect of the movie he felt compelled to discuss was the CGI likeness of Peter Cushing which appeared in several scenes. He felt, for all the time and energy apparently spent on it, it should have looked better. By contrast, I found myself wondering not how it could have looked better, but whether it should have been done in the first place.

Cushing, who portrayed the memorable villain Grand Moff Tarkin in Episode IV, passed away in 1994. While the role was not entirely a CGI fabrication– actor Guy Henry performed the role, and his facial movements were translated to the digital Cushing mask placed over his face– there is something truly odd in a uniquely 21st century way about watching these new scenes featuring what appears to be the deceased Cushing.

Several other articles online consider the ethics of “resurrecting” a dead performer in this way. One piece from The Independent briefly touches on the potential future complications of creating perfect facsimiles of the dead, an idea inspired by the Tarkin scenes in Rogue One. Another considers the question of consent: is it enough to merely have the blessing of an estate, when it is unclear what a deceased person would have individually felt about, for all intents and purposes, working after their death? Would they have agreed to act at all without the autonomy that comes from working in the flesh?

My perspective on this matter draws on these questions, but adds yet another: does re-creating dead actors violate the very aesthetic experience that is the movie?

Consider the different kinds of aesthetic experiences an audience might enjoy from a single film or play. There’s composition,  which has to do with whether the story is well told, with compelling characters, dialogue, etc. well-suited to the film’s purpose. An audience member might also enjoy strictly visual aspects of a film, savoring the kinds of pictures and images it evokes. These things considered, few aesthetic elements have quite the significant, visceral impact on an audience member’s investment in the work as performance. For as long as theater and film have existed people have marveled at the gifts of good actors. We enjoy watching them in almost the same way we become engrossed in finely tuned athletic ability of an Olympic hero. When an artist’s reputation precedes them, our knowledge of their skill serves to enhance the enjoyment of the performance.

Everyone who likes movies can point to at least one moment in their movie-going life when they were moved specifically by a reputably good actor’s performance, either in addition to, or even sometimes in spite of, other aesthetic elements in a film. I can point to several if asked, but for some reason whenever my brain looks up the term “good acting,” it goes first to Sean Penn’s performance in 2008’s Milk. He was given many accolades for his work in that film, and I particularly enjoyed watching him just vanish into his new personality, an skill which Joe Morgenstern of The Wall Street Journal called a “marvelous act of empathy.”

Sean Penn, as it turns out, is actually the perfect example on which to project my case for questioning the morality of recreating dead stars precisely because of the mountain of praise he has garnered over his career.

An artist must build legacy and reputation. Penn’s reputation as a thespian, while by no means perfect (forever Spicoli), has grown with him. If he were to die today, I would evaluate him wholistically as a good actor. This ability to evaluate his skill and work becomes much more difficult with the potential of his continued work beyond his death. Of course, one should say that it wasn’t Peter Cushing who acted in Rogue One, but someone else wearing his likeness. To that I would say, simply, think of the future. There will come a time, not too long from now, when we are able to create consciousness, and then models of specific people’s consciousnesses. The question of an artist’s reputation becomes blurrier here, where we can have an intelligent Penn facsimile act in films virtually forever. Who’s to say if the original Sean Penn would grow and change in the same way, if his artistry would run the same course. Who knows if an intelligent Penn facsimile could even act as well, without the pressure of his own mortality weighing in the back of his mind, as it does on us? The conclusion of our lives often provides a nice arc to the craft of lifelong artists, and adds an appreciable change in perspective and performance along the way. And yet, you could not say that that isn’t Sean Penn acting in 2048’s remake of Fast Times at Ridgemont High.

I doubt very much that it would even be socially acceptable to discern between the two Penns if both had genuine intelligence. To say “I recognize only the real Sean Penn as an actor” would be dreadfully ignorant. Thus, we would be forced to consider both Penn’s one and the same because, aside from the excusable physical makeup of their bodies, they would be. As a result, evaluating the artist wholistically would be subject to so much more than the original Penn would be able to control. Is it morally correct, as a person who is not Sean Penn, to recreate Sean Penn, thus complicating his legacy? I’m not certain it is.

Of course, speculating about the impact of an intelligent, synthetic Sean Penn overlooks a massive trend in filmmaking, and indeed all corporate activities, to minimize costs.  A digital Penn that is just sub-human will have the added financial appeal of not being as expensive as an actual, intelligent Sean Penn. In this case, programmers will be able to use partially constructed consciousness to create compelling performances without the light behind the eyes that would mean a big paycheck. You know, kind of like most hosts on Westworld. The acting career of Sean Penn would now become even more dubious, as people would not be able to definitively say that this new computer mock up isn’t Penn, nor would they be able to say it is. Rather than having a new arc or life to it, Sean Penn’s career would become stagnant, without the subtle changes that come with additional experience. Whatever Penn’s original wishes for his reputation might have been would become irrelevant, mowed down by the Hollywood profit machine.

To return to my earlier point, either of these alterations to the artist’s legacy would have a bearing on the audience’s aesthetic experience of the film in front of them. Without the idea of reputation, our ability to appreciate the artist’s performance as representative of his or her overall craft would be significantly diminished, thus reducing our appreciation of the production overall. Compelling though it might be, we’d know that we’re still only looking at a greatest hits mash-up of performances, rather than an authentic new work from a respected artist. The practice of resurrecting the dead to act in new films would leave a hole in the overall aesthetic value of a movie, however one might calculate it. As someone who finds works of art central to the health of a society, I cannot find a practice with such consequences ethical.

As an afterthought, we should consider why we feel the need to resurrect the dead for new performances in the first place. Why did Coachella some years back project Tupac on stage? Why create a digital mask of a relatively minor, albeit memorable Star Wars character?

In my opinion, the answer is simple: nostalgia.

Nostalgia is so powerful that it’s inspired an entire modern movement of looking backward in film and television. Think of the numerous reboots we’ve seen in mainstream film and TV in the past few years. Even mediocre works of the past, like Full House, have enjoyed a resurgence because companies like Netflix knew it could count on nostalgia to make the venture profitable. Disney, in particular, has learned that nostalgia is an incredible salesman for its movies. But nostalgia, as fun as it can be to play around with, is not a substitute for breaking new artistic ground. Films that rely on nostalgia as their primary motive only cause people to look backward, hardening them against the acceptance of change and distorting their perspective on where they’ve been before. When taken too many at a time, nostalgia films are bad for you. Seeing as the resurrection of the dead serves primarily this purpose, I’m forced to question the morality of the practice further.

 

A Case for Rogue One as the Best Star Wars Film Since Return of the Jedi

 

Image result for Rogue One

WARNING: SPOILERS AHEAD!

I am not an easy Star Wars fan to please. Unlike some, I do not consider any of the prequels worthy of the franchise title, including Revenge of the Sith with its occasional worthwhile moments. Although I liked The Force Awakens overall, I was the only one of my friends to express more-than-minor reservations about its copious recycling of plot devices and, at times, heavy-handed fan service. The flaws still sit in my mind as a disqualifying blemish on what could have been the best Star Wars ever given the creative talents, fresh vision, and genuinely moving writing the project blended together. However, watching The Force Awakens and dealing with its imperfections taught me something about going to see new Star Wars movies in general. It taught me to lighten up. It turns out, when I stopped seeing newer Star Wars films in the shadow of my favorite in the franchise, The Empire Strikes Back, they are far more enjoyable. Crazy, right?

I went to Rogue One on opening night with this philosophy in mind. Predictably, I enjoyed it, but not in the same way I enjoyed The Force Awakens. With Episode VII, my enjoyment was one of dismissal, of seeing around problems in the film to get at its notable beauty. Expecting the same from Rogue One, I was surprised to discover there was very little dismissing necessary. The movie was surprisingly close to my model of the perfect Star Wars. It was a departure from the plot points of earlier installments, for one. It was also dark in subject and style. It featured new and compelling characters (Chirrut Imwe is one of my new favorite characters, a fitting balance the old school Yoda of Empire). It even provided new insight into the already established story of A New Hope and the overall Star Wars universe. To me, Rogue One emulated The Empire Strikes Back in multiple deep, important ways without resorting to blind imitation of the Empire‘s plot points. The resulting film gave me hope that Disney is, in fact, attuned to what matters not only to the casual, I-just-want-to-see-spaceships-go-boom fans, but the hardcores that love the story as well.

Rogue One even showed a kind of visual taste notably absent in the more egregious failures of the franchise. In one of my favorite moments of The Empire Strikes Back, Admiral Piette walks in on Darth Vader’s meditation session and catches a brief glimpse at the ugliness underneath the mask. For original audiences, the already dark material presented in the film’s opening act veers into the realm of horror here. The audience becomes privy to the idea that there is more to the story than what we see on the surface, foreshadowing later revelations about the Empire and the ugliness of Luke’s parentage.

Inline image 2

Rogue One had a moment so similar, it may have been intended as a kind of homage. When an attendant comes to inform Vader of Director Krennic’s arrival, the audience gets a similar peek of Vader in a cloudy bacta tank:

Inline image 1

Even though we’ve already seen Vader’s deformity explicitly in Revenge of the Sith, the return to mystery in this scene is welcome as it also represents a return to visual artistry. One of the overarching criticisms expressed by Star Wars purists about the prequels concerns the emphasis of visual vividness over story. Sure, the prequels have lightsabers and spaceships, but by and large, the movies just don’t look like Star Wars movies. Even worse, they seem to replace compelling story material with their overt use of computer effects. By showing us only Vader’s shadow in the vat, Disney rejects George Lucas’ later fixation on visual scintillation, recognizing once again what Lucas once discovered accidentally: sometimes the best way to engage your audience is to let their imaginations do some work, rather than serving up your saga on a crystal clear CGI platter. Of course, Rogue One involves a bit of less-than-tactful CGI (COUGH zombie Tarkin COUGH), but it doesn’t replace the story. If you find it unsettling, it is soon over, and you can get back to enjoying the film.

Rogue One certainly has its flaws. I wasn’t amused by the film’s occasional moments of fan service, for instance. Still, its flaws presented themselves only momentarily and then were flitted away, unlike the imperfections in The Force Awakens, which were ever present due to their structural natureRogue One easily trumps the prequels as well, thanks to its return to a more conservative artistic vision, competent script, and compelling characters. Given these considerations, Disney’s latest Star Wars offering starts to look like the best post-Jedi installment in the franchise, at least from a certain point of view.

I suspect this idea will be hard to swallow for many fans. After all, Star Wars is first and foremost a saga, the story of a family across generations. To rank a story that is technically not part of that saga fourth best in the series seems blasphemous. But as all true fans know, the universe can be as captivating as the main story. The idea that a film about events outside of the saga, a film which breaks away from Star Wars tradition in several ways, can compete with most installments in the overarching narrative perhaps suggests that there is something freeing about taking new risks with old material. It is a lesson that filmmakers may well heed as they continue on the saga with Episodes VIII and IX.

 

On Teaching in the Fake News Age

Stanford University recently published a study of middle school to college age students with some shocking findings. As the authors noted, large swaths of students were unable to distinguish credible news sources from fake and sponsored news sources online. The study, given to students in twelve states and from schools of varying affluence, goes in depth to detail some of the astonishingly poor reasoning the kids  use in asserting that highly questionable sources like sponsored ads, tweets from interest groups, and even photos with absolutely no attribution are credible and therefore fit to be used in forming opinions.

In recent months, fake news has been treated in the media as a new issue suddenly bubbling up from the dark underbelly of the internet. While I reject the idea that it is a new concept, I do believe that it has grown in volume and in resemblance to actual news with the advent of the internet, and more specifically, social media. Social media is something of a wild frontier; it has existed just long enough for people to become wise to its power, but it hasn’t been around long enough for government to get too involved in it. Online, everyone can play the demagogue. What we say garners the attention of our followers. The more radical our statements, the more attention (good and bad) we get, and the more attention we get, the more visibility our posts earn. To add to this, we can say virtually whatever we want. No one filters us, or forces us to support our claims. We are allowed to speak our minds in the form of social media posts, blogs, and personal websites, and we’re free to give these posts as much visible similarity to actual news as we want.

And people should be able to do that. All of it. I’ll admit, I signed that Change.org petition to get Mark Zuckerberg to take action against fake news on Facebook. I also immediately regretted it. When a nation elects a demagogue thanks, in part, to misinformation, there is a problem to address. However, we must be cautious in how we address it. Simply not allowing less than truthful claims to become public is a form of censorship, which is an idea that should not be turned to in haste or panic. I believe we need to look elsewhere for a solution, which is where Stanford’s recent study comes in.

The students who participated in the study are just as susceptible to fake news and post-truthism as adults, if not more so. The difference is students are more malleable than adults. Their world view is still forming. While the adults in their lives will no doubt play a large part in the formation of their opinions, they have less rigid schema at stake than a 40 year old die-hard republican or democrat. Students who are unable to tell the difference between real and fake news represent a clear deficiency in our curriculum, but also our brightest hope to correct this issue.

I teach 7th grade English in upstate New York, so upon reading about Standford’s study, I pored over the Common Core Learning Standards to see what they said on the topic. I had some vague memory of bias being mentioned in the standards, and I was certain I’d discover that there are standards devoted to preventing the bleak scenario described by the Stanford researchers. What I discovered did less to ease my concern than I’d hoped.

I found two seventh grade standards which, when taken together, serve to bolster students’ defenses against misleading articles. Informational Text Standard 8 requires kids to “Trace and evaluate the argument and specific claims in a text, assessing whether the reasoning is sound and the evidence is relevant and sufficient to support the claims.” A teacher who addresses this standard thoroughly might teach his or her kids to discredit articles based on a lack of substantial evidence for its claims. However, through this process, the students are never taught to consider the agenda or allegiances of the online source from which the article comes.  Standard 9 asks that students “Analyze how two or more authors writing about the same topic shape their presentations of key information by emphasizing different evidence or advancing different interpretations of facts.” This can be used to detect bias in an article, which is important, but again, does not require that students evaluate the credibility of the larger source. Dismayed, I expanded my search to include all informational reading standards for grades 6-12, and discovered that while standards on evaluating arguments based on reason grow in rigor across the years, the same lack of source consideration remains. Furthermore, these standards do nothing to combat the larger issue of fake news, which is the fabrication and gross recontextualization of events which are then presented as facts in order to induce a visceral reaction. When you make up facts, it’s easy to use perfectly fine logic to come to any conclusion you wish.

I considered the idea that the standards I sought might appear elsewhere in the NYS curriculum, so I turned to the standards for Social Studies. At the 7th grade level, the Social Studies standards don’t even mention the internet as a medium for receiving information, focusing entirely on historical content. I am willing to entertain the idea that these standards exist somewhere, but if it’s a struggle even to locate them, I can’t imagine they are taught well in our schools. I encourage anyone who can locate standards relating to this issue in any content area to contact me.

In short, there is a gigantic, gaping hole in the Common Core Learning Standards where students should be taught to consider their sources, particularly digital ones. The standards stop just short of this, helping students to dispel individual arguments (provided that they suffer flaws in logic and reasoning), but not the publications they come from. This deficit is almost certainly a reflection of who wrote the standards: people who did not grow up with the internet, and thus did not entirely realize its alluring pull to impulsive, wild thinking. Thus, I am calling on my fellow teachers to redouble our efforts on this issue. We need to teach our students how to evaluate their sources, and we need to teach it this academic year. The next four years have the potential to wreak havoc on our world socially and environmentally. We must do all we can to ensure our youth are equipped to handle the sea of information that will make up their daily lives if we are to fight for our continued prosperity and well being.

I implore lawmakers to work as quickly as possible to prepare students for the future in this respect. Federal lawmakers should continue to push for nationwide curriculum to address this issue. In the event that federal measures fail, state legislators should ensure they are ready to pick up the slack. In New York, standards are currently under revision. Plans to present the new standards to the Board of Regents are set for early 2017, at which time there may be time for public comment, according to NYSED.gov. At that time, I encourage everyone to voice concerns over this lapse, which the proposed revisions still fail to address.

In my admittedly short career as an educator, never before has the answer to the age old question, “Why do I need to know this?” been so urgent and tangible. Why? Because look at the news, that’s why. These events serve as a sobering reminder why quality public education is the foundation of a healthy democracy. We cannot afford to let our educational institutions trail behind the times so severely. Teachers, administrators, and lawmakers: the time to look ahead is now.

 

 

 

Aliens Probably Exist(ed)?!

want-to-believe-2

In my commitment to actual news post election, I subscribed to The New York Times. I use it as a kind of methodone for my social media addiction, and so I often find myself perusing it on my phone when I feel that itch for a digital dopamine fix. This past weekend, I came across a very interesting, very credible opinion piece from June. Before proceeding, you should read it in its entirety. I promise it’s worth it.

The article asserts that, thanks to technological advancements in astronomy, we are now able to say that we are almost certainly not the only civilized species to have existed in cosmic history. Yes, there is some speculation involved, and no, it doesn’t speak to whether or not we are currently alone, but it is definitely still cause for reflection.

Much like one of my favorite TV FBI agents, I want to believe. I always have,  to a certain extent. Maybe you could call it more of a faith than a conviction. I’m no conspiracy theorist, but we have known for long enough that we are just a tiny speck in the universe, and that anything could be going on out there. To many of us, it seemed not only plausible, but likely that intelligent life existed elsewhere in the universe. Still, it was all conjecture. This is actual, numerical calculation. As the article states, we no longer have to reduce the matter to pessimism or optimism. We don’t have to subscribed to crazy conspiracy theories. Intelligent life outside Earth has almost certainly existed.

The implications of this idea are staggering. Mankind, particularly those of us who have predominantly subscribed to western thought, has operated for all but the last few moments of its existence under an assumed self importance based on the notion that there are no other species quite like us. We’ve just been handed math to strongly suggest that that assumption is bullshit. The ground from underneath our perception of the universe has been ripped out like a table cloth from under plates and glasses, and some how, bizarrely, nothing has fallen off the table. Nobody is talking about this. Everyone is carrying on as normal.

Perhaps that is what blows my mind the most. This piece, published in June, only just now crossed my path. I regularly read about space, science, and science fiction. The internet knows I like this shit. How have I only just now seen this? How has nobody been talking about this?  It isn’t as dramatic as a bunch of flying saucers landing in DC, granted. It’s also not as real, ominous, and imminent as many current events. But it’s still a radical shift in our understanding of our own place in the universe. This is a discovery that has the potential to seriously reconfigure people’s philosophies, to alter the undercurrents that influence our decisions every day, if  we only stop to consider its implications. How is this not bigger news?