Getting Old in Punk Pop

Blink182_AH5B4305_gbvh7y

Blink 182’s most recent album, California, begins with a starkly honest set of lines:

 There’s a cynical feeling saying I should give up / You’ve said everything you’ll ever say

The lines resonated with me. A songwriter with power pop leanings myself, I sometimes worry about the easily observable tendency in artists to produce less material as they get older. While I don’t feel anywhere near that point just yet, I’m a long-term thinker, and constantly work to avoid the day I run out of ideas however far in the future it may be. I want my craft to last into my old age, and I don’t want it to get stupid along the way.

The muses of pop-punk are ephemeral, however. As with all forms of pop, melody is king. Particularly with power pop, lively energy and a certain angst are a part of the sound. Of course, we all eventually start running short of lively energy as we get older, and that angst has a tendency to fade as well, at least towards the subjects that are usually sung about: romantic relationships, rejection of authority, and youthful stupidity.

Needless to say, I eagerly listened on to hear what Mark Hoppus, a man who rose to fame writing songs about making prank calls, pubescent sexual tension, and grandfathers shitting their pants, would have to say about continuing to perform with Blink 182 into his 40s. With 2003’s self titled album (which I love), and 2012’s Neighborhoods (which grows on me more and more), I was led to expect some musical growth. The opening lines were promising for the lyrical content to follow. There is definitely an untapped wealth of inspiration in the tension created when you outgrow your craft, and punk pop music, at least the way Hoppus defined it, requires such youthful point of view to maintain.

Instead, California turned out to be an absolute regression. Rather than nod to where he has been, using his wisdom and experience to comment on his former self, he tries simply to become his former self. “Teenage Satellites,” a forgettable mid album track, tells a story of completely cliched young-and-wild love that feels forced. The single, “She’s Out of Her Mind,” heavily recalls “The Rock Show” even as its music video remakes the iconic streaking scenes of “What’s My Age Again?” The song romanticizes its subject’s instability like only a horny 17-year-old can, except these guys are 45.

blink_182_nudie_run_comparison_h_1016.c9d7d64b9a9df77bbae4e4d5f04fc77e

There are a few less regressive songs on the album. “Rabbit Hole” is about self perpetuating misery, but still, it smacks of pubescent mopey-ness. There’s also a streak of Golden State worship in the album’s lyrics, most of which feels out of place on an album otherwise populated with the imaginings of man-children seeking their youth through music. “California” has a few flirtations with honest nostalgia about a past spent in the title state, but it spends so much time on cliches of Californian life that it sounds like it was sponsored by the board of tourism.

One mustn’t forget the joke songs, either. As Hoppus belts in one song,  “I wanna see some naked dudes / that’s why I built this pool.” Just like in their heyday, Blink has provided a few 30-second splashes of sophomoric entertainment, like the icing on a cake of sadness. Clearly for Hoppus et. al, the solution to running out of ideas is to take all your old ideas and do them again, regardless of how goofy it looks on you now.

Green Day, another of my childhood favorites, put out an album recently as well. They aged a bit better than Blink 182 throughout the 2000’s, releasing American Idiot, which turned their signature punk pop into a more progressive stadium rock sound, ordaining79a2201c67a9aa386cda76562666c62d.500x500x1 their youthful songs of rebellion with lofty concepts. Billy Joe Armstrong still sang about young love, but he began to create characters, and while the band went through less extreme stylistic changes than Blink, they still seemed to mature better. They used their political anger to fuel change in subject matter and maturity.

I maintain that American Idiot is one of the more important albums from the George W. Bush era. It was a rare evolution from what is usually a very stagnant genre. Post American Idiot, they released several duds, as they failed to grow more. 21st Century Breakdown played like an album of American Idiot b-sides. Uno, Dos, Tre! sought a return to a more traditional album, stripped of all higher concepts, but came out incredibly boring. And now we have Revolution Radio, another album heavily derived from their American Idiot developments.

revrad

Lyrically, Armstrong is still writing ironic, politically angsty sounding lyrics. He has a few entertaining and memorable quips like “I found my soul / beneath the sofa cushions.” They don’t always land, and often leave the listener puzzling at their meaning. Other tracks, like “Bouncing Off the Walls” are pretty generic songs of rebellion not really worth remembering.

As with all post 2000 Green Day albums, you kind of get lost in the steady stream of distorted eight notes and power chords. Tre Cool uses only his kick drum, hi hat, and snare for everything with extremely rare exceptions, and songs are punctuated with chants of “hey ho” that sound sampled at this point. In the middle, the temptation gets strong to switch the record despite a few new arranging tricks, such as the verses of the album’s opener, “Somewhere Now,” which contrast 60’s esque ringing guitar arpeggios with stomping stadium rock choruses. The hooks are certainly more ornate and interesting than those on California, but at the end of the day, it’s another f*cking Green Day album. They may have risen to a musical precipice beyond Blink, but they’ve been there for 13 years, with only very minor changes beyond that.

At this point, I haven’t seen proof of a power pop artist continuing thoughtfully through their career, innovating, but still staying true to the basic tenants of their genre. It’s possible that the kind of excitement you get from a well crafted punk pop tune just isn’t meant to be made by those who aren’t in the budding years of their lives. It’s a truth that is particularly painful because for so many creative types, myself included, it serves as a gateway for a glimpse of back-in-the-day. It’s only natural to want to generate that feeling for others, and it requires care and vision to attempt after certain benchmarks of your life are passed.

Top Ten Tuesday: Horror Flicks that Don’t Rely on Jump Scares

I love horror movies of all kinds, but lately I’ve come to really appreciate those that can disturb their audience without over-reliance on the age old horror tool, the jump scare. I wrote a thing about it here, but in essence, when a film doesn’t rely heavily on jumps and jolts, it allows the its atmosphere and themes to really shine through, which tend to be more memorable and enduring for the audience. So here’s a quick list for #TopTenTuesday of great films that focus on deeper thematic horror beyond just jumps. Not that there isn’t still a jump or two here or there, you know, for good measure.

#10: The Invitation

The Invitation (2015 film) POSTER.png

There’s a current trend in mainstream comedy that my colleague and I have dubbed cringecore– basically comedy excruciatingly derived from awkward situations. The Invitation seems a bit like the black sheep of that family, in that it follows what might be the most awkward dinner gathering in history. Rather than turning funny, however, things get pretty messed up. The horror here comes in the audience’s unraveling of the relationship history among the attendees, culminating in a pretty thrilling climax. It’s enthralling for all the right reasons.

#9: A Girl Walks Home Alone at Night

AGWHAN poster.jpg

Filmed in a noir-reminiscent black and white, this Persian-scripted film is an excellent, fresh take on the vampire tropes. Visually memorable and feminist in nature, the movie has been largely praised by critics, who like me, agree that it’s awesome to see such a vivid, humanist portrait of what is so often treated only with broad strokes. Add to this that the move is also just downright creepy thanks to its use of darkness, shadow, and of course to a haunting performance by Sheila Vand, and we have a winner.

#8: It Follows

Image result

A lot has been said about this movie. It’s definitely a darling of the indie horror circuit largely for its use of its unsettling, mildly surreal atmosphere. Some interesting reading on that atmosphere can be found here. It also predated Stranger Things as a scary love letter to the 80s with its haunting synth-heavy soundtrack. All elements collide in just the perfect way to turn what could have been a lame cautionary tale about teenage sex into something much more nuanced and artistic.

#7: Antichrist

Larsvontrierantichristposter.jpg

Lars von Trier’s first entry into his unofficial “depression trilogy” is laden with his signature use of rich metaphor and symbolism, which some find overbearing. It covers a topic oft dealt with in horror: grief. Willem Dafoe and Charlotte Gainsbourg play parents dealing with the death of their young son, who foolishly decide to figure things out at an isolated cabin in the woods. The film was the subject of controversy upon release, drawing rightful accusations of misogyny in its thematic messages. However, taken as a trilogy with its successors, Melancholia and Nymphomaniac, once can start to see with more nuance the message: the assessment that mankind by its very nature is evil, because nature itself is evil. It’s pretty heavy metal.

#6: We Need to Talk about Kevin

Image result for we need to talk about kevin

 

Be warned. This one is definitely going to harsh your mellow. It’s a story that gets at deep seated horrors of parenting with ambiguity enough to keep you thinking about it for years. Without giving away spoilers, it also gets at some of the darkest trends of real life horror observable in current events: truly an American horror film. It’s always weird to see John C. Reilly giving serious performances (forever Dr. Steve Brule to me), and yet he does so admirably here. However, the real credit is due to an intense, heartbreaking performance by Tilda Swinton. If you haven’t seen it, you need to. Just make sure it’s already been a crappy day, because things won’t get better afterwards.

#5: The Babadook

The-Babadook-Poster.jpg

The Babadook offers a more allegorical approach to the subject of child rearing. Essie Davis protrays Amelia, a widow struggling at once to cope with the death of her husband, and to manage her particularly difficult son, Sam, who shows some symptoms of autism, although it is never explicitly discussed. Sam’s obsession with a childhood monster becomes a supernatural pressure point on Amelia, whose unraveling illustrates with incredible foreboding the toll her life is taking on her. It’s always nice to see women of horror at work, and director Jennifer Kent sets the bar high.

#4 Alter Ego

Image result for alter ego shimizu

I’m sure at this point, the list has become pretty predictable, so I wanted to throw a wild card on here for you. Takashi Shimizu (director of Ju-On, among other memorable J-horror films) was a “supervising director” on this low-budget affair in 2002, and let me first say that most would probably not consider this film on the same critical level as others on this list. The production is cheap, the acting isn’t great and the script, well. Let’s just say it follows suit. HOWEVER: I found it to have a certain B-film charm to it. The film follows a group of girls at a photoshoot on the campus of an empty school. One by one, malicious doubles appear to pick them off. Shimizu makes the most with what he’s got, using some simple, but genuinely freaky special effects to make the doubles memorable, wacky and surreal. With its short run time of just about an hour, I recommend carving out a moment for this one, particularly if you like quirky, weird horror. Find it on Shudder.

#3: Contracted

A portrait of a diseased woman

Here’s a novel idea: a zombie movie that focuses on the horrors of a single person becoming a zombie, rather than the apocalyptic hordes-of-undead stuff of traditional zombie lore. If the purpose of a jump scare is instantaneous payoff, Contracted seems to follow an opposite philosophy: the film’s horror is all about the slow degradation of its protagonist. You may need to look past a few hard-to-swallow moments, but this one rewards, overall. Extra kudos to the makeup team for subtly turning the protagonist into a pile of death over time.

#2: Beyond the Walls

Image result for beyond the walls

Okay, technically Beyond the Walls is a three part miniseries rather than a film, but the pieces can easily be taken as a whole. It’s got an intriguing premise: a woman who inherits a mysterious house discovers a door inside its walls leading to a strange other-worldly place. I leave the spoilers off from there. The story is imaginative and dark, and there are a handful of cool visual moments that lend some memorable creepiness to it all, all without a single jump scare. There are a few unexplained moments and plot peculiarities, but on the whole, this one satisfies.

#1: The Witch

Image result for the witch

 

There was a lot of hype surrounding this one, and for once, it was all deserved. I can’t recommend this film enough to any horror fan. It is subtle, artistic, dark and heavily atmospheric while featuring some outstanding performances. The film is about a puritan family who is excommunicated to live on their own. Slowly, they succumb to their own flaws and are preyed upon until the film’s disturbing and memorable climax. The film is a wealth of possible interpretation, and I’m sure many will put their spin on it in the years to come. If you are a horror fan, you kind of have to check this one out.

 

Why No Alien Sequel Will Ever Be as Scary as the Original

alien-covenant-1280

The Alien series has walked a crooked line in the sci-fi genre. What began as a sci-fi/horror franchise quickly gave way to  sci-fi-action leanings with the release of Aliens and ensuing sequels. Eventually the franchise crossed over into B-film territory (Alien V. Predator, anyone?) before director Ridley Scott attempted a return to seriousness with Prometheus. In terms of genre, it is truly a meandering franchise.

Of course, for my tastes, straight horror is the way to go. The new trailer for Alien: Covenant looks promising in that regard. What few glimpses it offers seem to make excellent use of shadow and dark atmosphere, contrasted with a blisteringly exposed chest-bursting scene, all of which signal a return to the techniques of tasteful horror film making.

But sequels to Alien are, by their nature, flawed. I’ll go so far as to say that no Alien sequel will ever be as unnerving as the original. What ruins further installments is the viewer’s foreknowledge of the Alien universe as he or she experiences each new film. Director Ridley Scott seems hell-bent on clearly delineating the Alien mythos with his continued installments, thus reducing the audience’s discomfort with each sequel.

I wasn’t yet born when the Alien was released in 1979, but I can imagine what it might have been like to be in that audience. For someone who’s never heard of the Xenomorph, the experience of watching one reproduce must have been, well, alienating. The discovery of the downed alien spacecraft and the room of eggs on LV426 conveys a sense of dread, mystery, and foreboding, accentuated by the copious darkness in the atmosphere. The original chest-bursting scene, with no previous knowledge of what would take place, would have induced an incredible helpless anxiety, like being on a roller coaster you aren’t sure is safe. The entire process, from face-hugger to chest-bursting to the acid blood, is a horrific exploration that the crew of the Nostromo desperately does not want, but thanks to the claustrophic confines of the ship, have to endure. It is a plot designed to make its viewer feel estranged from normalcy and hopeless. In order to achieve that effect, the viewer must also find it strange and new.

21000
H.R. Giger’s original artwork would eventually become the Xenomorph.

In addition to the strangeness of the Xenomorph life cycle, one should also consider the appearance of the beast itself. Surrealist artist H.R. Giger designed the Xenomorph to be at once horrifying and alluring. Importantly, it lacks eyes, preventing audience empathy and projection, and adding a level of unpredictability to the creature’s actions. As whatculture.com notes, there is also a certain rapey sexuality in the construction of the Xenomorph, such as its phallic inner mouth, or the shape of its elongated dome. As some have pointed out, it is a creature which is meant to embody rape itself, in form and deed. An audience’s first view of the xenomorph is at once familiar and unusual in a most unsettling way that requires repeated viewings to get used to. And yet, with the years, sequels, and community formed around the film, a collective desensitization has been accomplished.

0c1803cec0d33d5f8507847f860e3624

Of course, this phenomenon goes along with any monster that stars in multiple installments of a franchise. With Alien, however, the very purpose of the film is to immerse its audience in an uncomfortable strangeness. Without that strangeness, the film makes considerably less of an impact. John Carpenter brilliantly circumvented this problem, either intentionally or unintentionally, by turning the franchise from horror to action. In doing so, he created a different kind of satisfying experience, but not a scary one. We horror addicts have been hard pressed to find a post-1979 Alien experience to be anywhere near as nightmare-inducing.

With this in mind, the only way Alien: Covenant can live up to its horror origins is by introducing aspects of the universe the fans haven’t seen yet. Doing that risks losing the original film’s concision, however, and could easily tip over into ridiculousness. I maintain a cautious optimism about the new film, but I expect it will seem more like an impersonation of the original rather than a unique installment worthy of consideration on its own.

On the Morality of Resurrecting Dead Artists

Image result for peter cushing cgi

Over the holidays, I spoke with my friend who works in the CGI end of the film industry. It was just after Rogue One had come out, and we had a lot to hash out. Of course, one aspect of the movie he felt compelled to discuss was the CGI likeness of Peter Cushing which appeared in several scenes. He felt, for all the time and energy apparently spent on it, it should have looked better. By contrast, I found myself wondering not how it could have looked better, but whether it should have been done in the first place.

Cushing, who portrayed the memorable villain Grand Moff Tarkin in Episode IV, passed away in 1994. While the role was not entirely a CGI fabrication– actor Guy Henry performed the role, and his facial movements were translated to the digital Cushing mask placed over his face– there is something truly odd in a uniquely 21st century way about watching these new scenes featuring what appears to be the deceased Cushing.

Several other articles online consider the ethics of “resurrecting” a dead performer in this way. One piece from The Independent briefly touches on the potential future complications of creating perfect facsimiles of the dead, an idea inspired by the Tarkin scenes in Rogue One. Another considers the question of consent: is it enough to merely have the blessing of an estate, when it is unclear what a deceased person would have individually felt about, for all intents and purposes, working after their death? Would they have agreed to act at all without the autonomy that comes from working in the flesh?

My perspective on this matter draws on these questions, but adds yet another: does re-creating dead actors violate the very aesthetic experience that is the movie?

Consider the different kinds of aesthetic experiences an audience might enjoy from a single film or play. There’s composition,  which has to do with whether the story is well told, with compelling characters, dialogue, etc. well-suited to the film’s purpose. An audience member might also enjoy strictly visual aspects of a film, savoring the kinds of pictures and images it evokes. These things considered, few aesthetic elements have quite the significant, visceral impact on an audience member’s investment in the work as performance. For as long as theater and film have existed people have marveled at the gifts of good actors. We enjoy watching them in almost the same way we become engrossed in finely tuned athletic ability of an Olympic hero. When an artist’s reputation precedes them, our knowledge of their skill serves to enhance the enjoyment of the performance.

Everyone who likes movies can point to at least one moment in their movie-going life when they were moved specifically by a reputably good actor’s performance, either in addition to, or even sometimes in spite of, other aesthetic elements in a film. I can point to several if asked, but for some reason whenever my brain looks up the term “good acting,” it goes first to Sean Penn’s performance in 2008’s Milk. He was given many accolades for his work in that film, and I particularly enjoyed watching him just vanish into his new personality, an skill which Joe Morgenstern of The Wall Street Journal called a “marvelous act of empathy.”

Sean Penn, as it turns out, is actually the perfect example on which to project my case for questioning the morality of recreating dead stars precisely because of the mountain of praise he has garnered over his career.

An artist must build legacy and reputation. Penn’s reputation as a thespian, while by no means perfect (forever Spicoli), has grown with him. If he were to die today, I would evaluate him wholistically as a good actor. This ability to evaluate his skill and work becomes much more difficult with the potential of his continued work beyond his death. Of course, one should say that it wasn’t Peter Cushing who acted in Rogue One, but someone else wearing his likeness. To that I would say, simply, think of the future. There will come a time, not too long from now, when we are able to create consciousness, and then models of specific people’s consciousnesses. The question of an artist’s reputation becomes blurrier here, where we can have an intelligent Penn facsimile act in films virtually forever. Who’s to say if the original Sean Penn would grow and change in the same way, if his artistry would run the same course. Who knows if an intelligent Penn facsimile could even act as well, without the pressure of his own mortality weighing in the back of his mind, as it does on us? The conclusion of our lives often provides a nice arc to the craft of lifelong artists, and adds an appreciable change in perspective and performance along the way. And yet, you could not say that that isn’t Sean Penn acting in 2048’s remake of Fast Times at Ridgemont High.

I doubt very much that it would even be socially acceptable to discern between the two Penns if both had genuine intelligence. To say “I recognize only the real Sean Penn as an actor” would be dreadfully ignorant. Thus, we would be forced to consider both Penn’s one and the same because, aside from the excusable physical makeup of their bodies, they would be. As a result, evaluating the artist wholistically would be subject to so much more than the original Penn would be able to control. Is it morally correct, as a person who is not Sean Penn, to recreate Sean Penn, thus complicating his legacy? I’m not certain it is.

Of course, speculating about the impact of an intelligent, synthetic Sean Penn overlooks a massive trend in filmmaking, and indeed all corporate activities, to minimize costs.  A digital Penn that is just sub-human will have the added financial appeal of not being as expensive as an actual, intelligent Sean Penn. In this case, programmers will be able to use partially constructed consciousness to create compelling performances without the light behind the eyes that would mean a big paycheck. You know, kind of like most hosts on Westworld. The acting career of Sean Penn would now become even more dubious, as people would not be able to definitively say that this new computer mock up isn’t Penn, nor would they be able to say it is. Rather than having a new arc or life to it, Sean Penn’s career would become stagnant, without the subtle changes that come with additional experience. Whatever Penn’s original wishes for his reputation might have been would become irrelevant, mowed down by the Hollywood profit machine.

To return to my earlier point, either of these alterations to the artist’s legacy would have a bearing on the audience’s aesthetic experience of the film in front of them. Without the idea of reputation, our ability to appreciate the artist’s performance as representative of his or her overall craft would be significantly diminished, thus reducing our appreciation of the production overall. Compelling though it might be, we’d know that we’re still only looking at a greatest hits mash-up of performances, rather than an authentic new work from a respected artist. The practice of resurrecting the dead to act in new films would leave a hole in the overall aesthetic value of a movie, however one might calculate it. As someone who finds works of art central to the health of a society, I cannot find a practice with such consequences ethical.

As an afterthought, we should consider why we feel the need to resurrect the dead for new performances in the first place. Why did Coachella some years back project Tupac on stage? Why create a digital mask of a relatively minor, albeit memorable Star Wars character?

In my opinion, the answer is simple: nostalgia.

Nostalgia is so powerful that it’s inspired an entire modern movement of looking backward in film and television. Think of the numerous reboots we’ve seen in mainstream film and TV in the past few years. Even mediocre works of the past, like Full House, have enjoyed a resurgence because companies like Netflix knew it could count on nostalgia to make the venture profitable. Disney, in particular, has learned that nostalgia is an incredible salesman for its movies. But nostalgia, as fun as it can be to play around with, is not a substitute for breaking new artistic ground. Films that rely on nostalgia as their primary motive only cause people to look backward, hardening them against the acceptance of change and distorting their perspective on where they’ve been before. When taken too many at a time, nostalgia films are bad for you. Seeing as the resurrection of the dead serves primarily this purpose, I’m forced to question the morality of the practice further.

 

A Case for Rogue One as the Best Star Wars Film Since Return of the Jedi

 

Image result for Rogue One

WARNING: SPOILERS AHEAD!

I am not an easy Star Wars fan to please. Unlike some, I do not consider any of the prequels worthy of the franchise title, including Revenge of the Sith with its occasional worthwhile moments. Although I liked The Force Awakens overall, I was the only one of my friends to express more-than-minor reservations about its copious recycling of plot devices and, at times, heavy-handed fan service. The flaws still sit in my mind as a disqualifying blemish on what could have been the best Star Wars ever given the creative talents, fresh vision, and genuinely moving writing the project blended together. However, watching The Force Awakens and dealing with its imperfections taught me something about going to see new Star Wars movies in general. It taught me to lighten up. It turns out, when I stopped seeing newer Star Wars films in the shadow of my favorite in the franchise, The Empire Strikes Back, they are far more enjoyable. Crazy, right?

I went to Rogue One on opening night with this philosophy in mind. Predictably, I enjoyed it, but not in the same way I enjoyed The Force Awakens. With Episode VII, my enjoyment was one of dismissal, of seeing around problems in the film to get at its notable beauty. Expecting the same from Rogue One, I was surprised to discover there was very little dismissing necessary. The movie was surprisingly close to my model of the perfect Star Wars. It was a departure from the plot points of earlier installments, for one. It was also dark in subject and style. It featured new and compelling characters (Chirrut Imwe is one of my new favorite characters, a fitting balance the old school Yoda of Empire). It even provided new insight into the already established story of A New Hope and the overall Star Wars universe. To me, Rogue One emulated The Empire Strikes Back in multiple deep, important ways without resorting to blind imitation of the Empire‘s plot points. The resulting film gave me hope that Disney is, in fact, attuned to what matters not only to the casual, I-just-want-to-see-spaceships-go-boom fans, but the hardcores that love the story as well.

Rogue One even showed a kind of visual taste notably absent in the more egregious failures of the franchise. In one of my favorite moments of The Empire Strikes Back, Admiral Piette walks in on Darth Vader’s meditation session and catches a brief glimpse at the ugliness underneath the mask. For original audiences, the already dark material presented in the film’s opening act veers into the realm of horror here. The audience becomes privy to the idea that there is more to the story than what we see on the surface, foreshadowing later revelations about the Empire and the ugliness of Luke’s parentage.

Inline image 2

Rogue One had a moment so similar, it may have been intended as a kind of homage. When an attendant comes to inform Vader of Director Krennic’s arrival, the audience gets a similar peek of Vader in a cloudy bacta tank:

Inline image 1

Even though we’ve already seen Vader’s deformity explicitly in Revenge of the Sith, the return to mystery in this scene is welcome as it also represents a return to visual artistry. One of the overarching criticisms expressed by Star Wars purists about the prequels concerns the emphasis of visual vividness over story. Sure, the prequels have lightsabers and spaceships, but by and large, the movies just don’t look like Star Wars movies. Even worse, they seem to replace compelling story material with their overt use of computer effects. By showing us only Vader’s shadow in the vat, Disney rejects George Lucas’ later fixation on visual scintillation, recognizing once again what Lucas once discovered accidentally: sometimes the best way to engage your audience is to let their imaginations do some work, rather than serving up your saga on a crystal clear CGI platter. Of course, Rogue One involves a bit of less-than-tactful CGI (COUGH zombie Tarkin COUGH), but it doesn’t replace the story. If you find it unsettling, it is soon over, and you can get back to enjoying the film.

Rogue One certainly has its flaws. I wasn’t amused by the film’s occasional moments of fan service, for instance. Still, its flaws presented themselves only momentarily and then were flitted away, unlike the imperfections in The Force Awakens, which were ever present due to their structural natureRogue One easily trumps the prequels as well, thanks to its return to a more conservative artistic vision, competent script, and compelling characters. Given these considerations, Disney’s latest Star Wars offering starts to look like the best post-Jedi installment in the franchise, at least from a certain point of view.

I suspect this idea will be hard to swallow for many fans. After all, Star Wars is first and foremost a saga, the story of a family across generations. To rank a story that is technically not part of that saga fourth best in the series seems blasphemous. But as all true fans know, the universe can be as captivating as the main story. The idea that a film about events outside of the saga, a film which breaks away from Star Wars tradition in several ways, can compete with most installments in the overarching narrative perhaps suggests that there is something freeing about taking new risks with old material. It is a lesson that filmmakers may well heed as they continue on the saga with Episodes VIII and IX.

 

On Teaching in the Fake News Age

Stanford University recently published a study of middle school to college age students with some shocking findings. As the authors noted, large swaths of students were unable to distinguish credible news sources from fake and sponsored news sources online. The study, given to students in twelve states and from schools of varying affluence, goes in depth to detail some of the astonishingly poor reasoning the kids  use in asserting that highly questionable sources like sponsored ads, tweets from interest groups, and even photos with absolutely no attribution are credible and therefore fit to be used in forming opinions.

In recent months, fake news has been treated in the media as a new issue suddenly bubbling up from the dark underbelly of the internet. While I reject the idea that it is a new concept, I do believe that it has grown in volume and in resemblance to actual news with the advent of the internet, and more specifically, social media. Social media is something of a wild frontier; it has existed just long enough for people to become wise to its power, but it hasn’t been around long enough for government to get too involved in it. Online, everyone can play the demagogue. What we say garners the attention of our followers. The more radical our statements, the more attention (good and bad) we get, and the more attention we get, the more visibility our posts earn. To add to this, we can say virtually whatever we want. No one filters us, or forces us to support our claims. We are allowed to speak our minds in the form of social media posts, blogs, and personal websites, and we’re free to give these posts as much visible similarity to actual news as we want.

And people should be able to do that. All of it. I’ll admit, I signed that Change.org petition to get Mark Zuckerberg to take action against fake news on Facebook. I also immediately regretted it. When a nation elects a demagogue thanks, in part, to misinformation, there is a problem to address. However, we must be cautious in how we address it. Simply not allowing less than truthful claims to become public is a form of censorship, which is an idea that should not be turned to in haste or panic. I believe we need to look elsewhere for a solution, which is where Stanford’s recent study comes in.

The students who participated in the study are just as susceptible to fake news and post-truthism as adults, if not more so. The difference is students are more malleable than adults. Their world view is still forming. While the adults in their lives will no doubt play a large part in the formation of their opinions, they have less rigid schema at stake than a 40 year old die-hard republican or democrat. Students who are unable to tell the difference between real and fake news represent a clear deficiency in our curriculum, but also our brightest hope to correct this issue.

I teach 7th grade English in upstate New York, so upon reading about Standford’s study, I pored over the Common Core Learning Standards to see what they said on the topic. I had some vague memory of bias being mentioned in the standards, and I was certain I’d discover that there are standards devoted to preventing the bleak scenario described by the Stanford researchers. What I discovered did less to ease my concern than I’d hoped.

I found two seventh grade standards which, when taken together, serve to bolster students’ defenses against misleading articles. Informational Text Standard 8 requires kids to “Trace and evaluate the argument and specific claims in a text, assessing whether the reasoning is sound and the evidence is relevant and sufficient to support the claims.” A teacher who addresses this standard thoroughly might teach his or her kids to discredit articles based on a lack of substantial evidence for its claims. However, through this process, the students are never taught to consider the agenda or allegiances of the online source from which the article comes.  Standard 9 asks that students “Analyze how two or more authors writing about the same topic shape their presentations of key information by emphasizing different evidence or advancing different interpretations of facts.” This can be used to detect bias in an article, which is important, but again, does not require that students evaluate the credibility of the larger source. Dismayed, I expanded my search to include all informational reading standards for grades 6-12, and discovered that while standards on evaluating arguments based on reason grow in rigor across the years, the same lack of source consideration remains. Furthermore, these standards do nothing to combat the larger issue of fake news, which is the fabrication and gross recontextualization of events which are then presented as facts in order to induce a visceral reaction. When you make up facts, it’s easy to use perfectly fine logic to come to any conclusion you wish.

I considered the idea that the standards I sought might appear elsewhere in the NYS curriculum, so I turned to the standards for Social Studies. At the 7th grade level, the Social Studies standards don’t even mention the internet as a medium for receiving information, focusing entirely on historical content. I am willing to entertain the idea that these standards exist somewhere, but if it’s a struggle even to locate them, I can’t imagine they are taught well in our schools. I encourage anyone who can locate standards relating to this issue in any content area to contact me.

In short, there is a gigantic, gaping hole in the Common Core Learning Standards where students should be taught to consider their sources, particularly digital ones. The standards stop just short of this, helping students to dispel individual arguments (provided that they suffer flaws in logic and reasoning), but not the publications they come from. This deficit is almost certainly a reflection of who wrote the standards: people who did not grow up with the internet, and thus did not entirely realize its alluring pull to impulsive, wild thinking. Thus, I am calling on my fellow teachers to redouble our efforts on this issue. We need to teach our students how to evaluate their sources, and we need to teach it this academic year. The next four years have the potential to wreak havoc on our world socially and environmentally. We must do all we can to ensure our youth are equipped to handle the sea of information that will make up their daily lives if we are to fight for our continued prosperity and well being.

I implore lawmakers to work as quickly as possible to prepare students for the future in this respect. Federal lawmakers should continue to push for nationwide curriculum to address this issue. In the event that federal measures fail, state legislators should ensure they are ready to pick up the slack. In New York, standards are currently under revision. Plans to present the new standards to the Board of Regents are set for early 2017, at which time there may be time for public comment, according to NYSED.gov. At that time, I encourage everyone to voice concerns over this lapse, which the proposed revisions still fail to address.

In my admittedly short career as an educator, never before has the answer to the age old question, “Why do I need to know this?” been so urgent and tangible. Why? Because look at the news, that’s why. These events serve as a sobering reminder why quality public education is the foundation of a healthy democracy. We cannot afford to let our educational institutions trail behind the times so severely. Teachers, administrators, and lawmakers: the time to look ahead is now.

 

 

 

Aliens Probably Exist(ed)?!

want-to-believe-2

In my commitment to actual news post election, I subscribed to The New York Times. I use it as a kind of methodone for my social media addiction, and so I often find myself perusing it on my phone when I feel that itch for a digital dopamine fix. This past weekend, I came across a very interesting, very credible opinion piece from June. Before proceeding, you should read it in its entirety. I promise it’s worth it.

The article asserts that, thanks to technological advancements in astronomy, we are now able to say that we are almost certainly not the only civilized species to have existed in cosmic history. Yes, there is some speculation involved, and no, it doesn’t speak to whether or not we are currently alone, but it is definitely still cause for reflection.

Much like one of my favorite TV FBI agents, I want to believe. I always have,  to a certain extent. Maybe you could call it more of a faith than a conviction. I’m no conspiracy theorist, but we have known for long enough that we are just a tiny speck in the universe, and that anything could be going on out there. To many of us, it seemed not only plausible, but likely that intelligent life existed elsewhere in the universe. Still, it was all conjecture. This is actual, numerical calculation. As the article states, we no longer have to reduce the matter to pessimism or optimism. We don’t have to subscribed to crazy conspiracy theories. Intelligent life outside Earth has almost certainly existed.

The implications of this idea are staggering. Mankind, particularly those of us who have predominantly subscribed to western thought, has operated for all but the last few moments of its existence under an assumed self importance based on the notion that there are no other species quite like us. We’ve just been handed math to strongly suggest that that assumption is bullshit. The ground from underneath our perception of the universe has been ripped out like a table cloth from under plates and glasses, and some how, bizarrely, nothing has fallen off the table. Nobody is talking about this. Everyone is carrying on as normal.

Perhaps that is what blows my mind the most. This piece, published in June, only just now crossed my path. I regularly read about space, science, and science fiction. The internet knows I like this shit. How have I only just now seen this? How has nobody been talking about this?  It isn’t as dramatic as a bunch of flying saucers landing in DC, granted. It’s also not as real, ominous, and imminent as many current events. But it’s still a radical shift in our understanding of our own place in the universe. This is a discovery that has the potential to seriously reconfigure people’s philosophies, to alter the undercurrents that influence our decisions every day, if  we only stop to consider its implications. How is this not bigger news?

 

 

 

 

 

Revisiting The Blair Witch Project

 

bl3

Let’s talk about found footage horror for a moment. It’s a subgenre that has enjoyed a considerable heyday over the past two decades or so. Launched somewhat by the cult favorite Cannibal Holocaust, and more so by the release of 1999’s The Blair Witch Project, found footage went on to see thousands of releases throughout the 2000s and 2010s. It became the popcorn flick of the horror genre– the role previously filled by slashers in the 80s and 90s. Given a quick examination of finances, it’s no wonder why found footage horror became notable. The Blair Witch Project was made for $60,000, and returned a comparatively incredible $1.5 million in its opening weekend (Not to mention its $140 million lifetime gross, according to boxofficemojo.com). In its opening weekend, The Blair Witch Project returned about 2,500% of its budget. It’s easy to understand why found footage horror gained the attention of the Hollywood money machine, and why for a good minute there, it had such a hell of a run. The Blair Witch Project remake was released recently, marking a milestone for the genre, and so I’d like to take a moment to return to this ever divisive film for a new critical look.

I first saw The Blair Witch Project when it came out on VHS. I was 11 years old. It inspired me, as it did many others, to grab my dad’s camera and make a parody film, beginning a hobby that I would pursue for the next decade. I found the movie laughable at that age. What could possibly be scary about a bunch of idiots screwing themselves over in the forest? I cringed at the entire middle act of the film, which was essentially a half hour long screaming match. The impression colored my idea of the found footage genre in the ensuing decade and beyond. Upon further review, I will admit that my critical faculties at age 11 might not have been as sharp as I thought. Actually, I feel prepared to say of the many found footage films I’ve seen, The Blair Witch Project is probably the most artfully done.

The film, by necessity, is probably the most conservative horror film I’ve ever seen in terms of actual screen time it dedicates to its monster. With virtually no budget, it’s easy to see why this is the case. The witch (or what-have-you) is left 100% to the viewer’s imagination, which is a stark contrast to many of the found footage films The Blair Witch Project inspired. Leaving the monster to the audience’s imagination is a hallmark of many beloved classic horror films, and allows the viewer to appreciate the film’s use of atmosphere, which requires much more subtlety of a film crew.

The Blair Witch Project is actually a very patient depiction of seeping panic, and how it can cause a group of perfectly decent people to behave monstrously. Although the woods are (maybe) stalked by some unseen evil, what ultimately undoes our protagonists is distrust and betrayal. Mike kicks the group’s only guidance into a creek because it is “useless,” an expression of frustration at Heather’s inability to navigate. As tensions set in, they all begin to subscribe to the idea of Heather’s, and then each other’s, incompetence. Sure, the arguing and bickering gets tiresome, and as they get more agitated, the camera work becomes nauseating, but it’s a pretty realistic, convincing depiction of a frightening idea: Just below the surface of each and every one of us, there is a panicked half-wit waiting to emerge when enough goes wrong.

The film even deals relatively well with a fundamental problem all found footage movies must tackle, and it’s something that has always bothered me about the found footage premise: Why, when faced with life threatening scenarios, do people continue to film, rather than devote the whole of their energy to survival? The Blair Witch Project is rife with conflict over the continued filming throughout. One of the film’s major conflicts is, paradoxically, the film’s very existence in the first place. The fact that Heather keeps the cameras rolling at times of stress is a major factor in the fallout and ultimate death of the our protagonists. Heather’s dedication to her craft serves as to satisfy the question that often goes entirely unanswered in found footage, and even elevates the film to a level of postmodern irony. What, after all, is more horrific? The fact that these terrible things happened to a bunch of students, or the irony that in trying to share their experience with the world, these same students caused those terrible things to happen to them?

The Blair Witch Project, for all the mainstream attention it garnered, is a surprisingly deep work of fiction. Is it perfect? Of course not. But set against the backdrop of the entire found footage movement, it sets itself aside as an experience and work of art.

On Thematic Ties between the 2016 Election and Black Mirror

I’ve been doing a lot of two things recently: watching Black Mirror, and reading the news. Both leave me with the same kind of feeling. I’ve decided that feeling comes because both sources are hitting on a common theme at the moment, in that bizarre netherworld where art and life cease imitating each other and become the same thing.

In Black Mirror’s notorious first episode, “The National Anthem,” the British Prime Minister is spurred on by a kidnapper and social media to publicly have sex with a pig. As the PM humiliates himself on national TV, the viewer sees only the increasingly sickened reactions of the masses watching on, paradoxically powerless to look away and yet, the movers of the entire travesty. If it wasn’t for the people who gave audience to the kidnapper’s actions, he wouldn’t have a platform on which to terrorize others. We become aware who the true villains of the episode are in this moment– the people who failed to realize, through massive diffusion of responsibility, their own culpability in the horror they now observe. The whole thing points to what is perhaps the greatest theme of the show: the idea that with new technology, every day people must adopt new definitions of responsibility, or suffer terrible consequences.

As frightening as it is to think about the direct ties between an episode of Black Mirror and real, modern life, we musn’t ignore them. One exists in the recent spotlight cast on fake and heavily spun news, widely consumed through social media. In 2016, more than half of adult voters recieved their news through social media, myself included.  Of course, not all news on social media is false. However, even if we like to think of ourselves as critical news consumers, we cannot excuse our roles as partial perpetrators in our own misinformation this election. If you are like me, you were probably lured into a false sense of security by election predictors which forcasted a tremendous likelihood of a Clinton win. I also don’t doubt that more than a few of the articles in which I partook were more slanted than I wanted to admit at the time– granting liberties to favored parties, and likely taking things said and done by right wing politicians out of context. Surfing Facebook, I have seen a lot of information shared by Trump supporters which decontextualizes and reimagines facts to the point of falsehood. The New York Times recently published a case study that highlights the avenues by which some of these pieces become overblown. In mindlessly consuming and sharing these materials, we are guilty of the same sin as those thoughtless masses in “The National Anthem.” We have acted on impulse, and allowed these pieces, with their hyperbolized, exciting titles, to manipulated us. We have given our racist uncle’s blog the same visibility (and therefore, in the minds of some, credibility) as a long-standing, established publication. Now, as we look on at the travesty we’ve created, we feel just a little dirty, but we aren’t sure why.

The greatest lesson we need to learn from this is social media, and Facebook in particular, is an unreliable source. Even when a credible articles appear in our feeds, we must remember that it has been picked for us based on what Facebook knows of our beliefs. It will inflate our own feelings of self-righteousness, which can blind us to facts which might nuance or change your views in the future. Always seek out information for yourself before posting or sharing anything. In the post-truth world, we have a personal responsibility to fight the spread of falsehoods by this new medium.

In Black Mirror episodes, the characters come to grips with the flaws of their technology only after it has become integral to their society, beyond the point where warnings are useful. In this way, I hope my analogy to the show will fail. Our response to these events will shape how people interact with these technologies in the generations to come. The internet can be intoxicating, and we are teenagers waking up to behold the carnage of our first rager. Here’s hoping we learn from it.