A Case for Rogue One as the Best Star Wars Film Since Return of the Jedi


Image result for Rogue One


I am not an easy Star Wars fan to please. Unlike some, I do not consider any of the prequels worthy of the franchise title, including Revenge of the Sith with its occasional worthwhile moments. Although I liked The Force Awakens overall, I was the only one of my friends to express more-than-minor reservations about its copious recycling of plot devices and, at times, heavy-handed fan service. The flaws still sit in my mind as a disqualifying blemish on what could have been the best Star Wars ever given the creative talents, fresh vision, and genuinely moving writing the project blended together. However, watching The Force Awakens and dealing with its imperfections taught me something about going to see new Star Wars movies in general. It taught me to lighten up. It turns out, when I stopped seeing newer Star Wars films in the shadow of my favorite in the franchise, The Empire Strikes Back, they are far more enjoyable. Crazy, right?

I went to Rogue One on opening night with this philosophy in mind. Predictably, I enjoyed it, but not in the same way I enjoyed The Force Awakens. With Episode VII, my enjoyment was one of dismissal, of seeing around problems in the film to get at its notable beauty. Expecting the same from Rogue One, I was surprised to discover there was very little dismissing necessary. The movie was surprisingly close to my model of the perfect Star Wars. It was a departure from the plot points of earlier installments, for one. It was also dark in subject and style. It featured new and compelling characters (Chirrut Imwe is one of my new favorite characters, a fitting balance the old school Yoda of Empire). It even provided new insight into the already established story of A New Hope and the overall Star Wars universe. To me, Rogue One emulated The Empire Strikes Back in multiple deep, important ways without resorting to blind imitation of the Empire‘s plot points. The resulting film gave me hope that Disney is, in fact, attuned to what matters not only to the casual, I-just-want-to-see-spaceships-go-boom fans, but the hardcores that love the story as well.

Rogue One even showed a kind of visual taste notably absent in the more egregious failures of the franchise. In one of my favorite moments of The Empire Strikes Back, Admiral Piette walks in on Darth Vader’s meditation session and catches a brief glimpse at the ugliness underneath the mask. For original audiences, the already dark material presented in the film’s opening act veers into the realm of horror here. The audience becomes privy to the idea that there is more to the story than what we see on the surface, foreshadowing later revelations about the Empire and the ugliness of Luke’s parentage.

Inline image 2

Rogue One had a moment so similar, it may have been intended as a kind of homage. When an attendant comes to inform Vader of Director Krennic’s arrival, the audience gets a similar peek of Vader in a cloudy bacta tank:

Inline image 1

Even though we’ve already seen Vader’s deformity explicitly in Revenge of the Sith, the return to mystery in this scene is welcome as it also represents a return to visual artistry. One of the overarching criticisms expressed by Star Wars purists about the prequels concerns the emphasis of visual vividness over story. Sure, the prequels have lightsabers and spaceships, but by and large, the movies just don’t look like Star Wars movies. Even worse, they seem to replace compelling story material with their overt use of computer effects. By showing us only Vader’s shadow in the vat, Disney rejects George Lucas’ later fixation on visual scintillation, recognizing once again what Lucas once discovered accidentally: sometimes the best way to engage your audience is to let their imaginations do some work, rather than serving up your saga on a crystal clear CGI platter. Of course, Rogue One involves a bit of less-than-tactful CGI (COUGH zombie Tarkin COUGH), but it doesn’t replace the story. If you find it unsettling, it is soon over, and you can get back to enjoying the film.

Rogue One certainly has its flaws. I wasn’t amused by the film’s occasional moments of fan service, for instance. Still, its flaws presented themselves only momentarily and then were flitted away, unlike the imperfections in The Force Awakens, which were ever present due to their structural natureRogue One easily trumps the prequels as well, thanks to its return to a more conservative artistic vision, competent script, and compelling characters. Given these considerations, Disney’s latest Star Wars offering starts to look like the best post-Jedi installment in the franchise, at least from a certain point of view.

I suspect this idea will be hard to swallow for many fans. After all, Star Wars is first and foremost a saga, the story of a family across generations. To rank a story that is technically not part of that saga fourth best in the series seems blasphemous. But as all true fans know, the universe can be as captivating as the main story. The idea that a film about events outside of the saga, a film which breaks away from Star Wars tradition in several ways, can compete with most installments in the overarching narrative perhaps suggests that there is something freeing about taking new risks with old material. It is a lesson that filmmakers may well heed as they continue on the saga with Episodes VIII and IX.



On Teaching in the Fake News Age

Stanford University recently published a study of middle school to college age students with some shocking findings. As the authors noted, large swaths of students were unable to distinguish credible news sources from fake and sponsored news sources online. The study, given to students in twelve states and from schools of varying affluence, goes in depth to detail some of the astonishingly poor reasoning the kids  use in asserting that highly questionable sources like sponsored ads, tweets from interest groups, and even photos with absolutely no attribution are credible and therefore fit to be used in forming opinions.

In recent months, fake news has been treated in the media as a new issue suddenly bubbling up from the dark underbelly of the internet. While I reject the idea that it is a new concept, I do believe that it has grown in volume and in resemblance to actual news with the advent of the internet, and more specifically, social media. Social media is something of a wild frontier; it has existed just long enough for people to become wise to its power, but it hasn’t been around long enough for government to get too involved in it. Online, everyone can play the demagogue. What we say garners the attention of our followers. The more radical our statements, the more attention (good and bad) we get, and the more attention we get, the more visibility our posts earn. To add to this, we can say virtually whatever we want. No one filters us, or forces us to support our claims. We are allowed to speak our minds in the form of social media posts, blogs, and personal websites, and we’re free to give these posts as much visible similarity to actual news as we want.

And people should be able to do that. All of it. I’ll admit, I signed that Change.org petition to get Mark Zuckerberg to take action against fake news on Facebook. I also immediately regretted it. When a nation elects a demagogue thanks, in part, to misinformation, there is a problem to address. However, we must be cautious in how we address it. Simply not allowing less than truthful claims to become public is a form of censorship, which is an idea that should not be turned to in haste or panic. I believe we need to look elsewhere for a solution, which is where Stanford’s recent study comes in.

The students who participated in the study are just as susceptible to fake news and post-truthism as adults, if not more so. The difference is students are more malleable than adults. Their world view is still forming. While the adults in their lives will no doubt play a large part in the formation of their opinions, they have less rigid schema at stake than a 40 year old die-hard republican or democrat. Students who are unable to tell the difference between real and fake news represent a clear deficiency in our curriculum, but also our brightest hope to correct this issue.

I teach 7th grade English in upstate New York, so upon reading about Standford’s study, I pored over the Common Core Learning Standards to see what they said on the topic. I had some vague memory of bias being mentioned in the standards, and I was certain I’d discover that there are standards devoted to preventing the bleak scenario described by the Stanford researchers. What I discovered did less to ease my concern than I’d hoped.

I found two seventh grade standards which, when taken together, serve to bolster students’ defenses against misleading articles. Informational Text Standard 8 requires kids to “Trace and evaluate the argument and specific claims in a text, assessing whether the reasoning is sound and the evidence is relevant and sufficient to support the claims.” A teacher who addresses this standard thoroughly might teach his or her kids to discredit articles based on a lack of substantial evidence for its claims. However, through this process, the students are never taught to consider the agenda or allegiances of the online source from which the article comes.  Standard 9 asks that students “Analyze how two or more authors writing about the same topic shape their presentations of key information by emphasizing different evidence or advancing different interpretations of facts.” This can be used to detect bias in an article, which is important, but again, does not require that students evaluate the credibility of the larger source. Dismayed, I expanded my search to include all informational reading standards for grades 6-12, and discovered that while standards on evaluating arguments based on reason grow in rigor across the years, the same lack of source consideration remains. Furthermore, these standards do nothing to combat the larger issue of fake news, which is the fabrication and gross recontextualization of events which are then presented as facts in order to induce a visceral reaction. When you make up facts, it’s easy to use perfectly fine logic to come to any conclusion you wish.

I considered the idea that the standards I sought might appear elsewhere in the NYS curriculum, so I turned to the standards for Social Studies. At the 7th grade level, the Social Studies standards don’t even mention the internet as a medium for receiving information, focusing entirely on historical content. I am willing to entertain the idea that these standards exist somewhere, but if it’s a struggle even to locate them, I can’t imagine they are taught well in our schools. I encourage anyone who can locate standards relating to this issue in any content area to contact me.

In short, there is a gigantic, gaping hole in the Common Core Learning Standards where students should be taught to consider their sources, particularly digital ones. The standards stop just short of this, helping students to dispel individual arguments (provided that they suffer flaws in logic and reasoning), but not the publications they come from. This deficit is almost certainly a reflection of who wrote the standards: people who did not grow up with the internet, and thus did not entirely realize its alluring pull to impulsive, wild thinking. Thus, I am calling on my fellow teachers to redouble our efforts on this issue. We need to teach our students how to evaluate their sources, and we need to teach it this academic year. The next four years have the potential to wreak havoc on our world socially and environmentally. We must do all we can to ensure our youth are equipped to handle the sea of information that will make up their daily lives if we are to fight for our continued prosperity and well being.

I implore lawmakers to work as quickly as possible to prepare students for the future in this respect. Federal lawmakers should continue to push for nationwide curriculum to address this issue. In the event that federal measures fail, state legislators should ensure they are ready to pick up the slack. In New York, standards are currently under revision. Plans to present the new standards to the Board of Regents are set for early 2017, at which time there may be time for public comment, according to NYSED.gov. At that time, I encourage everyone to voice concerns over this lapse, which the proposed revisions still fail to address.

In my admittedly short career as an educator, never before has the answer to the age old question, “Why do I need to know this?” been so urgent and tangible. Why? Because look at the news, that’s why. These events serve as a sobering reminder why quality public education is the foundation of a healthy democracy. We cannot afford to let our educational institutions trail behind the times so severely. Teachers, administrators, and lawmakers: the time to look ahead is now.




Aliens Probably Exist(ed)?!


In my commitment to actual news post election, I subscribed to The New York Times. I use it as a kind of methodone for my social media addiction, and so I often find myself perusing it on my phone when I feel that itch for a digital dopamine fix. This past weekend, I came across a very interesting, very credible opinion piece from June. Before proceeding, you should read it in its entirety. I promise it’s worth it.

The article asserts that, thanks to technological advancements in astronomy, we are now able to say that we are almost certainly not the only civilized species to have existed in cosmic history. Yes, there is some speculation involved, and no, it doesn’t speak to whether or not we are currently alone, but it is definitely still cause for reflection.

Much like one of my favorite TV FBI agents, I want to believe. I always have,  to a certain extent. Maybe you could call it more of a faith than a conviction. I’m no conspiracy theorist, but we have known for long enough that we are just a tiny speck in the universe, and that anything could be going on out there. To many of us, it seemed not only plausible, but likely that intelligent life existed elsewhere in the universe. Still, it was all conjecture. This is actual, numerical calculation. As the article states, we no longer have to reduce the matter to pessimism or optimism. We don’t have to subscribed to crazy conspiracy theories. Intelligent life outside Earth has almost certainly existed.

The implications of this idea are staggering. Mankind, particularly those of us who have predominantly subscribed to western thought, has operated for all but the last few moments of its existence under an assumed self importance based on the notion that there are no other species quite like us. We’ve just been handed math to strongly suggest that that assumption is bullshit. The ground from underneath our perception of the universe has been ripped out like a table cloth from under plates and glasses, and some how, bizarrely, nothing has fallen off the table. Nobody is talking about this. Everyone is carrying on as normal.

Perhaps that is what blows my mind the most. This piece, published in June, only just now crossed my path. I regularly read about space, science, and science fiction. The internet knows I like this shit. How have I only just now seen this? How has nobody been talking about this?  It isn’t as dramatic as a bunch of flying saucers landing in DC, granted. It’s also not as real, ominous, and imminent as many current events. But it’s still a radical shift in our understanding of our own place in the universe. This is a discovery that has the potential to seriously reconfigure people’s philosophies, to alter the undercurrents that influence our decisions every day, if  we only stop to consider its implications. How is this not bigger news?






Revisiting The Blair Witch Project



Let’s talk about found footage horror for a moment. It’s a subgenre that has enjoyed a considerable heyday over the past two decades or so. Launched somewhat by the cult favorite Cannibal Holocaust, and more so by the release of 1999’s The Blair Witch Project, found footage went on to see thousands of releases throughout the 2000s and 2010s. It became the popcorn flick of the horror genre– the role previously filled by slashers in the 80s and 90s. Given a quick examination of finances, it’s no wonder why found footage horror became notable. The Blair Witch Project was made for $60,000, and returned a comparatively incredible $1.5 million in its opening weekend (Not to mention its $140 million lifetime gross, according to boxofficemojo.com). In its opening weekend, The Blair Witch Project returned about 2,500% of its budget. It’s easy to understand why found footage horror gained the attention of the Hollywood money machine, and why for a good minute there, it had such a hell of a run. The Blair Witch Project remake was released recently, marking a milestone for the genre, and so I’d like to take a moment to return to this ever divisive film for a new critical look.

I first saw The Blair Witch Project when it came out on VHS. I was 11 years old. It inspired me, as it did many others, to grab my dad’s camera and make a parody film, beginning a hobby that I would pursue for the next decade. I found the movie laughable at that age. What could possibly be scary about a bunch of idiots screwing themselves over in the forest? I cringed at the entire middle act of the film, which was essentially a half hour long screaming match. The impression colored my idea of the found footage genre in the ensuing decade and beyond. Upon further review, I will admit that my critical faculties at age 11 might not have been as sharp as I thought. Actually, I feel prepared to say of the many found footage films I’ve seen, The Blair Witch Project is probably the most artfully done.

The film, by necessity, is probably the most conservative horror film I’ve ever seen in terms of actual screen time it dedicates to its monster. With virtually no budget, it’s easy to see why this is the case. The witch (or what-have-you) is left 100% to the viewer’s imagination, which is a stark contrast to many of the found footage films The Blair Witch Project inspired. Leaving the monster to the audience’s imagination is a hallmark of many beloved classic horror films, and allows the viewer to appreciate the film’s use of atmosphere, which requires much more subtlety of a film crew.

The Blair Witch Project is actually a very patient depiction of seeping panic, and how it can cause a group of perfectly decent people to behave monstrously. Although the woods are (maybe) stalked by some unseen evil, what ultimately undoes our protagonists is distrust and betrayal. Mike kicks the group’s only guidance into a creek because it is “useless,” an expression of frustration at Heather’s inability to navigate. As tensions set in, they all begin to subscribe to the idea of Heather’s, and then each other’s, incompetence. Sure, the arguing and bickering gets tiresome, and as they get more agitated, the camera work becomes nauseating, but it’s a pretty realistic, convincing depiction of a frightening idea: Just below the surface of each and every one of us, there is a panicked half-wit waiting to emerge when enough goes wrong.

The film even deals relatively well with a fundamental problem all found footage movies must tackle, and it’s something that has always bothered me about the found footage premise: Why, when faced with life threatening scenarios, do people continue to film, rather than devote the whole of their energy to survival? The Blair Witch Project is rife with conflict over the continued filming throughout. One of the film’s major conflicts is, paradoxically, the film’s very existence in the first place. The fact that Heather keeps the cameras rolling at times of stress is a major factor in the fallout and ultimate death of the our protagonists. Heather’s dedication to her craft serves as to satisfy the question that often goes entirely unanswered in found footage, and even elevates the film to a level of postmodern irony. What, after all, is more horrific? The fact that these terrible things happened to a bunch of students, or the irony that in trying to share their experience with the world, these same students caused those terrible things to happen to them?

The Blair Witch Project, for all the mainstream attention it garnered, is a surprisingly deep work of fiction. Is it perfect? Of course not. But set against the backdrop of the entire found footage movement, it sets itself aside as an experience and work of art.

On Thematic Ties between the 2016 Election and Black Mirror

I’ve been doing a lot of two things recently: watching Black Mirror, and reading the news. Both leave me with the same kind of feeling. I’ve decided that feeling comes because both sources are hitting on a common theme at the moment, in that bizarre netherworld where art and life cease imitating each other and become the same thing.

In Black Mirror’s notorious first episode, “The National Anthem,” the British Prime Minister is spurred on by a kidnapper and social media to publicly have sex with a pig. As the PM humiliates himself on national TV, the viewer sees only the increasingly sickened reactions of the masses watching on, paradoxically powerless to look away and yet, the movers of the entire travesty. If it wasn’t for the people who gave audience to the kidnapper’s actions, he wouldn’t have a platform on which to terrorize others. We become aware who the true villains of the episode are in this moment– the people who failed to realize, through massive diffusion of responsibility, their own culpability in the horror they now observe. The whole thing points to what is perhaps the greatest theme of the show: the idea that with new technology, every day people must adopt new definitions of responsibility, or suffer terrible consequences.

As frightening as it is to think about the direct ties between an episode of Black Mirror and real, modern life, we musn’t ignore them. One exists in the recent spotlight cast on fake and heavily spun news, widely consumed through social media. In 2016, more than half of adult voters recieved their news through social media, myself included.  Of course, not all news on social media is false. However, even if we like to think of ourselves as critical news consumers, we cannot excuse our roles as partial perpetrators in our own misinformation this election. If you are like me, you were probably lured into a false sense of security by election predictors which forcasted a tremendous likelihood of a Clinton win. I also don’t doubt that more than a few of the articles in which I partook were more slanted than I wanted to admit at the time– granting liberties to favored parties, and likely taking things said and done by right wing politicians out of context. Surfing Facebook, I have seen a lot of information shared by Trump supporters which decontextualizes and reimagines facts to the point of falsehood. The New York Times recently published a case study that highlights the avenues by which some of these pieces become overblown. In mindlessly consuming and sharing these materials, we are guilty of the same sin as those thoughtless masses in “The National Anthem.” We have acted on impulse, and allowed these pieces, with their hyperbolized, exciting titles, to manipulated us. We have given our racist uncle’s blog the same visibility (and therefore, in the minds of some, credibility) as a long-standing, established publication. Now, as we look on at the travesty we’ve created, we feel just a little dirty, but we aren’t sure why.

The greatest lesson we need to learn from this is social media, and Facebook in particular, is an unreliable source. Even when a credible articles appear in our feeds, we must remember that it has been picked for us based on what Facebook knows of our beliefs. It will inflate our own feelings of self-righteousness, which can blind us to facts which might nuance or change your views in the future. Always seek out information for yourself before posting or sharing anything. In the post-truth world, we have a personal responsibility to fight the spread of falsehoods by this new medium.

In Black Mirror episodes, the characters come to grips with the flaws of their technology only after it has become integral to their society, beyond the point where warnings are useful. In this way, I hope my analogy to the show will fail. Our response to these events will shape how people interact with these technologies in the generations to come. The internet can be intoxicating, and we are teenagers waking up to behold the carnage of our first rager. Here’s hoping we learn from it.