Saturday, September 17, 2016

perspectives on how Western art history become its own problem across media, a long form guess at the ways in which the ideology of Euro-American Romanticism wore itself out

First off, HT to DZ over at Mockingbird for highlighting this first article at Another Week Ends:

DZ spent a paragraph or two on it and since this is Wenatchee The Hatchet I'm going to write ... let's say about 4,500 words on the subject.  And the subject is how people came to hate poetry and what this can tell us about the history of poetry, poetry criticism, and how this connects to the history of Romantic era ideological commitments to what art is and what art ought to achieve and how the Romantic era festooned art and arts criticisms with some burdens that we have to deal with in poetry, art history, and also music and musicology. My general proposal is that a common thread in all these things is an unexamined burden brought in by postmillennialist modes of apocalyptic thought that I believe should be rejected but to get there we need first to survey what the ideological troubles with Romantic era optimistic apocalyptic and rhetoric proposed and for that ... we can finally start off with the article DZ mentioned at Another Week Ends.

Making a poem was never quite as simple as making a table, because it required inspiration and passion, but it did involve studying techniques and following rules. Indeed, the laws of poetry were natural laws, which had been discovered by the Greeks and could be learned from their example. [emphasis added] The English poet Alexander Pope agreed, writing in his “Essay on Criticism”:

Those RULES of old discover’d, not devis’d,
 Are Nature still, but Nature Methodiz’d;
 Nature, like Liberty, is but restrain’d
 By the same Laws which first herself ordain’d.

That was published in 1711, so clearly not much had changed in the previous two millennia. But turn to Percy Shelley’s essay “A Defense of Poetry,” written in 1821, and you will discover that the meaning of the word poetry had undergone a fantastic transformation. Poetry, Shelley says, is “connate with the origin of man,” and “a poet participates in the eternal, the infinite, and the one.” Poetry comprises every creative activity of human nature, including the arts, politics, and science: “The institutors of laws, and the founders of civil society, and the inventors of the arts of life” are all in some sense poets, since they shape reality in the light of their vision. Shelley even speaks of “the poetry in the doctrines of Jesus Christ,” as if Christianity itself were just one enormous poem.

The Romantics, faced with a disenchanted universe, attempted to discover a new source of enchantment in the human imagination, and poetry became a metaphor for that creative, life-enhancing power. [emphasis added] Poetry used to mean poems. Now poems began to seem like just one habitation, and far from the grandest, of the force that is poetry. Naturally, this fateful division between poetry and poems had enormous consequences for the way poems were written. After all, if poetry is ineffable and infinite, there is no reason it should be bound by the mechanical laws of meter and rhyme. In the modern age, poetry became antinomian.

Thus we find Emerson arguing, in his essay “The Poet,” that “it is not metres, but a metre-making argument, that makes a poem,—a thought so passionate and alive, that, like the spirit of a plant or an animal, it has an architecture of its own, and adorns nature with a new thing.” The metaphor of growth cancels out the old metaphor of craft. [emphasis added] For Horace, a poem was something you had to learn how to make, at the expense of great effort. For Keats, “if Poetry comes not as naturally as the Leaves to a tree it had better not come at all.”


For Lerner, as his use of the term the social suggests, that hope is not just individual and spiritual, but collective and political. Poetry is linked, in his vision, to the possibility of a total redemption of human society, of the kind Marxism used to call “the revolution.” In particular, his fusion of aesthetic, political, and spiritual messianism brings to mind the work of Walter Benjamin, the 20th-century German Jewish theorist. Lerner’s previous book, the novel 10:04, was saturated in the Benjaminian concept of redemption: the idea that the world as we know it carries within itself the possibility for transformation. Key to this vision is the idea that salvation will come from within, from a rearrangement of the world, rather than through an external power or a god

... The Hatred of Poetry is a subtle inquiry into poetry’s discontents, and a moving statement of poetry’s potential. It can also be read, though, as an example of the dead end into which modern poetic theory has been led by its grandiose aspirations. [emphasis added] As long as we focus on what poetry isn’t and can’t be, how can we rediscover what it once was, and might be again?

It's not just in the realm of poetry that we see laments of the burden of 19th century Romantic ideological tropes about what art is and ought to be. If the trouble with poetry may be likened to a topic that has historically been a theme celebrated in poetry, the beauty of a fair maiden, then the trouble with poetry has been that no real woman can possibly compare to the impossible standard of the manic pixie dream girl and that this has become the at times tacit standard by which poetry is judged by those who judge poetry.

Something similar could be said for film criticism, that things have gotten to a point where film critics can praise a film that is at least five hours long for its compelling realism and representational approach while for the rest of the movie-going public that isn't interested in that sort of film-as-compelling-art trope they want to go to a movie that does not replicate their day job.  If what the world of cinema needs is an unquestionably authentic and realistically naturalistic presentation of the world as it is then film critics should be writing reviews and film criticism about an eight-hour shift worked by someone by way of surveillance footage in a grocery store.  You can't get more cinema verite than that, can you?

When critical traditions insist upon instantiations of artistic ideals that may never have necessarily existed in the arts themselves, at least as presented by scholars or theorists, we can run into the whole non-tradition of the American symphony in the 19th century that you'll never get to hear ... though for reasons that will have to be some other blog post.  Romantic ideological commitments in the realm of the arts and arts criticism have in some sense left us no choice but to endure people complaining that there aren't any new ideas as though new ideas were the point of art.

And so we find that there are historians and arts critics who feel lately that one of the most damaging things about arts histories is ... arts histories and the unexamined ideologies that go with it.

The art world likes to ask big art-centric questions like "Can art change the world?" We usually answer "Yes." I usually disagree. Art can't stop famine in sub-Saharan Africa or eradicate Zika. But art does change the world incrementally and by osmosis. Typically by first changing how we see, and thereby how we remember. Raymond Chandler invented early-20th-century L.A.; Francis Ford Coppola forged our vision of the Vietnam War; Andy Warhol combined clashing colors that were never together before and that palette is now ubiquitous; God creating Adam looks the way Michelangelo painted it; Oscar Wilde said "the mysterious loveliness" of fog didn't exist before poets and painters. That's big. But art as we now know it has narrowed. These days our definition of it is mainly art informed by other art and art history. Especially in the last two centuries — and tenaciously of late — art has examined its own essences, ordinances, techniques, tools, materials, presentational modes, and forms. To be thought of as an artist someone must self-identify as one and make what they think of as art. This center cannot hold. Why? It is far too tight to let real art breathe. [emphasis added]

Our art history is organized teleologically — it's an arrow. Things are always said to be going forward, and progress is measured mainly in formal ways by changes in ideas of space, color, composition, subject matter, and the like. [emphasis added] Artists and isms follow one another in a Biblical begatting based on progress toward a goal or a higher stage. Cubism was "a race toward flatness"; Suprematism was "the zero point of painting"; Rodchenko said he made "the last painting"; Ad Reinhardt one-upped him saying he was "making the last painting which anyone can make." In this system synthetic shifts and tics combine into things we call movements like Cubism, Constructivism, Futurism, Art Nouveau, Color Field, etc. The problem is anyone who doesn't fall into this timeline is out of luck. This paradigm has been in place for 200 years.

It's beyond time for a new generation of art historians not only to open up the system and let art be the garden that it is, home to exotic blooms of known and unknown phenomena. It's time to work against this system. [emphases added] We can't say painting is dead just as women and artist of color started to show up in art history. Our art history has stiffened into an ideology that clear-cuts a medium, pronounces it dead (like undertakers) and moves on like conquistadors to the next stage. The idea that art has an overall goal of advancing or perfecting its terms and techniques is made up. Imagined. Idiotic. Except to those benefiting from this intellectual fundamentalism. Someday, people will look back at this phase of art history the way we look back at manifest destiny and colonialism.

Ah, yes, it's so easy to just insist that this changes, a pedagogy inspired by some kind of Herder-inspired German idealism of the Romantic era but teaching the history of the whole human race as an art-making species across the entire planet over its whole existence is a time-consumin gand expensive proposition.  Even if we were to talk about just the history of music in the Western world since 1900 there's problems, problems Kyle Gann has blogged about at length.

... With so many niches and such an explosion in the number of composers, there should have been more books, not none. Just because we don’t have a central musical style anymore doesn’t mean we can’t have a central narrative whose primary outlines everyone could accede to. And how can we have a meaningful new-music world at all without a narrative?

At the request of my department chair – and he so rarely asks me for anything, I could hardly have turned him down – I am teaching a 20th-century music history survey course, or rather, music since 1910. I’ve been dreading it, and my fears are so far confirmed. First of all, I have long been convinced that you can’t do the entire 20th century in a survey course. To me, third-semester music history should be 1900-1960, and the fourth semester should take over after that. Not only is there way too much material, there’s no unifying idea to the first and second halves of the century. The year 1976 seems to remain a popular stopping point for many professors and textbooks, and I wonder if anyone (besides me) has ever taught a 20th-century class in which the last three decades got as much attention as the first three.

... In 1967, musicologist Leonard Meyer published a fiery book that was widely read at the time: Music, the Arts, and Ideas. In it he predicted “the end of the Renaissance,” by which he meant that there would cease to be a musical mainstream, and that instead we would settle into an ahistorical period of stylistic stasis in which a panoply of styles would coexist. This seemed an outrageous forecast at the time, but Meyer’s prescience has been greatly confirmed.

The new generation of composers is conflict-averse, its discourse reduced to a broadly tolerant pragmatism. However much the young composers believe they have blessedly transcended ideology and partisanship, though, they have nevertheless inherited some of the previous attitudes in a less articulated form. Instead of distinct categories, what we have is a continuum of opinions along the accessibility/difficulty scale: how much should the composer keep the audience in mind? What should be the relation, if any, to pop music? Is the educated elite of academia a sufficient audience? Should the composer ignore all questions of perceptibility and follow his pleasure? Is there, indeed, any way to predict what music will go over well with an audience and what won’t? Does the long tail phenomenon of internet distribution render all such questions moot? What is most typical of American music at the moment, I would argue, is a large-scale, implicit, almost publicly unarticulated debate on the social use of music, of what it is made for.

Since it was reading Gann's blog that introduced me to Meyer I'll just quote some stuff from Meyer as to the nature of the problem of perspective and the plurality of artistic styles.

Leonard B. Meyer
Copyright (c) 1967. 1994 by The University of Chicago
ISBN 0-226-52143-5

page 179-180

Although diversity had been growing since the seventeenth century, the fact was seldom squarely faced. The very ideology that nurtured pluralism tended, until recently, to eclipse its presence and obscure its significance. To believe in progress, in a dialectic of history, or a divine plan was to acknowledge, at least tacitly, the existence of a single force or principle to which all the seeming diversity would one day be related. [emphasis added] To accept the Newtonian world view, or later the theory of evolution, was almost inevitably to subscribe to monism and to look forward to a time when all phenomena would be reduced to, or subsumed under, one basic, encompassing set of laws. The notable achievements of science were taken as proof that Truth was One. Behind the manifest variety of phenomena and events lay, it was supposed, the latent unity of the universe which would eventually be discovered and embodied in a simple, all-embracing model. Because the oneness of things was what was real, surface diversity and incongruity could be disregarded.

But this picture of the world is, as we have seen, no longer entirely convincing. [emphases added] The inevitability of progress, the reality of either a divine or natural purpose in things, the existence of a single set of categorical cultural norms, and, above all, the possibility of discovering some single fixed and final truth--all these beliefs have been questioned and found wanting. Not only has no unified conceptual model of the universe been forthcoming but diversity within as well as between fields has increased enormously over the past fifty years. And our awareness of this diversity has been intensified by the remarkable revolution in communication.

In an ideological climate in which determinism is doubted and teleology is suspect, in which causation is complex and laws are provisional, and in which reality is a construct and truths are multiple--in such a climate it is increasingly difficult to escape and ignore the pervasive presence of pluralism. Impelled by the human desire for simplicity, economy, and elegance, the search for an overarching unity will unquestionably continue. But at the same time it is necessary to recognize that the "dissonance" of intellectual and cultural diversity will probably not be resolved, in the foreseeable future, into a single, consonant "chord of nature."

The world is too big and the humans who have lived within it are too diverse to be able to boil it all down in the kind of ways Romantic ideological schools of thought assumed could happen.  But it's frankly too easy to kick the dead while they're dead.  If German idealism played a disproportionally large role in the Western conception of art and art history and we've had a couple of centuries to start recognizing the colonialist/imperialist implications of that there's another problem, which is not necessarily being squarely faced by artists and art historians that I'm currently aware of--the push for a truly global conception of art and art history that can encompass the entire world is the sort of thing that would seem the proper domain and concern of a truly global ruling class.  Only people with an interest in running the global arts scene or having a place within it as a market or as a ... kind of priestly practice, would seem to want to insist on having some space at the table.

The recent back and forth about Lionel Shriver's speech suggests the possibility that debates about what people at the table should get to do and who should be at the table, this metaphorical/sociological table of who gets officially recognized as artist/writer/musician, revolves around this kind of concern for art and arts history as something encompassing the span of humanity across time and planet.  Anything that could be identified as art that was nonetheless not made in an "art for the sake of art" kind of way probably can't be given admittance to the club.  Thanks to generations of Cold War propaganda for capitalism and socialism or communism, we've got a whole army of historians and critics who have been trained to think of those with political, ideological or religious differences as temperamentally and intellectually incapable of even making art, whatever art may be.

The Romantics made a lot of noise about rejecting rules and restrictions and casting off the petty constraints of society but there may have been more bluster than substance to that.  As Meyer put it in writing about the Romantic era in music:

ISBN 0-226-52152-4

page 201
... the Romantic repudiation of convention (and especially of neo-Aristotelian aesthetics, which had been associated with the ancien regime), coupled with the denigration and weakening of syntactic relationships, highlighted the presence of diversity. As a result, the basis of coherence and unity became an issue: How did disparate and individualized themes, diverse modes of organization, and contrasts of expression--al intensified by the valuing of originality--form an organic whole? How did the several parts of a set of piano pieces or the different movements of a symphony or chamber work constitute a cohesive composition?

page 220
Put aphoristically: radical individualism seeks to undermine the norms on which its expression depends. [emphasis added]
The valuing of originality and individuality was reciprocally related to the denigration of convention. A convention is a shared, common property; it belongs to the compositional community, not to the individual. And it does not seem too far-fetched to suggest that the emphasis on the importance of novel musical ideas was related to the concern of the elite egalitarians with the power of possession. Musical ideas constituted the main "capital" possessed by composers, and these ideas could be made manifest only to the extent that they were in some way different--that is, original.

pages 344-345
There is, then, an inherent incompatibility between radical originality and individual expression because the latter depends on deviation from shared norms for its delineation. Therefore, to the extent that the prizing of originality leads to the abrogation of such norms, the delineation of individual expression either becomes attenuated or requires ever more radical departures from whatever norms are still prevalent. [emphasis added]Thus, especially in those styles of twentieth-century music in which constraints have been affected by a compelling concern with originality, originality ceases to be connected with individual expression.

Meyer made an observation in passing that Richard Taruskin transformed into an entire essay ("The Scary Purity of John Cage" was the title if memory serves), which was that in purely ideological terms you couldn't get more Romantic than John Cage, he had all the ideological imperatives about music for which the Romantic theorists and admirers of poetry pined. Yet fans of Romantic era literature and art can tend to abominate Cage even though, as an expression of what the artistic goals of the Romantic era philosophers who wrote about art would seem to have wanted Cage arrived at creating musical works-as-philosophy that transformed whatever you happened to be hearing during the duration of a performance of 4'33" into the sublimest of all musical experiences (if you're into that kind of thing, at least).

The assumption of some kind of teleological destiny for the arts based on residual European art history theories predicated on 19th century European views may not yet go by the board but if we are going to drop all of that stuff we might want to play with a few ideas.  For instance, whether we're looking at Marxist theory or some kind of postmillennialist Christian impulse of the sort that drove the Social Gospel types in the 19th century or that inspires Christian reconstructionists these days, if there's a common thread in criticism of art history theorizing it's that the teleological approach is one of the problems. 

Let's go all the way back to that Atlantic feature about poetry with the stuff about Walter Benjamin and Marx:

For Lerner, as his use of the term the social suggests, that hope is not just individual and spiritual, but collective and political. Poetry is linked, in his vision, to the possibility of a total redemption of human society, of the kind Marxism used to call “the revolution.” In particular, his fusion of aesthetic, political, and spiritual messianism brings to mind the work of Walter Benjamin, the 20th-century German Jewish theorist. [emphasis added] Lerner’s previous book, the novel 10:04, was saturated in the Benjaminian concept of redemption: the idea that the world as we know it carries within itself the possibility for transformation. Key to this vision is the idea that salvation will come from within, from a rearrangement of the world, rather than through an external power or a god.

... Poetry is a figure for the unalienated labor and uncommodified value that Marx thought would exist after the revolution. This is a 21st-century artist’s Marxism, one that no longer hopes for real revolution, but looks to the imagination for anticipations of what a perfected world would look and feel like. [emphasis added]

That teleological approach could be pinned on some kind of Christian apocalyptic but if we're going to do that then let's be careful.  This would be the point at which it matters whether the kind of apocalyptic interpretation of history we're looking at is premillennial, postmillennial or amillenial in disposition.  Yes, this kind of stuff, theoretically, could actually matter.  The average premillenialist Christian in America has perhaps still been trained to await a Secret Rapture and an end of the world in as little as a few months.  These are not the kinds of people who are going to care about a teleological approach to arts history is probably the nicest and most succinct way to put it. 

Whether in a Marxist form, an explicitly Christian form or even a deistic form the long-term influence of postmillennialist optimism as an informing ideological variable in art theory and art history and criticism may need to be explicitly abandoned.  Maybe it's a bit much to say "need to be", and I'll just say I explicitly reject postmillennialism in its Christian, Marxist, and deistic varieties. 

At the risk of making a possibly wildly controversial statement about Christians and the arts and the avant garde is it possible that the reason so many of the innovators in the last 120 years came from Christian traditions that could be described as historically amillenial were more open to invention and innovation (traditional Catholic and Orthodox teaching seems more non-millenarian in practical ways) than in nationalist traditions that have been steeped in a more postmillennialist train of thought? Remember that essential to this proposal is the observation that, yes, a Christian who is an amillenialist still affirms and awaits the return of Christ but not in a way that imagines that we'll hand the world to Jesus on a silver platter because of our success at Christianizing the world; it's been that postmillennialist optimism that has presented itself as Christian but that has historically been implemented as nationalism or patriotism that I am explicitly skeptical about.

College students can really like to imagine that they have transcended genre or are not beholden to this or that tradition.  The ideological fetishes of Romanticism are still very much with us.  If you have no problem admitting you work in fairly traditional idioms in a traditional way with traditional methods that almost seems to defy the whole point of being at a liberal arts college studying the arts.  If you like to write sonnets the writing teacher may tell you it's time to move on.  Music students seem to want to cast off sonata and fugue as soon as they can pass the test that requires them to say they know what that stuff is. 

And yet it seems to me that the 19th century theorists and pundits botched sonata and fugue by interpreting it in terms of their own stereotypes and expectations.  It's been interesting to read that a composer like Angelo Gilardino can refer to sonata forms as obsolete as though they were obsolete on scholarly or historical grounds even before he began to compose music for the guitar; thus the guitar could be thought of as an instrument with a body of work that lacks sonatas and fugues even though all the prestige of the mainstream classical scene seems built around a body of literature that presupposes the sonata and the fugue, those venerable 18th century approaches to thematic development, as foundational to the Western canon. 

If that's the case then how could the guitar gain the respectability Segovia wanted for our instrument if its practitioners regard the forms of the mainstream canon inimical to the instrument?  But that's a hobby horse I don't need to sit on too long for this already long post.  I'm just proposing that the 19th century Romantics (or maybe even 18th century Romantics) had a blinkered and provincial view of stuff they considered universal.  The trouble is that the contemporary post-industrial West is probably not in a different position.  Try as we might we are not primed to imagine a truly abstracted and global human race.  And yet that is in some sense a holdover from Romantic ideology, an ideology that may in some sense by found bitterly and desperately wanting in light of its own criteria of and for artistic greatness.  As Meyer put it, Romanticism insisted on the repudiation of conventions but maybe the repudiation of all convention drained the arts of the way to express the individual in the way Romantics admired.  The Romantics were busy disguising their conventionality in the hopes it wouldn't be noticed and it wasn't until the 20th century that artists and musicians and writers actually cast off the constraints many Romantics pretended to cast off.  History showed what many Romantics thought about that.  The punchline may be that the late Romantics had the misery of observing artists who actually did what they pretended to themselves they were doing.

from the manic pixie dream girl through waif fu to the murderous ingenue: Ex Machina, The Witch and film critics who will fall for the manic pixie dream girl as long as she stabs the patriarchy

When I coined the term “Manic Pixie Dream Girl” in an essay about the movie “Elizabethtown” in 2007, I never could have imagined how that phrase would explode. Describing the film’s adorably daffy love interest played by Kirsten Dunst, I defined the MPDG as a fantasy figure who “exists solely in the fevered imaginations of sensitive writer-directors to teach broodingly soulful young men to embrace life and its infinite mysteries and adventures.”

That day in 2007, I remember watching “Elizabethtown” and being distracted by the preposterousness of its heroine, Claire. Dunst’s psychotically bubbly stewardess seemed to belong in some magical, otherworldly realm — hence the “pixie” — offering up her phone number to strangers and drawing whimsical maps to help her man find his way. And as Dunst cavorted across the screen, I thought also of Natalie Portman in “Garden State,” a similarly carefree nymphet who is the accessory to Zach Braff’s character development. It’s an archetype, I realized, that taps into a particular male fantasy: of being saved from depression and ennui by a fantasy woman who sweeps in like a glittery breeze to save you from yourself, then disappears once her work is done.
When I hit “publish” on that piece, the first entry in a column I called “My Year of Flops,” I was pretty proud of myself. I felt as if I had tapped into something that had been a part of our culture for a long time and given it a catchy, descriptive name — a name with what Malcolm Gladwell might call “stickiness.”

Now that we've had a decade of negative regard for the manic pixie dream girl it could possibly explain why, in spite of authors who are alert to the sexual stereotypes in waif fu, there is a bare, slight modification to the waif fu trope for which film critics have fallen, if only some film critics.

Whether it's Mindy's Hit Girl, or Ava from Ex Machina or Thomasin from The Witch, if a Hollywood reaction to criticism of the manic pixie dream girl has taken a shape then the preternaturally beautiful female has stopped being the manic pixie dream girl and has become the murderous ingénue and for some reason this slight pivot is enough to win over film critics as though it were an insightful, revelatory and revolutionary iteration of cinematic girl power.

Basically nothing substantial about the objectification process of the female has changed EXCEPT what film critics have given themselves license to publish as to the significance of the narrative perspective on the old femme fatale trope.

This is most easily documented in the case of The Witch.


Throughout the film, Thomasin’s family is picked off one by one until she’s the only one left (a particularly gory moment near the end sees her father William gored by the horns of a demonic goat named Black Phillip). She then signs herself over to the devil and joins a coven of witches dancing in the woods; the film closes on Thomasin levitating and laughing with delight. In an interview, Eggers said he didn’t initially approach his screenplay of The Witch as Thomasin’s story, but that he eventually realized she had to be the heart of the film

The original draft was about how the titular witch manifested herself to different members of the family, meaning the film spent roughly equal time with everyone. “But through working on the second draft with my producers, Thomasin became the protagonist,” he said, adding that the film still works as an ensemble piece. In the story, the witch and her demonic partners take several forms: a goat, a raven, a rabbit, a beautiful woman, and a disfigured crone. While most of the other family members are besieged by these figures, Thomasin is targeted instead with suspicion from her parents and siblings, who come to think she’s in league with evil forces. “It was not my intention to make a story of female empowerment,” Egger said, “but I discovered in the writing that if you’re making a witch story, these are the issues that rise to the top.”
The film’s exploration of patriarchal power was the key to unlocking Thomasin’s story. As a woman in the 17th century, she’s entirely stripped of agency. She exists only to work and help her family, and eventually be married off and bear more children. As The Witch progresses, it becomes clear that the campaign being waged against her family is targeted at freeing her so that she can join the coven in the woods. The idea that she’s been liberated is an intentionally muddy one—when she submits to Satan near the end of the film, he takes the form of a man—but there’s a giddy sense nonetheless that she has triumphed.

When asked about The Witch’s deeper commentary at a press conference, the actress Anya Taylor-Joy said she thought the film had a “happy” ending—because joining the coven is the first choice Thomasin gets to make on her own. Eggers is careful to communicate the darkness of Thomasin’s coercion, but doesn’t shy away from the fact that she’s leaving a repressive society behind. When he started thinking about The Witch, his focus was on the unknown, on “understanding where all this stuff comes from, the origins of the clichés—how they’re powerful, how they’re part of everyday life.” But he’s surprised and happy with the way his story evolved, and how it can speak to important modern issues despite being set centuries ago. Thomasin and The Witch seem destined to enter the great canon of horror films that includes the likes of Carrie, The Descent, and A Nightmare on Elm Street: stories that terrify by tapping into the immense power and fury of isolated women.
Normally, the fall of the main character in the final scene of a horror movie would be a director’s gloomy or gleeful surrender to evil. But The Witch presents Thomasin’s conversion as a victory for her: Embracing Satan allows her to escape from the physical hardship, moral hypocrisy, and gendered violence that’s tortured her thus far. (Given how few people in the Calvinist universe actually belong to the divine elect, hedging your bets by becoming a cursed, uberpowerful immortal is just good sense.) I can’t overstate just how shocking this moment feels, when you realize that the movie has up until now perpetrated a fundamental deception about its own point of view. All along, Eggers has stood on the Devil’s side; the triumph of the forces he’s trained us to dread and fear actually constitutes a happy ending. This hugely daring reversal could read as a middle finger to viewers, who’ve spent the past hour and change sympathizing with the pilgrims and rooting against the dark hosts. But don’t have such a limiting, orthodox view of what a horror movie ought to accomplish! Let the film’s ending serve as a reminder—as a certain goat might say—how delicious heresy can be. [emphases added]

Perhaps religious education has declined to the point where people don't realize that the primary difference between Protestants and Catholics over the centuries in the realm of diabology was simply who decided who was the Antichrist; in nearly all other respects, as the historian Jeffrey Burton Russell has put it in his numerous books on the history of thought about the Devil in Judeo-Christian traditions, there was agreement among Catholics and Protestants.  The largely reflexive reactions on the part of American film critics to cast the Puritan legacy in diabolical terms can seem like overcompensation when you have some understanding that Western Christian diabology has been one of the handful of areas in comparative religion where the Christians largely affirmed the same core ideas.  Be that as it may, film critics who might blush at stereotypes about people of color don't blush quite as much if the stereotypes involve, let's not finesse this too much, white trash Protestants.

An ever so slightly more nuanced version of this sentiment about the dread state of being female in a Puritan context ...

Would the director and talented, fresh-faced actress Anya Taylor-Joy consider Thomasin’s final resting place in Satan’s blood-soaked embrace a “happy” ending? Taylor-Joy answered: yes, because it was the first choice she really got to make. Yes, because it meant empowerment. Yes, because society left her no other option: if she went back to the plantation, she’d face the same accusations; and she couldn’t very well run a farm on her own with nothing but her dead family’s corpses for fertilizer.
Already, red flags were firing. How can Thomasin’s story be one of female empowerment when, as the final scenes imply, she chooses Satan because she literally has no other choice? If the story had painted her ultimate destiny as a clear decision between the life she lived with her family and dancing naked in the woods around a flame, that would be one thing. But Thomasin is no Carrie (of the Stephen King novel), who, despite ending up worse off in many ways, at least chose to be up there of her own volition.

The author quoted above went on to talk about how terrible the plight of women was in Puritan era America. 

But not everyone quite bought into the girl power

By Will Leitch
February 19, 2016
The Witch is the sort of horror movie that gets a ton of praise for its dogged resistance to conventional scary movie tropes. An indie hit out of Sundance last year, The Witch is the type of film that’s a success at film festivals but tends to evaporate once released into the wild; what works in the relentless hustle of a festival can feel airless when introduced to the elements of regular human audiences. The Witch is wrapped up in its own views of religion, of sin, of feminine power, but more than anything else, it is wrapped up in itself.


The parents are seen as tormented but also cruel and vengeful in a way that’s easily mocked from the distance of 450 years; Eggers is much more interested in their suffering than their plight. The family begins to crumble as William starts to wonder if he is reliving the life of Job, and we are invited to revel in the family’s strife and even perhaps suspect William and his brood may have it coming to them a little. [emphasis added] William is seen as a decent but deeply misguided man, and the movie briefly flirts with the notion that God is somehow punishing him. Except we know there’s a witch: We see her in the first five minutes of the movie, and the possibility of her reemergence is the central driver of tension the rest of the way. Something is legitimately tormenting this family, and it is not God. We should feel more sympathy for William than Eggers allows us to. Sure, he’s got some outdated views—he’s 500 years old— but there’s still a witch trying to kill his family, cut the guy a break.

It's been a while since we've had a link to Cinemagogue so a link has been overdue and so ...

What can be read as a fairly classic cautionary tale by a movie critic who's also a pastor in the Reformed tradition and is able to take the Reformed idiom of the characters seriously has been read as a girl power ode by other film critics.  Whereas people looking for a pagan girl power cheerleading anthem see Thomasin's signing herself over to the dark side as a victory, a Christian can see that decision not as a rejection of a father's obstinence and self-righteous self-determination but as the logical outgrowth of that.  Thomasin merely embodies further the sins of her own father that led him to choose his own path and, by dint of being the father, forced the rest of his family to join his fate.

We'll come back to the theme of the daughter and the father almost immediately but first we need to shift back to those manic pixie dream girls.

Now if the manic pixie dream girl role was secured by the likes of Zooey Deschanel, Kirsten Dunst and Natalie Portman a decade ago, in this decade the murderous ingénue has been championed by Chloe Grace Moretz, Alicia Vikander and more lately Anya Taylor-Joy.  Each of these actresses is conventionally beautiful enough to end up playing manic pixie dream girls somewhere in the future. These are still actresses fit to play the manic pixie dream girl;  they have become known by being cast as avenging angels literally stabbing the patriarchs who embody the privilege and power that runs the world as we know it. These femme fatales, these deadly debutantes are sympathetic not so much because they aren't murderous schemers and, really, still manic pixie dream girls, but because they enact revenge fantasies against the big dicks who have God complexes who feel entitled to reorder the world around their sense of entitlement.  So, yeah, it's easy to root for these murderous ingenues, perhaps, but the image of the feminine has not necessarily changed so much as the frame around the portrait.

Making a femme fatale a sympathetic protagonist does absolutely nothing to alter the trope, but the shift in narrative perspective alone seems to be enough to convince some film critics that the deadly debutante is a fantastically subversive thing. 

Let's propose that the chilly remove we can observe in these films with the murderous ingénue is a sign that filmmakers don't want to necessarily come out and say they're rooting for the murderous ingénue ... but if that if you do once you've paid for your ticket, well, hey, girl power.

Perhaps there's some kind of subtext in the film criticism dealing with the trope of the murderous ingénue.  They're not necessarily just writing about films featuring a murderous ingénue like Ava or Thomasin or Hit Girl, they're writing about the frustration of being unable to assimilate into the mainstream of cultural power and influence; the frustration is the degree to which an art form that is more than a century old seems to have so few women headlining and defining the culture.  The patriarchy we're complaining about is not really the old Puritan era patriarchal system that didn't allow women a voice that, if we were to interview women from that era by the magic of a time machine, they might not have considered necessary in the way we do, the patriarchy at play is the one perceived to exist right now, the patronage empire that can greenlight one Bayformers movie after another and keeps the Star Trek franchise alive decades after the end of the Cold War that is a necessary historical component to understanding why anyone made the franchise to begin with

There could be more than just a few things to say about the abjection of the past necessary for this interpretation but perhaps we can say for the moment that films like The Witch and Ex Machina can function as Turing tests that ask you who you think the protagonist of the film is without committing to the idea that many film critics who have reviewed this films simultaneously commit to, which is the decision that once you've settled who the protagonist is you've established who the hero is, as if the protagonist and heroic were one and the same thing.

Let's not be too hasty.  We've already had years in which to observe the formulation of the murderous ingénue since Chloe Grace Moretz played Hit Girl in Kick-Ass

Chloe Grace Moretz went on to keep doing the murderous ingénue type in the form of a remake of Let The Right One in; by reprising Hit Girl for Kick-Ass 2; and by starring in a remake of Carrie. The trajectory is short and it tends toward domestication and remakes.  I hope Moretz can shake off the murderous ingénue role typecasting before she obviously ages out of the part.

Meanwhile, Anya Taylor-Joy has since shown up in a film described as a respectable second-tier Ex Machina, the sci-fi film Morgan.

... Morgan’s biggest downside, really, is simply that last year’s Ex Machina got here first, tackling many of the same issues (and some of the same scenery) in a more audience-friendly, immediately satisfying way. Still, that second-banana status shouldn’t negate this film’s virtues, most notably the impressive sense of chilly remove that lingers past the final enigmatic frames. 

Is playing a type that Vikander got to first edgy enough?  I haven't seen Morgan yet and may not get around to it but an advantage of steeping yourself in a little film criticism is that you can keep up with films you can't afford to go see and can observe patterns here and there. Taylor-Joy may be the latest actress to benefit from the murderous ingénue trope and, well, that conveniently lets me have three murderous ingenues to correspond to three manic pixie dream girls from the earlier decade.

Twenty years ago Joss Whedon's Buffy the Vampire Slayer was considered wry and inventive and we've had those decades as an interval in which feminists and film critics have been able to discern the long-term limitations of Whedon being a one-trick pony.  Waif fu has been shown to be not so different from earlier tropes.  The risk in the current cultural moment is that film critics who by now ought to know better because they review movies for a living are falling for the murderous ingénue perhaps only because they were so saturated by the waif fu/manic pixie dream girl trope in the previous decade they don't understand that they're seeing the same stuff but through the refraction of a different narrative prism/trope.  The light that shines through and the resulting rainbow is unchanged.  And make no mistake, the kinds of endings we can get in Ex Machina or The Witch can still fit comfortably into waif fu.

The reason we shouldn't be so eager to celebrate these films and these murderous ingenues as odes to girl power is because a collective cultural venting of frustration now can be blind to trajectories. The trajectory of the cinematic universe is short and tends toward repetition and tropes.

What the murderous ingénue shows us, whether it's Hit Girl or Ava or Thomasin,, is that the murderous ingénue is the daughter who magnifies in her vice those things her father regarded as virtue. In the case of Hit Girl (the cinematic version, not the comic) she is the daughter who lives out the quest for vengeance, justice and murder she received from her father.  For Ava, she embodies the insatiable ambition of her creator and his quest to revolutionize and overthrow whatever the old order might have been, it just doesn't so happen he is the old order.  For Thomasin, if we were to take the Puritans a bit more seriously on their own terms than the average American film critic might want to, her turn to Satan is just a more explicit form of rejecting social formation as a necessary component of individual and spiritual identity that she got from ... her father. The reason we shouldn't celebrate the murderous ingénue as some kind of stab at a patriarchy is that she is her father's daughter.

If we wanted any more vivid proof of how readily an actress who has played the murderous ingénue can pivot over to what will probably be waif fu ... Alicia Vikander (who's turn as Ava in Ex Machina was engrossing and charming) is going to be playing Lara Croft.

Meet the new boss.  Same as the old boss.

Friday, September 16, 2016

a few thoughts on Lionel Shriver's speech on cultural appropriation and the problem of privileged people talking about privilege

So for those who sometimes keep track of what writers say about writing, and of things like debates about the legitimacy or illegitimacy of cultural appropriation, there was this speech, from which I'll only quote an excerpt:
The author of Who Owns Culture? Appropriation and Authenticity in American Law, Susan Scafidi, a law professor at Fordham University who for the record is white, defines cultural appropriation as “taking intellectual property, traditional knowledge, cultural expressions, or artifacts from someone else’s culture without permission. This can include unauthorised use of another culture’s dance, dress, music, language, folklore, cuisine, traditional medicine, religious symbols, etc.”

What strikes me about that definition is that “without permission” bit. However are we fiction writers to seek “permission” to use a character from another race or culture, or to employ the vernacular of a group to which we don’t belong? Do we set up a stand on the corner and approach passers-by with a clipboard, getting signatures that grant limited rights to employ an Indonesian character in Chapter Twelve, the way political volunteers get a candidate on the ballot?

Shriver's counterpoint didn't seem like much of a counterpoint because, even as a kind of stick-in-the-mud Presbyterian Calvinist sort who's moderately conservative about both politics and religion, this rebuttal from Shriver seems to depend on a bad faith understanding of why people would be upset about whatever they define cultural appropriation as being.  I'll get to that in a bit but if the core of Shriver's rebuttal is to say that writers can do what they want to do because nobody should have to give writers permission that's not much of a defense.  It's the kind of defense that presupposes the liberty of the writer to write whatever he/she/it insists upon writing.  While some writers would (and did) say this smacks of privilege the problem here isn't that this is a privilege a writer "shouldn't" have by dint of being a writer, it's that it's the kind of decision-making power every writer has but that isn't the real point you want to make if you want to make a defense of writers creating characters of races or religious beliefs or sexualities you don't personally possess.

Of course, a few people took issue with Shriver's talk and ...

Among the invited opponents: Yassmin Abdel-Magied, a writer who’d walked out during Shriver’s talk. “The stench of privilege hung heavy in the air, and I was reminded of my ‘place’ in the world,” she wrote in The Guardian. Abdel-Magied rightly accuses Shriver of insensitivity, but also sets a restrictive, overly political vision for what literature should be
Cultural appropriation is both a real phenomenon—responsible for rock-n-roll and the Washington Redskins alike—and a ripe target for criticism and mockery, since the concept renders nearly every garment or foodstuff fraught.
The author of the New Republic piece pointed out the obvious but necessary observation that cultural appropriation is basically how all art happens and how culinary innovations can occur; we might benefit from proposing that not all cultural appropriation is done in the same way for the same reason.  It might be worth revisiting an example I've used a few times at this blog of how Bubber Miley quoted from a Chopin piano sonata at the end of an early Ellington recording.  Miley didn't get permission from Chopin because Chopin was dead.  Cultural appropriation in which an American musician who isn't white cribs from a dead Polish composer who composed nationalist piano music a century early has never been what the people who talk about the badness of cultural appropriation seem to be concerned about. 

The stench of privilege could be heavy in the air anywhere where writers can afford to go to writers' conferences to hear any writer hold forth on the sacred nature of the art form.  I don't have much of a history going to conferences for writers and I'm not opposed them--that said, having never been a vocational artist I think the delusion many vocational artists and writers seem to work from is imagining that there can be any but two modes of relationship the vocational artist has to formal and informal power:  you're either a servant of the ruling class or a participant in the ruling class, whatever or whoever the ruling class may be.  Those are the only two possibilities and any liminal space between them presupposes the binary. 

The only "possible" way to be outside of this binary is to be an amateur who never becomes a vocational artist or writer.  Alas, we seem to live in a moment in which writers who have the privilege of writing about privilege held by others don't seem to think of privilege as being something they, too, possess. There may be tens of thousands of people who can afford to go to liberal arts colleges and get degrees in the humanities who have been able to convince themselves they are the proletariat when they aren't.  For college graduates, even people with just undergraduate degrees, to think of themselves as working class is absurd.  People who graduated from high school, or didn't graduate from high school, and are out in the work force could be proletariat if they're not making huge sums of money, but the vocational writer is in a cultural sense a part of the priesthood of culture-builders.

That is, in a sense, the substance of a potential critique of Shriver's whole approach.  The people who have the privilege of writing stuff that gets monetized absolutely have a privilege and if it's one they take for granted or presume upon then they can come along and declare as Shriver does that a writer shouldn't need to ask anyone's permission what kinds of characters to put into a book.  Sure.  To propose otherwise is to propose a kind of censorship. 

The most basic problem with this definition of cultural appropriation ...

In the United States, cultural appropriation almost always involves members of the dominant culture (or those who identify with it) “borrowing” from the cultures of minority groups.

Is that it assumes its own definition.  What's more, the problem is that, as Conor Friedersdorf was writing not so long ago, if a culture of victimhood:

... Ferris Bueller is a stand-in for every kid who has performed victimhood to avoid school or homework. I don’t mean to suggest there are no real victims. Quite the contrary. The argument is that huge percentages of the population will, if given the opportunity, exaggerate their victimhood in order to get the gains that come with it. Many people will even fall for their own act to a degree. None of us are immune. I’m often tempted to view myself as an aggrieved party in some dispute.
This aspect of the culture isn’t a race thing, it’s a human nature thing. You can’t set up a system where status accrues to victims and then let people determine their own victim status. [emphasis added] Insofar as this is true of black and brown people on college campuses, it’s only because they’re no different from white people on college campuses, who participate just as much in victim culture, and many people off campus. Every human is vulnerable to the perverse incentives of “victimhood culture.” [emphasis added] 

And this is apparently how people who have a lot of power and influence can still see themselves as victims. Even if we decided there was noting at all to contest in the definition of what "cultural appropriation" even is, there's no certainty that the people who are considered the dominant culture won't see themselves as, somehow, the victims or the minority besieged by the masses of every other culture that isn't precisely them.  The proverbial one percent will feel lonely and at risk from the ninety-nine percent by dint of being a numeric minority.  It's always been in the nature of empires to assimilate all those cultural elements the masters of said empire decided to not destroy. 

But let's get back to the "cultural appropriation almost always" because beyond the abstract difficult of defining cultural appropriation in largely pejorative terms up front, there's a historical matter. 

Let's just return to "Black and Tan Fantasy", one of Duke Ellington's early works.

Check out 3:00 moving forward, and then compare it to ... Chopin's Piano Sonata No. 2, movement 3

So if cultural appropriation almost always involves members of the dominant culture (or those who identify with it) "borrowing" from the cultures of minority groups what are we expected to make of one of Duke Ellington's trumpet-players borrowing a riff from Chopin for the end of "Black and Tan Fantasy?"

How about when the composer George Walker composed a set of variations on "O Bury Me Beneath the Willow" as the second movement for his first piano sonata?

If you want to hear a version of this old bluegrass song ... you could do worse than the Carter family, probably.

Start listening at 7:30ish for a pretty abstract take on the folk song with some fine variations

At this point it's probably not even "necessary" to mention John Coltrane's quartet playing "My Favorite Things"

In each of these cases they cannot, based on the definition of cultural appropriation quoted above, even qualify as cultural appropriation.  The problem with this polemical definition of cultural appropriation is the double standard built into it.  George Walker, as an African American composer, can take a lovely old bluegrass/cowboy song chorus and fashion some great variations on it and that's not cultural appropriation?  It's not like there were never black cowboys, even if we stereotypically equate cowboys with white males.  Would it count as "cultural appropriation" that a man trained as a classical pianist and composer appropriated a folk song without getting permission?  Well, if you insist but those kinds of appropriations have been happening for as long as humans have been around. 

So it's not without cause Phoebe Maltz Bovy wrote, "Cultural appropriation is both a real phenomenon—responsible for rock-n-roll and the Washington Redskins alike—and a ripe target for criticism and mockery, since the concept renders nearly every garment or foodstuff fraught."  An ideologically implemented set of objections to cultural appropriation runs aground on the long tradition of white and black musicians in the United States and across the world borrowing from musical cultures and sharing musical ideas and ideals.  The point is easier to make in music because it happens to be easier to demonstrate here at this blog.

But for fiction there's another reason to be skeptical about the kind of agitation that insists that novels not traffic in stereotypes for want of gaining some kind of permission to do so.  Fiction is fiction.   It's not as though we're talking about journalism which, at least in theory, is obliged to present the facts that can be discovered and also the truth that may be discovered (which, it must be said is not necessarily ever or always exactly the same thing).  The field of anthropology can set for itself a commitment to reliable or aspires-to-be-reliable ethnographies of people in times and places but fiction is fiction. 

While it might be nice if some authors displayed some more social responsibility in whether they traffic in stereotypes when they decide to invent characters the case for why they should in the sense that they are obligated to is wanting; it can seem as though Socialist Realism retains vitality as an idea not on the basis of depicting a realistic triumph of the working class but through a spiritual rebirth in the form of some writers embracing the idea that people only have the moral license and legitimate opportunity to write about their own group.  And that's great if we're talking about journalism!  Yes, by all means people who want to tell the story of what they and their group have been through should have a First Amendment protected right to share their stories.

There is a different way to object to Abdel-Magied than the way Rod Dreher approached it this week:

It's possible to suggest that Shriver presented a speech that comes off like the entitlement of a smug lecturing writer and that Abdel-Magied makes a comparably terrible case made in equally bad faith, a case that depends on a double standard that is not, as yet, entirely justifiably explained.  It may be too much to propose that cultural appropriations back and forth that are done in ways that are respectful to the people and traditions involved is something we should all be able to agree is something to strive for.  Figuring out how Gentiles and Jews who identified themselves as believers in Christ could peacefully and lovingly co-exist in spite of their substantial cultural differences was not just a little sidebar concern if we go back and read the New Testament. 

I don't actually blame Abdel-Magied for walking out of the speech.  I read the transcript of Shriver's speech and feel bad for anyone who paid money to hear writers talk about writers being able to do whatever they want because ... writers!  It's just that anyone who can attend a writer's conference is already in a position of privilege compared to the people who aren't vocationally writers, certainly compared to anyone who actually can't even read.  Compared to everyone the world over who can't, couldn't or never will be able to read Abdel-Magied writes from a position of spectacular and unacknowledged privilege.

The problem is that the world will never be a utopia and never be equal.  Well, there is the hope that one day truth and justice shall reign across the entire planet and, as a Christian, I can tell you that's an apocalyptic utopian hope of the sort that Christians, at least, confess is only possible when Jesus Christ comes to reign as God and king of the cosmos and when every tear is wiped away and, you get the basic idea.  For people who are actually religious we understand that that level of justice Abdel-Magied alludes to is only possible when a god who created the entire universe/multiverse decides to bring that utopia about.  Until then, injustice and inequality will always reign, most frequently these days through those who would insist to us they can bring it about, unfortunately.

So, somewhat contra Rod Dreher, I wasn't convinced by Abdel-Magied but Shriver's speech was boilerplate sanctimony from the writers-talking-about-writers. 


Noah Millman had a response and it was pointedly different from Dreher's, though both obviously blog at The American Conservative:
Nonetheless, I have a question for Ms. Shriver. I agree that the whole point of writing fiction is trying on new hats, new masks.

But what if the mask you want to wear is... Batman's?

Millman swerves into Lone Ranger territory ...

The point is that while Tonto is intellectual property, Comanche-ness is not. It's a matter of courtesy (as well as good defensive marketing) to consult with the Comanche nation before representing one of their number on screen. Consulting Universal on representing Tonto is a matter of law. And so anybody who feels a kind of ownership of the Native American identity runs the risk of feeling: Something I own was used without permission.

That, I suspect, is what really rankles those who gnash their teeth when someone lectures them about how art is all about borrowing and exchanging freely. That's exactly what art is, but our whole edifice of intellectual property law is increasingly designed not to facilitate that borrowing and exchange, but to frustrate it, in the service of protecting the value of incumbent cultural products — the ones owned by corporations. [emphasis added]

The solution, though, isn't to build more walls, so that everyone sticks to their cultural knitting. That will just exacerbate existing baleful trends. Rather, what's needed is to restore the artistic commons, before the only culture we know is one we'll have to pay a fee to join. [emphasis added]

In other words, insisting on putting a stop to cultural appropriation is counterproductive because cultures are not intellectual property.  As the definition and enforcement of intellectual property has tilted in favor of corporate juggernauts able to enforce their interests complaining about cultural appropriation is a far lesser concern than what Millman pretty directly articulates, the reality that our culture as a whole has been steadily reaching a point where there's functionally no public domain for the culture we can easily get.

Now there's a substantial body of public domain work in literature and art and music but that gets us back to the Western artistic canon.  It may be not everyone is happy that the history of Western art, literature and music seems to be those proverbial dead white guys and gals but at least it's public domain!  By contrast, if it's been recorded by a record company the record company owns the copyright and the rights to its reproduction and distribution.  Perhaps one of the advantages of the old Western canon these days is how much of it is public domain.  Pride & Prejudice & Zombies can be a thing because Austen's book was published centuries ago.  But as we were reminded with the passing of Prince, he was choosy about how people had access to his music and he literally had every right to be. 

As a guitarist and a composer who's fond of 18th century music I'd say Millman will want to remember that we already have a spectacular public domain for the arts.  Perhaps a blogger at The American Conservative could note a potential irony in this, the wealth of the Western artistic canon that's literally free for cultural appropriation of every possible sort can be the one most objectionable to those who wish the canon wasn't so white and patriarchal.  All right ... but at least you don't have to pay a licensing fee to get permission to rewrite Jane Austen or Dostoevsky ... or do you?