Sunday, September 25, 2016

HT Mockingbird, a Slate piece at Browbeat on how The Iron Giant is not about guns or war but about sin

http://www.mbird.com/2016/09/another-week-ends-knuckled-mascots-poetry-haters-holy-fools-q-tip-effects-and-well-loved-waterboys/

Also on the film front, finally a confrontational headline from Slate we can all get on board with: “Everyone Misunderstood Brad Bird’s The Iron Giant: It’s Not About Guns. It’s About Sin.” Amen.

Indeed.  Slate has evolved over the last sixteen years into one of those publications with provocative titles that are declarative statements or rhetorical questions of the sort that reminds me of the polemics of ... Christian bloggers. :s

It's still amazing to consider that back in 1999 there was Toy Story 2, South Park: Bigger, Longer & Uncut, Princess Mononoke, and the Iron Giant.

another incubation phase

There's not quite as much writing here at the blog lately but it's not because no writing is being done.  Some of the writing has shown up elsewhere.  Some of the writing will show up here eventually. There's a long-form analytical series that's been taking shape in the last month or so ever since I read Rod Dreher's interview with J. D. Vance about Hillbilly Elegy. I'll refrain from dropping any more hints as to the content and scope of the pending series of posts because the incubation process has just ended a phase.  What's in the egg is probably not yet ready to hatch.

There's other long-term incubations going on for stuff I've been meaning to write about scholarly approaches to 18th centur ymusic vs 19th century understandings of the same.  There's stuff I've been working on about problems I have with Francis Schaeffer's narrative approach vs an actual music history, or history of music. 

I've been meaning for months to discuss guitar sonatas again and get into more specific stuff but there's some secondary literature that I feel beholden to read that I'm not done reading yet. 

For now I'll just say that sonata form can't be considered obsolete if you can make a case that the entire 19th century conceptual framework for discussing sonatas misunderstood and misconstrued sonata as a form rather than as a kind of thought process that continued the developmental economy of the fugue within homophonic rather than polyphonic terms. The implications of deconstructing 19th century assumptions about sonata and fugue can have for the integration of vernacular American styles into 18th century developmental processes seem self-evident to me, but there's a point at which you have to make the case that ragtime and sonata form can be successfully integrated by abandoning stereotypical notions of what sonata "ought" to be in the terms of German idealism from the 19th century through some music itself. There are some experiments that are possible in the arts if the arts are not regarded as a substitute for traditional religion. That experiments with fusions of jazz and 18th century developmental processes that have been presented, rather unfortunately, as actually fixed "forms" have been going on over the last half century on both sides of the Iron Curtain/Cold War divide is something I've been meaning to write about.  But I haven't. 

So there's a lot of what you might call pre-writing going on before the writing that might show up here can take place.

Wednesday, September 21, 2016

Lasting Legacy LLC finally inactive in Washington state as of 9/13/2016, expiration date 4/30/2016

Now that that corporation in Arizona is officially up and running ... it looks like there's finally been time to render inactive Lasting Legacy LLC here in Washington state.

https://www.sos.wa.gov/corps/search_detail.aspx?ubi=603199549
LASTING LEGACY LLC
UBI Number 603199549
Category LLC
Active/Inactive Inactive
State Of Incorporation WA
WA Filing Date 04/17/2012
Expiration Date 04/30/2016
Inactive Date 09/13/2016
Duration Perpetual
Agent Name CT CORPORATION SYSTEM 
Address
505 UNION AVE SE STE 120
OLYMPIA WA 985010000

Governing Persons
Member,Manager
DRISCOLL , MARK
23632 HIGHWAY 99 STE F441
EDMONDS , WA 98026 
Member
DRISCOLL , GRACE
23632 HIGHWAY 99 STE F441
EDMONDS , WA 980269211

On Mission LLC has been inactive in Washington state since April 1, 2016
https://www.sos.wa.gov/corps/search_detail.aspx?ubi=603258287

as has OMCRU Investments LLC

https://www.sos.wa.gov/corps/search_detail.aspx?ubi=603258278

But fear not, Lasting Legacy LLC is over in Maricopa county according to ...

http://ecorp.azcc.gov/Details/Corp?corpId=L20752916

and things OMCRU are over here ...

http://ecorp.azcc.gov/Details/Corp?corpId=R20632652

For Mark Driscoll Ministries ...
http://ecorp.azcc.gov/Details/Corp?corpId=F20337947

for The Trinity Church ...

http://ecorp.azcc.gov/Details/Corp?corpId= 20491878

or a remarkably similar link may get you there if you search for that entity.

Saturday, September 17, 2016

perspectives on how Western art history become its own problem across media, a long form guess at the ways in which the ideology of Euro-American Romanticism wore itself out

First off, HT to DZ over at Mockingbird for highlighting this first article at Another Week Ends:

http://www.mbird.com/2016/09/another-week-ends-knuckled-mascots-poetry-haters-holy-fools-q-tip-effects-and-well-loved-waterboys/

DZ spent a paragraph or two on it and since this is Wenatchee The Hatchet I'm going to write ... let's say about 4,500 words on the subject.  And the subject is how people came to hate poetry and what this can tell us about the history of poetry, poetry criticism, and how this connects to the history of Romantic era ideological commitments to what art is and what art ought to achieve and how the Romantic era festooned art and arts criticisms with some burdens that we have to deal with in poetry, art history, and also music and musicology. My general proposal is that a common thread in all these things is an unexamined burden brought in by postmillennialist modes of apocalyptic thought that I believe should be rejected but to get there we need first to survey what the ideological troubles with Romantic era optimistic apocalyptic and rhetoric proposed and for that ... we can finally start off with the article DZ mentioned at Another Week Ends.

http://www.theatlantic.com/magazine/archive/2016/10/why-poetry-misses-the-mark/497504/
...


Making a poem was never quite as simple as making a table, because it required inspiration and passion, but it did involve studying techniques and following rules. Indeed, the laws of poetry were natural laws, which had been discovered by the Greeks and could be learned from their example. [emphasis added] The English poet Alexander Pope agreed, writing in his “Essay on Criticism”:

Those RULES of old discover’d, not devis’d,
 Are Nature still, but Nature Methodiz’d;
 Nature, like Liberty, is but restrain’d
 By the same Laws which first herself ordain’d.



That was published in 1711, so clearly not much had changed in the previous two millennia. But turn to Percy Shelley’s essay “A Defense of Poetry,” written in 1821, and you will discover that the meaning of the word poetry had undergone a fantastic transformation. Poetry, Shelley says, is “connate with the origin of man,” and “a poet participates in the eternal, the infinite, and the one.” Poetry comprises every creative activity of human nature, including the arts, politics, and science: “The institutors of laws, and the founders of civil society, and the inventors of the arts of life” are all in some sense poets, since they shape reality in the light of their vision. Shelley even speaks of “the poetry in the doctrines of Jesus Christ,” as if Christianity itself were just one enormous poem.


The Romantics, faced with a disenchanted universe, attempted to discover a new source of enchantment in the human imagination, and poetry became a metaphor for that creative, life-enhancing power. [emphasis added] Poetry used to mean poems. Now poems began to seem like just one habitation, and far from the grandest, of the force that is poetry. Naturally, this fateful division between poetry and poems had enormous consequences for the way poems were written. After all, if poetry is ineffable and infinite, there is no reason it should be bound by the mechanical laws of meter and rhyme. In the modern age, poetry became antinomian.

Thus we find Emerson arguing, in his essay “The Poet,” that “it is not metres, but a metre-making argument, that makes a poem,—a thought so passionate and alive, that, like the spirit of a plant or an animal, it has an architecture of its own, and adorns nature with a new thing.” The metaphor of growth cancels out the old metaphor of craft. [emphasis added] For Horace, a poem was something you had to learn how to make, at the expense of great effort. For Keats, “if Poetry comes not as naturally as the Leaves to a tree it had better not come at all.”

...

For Lerner, as his use of the term the social suggests, that hope is not just individual and spiritual, but collective and political. Poetry is linked, in his vision, to the possibility of a total redemption of human society, of the kind Marxism used to call “the revolution.” In particular, his fusion of aesthetic, political, and spiritual messianism brings to mind the work of Walter Benjamin, the 20th-century German Jewish theorist. Lerner’s previous book, the novel 10:04, was saturated in the Benjaminian concept of redemption: the idea that the world as we know it carries within itself the possibility for transformation. Key to this vision is the idea that salvation will come from within, from a rearrangement of the world, rather than through an external power or a god


... The Hatred of Poetry is a subtle inquiry into poetry’s discontents, and a moving statement of poetry’s potential. It can also be read, though, as an example of the dead end into which modern poetic theory has been led by its grandiose aspirations. [emphasis added] As long as we focus on what poetry isn’t and can’t be, how can we rediscover what it once was, and might be again?

It's not just in the realm of poetry that we see laments of the burden of 19th century Romantic ideological tropes about what art is and ought to be. If the trouble with poetry may be likened to a topic that has historically been a theme celebrated in poetry, the beauty of a fair maiden, then the trouble with poetry has been that no real woman can possibly compare to the impossible standard of the manic pixie dream girl and that this has become the at times tacit standard by which poetry is judged by those who judge poetry.

Something similar could be said for film criticism, that things have gotten to a point where film critics can praise a film that is at least five hours long for its compelling realism and representational approach while for the rest of the movie-going public that isn't interested in that sort of film-as-compelling-art trope they want to go to a movie that does not replicate their day job.  If what the world of cinema needs is an unquestionably authentic and realistically naturalistic presentation of the world as it is then film critics should be writing reviews and film criticism about an eight-hour shift worked by someone by way of surveillance footage in a grocery store.  You can't get more cinema verite than that, can you?

When critical traditions insist upon instantiations of artistic ideals that may never have necessarily existed in the arts themselves, at least as presented by scholars or theorists, we can run into the whole non-tradition of the American symphony in the 19th century that you'll never get to hear ... though for reasons that will have to be some other blog post.  Romantic ideological commitments in the realm of the arts and arts criticism have in some sense left us no choice but to endure people complaining that there aren't any new ideas as though new ideas were the point of art.

And so we find that there are historians and arts critics who feel lately that one of the most damaging things about arts histories is ... arts histories and the unexamined ideologies that go with it.

http://www.vulture.com/2016/09/tyranny-of-art-history-in-contemporary-art.html

The art world likes to ask big art-centric questions like "Can art change the world?" We usually answer "Yes." I usually disagree. Art can't stop famine in sub-Saharan Africa or eradicate Zika. But art does change the world incrementally and by osmosis. Typically by first changing how we see, and thereby how we remember. Raymond Chandler invented early-20th-century L.A.; Francis Ford Coppola forged our vision of the Vietnam War; Andy Warhol combined clashing colors that were never together before and that palette is now ubiquitous; God creating Adam looks the way Michelangelo painted it; Oscar Wilde said "the mysterious loveliness" of fog didn't exist before poets and painters. That's big. But art as we now know it has narrowed. These days our definition of it is mainly art informed by other art and art history. Especially in the last two centuries — and tenaciously of late — art has examined its own essences, ordinances, techniques, tools, materials, presentational modes, and forms. To be thought of as an artist someone must self-identify as one and make what they think of as art. This center cannot hold. Why? It is far too tight to let real art breathe. [emphasis added]

...
Our art history is organized teleologically — it's an arrow. Things are always said to be going forward, and progress is measured mainly in formal ways by changes in ideas of space, color, composition, subject matter, and the like. [emphasis added] Artists and isms follow one another in a Biblical begatting based on progress toward a goal or a higher stage. Cubism was "a race toward flatness"; Suprematism was "the zero point of painting"; Rodchenko said he made "the last painting"; Ad Reinhardt one-upped him saying he was "making the last painting which anyone can make." In this system synthetic shifts and tics combine into things we call movements like Cubism, Constructivism, Futurism, Art Nouveau, Color Field, etc. The problem is anyone who doesn't fall into this timeline is out of luck. This paradigm has been in place for 200 years.
...

It's beyond time for a new generation of art historians not only to open up the system and let art be the garden that it is, home to exotic blooms of known and unknown phenomena. It's time to work against this system. [emphases added] We can't say painting is dead just as women and artist of color started to show up in art history. Our art history has stiffened into an ideology that clear-cuts a medium, pronounces it dead (like undertakers) and moves on like conquistadors to the next stage. The idea that art has an overall goal of advancing or perfecting its terms and techniques is made up. Imagined. Idiotic. Except to those benefiting from this intellectual fundamentalism. Someday, people will look back at this phase of art history the way we look back at manifest destiny and colonialism.

Ah, yes, it's so easy to just insist that this changes, a pedagogy inspired by some kind of Herder-inspired German idealism of the Romantic era but teaching the history of the whole human race as an art-making species across the entire planet over its whole existence is a time-consumin gand expensive proposition.  Even if we were to talk about just the history of music in the Western world since 1900 there's problems, problems Kyle Gann has blogged about at length.

http://www.artsjournal.com/postclassic/2007/04/the_newmusic_narrative_interru.html

... With so many niches and such an explosion in the number of composers, there should have been more books, not none. Just because we don’t have a central musical style anymore doesn’t mean we can’t have a central narrative whose primary outlines everyone could accede to. And how can we have a meaningful new-music world at all without a narrative?

http://www.artsjournal.com/postclassic/2013/09/the-end-of-music-history.html

At the request of my department chair – and he so rarely asks me for anything, I could hardly have turned him down – I am teaching a 20th-century music history survey course, or rather, music since 1910. I’ve been dreading it, and my fears are so far confirmed. First of all, I have long been convinced that you can’t do the entire 20th century in a survey course. To me, third-semester music history should be 1900-1960, and the fourth semester should take over after that. Not only is there way too much material, there’s no unifying idea to the first and second halves of the century. The year 1976 seems to remain a popular stopping point for many professors and textbooks, and I wonder if anyone (besides me) has ever taught a 20th-century class in which the last three decades got as much attention as the first three.

http://www.artsjournal.com/postclassic/2013/08/state-of-the-confusion.html

... In 1967, musicologist Leonard Meyer published a fiery book that was widely read at the time: Music, the Arts, and Ideas. In it he predicted “the end of the Renaissance,” by which he meant that there would cease to be a musical mainstream, and that instead we would settle into an ahistorical period of stylistic stasis in which a panoply of styles would coexist. This seemed an outrageous forecast at the time, but Meyer’s prescience has been greatly confirmed.
...

The new generation of composers is conflict-averse, its discourse reduced to a broadly tolerant pragmatism. However much the young composers believe they have blessedly transcended ideology and partisanship, though, they have nevertheless inherited some of the previous attitudes in a less articulated form. Instead of distinct categories, what we have is a continuum of opinions along the accessibility/difficulty scale: how much should the composer keep the audience in mind? What should be the relation, if any, to pop music? Is the educated elite of academia a sufficient audience? Should the composer ignore all questions of perceptibility and follow his pleasure? Is there, indeed, any way to predict what music will go over well with an audience and what won’t? Does the long tail phenomenon of internet distribution render all such questions moot? What is most typical of American music at the moment, I would argue, is a large-scale, implicit, almost publicly unarticulated debate on the social use of music, of what it is made for.

Since it was reading Gann's blog that introduced me to Meyer I'll just quote some stuff from Meyer as to the nature of the problem of perspective and the plurality of artistic styles.

MUSIC, THE ARTS, AND IDEAS
Leonard B. Meyer
Copyright (c) 1967. 1994 by The University of Chicago
ISBN 0-226-52143-5


page 179-180

Although diversity had been growing since the seventeenth century, the fact was seldom squarely faced. The very ideology that nurtured pluralism tended, until recently, to eclipse its presence and obscure its significance. To believe in progress, in a dialectic of history, or a divine plan was to acknowledge, at least tacitly, the existence of a single force or principle to which all the seeming diversity would one day be related. [emphasis added] To accept the Newtonian world view, or later the theory of evolution, was almost inevitably to subscribe to monism and to look forward to a time when all phenomena would be reduced to, or subsumed under, one basic, encompassing set of laws. The notable achievements of science were taken as proof that Truth was One. Behind the manifest variety of phenomena and events lay, it was supposed, the latent unity of the universe which would eventually be discovered and embodied in a simple, all-embracing model. Because the oneness of things was what was real, surface diversity and incongruity could be disregarded.

But this picture of the world is, as we have seen, no longer entirely convincing. [emphases added] The inevitability of progress, the reality of either a divine or natural purpose in things, the existence of a single set of categorical cultural norms, and, above all, the possibility of discovering some single fixed and final truth--all these beliefs have been questioned and found wanting. Not only has no unified conceptual model of the universe been forthcoming but diversity within as well as between fields has increased enormously over the past fifty years. And our awareness of this diversity has been intensified by the remarkable revolution in communication.

In an ideological climate in which determinism is doubted and teleology is suspect, in which causation is complex and laws are provisional, and in which reality is a construct and truths are multiple--in such a climate it is increasingly difficult to escape and ignore the pervasive presence of pluralism. Impelled by the human desire for simplicity, economy, and elegance, the search for an overarching unity will unquestionably continue. But at the same time it is necessary to recognize that the "dissonance" of intellectual and cultural diversity will probably not be resolved, in the foreseeable future, into a single, consonant "chord of nature."

The world is too big and the humans who have lived within it are too diverse to be able to boil it all down in the kind of ways Romantic ideological schools of thought assumed could happen.  But it's frankly too easy to kick the dead while they're dead.  If German idealism played a disproportionally large role in the Western conception of art and art history and we've had a couple of centuries to start recognizing the colonialist/imperialist implications of that there's another problem, which is not necessarily being squarely faced by artists and art historians that I'm currently aware of--the push for a truly global conception of art and art history that can encompass the entire world is the sort of thing that would seem the proper domain and concern of a truly global ruling class.  Only people with an interest in running the global arts scene or having a place within it as a market or as a ... kind of priestly practice, would seem to want to insist on having some space at the table.

The recent back and forth about Lionel Shriver's speech suggests the possibility that debates about what people at the table should get to do and who should be at the table, this metaphorical/sociological table of who gets officially recognized as artist/writer/musician, revolves around this kind of concern for art and arts history as something encompassing the span of humanity across time and planet.  Anything that could be identified as art that was nonetheless not made in an "art for the sake of art" kind of way probably can't be given admittance to the club.  Thanks to generations of Cold War propaganda for capitalism and socialism or communism, we've got a whole army of historians and critics who have been trained to think of those with political, ideological or religious differences as temperamentally and intellectually incapable of even making art, whatever art may be.

The Romantics made a lot of noise about rejecting rules and restrictions and casting off the petty constraints of society but there may have been more bluster than substance to that.  As Meyer put it in writing about the Romantic era in music:

STYLE AND MUSIC: THEORY, HISTORY AND IDEOLOGY
LEONARD B. MEYER
THE UNIVERSITY OF CHICAGO PRESS
COPYRIGHT (C) 1989 BY LEONARD B. MEYER
ISBN 0-226-52152-4


page 201
... the Romantic repudiation of convention (and especially of neo-Aristotelian aesthetics, which had been associated with the ancien regime), coupled with the denigration and weakening of syntactic relationships, highlighted the presence of diversity. As a result, the basis of coherence and unity became an issue: How did disparate and individualized themes, diverse modes of organization, and contrasts of expression--al intensified by the valuing of originality--form an organic whole? How did the several parts of a set of piano pieces or the different movements of a symphony or chamber work constitute a cohesive composition?


page 220
Put aphoristically: radical individualism seeks to undermine the norms on which its expression depends. [emphasis added]
...
The valuing of originality and individuality was reciprocally related to the denigration of convention. A convention is a shared, common property; it belongs to the compositional community, not to the individual. And it does not seem too far-fetched to suggest that the emphasis on the importance of novel musical ideas was related to the concern of the elite egalitarians with the power of possession. Musical ideas constituted the main "capital" possessed by composers, and these ideas could be made manifest only to the extent that they were in some way different--that is, original.

pages 344-345
There is, then, an inherent incompatibility between radical originality and individual expression because the latter depends on deviation from shared norms for its delineation. Therefore, to the extent that the prizing of originality leads to the abrogation of such norms, the delineation of individual expression either becomes attenuated or requires ever more radical departures from whatever norms are still prevalent. [emphasis added]Thus, especially in those styles of twentieth-century music in which constraints have been affected by a compelling concern with originality, originality ceases to be connected with individual expression.
 

Meyer made an observation in passing that Richard Taruskin transformed into an entire essay ("The Scary Purity of John Cage" was the title if memory serves), which was that in purely ideological terms you couldn't get more Romantic than John Cage, he had all the ideological imperatives about music for which the Romantic theorists and admirers of poetry pined. Yet fans of Romantic era literature and art can tend to abominate Cage even though, as an expression of what the artistic goals of the Romantic era philosophers who wrote about art would seem to have wanted Cage arrived at creating musical works-as-philosophy that transformed whatever you happened to be hearing during the duration of a performance of 4'33" into the sublimest of all musical experiences (if you're into that kind of thing, at least).

The assumption of some kind of teleological destiny for the arts based on residual European art history theories predicated on 19th century European views may not yet go by the board but if we are going to drop all of that stuff we might want to play with a few ideas.  For instance, whether we're looking at Marxist theory or some kind of postmillennialist Christian impulse of the sort that drove the Social Gospel types in the 19th century or that inspires Christian reconstructionists these days, if there's a common thread in criticism of art history theorizing it's that the teleological approach is one of the problems. 

Let's go all the way back to that Atlantic feature about poetry with the stuff about Walter Benjamin and Marx:
http://www.theatlantic.com/magazine/archive/2016/10/why-poetry-misses-the-mark/497504/

For Lerner, as his use of the term the social suggests, that hope is not just individual and spiritual, but collective and political. Poetry is linked, in his vision, to the possibility of a total redemption of human society, of the kind Marxism used to call “the revolution.” In particular, his fusion of aesthetic, political, and spiritual messianism brings to mind the work of Walter Benjamin, the 20th-century German Jewish theorist. [emphasis added] Lerner’s previous book, the novel 10:04, was saturated in the Benjaminian concept of redemption: the idea that the world as we know it carries within itself the possibility for transformation. Key to this vision is the idea that salvation will come from within, from a rearrangement of the world, rather than through an external power or a god.

... Poetry is a figure for the unalienated labor and uncommodified value that Marx thought would exist after the revolution. This is a 21st-century artist’s Marxism, one that no longer hopes for real revolution, but looks to the imagination for anticipations of what a perfected world would look and feel like. [emphasis added]

That teleological approach could be pinned on some kind of Christian apocalyptic but if we're going to do that then let's be careful.  This would be the point at which it matters whether the kind of apocalyptic interpretation of history we're looking at is premillennial, postmillennial or amillenial in disposition.  Yes, this kind of stuff, theoretically, could actually matter.  The average premillenialist Christian in America has perhaps still been trained to await a Secret Rapture and an end of the world in as little as a few months.  These are not the kinds of people who are going to care about a teleological approach to arts history is probably the nicest and most succinct way to put it. 

Whether in a Marxist form, an explicitly Christian form or even a deistic form the long-term influence of postmillennialist optimism as an informing ideological variable in art theory and art history and criticism may need to be explicitly abandoned.  Maybe it's a bit much to say "need to be", and I'll just say I explicitly reject postmillennialism in its Christian, Marxist, and deistic varieties. 

At the risk of making a possibly wildly controversial statement about Christians and the arts and the avant garde is it possible that the reason so many of the innovators in the last 120 years came from Christian traditions that could be described as historically amillenial were more open to invention and innovation (traditional Catholic and Orthodox teaching seems more non-millenarian in practical ways) than in nationalist traditions that have been steeped in a more postmillennialist train of thought? Remember that essential to this proposal is the observation that, yes, a Christian who is an amillenialist still affirms and awaits the return of Christ but not in a way that imagines that we'll hand the world to Jesus on a silver platter because of our success at Christianizing the world; it's been that postmillennialist optimism that has presented itself as Christian but that has historically been implemented as nationalism or patriotism that I am explicitly skeptical about.

College students can really like to imagine that they have transcended genre or are not beholden to this or that tradition.  The ideological fetishes of Romanticism are still very much with us.  If you have no problem admitting you work in fairly traditional idioms in a traditional way with traditional methods that almost seems to defy the whole point of being at a liberal arts college studying the arts.  If you like to write sonnets the writing teacher may tell you it's time to move on.  Music students seem to want to cast off sonata and fugue as soon as they can pass the test that requires them to say they know what that stuff is. 

And yet it seems to me that the 19th century theorists and pundits botched sonata and fugue by interpreting it in terms of their own stereotypes and expectations.  It's been interesting to read that a composer like Angelo Gilardino can refer to sonata forms as obsolete as though they were obsolete on scholarly or historical grounds even before he began to compose music for the guitar; thus the guitar could be thought of as an instrument with a body of work that lacks sonatas and fugues even though all the prestige of the mainstream classical scene seems built around a body of literature that presupposes the sonata and the fugue, those venerable 18th century approaches to thematic development, as foundational to the Western canon. 

If that's the case then how could the guitar gain the respectability Segovia wanted for our instrument if its practitioners regard the forms of the mainstream canon inimical to the instrument?  But that's a hobby horse I don't need to sit on too long for this already long post.  I'm just proposing that the 19th century Romantics (or maybe even 18th century Romantics) had a blinkered and provincial view of stuff they considered universal.  The trouble is that the contemporary post-industrial West is probably not in a different position.  Try as we might we are not primed to imagine a truly abstracted and global human race.  And yet that is in some sense a holdover from Romantic ideology, an ideology that may in some sense by found bitterly and desperately wanting in light of its own criteria of and for artistic greatness.  As Meyer put it, Romanticism insisted on the repudiation of conventions but maybe the repudiation of all convention drained the arts of the way to express the individual in the way Romantics admired.  The Romantics were busy disguising their conventionality in the hopes it wouldn't be noticed and it wasn't until the 20th century that artists and musicians and writers actually cast off the constraints many Romantics pretended to cast off.  History showed what many Romantics thought about that.  The punchline may be that the late Romantics had the misery of observing artists who actually did what they pretended to themselves they were doing.

from the manic pixie dream girl through waif fu to the murderous ingenue: Ex Machina, The Witch and film critics who will fall for the manic pixie dream girl as long as she stabs the patriarchy


http://www.salon.com/2014/07/15/im_sorry_for_coining_the_phrase_manic_pixie_dream_girl/

When I coined the term “Manic Pixie Dream Girl” in an essay about the movie “Elizabethtown” in 2007, I never could have imagined how that phrase would explode. Describing the film’s adorably daffy love interest played by Kirsten Dunst, I defined the MPDG as a fantasy figure who “exists solely in the fevered imaginations of sensitive writer-directors to teach broodingly soulful young men to embrace life and its infinite mysteries and adventures.”

That day in 2007, I remember watching “Elizabethtown” and being distracted by the preposterousness of its heroine, Claire. Dunst’s psychotically bubbly stewardess seemed to belong in some magical, otherworldly realm — hence the “pixie” — offering up her phone number to strangers and drawing whimsical maps to help her man find his way. And as Dunst cavorted across the screen, I thought also of Natalie Portman in “Garden State,” a similarly carefree nymphet who is the accessory to Zach Braff’s character development. It’s an archetype, I realized, that taps into a particular male fantasy: of being saved from depression and ennui by a fantasy woman who sweeps in like a glittery breeze to save you from yourself, then disappears once her work is done.
When I hit “publish” on that piece, the first entry in a column I called “My Year of Flops,” I was pretty proud of myself. I felt as if I had tapped into something that had been a part of our culture for a long time and given it a catchy, descriptive name — a name with what Malcolm Gladwell might call “stickiness.”
...

Now that we've had a decade of negative regard for the manic pixie dream girl it could possibly explain why, in spite of authors who are alert to the sexual stereotypes in waif fu, there is a bare, slight modification to the waif fu trope for which film critics have fallen, if only some film critics.

Whether it's Mindy's Hit Girl, or Ava from Ex Machina or Thomasin from The Witch, if a Hollywood reaction to criticism of the manic pixie dream girl has taken a shape then the preternaturally beautiful female has stopped being the manic pixie dream girl and has become the murderous ingénue and for some reason this slight pivot is enough to win over film critics as though it were an insightful, revelatory and revolutionary iteration of cinematic girl power.

Basically nothing substantial about the objectification process of the female has changed EXCEPT what film critics have given themselves license to publish as to the significance of the narrative perspective on the old femme fatale trope.

This is most easily documented in the case of The Witch.

http://www.theatlantic.com/entertainment/archive/2016/02/robert-eggers-the-witch-female-empowerment/470844/

...

Throughout the film, Thomasin’s family is picked off one by one until she’s the only one left (a particularly gory moment near the end sees her father William gored by the horns of a demonic goat named Black Phillip). She then signs herself over to the devil and joins a coven of witches dancing in the woods; the film closes on Thomasin levitating and laughing with delight. In an interview, Eggers said he didn’t initially approach his screenplay of The Witch as Thomasin’s story, but that he eventually realized she had to be the heart of the film

The original draft was about how the titular witch manifested herself to different members of the family, meaning the film spent roughly equal time with everyone. “But through working on the second draft with my producers, Thomasin became the protagonist,” he said, adding that the film still works as an ensemble piece. In the story, the witch and her demonic partners take several forms: a goat, a raven, a rabbit, a beautiful woman, and a disfigured crone. While most of the other family members are besieged by these figures, Thomasin is targeted instead with suspicion from her parents and siblings, who come to think she’s in league with evil forces. “It was not my intention to make a story of female empowerment,” Egger said, “but I discovered in the writing that if you’re making a witch story, these are the issues that rise to the top.”
...
The film’s exploration of patriarchal power was the key to unlocking Thomasin’s story. As a woman in the 17th century, she’s entirely stripped of agency. She exists only to work and help her family, and eventually be married off and bear more children. As The Witch progresses, it becomes clear that the campaign being waged against her family is targeted at freeing her so that she can join the coven in the woods. The idea that she’s been liberated is an intentionally muddy one—when she submits to Satan near the end of the film, he takes the form of a man—but there’s a giddy sense nonetheless that she has triumphed.

When asked about The Witch’s deeper commentary at a press conference, the actress Anya Taylor-Joy said she thought the film had a “happy” ending—because joining the coven is the first choice Thomasin gets to make on her own. Eggers is careful to communicate the darkness of Thomasin’s coercion, but doesn’t shy away from the fact that she’s leaving a repressive society behind. When he started thinking about The Witch, his focus was on the unknown, on “understanding where all this stuff comes from, the origins of the clichés—how they’re powerful, how they’re part of everyday life.” But he’s surprised and happy with the way his story evolved, and how it can speak to important modern issues despite being set centuries ago. Thomasin and The Witch seem destined to enter the great canon of horror films that includes the likes of Carrie, The Descent, and A Nightmare on Elm Street: stories that terrify by tapping into the immense power and fury of isolated women.


 http://www.slate.com/blogs/browbeat/2016/02/22/a24_s_new_horror_film_the_witch_has_enchanted_critics_but_mainstream_audiences.html
 ...
Normally, the fall of the main character in the final scene of a horror movie would be a director’s gloomy or gleeful surrender to evil. But The Witch presents Thomasin’s conversion as a victory for her: Embracing Satan allows her to escape from the physical hardship, moral hypocrisy, and gendered violence that’s tortured her thus far. (Given how few people in the Calvinist universe actually belong to the divine elect, hedging your bets by becoming a cursed, uberpowerful immortal is just good sense.) I can’t overstate just how shocking this moment feels, when you realize that the movie has up until now perpetrated a fundamental deception about its own point of view. All along, Eggers has stood on the Devil’s side; the triumph of the forces he’s trained us to dread and fear actually constitutes a happy ending. This hugely daring reversal could read as a middle finger to viewers, who’ve spent the past hour and change sympathizing with the pilgrims and rooting against the dark hosts. But don’t have such a limiting, orthodox view of what a horror movie ought to accomplish! Let the film’s ending serve as a reminder—as a certain goat might say—how delicious heresy can be. [emphases added]


Perhaps religious education has declined to the point where people don't realize that the primary difference between Protestants and Catholics over the centuries in the realm of diabology was simply who decided who was the Antichrist; in nearly all other respects, as the historian Jeffrey Burton Russell has put it in his numerous books on the history of thought about the Devil in Judeo-Christian traditions, there was agreement among Catholics and Protestants.  The largely reflexive reactions on the part of American film critics to cast the Puritan legacy in diabolical terms can seem like overcompensation when you have some understanding that Western Christian diabology has been one of the handful of areas in comparative religion where the Christians largely affirmed the same core ideas.  Be that as it may, film critics who might blush at stereotypes about people of color don't blush quite as much if the stereotypes involve, let's not finesse this too much, white trash Protestants.

An ever so slightly more nuanced version of this sentiment about the dread state of being female in a Puritan context ...

https://killscreen.com/articles/the-witch-isnt-an-empowerment-narrative-and-thats-why-its-great/

...
Would the director and talented, fresh-faced actress Anya Taylor-Joy consider Thomasin’s final resting place in Satan’s blood-soaked embrace a “happy” ending? Taylor-Joy answered: yes, because it was the first choice she really got to make. Yes, because it meant empowerment. Yes, because society left her no other option: if she went back to the plantation, she’d face the same accusations; and she couldn’t very well run a farm on her own with nothing but her dead family’s corpses for fertilizer.
 
Already, red flags were firing. How can Thomasin’s story be one of female empowerment when, as the final scenes imply, she chooses Satan because she literally has no other choice? If the story had painted her ultimate destiny as a clear decision between the life she lived with her family and dancing naked in the woods around a flame, that would be one thing. But Thomasin is no Carrie (of the Stephen King novel), who, despite ending up worse off in many ways, at least chose to be up there of her own volition.

The author quoted above went on to talk about how terrible the plight of women was in Puritan era America. 


But not everyone quite bought into the girl power

https://newrepublic.com/article/130182/witch-suffer-little-children

By Will Leitch
February 19, 2016
...
The Witch is the sort of horror movie that gets a ton of praise for its dogged resistance to conventional scary movie tropes. An indie hit out of Sundance last year, The Witch is the type of film that’s a success at film festivals but tends to evaporate once released into the wild; what works in the relentless hustle of a festival can feel airless when introduced to the elements of regular human audiences. The Witch is wrapped up in its own views of religion, of sin, of feminine power, but more than anything else, it is wrapped up in itself.

..


The parents are seen as tormented but also cruel and vengeful in a way that’s easily mocked from the distance of 450 years; Eggers is much more interested in their suffering than their plight. The family begins to crumble as William starts to wonder if he is reliving the life of Job, and we are invited to revel in the family’s strife and even perhaps suspect William and his brood may have it coming to them a little. [emphasis added] William is seen as a decent but deeply misguided man, and the movie briefly flirts with the notion that God is somehow punishing him. Except we know there’s a witch: We see her in the first five minutes of the movie, and the possibility of her reemergence is the central driver of tension the rest of the way. Something is legitimately tormenting this family, and it is not God. We should feel more sympathy for William than Eggers allows us to. Sure, he’s got some outdated views—he’s 500 years old— but there’s still a witch trying to kill his family, cut the guy a break.
...

It's been a while since we've had a link to Cinemagogue so a link has been overdue and so ...

http://cinemagogue.com/2016/02/28/the-witch-agency-for-eating-babies/

What can be read as a fairly classic cautionary tale by a movie critic who's also a pastor in the Reformed tradition and is able to take the Reformed idiom of the characters seriously has been read as a girl power ode by other film critics.  Whereas people looking for a pagan girl power cheerleading anthem see Thomasin's signing herself over to the dark side as a victory, a Christian can see that decision not as a rejection of a father's obstinence and self-righteous self-determination but as the logical outgrowth of that.  Thomasin merely embodies further the sins of her own father that led him to choose his own path and, by dint of being the father, forced the rest of his family to join his fate.

We'll come back to the theme of the daughter and the father almost immediately but first we need to shift back to those manic pixie dream girls.

Now if the manic pixie dream girl role was secured by the likes of Zooey Deschanel, Kirsten Dunst and Natalie Portman a decade ago, in this decade the murderous ingénue has been championed by Chloe Grace Moretz, Alicia Vikander and more lately Anya Taylor-Joy.  Each of these actresses is conventionally beautiful enough to end up playing manic pixie dream girls somewhere in the future. These are still actresses fit to play the manic pixie dream girl;  they have become known by being cast as avenging angels literally stabbing the patriarchs who embody the privilege and power that runs the world as we know it. These femme fatales, these deadly debutantes are sympathetic not so much because they aren't murderous schemers and, really, still manic pixie dream girls, but because they enact revenge fantasies against the big dicks who have God complexes who feel entitled to reorder the world around their sense of entitlement.  So, yeah, it's easy to root for these murderous ingenues, perhaps, but the image of the feminine has not necessarily changed so much as the frame around the portrait.

Making a femme fatale a sympathetic protagonist does absolutely nothing to alter the trope, but the shift in narrative perspective alone seems to be enough to convince some film critics that the deadly debutante is a fantastically subversive thing. 

Let's propose that the chilly remove we can observe in these films with the murderous ingénue is a sign that filmmakers don't want to necessarily come out and say they're rooting for the murderous ingénue ... but if that if you do once you've paid for your ticket, well, hey, girl power.

Perhaps there's some kind of subtext in the film criticism dealing with the trope of the murderous ingénue.  They're not necessarily just writing about films featuring a murderous ingénue like Ava or Thomasin or Hit Girl, they're writing about the frustration of being unable to assimilate into the mainstream of cultural power and influence; the frustration is the degree to which an art form that is more than a century old seems to have so few women headlining and defining the culture.  The patriarchy we're complaining about is not really the old Puritan era patriarchal system that didn't allow women a voice that, if we were to interview women from that era by the magic of a time machine, they might not have considered necessary in the way we do, the patriarchy at play is the one perceived to exist right now, the patronage empire that can greenlight one Bayformers movie after another and keeps the Star Trek franchise alive decades after the end of the Cold War that is a necessary historical component to understanding why anyone made the franchise to begin with

There could be more than just a few things to say about the abjection of the past necessary for this interpretation but perhaps we can say for the moment that films like The Witch and Ex Machina can function as Turing tests that ask you who you think the protagonist of the film is without committing to the idea that many film critics who have reviewed this films simultaneously commit to, which is the decision that once you've settled who the protagonist is you've established who the hero is, as if the protagonist and heroic were one and the same thing.

Let's not be too hasty.  We've already had years in which to observe the formulation of the murderous ingénue since Chloe Grace Moretz played Hit Girl in Kick-Ass

Chloe Grace Moretz went on to keep doing the murderous ingénue type in the form of a remake of Let The Right One in; by reprising Hit Girl for Kick-Ass 2; and by starring in a remake of Carrie. The trajectory is short and it tends toward domestication and remakes.  I hope Moretz can shake off the murderous ingénue role typecasting before she obviously ages out of the part.

Meanwhile, Anya Taylor-Joy has since shown up in a film described as a respectable second-tier Ex Machina, the sci-fi film Morgan.

http://www.thestranger.com/film/2016/08/31/24533750/morgan-remember-synthetic-humanoids-will-kill-you

... Morgan’s biggest downside, really, is simply that last year’s Ex Machina got here first, tackling many of the same issues (and some of the same scenery) in a more audience-friendly, immediately satisfying way. Still, that second-banana status shouldn’t negate this film’s virtues, most notably the impressive sense of chilly remove that lingers past the final enigmatic frames. 

Is playing a type that Vikander got to first edgy enough?  I haven't seen Morgan yet and may not get around to it but an advantage of steeping yourself in a little film criticism is that you can keep up with films you can't afford to go see and can observe patterns here and there. Taylor-Joy may be the latest actress to benefit from the murderous ingénue trope and, well, that conveniently lets me have three murderous ingenues to correspond to three manic pixie dream girls from the earlier decade.

Twenty years ago Joss Whedon's Buffy the Vampire Slayer was considered wry and inventive and we've had those decades as an interval in which feminists and film critics have been able to discern the long-term limitations of Whedon being a one-trick pony.  Waif fu has been shown to be not so different from earlier tropes.  The risk in the current cultural moment is that film critics who by now ought to know better because they review movies for a living are falling for the murderous ingénue perhaps only because they were so saturated by the waif fu/manic pixie dream girl trope in the previous decade they don't understand that they're seeing the same stuff but through the refraction of a different narrative prism/trope.  The light that shines through and the resulting rainbow is unchanged.  And make no mistake, the kinds of endings we can get in Ex Machina or The Witch can still fit comfortably into waif fu.

The reason we shouldn't be so eager to celebrate these films and these murderous ingenues as odes to girl power is because a collective cultural venting of frustration now can be blind to trajectories. The trajectory of the cinematic universe is short and tends toward repetition and tropes.

What the murderous ingénue shows us, whether it's Hit Girl or Ava or Thomasin,, is that the murderous ingénue is the daughter who magnifies in her vice those things her father regarded as virtue. In the case of Hit Girl (the cinematic version, not the comic) she is the daughter who lives out the quest for vengeance, justice and murder she received from her father.  For Ava, she embodies the insatiable ambition of her creator and his quest to revolutionize and overthrow whatever the old order might have been, it just doesn't so happen he is the old order.  For Thomasin, if we were to take the Puritans a bit more seriously on their own terms than the average American film critic might want to, her turn to Satan is just a more explicit form of rejecting social formation as a necessary component of individual and spiritual identity that she got from ... her father. The reason we shouldn't celebrate the murderous ingénue as some kind of stab at a patriarchy is that she is her father's daughter.

If we wanted any more vivid proof of how readily an actress who has played the murderous ingénue can pivot over to what will probably be waif fu ... Alicia Vikander (who's turn as Ava in Ex Machina was engrossing and charming) is going to be playing Lara Croft.

http://variety.com/2016/film/news/tomb-raider-release-date-alicia-vikander-1201810114/

Meet the new boss.  Same as the old boss.

Friday, September 16, 2016

a few thoughts on Lionel Shriver's speech on cultural appropriation and the problem of privileged people talking about privilege

So for those who sometimes keep track of what writers say about writing, and of things like debates about the legitimacy or illegitimacy of cultural appropriation, there was this speech, from which I'll only quote an excerpt:

https://www.theguardian.com/commentisfree/2016/sep/13/lionel-shrivers-full-speech-i-hope-the-concept-of-cultural-appropriation-is-a-passing-fad
...
The author of Who Owns Culture? Appropriation and Authenticity in American Law, Susan Scafidi, a law professor at Fordham University who for the record is white, defines cultural appropriation as “taking intellectual property, traditional knowledge, cultural expressions, or artifacts from someone else’s culture without permission. This can include unauthorised use of another culture’s dance, dress, music, language, folklore, cuisine, traditional medicine, religious symbols, etc.”


What strikes me about that definition is that “without permission” bit. However are we fiction writers to seek “permission” to use a character from another race or culture, or to employ the vernacular of a group to which we don’t belong? Do we set up a stand on the corner and approach passers-by with a clipboard, getting signatures that grant limited rights to employ an Indonesian character in Chapter Twelve, the way political volunteers get a candidate on the ballot?

Shriver's counterpoint didn't seem like much of a counterpoint because, even as a kind of stick-in-the-mud Presbyterian Calvinist sort who's moderately conservative about both politics and religion, this rebuttal from Shriver seems to depend on a bad faith understanding of why people would be upset about whatever they define cultural appropriation as being.  I'll get to that in a bit but if the core of Shriver's rebuttal is to say that writers can do what they want to do because nobody should have to give writers permission that's not much of a defense.  It's the kind of defense that presupposes the liberty of the writer to write whatever he/she/it insists upon writing.  While some writers would (and did) say this smacks of privilege the problem here isn't that this is a privilege a writer "shouldn't" have by dint of being a writer, it's that it's the kind of decision-making power every writer has but that isn't the real point you want to make if you want to make a defense of writers creating characters of races or religious beliefs or sexualities you don't personally possess.

Of course, a few people took issue with Shriver's talk and ...
https://newrepublic.com/article/136820/novels-arent-political-statements-theyre-not-apolitical-either

 ...
Among the invited opponents: Yassmin Abdel-Magied, a writer who’d walked out during Shriver’s talk. “The stench of privilege hung heavy in the air, and I was reminded of my ‘place’ in the world,” she wrote in The Guardian. Abdel-Magied rightly accuses Shriver of insensitivity, but also sets a restrictive, overly political vision for what literature should be
...
Cultural appropriation is both a real phenomenon—responsible for rock-n-roll and the Washington Redskins alike—and a ripe target for criticism and mockery, since the concept renders nearly every garment or foodstuff fraught.
...
The author of the New Republic piece pointed out the obvious but necessary observation that cultural appropriation is basically how all art happens and how culinary innovations can occur; we might benefit from proposing that not all cultural appropriation is done in the same way for the same reason.  It might be worth revisiting an example I've used a few times at this blog of how Bubber Miley quoted from a Chopin piano sonata at the end of an early Ellington recording.  Miley didn't get permission from Chopin because Chopin was dead.  Cultural appropriation in which an American musician who isn't white cribs from a dead Polish composer who composed nationalist piano music a century early has never been what the people who talk about the badness of cultural appropriation seem to be concerned about. 

The stench of privilege could be heavy in the air anywhere where writers can afford to go to writers' conferences to hear any writer hold forth on the sacred nature of the art form.  I don't have much of a history going to conferences for writers and I'm not opposed them--that said, having never been a vocational artist I think the delusion many vocational artists and writers seem to work from is imagining that there can be any but two modes of relationship the vocational artist has to formal and informal power:  you're either a servant of the ruling class or a participant in the ruling class, whatever or whoever the ruling class may be.  Those are the only two possibilities and any liminal space between them presupposes the binary. 

The only "possible" way to be outside of this binary is to be an amateur who never becomes a vocational artist or writer.  Alas, we seem to live in a moment in which writers who have the privilege of writing about privilege held by others don't seem to think of privilege as being something they, too, possess. There may be tens of thousands of people who can afford to go to liberal arts colleges and get degrees in the humanities who have been able to convince themselves they are the proletariat when they aren't.  For college graduates, even people with just undergraduate degrees, to think of themselves as working class is absurd.  People who graduated from high school, or didn't graduate from high school, and are out in the work force could be proletariat if they're not making huge sums of money, but the vocational writer is in a cultural sense a part of the priesthood of culture-builders.

That is, in a sense, the substance of a potential critique of Shriver's whole approach.  The people who have the privilege of writing stuff that gets monetized absolutely have a privilege and if it's one they take for granted or presume upon then they can come along and declare as Shriver does that a writer shouldn't need to ask anyone's permission what kinds of characters to put into a book.  Sure.  To propose otherwise is to propose a kind of censorship. 

The most basic problem with this definition of cultural appropriation ...
http://racerelations.about.com/od/diversitymatters/fl/What-Is-Cultural-Appropriation-and-Why-Is-It-Wrong.htm

In the United States, cultural appropriation almost always involves members of the dominant culture (or those who identify with it) “borrowing” from the cultures of minority groups.

Is that it assumes its own definition.  What's more, the problem is that, as Conor Friedersdorf was writing not so long ago, if a culture of victimhood:

http://www.theatlantic.com/politics/archive/2015/09/readers-defend-the-rise-of-the-microaggressions-framework/405772/

... Ferris Bueller is a stand-in for every kid who has performed victimhood to avoid school or homework. I don’t mean to suggest there are no real victims. Quite the contrary. The argument is that huge percentages of the population will, if given the opportunity, exaggerate their victimhood in order to get the gains that come with it. Many people will even fall for their own act to a degree. None of us are immune. I’m often tempted to view myself as an aggrieved party in some dispute.
 
This aspect of the culture isn’t a race thing, it’s a human nature thing. You can’t set up a system where status accrues to victims and then let people determine their own victim status. [emphasis added] Insofar as this is true of black and brown people on college campuses, it’s only because they’re no different from white people on college campuses, who participate just as much in victim culture, and many people off campus. Every human is vulnerable to the perverse incentives of “victimhood culture.” [emphasis added] 

And this is apparently how people who have a lot of power and influence can still see themselves as victims. Even if we decided there was noting at all to contest in the definition of what "cultural appropriation" even is, there's no certainty that the people who are considered the dominant culture won't see themselves as, somehow, the victims or the minority besieged by the masses of every other culture that isn't precisely them.  The proverbial one percent will feel lonely and at risk from the ninety-nine percent by dint of being a numeric minority.  It's always been in the nature of empires to assimilate all those cultural elements the masters of said empire decided to not destroy. 

But let's get back to the "cultural appropriation almost always" because beyond the abstract difficult of defining cultural appropriation in largely pejorative terms up front, there's a historical matter. 

Let's just return to "Black and Tan Fantasy", one of Duke Ellington's early works.
https://www.youtube.com/watch?v=sX4VK_CGIuw

Check out 3:00 moving forward, and then compare it to ... Chopin's Piano Sonata No. 2, movement 3
https://www.youtube.com/watch?v=y0mAbw-niI8
0:11-0:17

So if cultural appropriation almost always involves members of the dominant culture (or those who identify with it) "borrowing" from the cultures of minority groups what are we expected to make of one of Duke Ellington's trumpet-players borrowing a riff from Chopin for the end of "Black and Tan Fantasy?"

How about when the composer George Walker composed a set of variations on "O Bury Me Beneath the Willow" as the second movement for his first piano sonata?

If you want to hear a version of this old bluegrass song ... you could do worse than the Carter family, probably.
https://www.youtube.com/watch?v=YCniFuHlPG0

Start listening at 7:30ish for a pretty abstract take on the folk song with some fine variations
https://www.youtube.com/watch?v=S6pRXxyrgxw

At this point it's probably not even "necessary" to mention John Coltrane's quartet playing "My Favorite Things"
https://www.youtube.com/watch?v=YHVarQbNAwU

In each of these cases they cannot, based on the definition of cultural appropriation quoted above, even qualify as cultural appropriation.  The problem with this polemical definition of cultural appropriation is the double standard built into it.  George Walker, as an African American composer, can take a lovely old bluegrass/cowboy song chorus and fashion some great variations on it and that's not cultural appropriation?  It's not like there were never black cowboys, even if we stereotypically equate cowboys with white males.  Would it count as "cultural appropriation" that a man trained as a classical pianist and composer appropriated a folk song without getting permission?  Well, if you insist but those kinds of appropriations have been happening for as long as humans have been around. 

So it's not without cause Phoebe Maltz Bovy wrote, "Cultural appropriation is both a real phenomenon—responsible for rock-n-roll and the Washington Redskins alike—and a ripe target for criticism and mockery, since the concept renders nearly every garment or foodstuff fraught."  An ideologically implemented set of objections to cultural appropriation runs aground on the long tradition of white and black musicians in the United States and across the world borrowing from musical cultures and sharing musical ideas and ideals.  The point is easier to make in music because it happens to be easier to demonstrate here at this blog.

But for fiction there's another reason to be skeptical about the kind of agitation that insists that novels not traffic in stereotypes for want of gaining some kind of permission to do so.  Fiction is fiction.   It's not as though we're talking about journalism which, at least in theory, is obliged to present the facts that can be discovered and also the truth that may be discovered (which, it must be said is not necessarily ever or always exactly the same thing).  The field of anthropology can set for itself a commitment to reliable or aspires-to-be-reliable ethnographies of people in times and places but fiction is fiction. 

While it might be nice if some authors displayed some more social responsibility in whether they traffic in stereotypes when they decide to invent characters the case for why they should in the sense that they are obligated to is wanting; it can seem as though Socialist Realism retains vitality as an idea not on the basis of depicting a realistic triumph of the working class but through a spiritual rebirth in the form of some writers embracing the idea that people only have the moral license and legitimate opportunity to write about their own group.  And that's great if we're talking about journalism!  Yes, by all means people who want to tell the story of what they and their group have been through should have a First Amendment protected right to share their stories.

There is a different way to object to Abdel-Magied than the way Rod Dreher approached it this week:

http://www.theamericanconservative.com/dreher/blasphemy-of-the-left-file/

It's possible to suggest that Shriver presented a speech that comes off like the entitlement of a smug lecturing writer and that Abdel-Magied makes a comparably terrible case made in equally bad faith, a case that depends on a double standard that is not, as yet, entirely justifiably explained.  It may be too much to propose that cultural appropriations back and forth that are done in ways that are respectful to the people and traditions involved is something we should all be able to agree is something to strive for.  Figuring out how Gentiles and Jews who identified themselves as believers in Christ could peacefully and lovingly co-exist in spite of their substantial cultural differences was not just a little sidebar concern if we go back and read the New Testament. 

I don't actually blame Abdel-Magied for walking out of the speech.  I read the transcript of Shriver's speech and feel bad for anyone who paid money to hear writers talk about writers being able to do whatever they want because ... writers!  It's just that anyone who can attend a writer's conference is already in a position of privilege compared to the people who aren't vocationally writers, certainly compared to anyone who actually can't even read.  Compared to everyone the world over who can't, couldn't or never will be able to read Abdel-Magied writes from a position of spectacular and unacknowledged privilege.

The problem is that the world will never be a utopia and never be equal.  Well, there is the hope that one day truth and justice shall reign across the entire planet and, as a Christian, I can tell you that's an apocalyptic utopian hope of the sort that Christians, at least, confess is only possible when Jesus Christ comes to reign as God and king of the cosmos and when every tear is wiped away and, you get the basic idea.  For people who are actually religious we understand that that level of justice Abdel-Magied alludes to is only possible when a god who created the entire universe/multiverse decides to bring that utopia about.  Until then, injustice and inequality will always reign, most frequently these days through those who would insist to us they can bring it about, unfortunately.

So, somewhat contra Rod Dreher, I wasn't convinced by Abdel-Magied but Shriver's speech was boilerplate sanctimony from the writers-talking-about-writers. 


POSTLUDE

Noah Millman had a response and it was pointedly different from Dreher's, though both obviously blog at The American Conservative:

http://theweek.com/articles/649088/whos-kemosabe
...
Nonetheless, I have a question for Ms. Shriver. I agree that the whole point of writing fiction is trying on new hats, new masks.

But what if the mask you want to wear is... Batman's?

Millman swerves into Lone Ranger territory ...

...
The point is that while Tonto is intellectual property, Comanche-ness is not. It's a matter of courtesy (as well as good defensive marketing) to consult with the Comanche nation before representing one of their number on screen. Consulting Universal on representing Tonto is a matter of law. And so anybody who feels a kind of ownership of the Native American identity runs the risk of feeling: Something I own was used without permission.

That, I suspect, is what really rankles those who gnash their teeth when someone lectures them about how art is all about borrowing and exchanging freely. That's exactly what art is, but our whole edifice of intellectual property law is increasingly designed not to facilitate that borrowing and exchange, but to frustrate it, in the service of protecting the value of incumbent cultural products — the ones owned by corporations. [emphasis added]

The solution, though, isn't to build more walls, so that everyone sticks to their cultural knitting. That will just exacerbate existing baleful trends. Rather, what's needed is to restore the artistic commons, before the only culture we know is one we'll have to pay a fee to join. [emphasis added]

In other words, insisting on putting a stop to cultural appropriation is counterproductive because cultures are not intellectual property.  As the definition and enforcement of intellectual property has tilted in favor of corporate juggernauts able to enforce their interests complaining about cultural appropriation is a far lesser concern than what Millman pretty directly articulates, the reality that our culture as a whole has been steadily reaching a point where there's functionally no public domain for the culture we can easily get.

Now there's a substantial body of public domain work in literature and art and music but that gets us back to the Western artistic canon.  It may be not everyone is happy that the history of Western art, literature and music seems to be those proverbial dead white guys and gals but at least it's public domain!  By contrast, if it's been recorded by a record company the record company owns the copyright and the rights to its reproduction and distribution.  Perhaps one of the advantages of the old Western canon these days is how much of it is public domain.  Pride & Prejudice & Zombies can be a thing because Austen's book was published centuries ago.  But as we were reminded with the passing of Prince, he was choosy about how people had access to his music and he literally had every right to be. 

As a guitarist and a composer who's fond of 18th century music I'd say Millman will want to remember that we already have a spectacular public domain for the arts.  Perhaps a blogger at The American Conservative could note a potential irony in this, the wealth of the Western artistic canon that's literally free for cultural appropriation of every possible sort can be the one most objectionable to those who wish the canon wasn't so white and patriarchal.  All right ... but at least you don't have to pay a licensing fee to get permission to rewrite Jane Austen or Dostoevsky ... or do you?

Saturday, September 10, 2016

Alastair Roberts at Mere Orthodoxy "Sometimes Narratives Betray the Cause", starting with the Elizabeth Holmes narrative and moving toward the observation that evangelicals can be even worse at buying into iconic narratives


https://mereorthodoxy.com/sometimes-narratives-betray-cause/


In the wake of Elizabeth Holmes' narrative of founding a pioneering tech start-up looking to be more thoroughly narrative than anything else, Alastair Roberts has written about the propensity for the charms of a narrative to bludgeon out processes of confirmation.  Whether Roberts might wish this or no this can be thought of as a kind of part 2 for "Rob Bell and Don Draper: The Ad Man's Gospel".

Nick Bilton’s Vanity Fair article is definitely worth a read. Perhaps the most striking dimension of it for me was his attention to the role played by ‘narrative’ in Holmes’ rise and in the credence that people gave to her. Perhaps more than anything else, Holmes’ success lay in a story, a story about a revolutionary new technology that would transform the way blood testing is conducted, and a story about a young woman excelling in the male world of innovation and technology.

Roberts mentioned that Holmes was defended for a time on a ground that feminists needed icons of pioneers in male-dominated industries and that the need for the icon blinkered the ability of people to see that there were unraveling threads in Holmes' narrative.  Roberts, however, went on to state that evangelicals and Christians are, if anything, even worse about depending on these sorts of narratives.

This is certainly not just a problem for other movements: Christians can be as bad at this as any others, and often are much worse. Many of the prominent stories of the ‘persecution’ of Christians in the West that are publicized in the Christian press, for instance, turn out to be distorted, stories of employees breaching company policies, harassing or mistreating others, or of professing Christians making an unpleasant nuisance of themselves. We believe the stories that we are told without closely examining them because we want to believe them. They so effectively symbolize our narrative that their truthiness suffices to demonstrate their veracity.

The same can be true of attractive testimonies and iconic figures who represent us. As a young teen, I remember my church using the story of Hansie Cronje, the captain of the South African cricket team, in some of its evangelistic literature. Not only was Cronje a dynamic and popular sports figure, he was also a clean cut, ‘born again’ Christian. Unfortunately, only a few years later Cronje was discovered to have been involved in match-fixing, in a scandal that threw the entire sport into crisis. Rumors of serial adultery also surfaced.

Here in Seattle almost two years after Mark Driscoll decided to resign, and in the year after he hit the conference circuit to share retroactively how God released him from a ministry he'd spent a decade saying he didn't plan to abandon, it seems unlikely that we who are Christians can remind ourselves too many times that buying into an iconic narrative has drawbacks.  Mark Driscoll certainly presented quite a narrative of the history of how he planted Mars Hill ten years ago in Confessions of a Reformission Rev.  Back around 2005 he mentioned on the Midrash he was planning to write a history of Mars Hill.  Eleven years away from that it seems ... embarrassing and bewildering to think that a church that was merely ten years old could possibly have been significant enough to merit a history about it written by one of its co-founding pastors.  Twenty years on there seems reason to write a history of Mars Hill but even ten years ago when I read the published book Mark Driscoll wrote I was disappointed to see that this was not really a history of Mars Hill so much as a "how I did it" book written by Mark Driscoll about what was ultimately his own narrative, not the story of the formation of a Christian community.  It was a decade ago then, that some seeds were planted in which I began to think that it would be a good idea to record the history of Mars Hill in some way that was not just promotional copy for one guy's branded narrative.

One of the most pervasive frustrations I have had over the last ten years is discovering that the Christian and the non-Christian, left and right, have a huge investment in committing to some stereotyped branded narrative.  Sometimes these tropes converge closely enough to things that happened and to the people involved that it can pass for what happened. 

What has also been difficult is that in Mark and Grace Driscoll we have a couple that could not have been more, by Mark Driscoll's own account, explicitly trained professionally to formulate and master narrative.  Driscoll trained in speech communication and his wife trained in public relations. 

https://web.archive.org/web/20120115012617/http://pastormark.tv/2012/01/12/a-blog-for-the-brits

...
I have a degree in communications from one of the top programs in the United States. So does my wife, Grace. We are used to reporters with agendas and selective editing of long interviews. Running into reporters with agendas and being selectively edited so that you are presented as someone that is perhaps not entirely accurate is the risk one takes when trying to get their message out through the media.

What made Mark Driscoll's case unusual within the history of American Christianity was not so much that there was what we could call a spin-doctored narrative of the sort that has unraveled lately for Elizabeth Holmes.  No, what made it unusual was that this managed to get formulated for the record by a man who pretty much told us along the way he got training to create this kind of narrative and that his wife worked in public relations.  To translate it at Driscollian levels of terms, this was a guy who told us his training and that of his wife's was for spin-doctoring. By the time Mark Driscoll was giving lectures on the methods and significance of engaging mass media platforms across media types this was a guy who was speaking as a vocational propagandist at every level except for calling himself a minister of propaganda.  Of course, in hindsight, this is obviously what he was and what he may yet hope to be again. 

In a culture that loves stars and icons, we can desire our own stars and icons like the nations, putting our trust in them. Christian culture so often recklessly invests its credibility, witness, and energy in fickle celebrities and prominent leaders, leaders that all too frequently are revealed to have feet of clay. As Christians we so often have narratives that we are invested in and attracted by, the sorts of narratives that disable the immune system of our critical faculties, just when we might most need them.

At such times, we can benefit both from the development of communities of internal critique and from receptivity to external critics, which may require overcoming our urge to circle the wagons. Scandals are hardly ever without advance warning signs, if we pay attention, and listen to those warnings. Those warnings will often come from people we instinctively dislike. The warnings will run directly against what we want to believe. They will offend our sense of truthiness. But they should be heeded nonetheless.

Developing communities of healthy internal critique is difficult too. How often do you see evangelical Christians prepared to break ranks and sharply challenge someone in their immediate circles? The lack of examples and exemplars of such behavior perpetuates and intensifies a culture where prevailing narratives and icons are upheld uncritically. We all like to criticize the other side, yet are reluctant to ask tough and searching questions of our own. We don’t like to challenge our friends and to risk the possibility that they react against or marginalize us. We feel uncomfortable and defensive in places where everyone is rendered vulnerable to challenge and criticism.

How often do we see evangelical Christians prepared to break ranks and sharply challenge someone?  Well, yes, the lack of examples of such behaviors is depressing but it's possible, at the risk of pointing this out, that the decline of Mark Driscoll may be one of the very few case studies of how criticism of Mark Driscoll from within the broadly Reformed Christian scene may be such an example. 

There may be an advantage in actually being on the margins of a community but well-connected enough within it to raise questions about the group narrative.  It can be perilously easy to present prophetic activity as this sort of activity and that would, as Roberts is more likely to be aware of than others, as a kind of Hollywood-style narrative in itself about prophets "speaking truth to power" from the margins of society.  Prophets were generally an accepted class within ancient societies and, along with or in competition against sages, would advise power. Prophets within the biblical canon have a well-attested history of engaging in polemic with other prophets as those who corruptly condone power.  Thanks to all of the polemics and tools of narrative refinement we have a whole host of elites within subcultures who can fashion themselves as mediators for the narratives of groups that internally see themselves as oppressed underdogs while being overlords and spin doctors from within the community. 

One of the difficulties of internal critique is that it seems all too often the people who most need to be subjected to scrutiny are immersed in what some call a victimhood culture.  College students who have the privilege of writing about privilege may not be able to grasp that in comparison to people who only graduated from high school or didn't even graduate from high school that their position of privilege is not even relative to those who have never received a college education; but it isn't that difficult to come across writers who have been to liberal arts colleges who have convinced themselves that they are ... well, actually, as Alastair Roberts put it in a recent podcast people now go to college so they can be in the middle class.  If undergraduate college education has become a prerequisite for being middle class does this mean that's become the entry requirement or the maintainance requirement or both?  For the moment, before getting back to my earlier point, I'll say that Christopher Hitchens' remark that the trouble with religious moderates is that they have too rarely stood up against the abuses of their demagogues within their traditions is something I have tried to take to heart.

Now back to the matter of what some call a victimhood culture, the trouble with it, which authors like Friedersdorf have addressed, is that the idiom of this culture appropriates for individuals narratives of group victimhood that can essentially be invoked by anyone.  I made a long-form case that Mark Driscoll has been able to appropriate the first-person industrial complex and the power of an emotionally charged narrative to make use of the victimhood culture. I also pointed out that one of Mark Driscoll's gambits in asserting and implicitly defending some of his most tendentious readings of biblical texts has been to wrap his interpretation so tightly inside stories about himself and his children that in order to contest his interpretation you have to drill down into the exegetical minutiae that bore non-scholars on the one hand or directly attack the mercenary deployment of his children in narrative in a way that is almost pre-built to make you seem like a jerk.

http://wenatcheethehatchet.blogspot.com/2012/10/esther-as-godless-woman-then-christ_4449.html

I don't think it's really possible to over-emphasize a need for an analytic approach that explores how these narratives are constructed and the ways in which they can be used in defensive and offensive ways in polemical contexts.  Mark Driscoll proved good at this for a time;  the controversies that he did and did not directly engage in 2012 could be particularly instructive here.  From what I've read about the Elizabeth Holmes story of rise and fall there may be a comparable eagerness for the star to rush to discuss any controversy that deals with the persona and its narrative and a reticence to discuss the brass tacks of the actual content under contention. 

Leonard Meyer's postlude to Music, The Arts and Ideas mentioned that the future is no longer a source of shared optimism but that we have access to the past; this past, however, would not be historical research but histories of ethno-mythic fabrication, that as groups consolidated their respective group identities they would formulate mythologies of heroes and villains by inventing selective histories not to discover history as such but to formulate histories with the aim of confirming existing prejudices and aims. 

This would not just be seen in American Christian fundamentalists fabricating a set of Founding Fathers who were all Trinitarian Protestant Christians, it could also be found in attempts to arrive at mythic American Indians who have multiple categories for sexuality and gender while ignoring altogether the practice of slavery and the caste systems in place.  These propagandistic histories are not necessarily pure fiction so much as they are polemics that take the useful "tithe" of historically verifiable elements and present them as though they were the sum of the history.  That the Founding Fathers were more Christian in some generic sense than a Richard Dawkins of today is not that hard to propose, just as more flexible categories about sexuality in Native American groups compared to Victorian white Christians isn't that hard to establish, either.  The problem is in how the left and right are more attentive to the propagandistic fabrications of "them" vs the swiftness of embracing it for "us" in making use of these kinds of ad hoc just so historical narratives.  The propensity for people left and right to take refuge in histories manufactured for the sake of promulgating contemporary political agendas is just the human condition, as Roberts so generally noted.