Saturday, September 08, 2018

incubation phase for some musical posting

yes ... regular/loyal readers know how overdue I am to blog about a couple of musical works, I figure.

I do plan to get to that but I have to set some time aside for intensive score study.  Having a thoroughly conventional not-music-related day job and other things have slowed things down on that front.  Plus ... I do compose music and this year has been a productive one on the compositional side of things.

another variant on the decline of American prestige and influence via The Atlantic

Now, to be sure, the proposal that the United States has been in decline is an old idea.  Nobody could campaign with "Make America Great Again" without the supposition of an American decline.  Depending on who you read the very idea of American decline seemed hard to take seriously in the present ... if the present were 2012.

A mere one president later ...

Commentators are once more worrying over America’s waning preeminence. A New Yorker headline in January suggested Donald Trump was “Making China Great Again.” When the president withdrew from the Iran deal in May, Washington Post columnist Anne Applebaum lamented that “the era of American hegemony” had been “remarkably brief.” And just last week, reflecting on the G7 summit, the New York Times exhorted Americans “to recognize that this president has transformed ‘America First’ into ‘America Alone,’ and that this is the last place that a great and powerful nation wants to be.”

 In Nairobi National Park, a succession of concrete piers rises over the heads of rhinos and giraffes, part of a $13.8 billion rail project that will link Kenya’s capital with the Indian Ocean. It’s a project with the ambition and scale of global leadership, and the site safety posters are in the language of its engineers and builders: Chinese. 

Four hundred miles further north, in one of Kenya’s city-sized refugee camps, there’s another sign of what global leadership used to look like: sacks of split peas, stamped USAID; a handful of young, quiet Americans working on idealistic development projects. I saw both this month, but one already looks like a relic of the past. The baton of global leadership is being passed from the U.S. to China.

In Africa, the evidence is everywhere. China will put nearly $90 billion into the continent this year, the United States nothing close. China is betting big on economic partnerships and dependencies along its new Silk Road, christened “One Belt, One Road.” The U.S., meanwhile, spends many of its dollars on expensive wars, to the detriment of soft-power projects like USAID, or domestic welfare programs like Medicaid.

America’s global influence is certain to decline relatively in the years ahead; it is the inevitable consequence of the return of the Middle Kingdom. As that happens, the U.S. should be more deliberate about the policy choices it makes. It’s a lesson I’ve seen my own country—which was once an empire, too—learn the hard way. On the way down from global hegemony, Britain came around too slowly to investing in domestic welfare. The U.S. should apply those lessons sooner.

The time is ripe. Its 45th president swung to power on the backs of voters worn out by the burden of expensive wars, tired of wartime austerity, and fed up with rising inequality. America has spent nearly $6 trillion on sustaining long-running conflicts in Afghanistan and Iraq. Median wages haven’t gone up in decades. Its health-care inequality is a byword in failure, infant mortality barely better than that of developing countries, and some states’ death rates are soaring because of “diseases of despair.”
It’s clear that many voters gave up on the American empire. When they voted in 2016, they didn’t care for the international institutions the U.S. had so carefully constructed after World War II: NATO; the United Nations; the World Bank. They didn’t care for their country to protect the liberal world order, to lead the “Free World.” Voters on the left and the right showed their readiness for a policy turn inwards. They wanted a country focused on domestic policies. (These are my own views, and not those of my organization.)

A similar thing happened in Britain after World War II. In 1945, the Labour leader Clement Attlee campaigned on bettering the lives of Britons at the bottom. He promised welfare over warfare: a national health service, social security, public housing. It won him the election; scoring an upset win against the man who had just brought Britain its finest victory in a global war, Winston Churchill.

But in the tumultuous years that followed, Attlee wasn’t able or willing to fully scale down spending on the army and the Empire. When Churchill came back after him in 1951, India and other colonies had already won their independence, but the over-spending on foreign intervention and the military remained. The result was a delay of the inevitable decline of the Empire, but also a half-baked welfare state, which couldn’t provide for its citizens the promises that Attlee envisaged.

During a series of international conflicts from the early 1950s to the early 1970s, Britain continued to lose not only territory in Africa, the Middle East, and Asia, but also vast amounts of money and human capital, which could have otherwise been deployed to the betterment of its people. In Cyprus, Kenya, Oman, Yemen, the Suez Canal, the British possessions in Southeast Asia, and elsewhere, Britain spent vast amounts in a futile effort to retain some its imperial power.

I don’t long for the days of the British Empire. My family spent its vainglorious reign digging ironstone from the ground. The imperial sun never shone down the mine shafts of northeast England. But I know the end of the Empire did not mean the end of Britain, or that of the wellbeing of its citizens. Quite the contrary: the Britain I grew up in provided me and my family with educational opportunities and health care we’d never have known had Britain not attempted to build a welfare state at home.

Those of us on the global sidelines, America’s anxious auxiliaries, know a collapse in the instruments of a nation’s power when it happens. In Britain that collapse was precipitated by the left’s loathing of imperialism. In the United States, it has come from the right’s loathing of “globalism.”
America remains a global power, but in the world’s capitals, policymakers are now puzzling out which alliances and organizations will shape the future. Entropy rules over empire.

President Trump posed in Churchill’s armchair on a recent visit to Britain. A bust of Winston Churchill sits once more in the Oval Office. But in terms of America’s position on the world stage, Trump’s legacy may more resemble the one that Attlee set in motion. And Attlee is remembered and respected today not for an empire lost, but for a welfare state founded.

Now based on things written by people who I know have supported Trump as president, that he is willing to openly and defiantly turn his back to the "postwar order" of a pax American formulated in the wake of World War II is one of the things they like most about him.  Trump supporters see him as someone willing to say openly what blue-state voters and even progressives might have been willing to say so long as it didn't involve any policy-impacting changes, that the era of American colonialism and imperialism doesn't seem worth it.  It's just that such a set of claims were probably not supposed to be said by someone like Trump who ran on a GOP rather than a DNC ticket.  

But what that sort of repudiation would entail is a voluntaristic decline in prestige, influence and power-brokering status.  At the level of what a less imperialist United States would actually mean, it would include (among other things) an America with less clout, less prestige and connection to the international order that it in so many ways functionally shaped in the post-World War II period.   One of the things that may have separated leftists from the traditionally liberal is whether or not this declinist take on American influence could be considered possibly acceptable or an unmitigated disaster.  Ironically people on the old left and old right might have some overlap in proposing that the age of American colonialist/imperialist interventionism should end. 

Where xenophobic panic seems to kick in is whether American journalists and pundits become afraid that Russia hacked the election (as in actually did so effectively, as distinct from tried to do something) or ... China.  With the Cold War legacy being what it was it is in some ways a lot easier to suspect the Russians than the Chinese, though that sort of script has a lazy inertia to it, most likely.  If it were a matter of the sheer number of ships a navy to navy battle between the United States and China would at this point probably not going in our favor on purely numeric grounds, but it's not seeming as though Chinese naval power can be projected at anywhere close to the levels the United States naval power can be.  

But that's the thing about these adminsitrative flip flops, it has begun to seem as though red and blue partisans think America is in a dangerous declien because the other team won the executive branch.  It makes a kind of emotional sense for those to whom red state and blue state partisanship is their religious conviction but for those of us who are not in that sort of camp it seems as paranoid as the 1980s era ravings of some Hal Lindsey style eschatology guru.  For a whole lot of Americans the impact is going to be gradual and indirect.  

Sometimes the panic manifests in sloganeering that can seem wildly racist if you stop and think about it.  I saw some non-white people writing after Trump won that we should remember to set back the calendars fifty years.  That seemed a bit ... melancholic ... but depending on what your skin color is the rhetoric made a certain kid of sense.  Some white progressives said that we needed to set the clock or calendar back five hundred years.  It wasn't until as recently as .... 2004 that the American Indian Probate Reform Act became effective.  If you want tome downbeat reading material ... go read up on that.  

Oh ... well .. that'd be great for Native Americans who would suddenly no longer have been largely wiped out by epidemics of disease!  Since half my lineage is Native American setting the calendar back five hundred years would be GREAT for magically bringing back more Native Americans.  The implicit assumption that setting the calendar back five hundred years has to be bad is not a given.  We'd have a world that was not reliant on fossil fuels to power our entire global technocratic economic system and, like I wrote earlier, there'd be a lot more Native Americans.   There's a temptation toward the total abjection of the past or any life that isn't mediated in contemporary Western technological terms.  Is human life not worth living if you don't have access to artificial light and indoor plumbing?  Most definitely.  Is life worth living if you can't read any books?  Well, I'd struggle imagining that one for myself but ... yes.  

A world in which America isn't the uncontested leader of the so-called free world is still the world, and arguably still a world worth living in. 

But that hardly means that whoever takes up the mantle of the leader of the world is going to be any better.  On that point there's reason to be concerned that however bad the United States can often be it's not at all clear that whichever power rises to be the "leader of the world" next is going to be "better".  

links for the weekend

animal preservation can turn out to be about more than just habitat preservation, cumulative knowledge is a variable, according to an article in The Atlantic about ungulate migratory patterns

Ecologists have long speculated that ungulates—hooved animals like deer, bison, and sheep—also learn to migrate, since many species seem to adopt the movement patterns of their mothers and peers. By studying the translocated bighorns, using data gleaned from their collars, Kauffman’s team has finally confirmed this longstanding assumption.

To an extent, ungulates can find emerging greenery through local smells and sights. “But they also possess excellent spatial memory,” says Jesmer. “They can remember when a path greened up and time their movements to go to that area the next spring.” Their mental maps are the foundations of migrations. They’re the difference between an animal that’s just going after nearby shoots, and one that’s moving long distances across the terrain in anticipation of greenery that it knows will arrive.

That knowledge takes time to accrue, which the team showed by studying both the bighorns and five groups of translocated moose. The more time these animals spent in a new place, the better their surfing ability was, and the more likely they were to migrate. Jesmer thinks this process likely occurs over generations: Individuals learn to move through the world by following their mothers, and then augment that inherited know-how with their own experiences. “Each generation, you get this incremental increase in knowledge,” Jesmer says. For sheep, he says, learning how to effectively exploit their environment takes around 50 to 60 years. Moose need closer to a century.

That knowledge allows the animals to find plants early, when they’re young, tender, and more easily digested. And by eating high-quality plants, they can more easily pack on the fat and protein that gets them through harsh winters. “When they lose that knowledge, their populations will suffer,” says Jesmer.

Wildlife conservation isn’t just about raising the numbers on a population count. It’s also an act of cultural preservation. When rangers stop poachers from killing an elephant matriarch, they’re also saving her memories. When conservationists preserve routes over which bighorn sheep can travel, they’re keeping the animals’ traditional knowledge alive for future generations.

over at The New Republic Josephine Livingstone sounded off on The New Yorker and Steve Bannon flare up.

The New Yorker announced this Labor Day that Steve Bannon—the architect of Donald Trump’s ethno-nationalist campaign—would appear as a headline guest at its October festival, to be interviewed by editor David Remnick. Later that day, Remnick rescinded Bannon’s invitation in a memo circulated to staff. Between these announcements a streak of rage burned across Twitter, resulting in the withdrawal of several celebrity guests from the festival.
All this happened in a single day, on the internet, and then it was done. Was this just a flurry of nonsense on a sleepy summer’s holiday, or was this actual lightning hitting the ground? Twitter is a repository for the real opinions of real people, but it is also a virtual space that exists in parallel to reality traditionally conceived. It’s governed by its own strange weather. But in this case the online storm pointed to factors that exist outside the online discourse, including a growing distaste for the media-political bubble in which people like Remnick and Bannon live.

But an interview does not equal endorsement, he insisted. Bannon has historical significance, since he helped Trump get elected: The New Yorker is “hardly pulling him out of obscurity,” Remnick noted. He compared his proposed interview to Dick Cavett interviewing Lester Maddox and George Wallace, and Oriana Fallaci meeting with Henry Kissinger and Ayatollah Khomeini. Still, he acknowledged that “many of our readers, including some colleagues, have said that the Festival is different, a different kind of forum.” He eventually concluded that a written profile would be a more appropriate treatment for this important, though awful, man.
For his part, Bannon has explained that he accepted the invitation because he “would be facing one of the most fearless journalists of his generation.” He later called Remnick “gutless” for cowing to the “howling online mob.”
However, this framing of the Festival obscures certain stakes at play. First up, the money. Events are a great way for magazines to make money, especially in an era of declining ad sales. Lots of publications hold charity-style benefit dinners and forums where guests bat around “ideas.” An evening with Jack Antonoff at the New Yorker Festival, including a live concert and interview, will set you back $177. A Haruki Murakami event with fiction editor Deborah Treisman costs the same. In a 2014 article on the Festival at the business siteBizBash, Rhonda Sherman, the magazine’s director of editorial promotion, said, “The New Yorker simply would not put on the New Yorker Festival if it were not profitable.”
Fundraising is a necessary part of the magazine publishing machine, and nobody could blame The New Yorker for wanting to generate cash. But it also means that the invitation to Bannon didn’t come from a place of editorial purity—from a desire simply to interrogate him. This is not to say that Remnick solicited Bannon with the cynical intention of extracting cash from curious punters. But it does mean that the reverberations of Bannon’s appearance would have been felt in the magazine’s coffers.
The second factor obscured by the cloud of indignation concerns cultural, rather than literal, capital. David Remnick and Steve Bannon are captains of two different elites. Remnick heads The New Yorker, which nestles atop the American pyramid of intellectual prestige. Bannon helped to turn Donald Trump—denizen of reality television, the dark mirror to journalistic high-mindedness—into the most powerful man in the world. They are like prefects of different boarding school houses. Each derives part of his power by opposing the other.
Last year, Digiday reported that The New Yorker’s opposition to Trump led to a boom in subscriptions. Subscribing to the magazine, which often features caricatures of Trump on its cover, represents to some readers an act of resistance. The New Yorker’s unabashed intellectualism, commitment to deep inquiry, and skepticism of conservative politics is the kind of bandwagon decent liberals want to get on.
For his part, Bannon referred to the media as “the opposition party” at the 2017 Conservative Political Action Conference. The press are, Bannon said, “corporatist globalist media that are adamantly opposed to a economic nationalist agenda like Donald Trump has.” In the months since that CPAC appearance, Trump has sculpted his hatred for the media into an ideological issue that pits his supporters against all those who speak with journalistic authority. Bannon lies at the origin of this bit of propaganda.
The proposed meeting between Remnick and Bannon thus represented much more than the political conundrum about “platforming” odious people. It would have seen two public figures at the pinnacle of their respective clans, coming together to create a spectacle that would generate money for Remnick’s magazine and a mixture of prestige and notoriety for Bannon. The merit of the event’s content (whatever it would have been—we’ll never know) need barely come into it. The interview was compromised from the start.
A potentially quotidian observation that people who make their livings in the press pulling an interview that amounts to a ... publicity stunt don't come across as the most convincing sorts when appealing to principle?  Not that the point can't or shouldn't be made, of course.  

Scott Timberg has been writing about the demise of the alternative/weekly press since the news that The Village Voice was shutting down came up.

The alt weekly papers have been closing steadily over the last ten years, it seems, although up here in Puget Sound The Stranger (for worse and better) is still going.  The alternative press doesn't always seem like a very accurate or plausible term for whatever that branch of the press is supposed to be.  I say that with The Stranger in particular in mind beause over the last twenty years it began to seem that men like Dan Savage and Mark Driscoll have vastly more in common with each other than they may have things they think separate them.  Yes, one is ostensibly more blue state and the other is more ostensibly red state but as their respective histories of over the top punditry in Puget Sound suggested to me guys who can't resist telling people how to get off and why they should can show up across the entire political or religious/areligious spectrum.  
And as I was gloomily musing back in 2016 an alternative press that was as shocked by the Trump victory as the mainstream press may have been equally a failure at doing its job.  The future of journalistic practice may be no safer in what's left of the alternative press than in the mainstream press.  There was a review of Timberg's book at Arts Fuse a few years back, the author pointed out that Timberg came from a lien of writers and journalists and teachers and that there's an irony to his not realizing the ways in which the shifts in economics and policy that gutted the midwestern industrial job base would have a trickle down on effect on arts coverage.  
I could translate it a different way, middle-aged guys who remember the 1990s as being great were young enough to not need medical coverage or stale work and so didn't have to think about how the dot com era bubble was not all that great for people looking to get into more traditional work.  Journalism jobs were already drying up and withering away in that period and if Timberg had already landed his sweet gigs at regional papers it wouldn't have necessarily felt like things were declining back then, now he likely has a clearer sense of the downward spiral.
Yet ... as I guess I have been establishign at length at this blog, you can do journalistic blogging as the need arises.  You can do a lot of writing at a blog about the arts.  The kicker, however, is I do all this on my own time and don't make money at it ... nor am I necessarily saying I want to make money off this blog.  I've refused to monetize this blog for a variety of reasons.  
Someone once told me that the downside for blogs and bloggers is that the institutional press basically only takes itself seriously.  The election of Trump could have caused an existential crisis for the credibility of the institutional press that alternately either didn't see that victory coming or leveraged it for consumer cultivation work (i.e. The New Yorker seeing a subscription boom)
But the Frankfurt school didn't predict Trump would win. That's a fabrication of the sort that shows up in "here's my book report on an old book" style of journalism.  I've seen it enough in evangelical and conservative Protestant writing to have some idea what it looks like in a magazine like The New  Yorker.  By all means read Adorno, though, if you can navigate his style.  
Over at The Baffler there's a perhaps predictably cynical take on the Kaepernick ad with Nike.
The ad itself is a fascinating piece of communication whose implications speak volumes. It’s spare—a black and white photograph of Kaepernick’s face emblazoned with the copy “Believe in something. Even it means sacrificing everything.” Kaepernick’s mere image alongside what is otherwise fairly boilerplate Nike-speak in the “Just Do It” vein is catnip to his supporters and an affront to conservatives. There is, at present, no reason for any company to endorse him as an athlete, which means that Nike (which has had him under contract all along) is forking over a hefty payday, a shoe, and potentially a line of apparel to someone on the basis of his activism. In the most simplistic branding terms, this decision means that social justice work is good, and its critics are therefore bad. Nike has trained the spotlight on Kaepernick when it could’ve easily remained silent.

But it’s just as instructive to look at what the ad didn’t say. It cosigns the Nike brand to Kaepernick’s determination and integrity, not the substance of his “something”—which, by his own admission, evolved over time as he gained a more sophisticated understanding of politics and activism. His message, which is perhaps best described as an inchoate structural critique of racist violence, is wholly absent; we have to settle for generic motivational copy that could easily apply to sports, or any other demanding endeavor off the field. It is impossible to agree or disagree with the ad. Nike pointedly does not decry white supremacy, police violence, the carceral state, or environmental racism—all themes Kaepernick has touched on via his public statements and charitable work. Much like the “Equality” campaign from last year or the much-praised utterances of LeBron James, its premier athlete, Nike here demonstrated clear limits to just how far it is willing to go.

Viewed in the context of the charged psychic minefield of brand symbolism, the embrace of the Kaepernick ad as an unconditional triumph is a gesture of self-preservation. The current state of debate surrounding putative loyalty to the national anthem and the NFL—both patriotic brands cultivating a similarly charged sort of signification among a very different consumer demographic—requires us to interpret the Nike-branded message as a token of  progress because otherwise we would have to admit how cut off we are from any real version of dissent or meaningful opposition. Our own capacity to trust Nike belies an underlying sickness that we would rather not address. That we are okay with a politics mediated by brands puts the onus on us—which is to say, where it should ultimately belong. Unless Nike stuns everyone by expanding its partnership with Kaepernick to the point of adopting his worldview to influence corporate practices, we should view these efforts neutrally. Having Kaepernick around is good for the discourse; but our own ready inclination to pat Nike on the back for the culture-war troubles it’s now fending off largely by design points to some disquieting truths about ourselves.
Being pro-Kaepernick doesn’t require you be anti-capitalism. Nor does seeing value in the ad make you a sinister sell-out. Ideally, though, the ad’s appearance can serve as a teachable moment, burnishing Nike’s and Kaepernick’s respective brands while highlighting the consumer psychology at work in establishing and cultivating our loyalty to consumer brands: their agendas, their putative virtues, or their capacity for political action. Corporations wield real power. But brands are a figment that we feed every day—and if we ever we plan to reckon with them, we must also truly reckon with ourselves. 

Had that been an article from what is colloquially known as the alt right or a conservative publication the commentary might have opened with "the virtue signalling is strong with this one".  That might even be true ... although as that goes virtue signalling is so strong with those who call out virtue signalling that it's not like there  are any "good guys" on that front.  Jesus taught against doing good deeds to gain the adulation of people and people have been finding loopholes in that instruction ever since Jesus taught it.  

But ... it is important to consider your brand loyalties and what the brands actually "say" and what our loyalty to a brand says about you.  As a former Mars Hill member who decided to leave one of the key realizations I had about what I believed (and I would say I'm in many respects a stick in the mud evangelical moderately conservative sort) is that there was basically nothing about what I believed that had to be realized by way of membership at Mars Hill.  To put it in market terms, I began to feel that Mars Hill had stopped being about the product and had become altogether more about the branding, and that what had begun as a "we" of a Christian community exploring what city life could be like it had morphed into Mark Driscoll's "my story" as a synecdoche for an entire community.  

Over at ArtsFuse, saw a piece about a book on a connection between what's known as neoliberalism and a neoclassical tendency in jazz.

It's probably going to be stuck on my "to get to ... maybe" list.  I still have a book by Ephraim Radner I haven't even started yet and there's no such thing as a "fast" Adorno reading program.  I did finish reading a certain book by Joseph Campbell, though, and it was one of the lamer books I've read through but I'm waiting to write about that at some other time. 

at LitHub Kevin Young writes on how plagiarists think and ... throws out a few axioms I'm not sure I take at face value

This is the first defense of the plagiarist: I only did it once, and by accident. The second defense is plagiarized from the first: I only did it once. Yet as Mallon reminds us, "Plagiarism is something people may do for a variety of reasons but almost always something they do more than once." The "unconscious stealing plea" goes hand in hand with the idea of only doing it once--not simply that I did it just that one time, but rather, in that one instance too, I was so unconscious as to be blameless.

To rewatch the episode now is to see the hoaxer plead innocence in a way familiar enough that it may seem plagiarized from some clichéd script. Indeed, all this was happening within months of James Frey, "Nasdijj," and JT LeRoy implosions. Where the fake memoirist plays at suffering, the plagiarist, like the impostor, often performs innocence. In Viswanathan’s case this doesn’t mean just "not guilty" of the charges before her, but innocence as a permanent state--one feminine, youthful, American. Such enforced innocence--gendered and often raced--may explain why, in the press material of the young author, she is regularly referred to as a "starlet," a term usually reserved for cinema. No matter her actual age, a starlet performs youthfulness--and matching beauty. Such youthfulness quickly if quietly signifies newness, freshness, and originality in turn, an approachable prodigy.

The early notices and prepress frame Viswanathan as a "girl wonder" a century after the iconic figure’s heyday. "A clever novel by a promising author. . . one of the hottest young talents in fiction," says the Boston Globe.  There’s a sense too of the author as somehow a new invention: the "young adult" (or YA as it’s known) Indian author; or more exactly, the Indian YA one. But by far the biggest suggestion of all is that the fictional Opal is true to life, a double who’s her and not-her. Opal Mehta is plagiarized from Viswanathan.

I don’t mean to substitute Freud’s couch for the Today show’s, yet we must be able to see the ways our culture’s cult of innocence and youth is also the culture of plagiarism. Newness at all costs yields pressure not just on the potential author but also on the culture that cannot be honest about its recycling, much less its trash.


Now that's an interesting proposal, if a deliberately provocative one--that our cultural cult of innocence and youth is also a culture of plagiarism. Perhaps in traditional and parochial societies the way to be wise beyond your years was to simply go to all the people who were considered wise, find out what you could from them, and share that wisdom as seemed fitting.  Ancient societies had wisdom traditions and so, for instance, in Jewish literature we have the book of Proverbs (which is necessarily best appreciated in conjunction with the Book of Job and the Book of Ecclesiastes and a few other books ... ).  Proverbs was robust enough a book that Christians didn't see much need to formulate a new non-Jewish wisdom literature and so they canonized it. 

But in our era and place to be wise beyond your years might also bring with it some obligation to not show your debts. 

But why can't this culture be honest about its recycling?  Copryight law?  That might be a variable, but I've written in the past about how we could look at Generation X as a case study in a generation that had to participate in the arts worlds in the wake of modifications to copyright law that extended effective copyright.  If you don't mind working on and with a lot of properties that are work for hire and trademarked by corporations you can get a Christopher Nolan making some, I think, pretty solid Batman movies.  But that entails being willign to play with the toys in the toybox that someone hands to you and not everyone is willing to do that.  That may be a variable that isn't being considered for a LitHub article because the rules for YA and higher brow literature are not the same as they would be for, say, comics.  That's soemthing to keep in mind as an article like this one moves along. 

To plagiarize another is to steal a bit of that person’s soul--almost as much as soul music was stolen. It is also to steal, in other words, a culture. Whether we think plagiarism a crime--the only crime in literature to some--or at best a minor offense, all depends on if we’re the ones being plagiarized. Or on whether we see writing as work. But why would Viswanathan steal her own culture? Why plagiarize in spirit and image, if not words, exactly those things that India-born Viswanathan may have actually experienced? It’s a puzzle--she seems to do so for ease, of course, but also perhaps as a way of becoming American, claiming an Americanness that, the book’s plot seems to suggest, is always at a remove. There’s a sense that Opal Mehta is somehow living a plagiarized existence, trying to own what doesn’t belong to her: Harvard, full-blooded humanity, a "life."

Opal Mehta must become a type for Viswanathan to live.

I have said plagiarism is about class, but it’s really race disguised as class. This is true of Viswanathan--not to mention Opal Mehta--who to some may come to represent the fast track that Harvard, or getting into it, has come to mean. When I was in school there, the saying went that the hardest thing about Harvard was getting in; if at all true then, getting in has become only tougher, with just over 2000 students accepted from a pool of more than 34,000 applicants for the class of 2016.

Eh ... I don't know.  Race disguised as class seems like it might not be far enough of a step, race disguised as class could still be a way to disguise the class issue.  To put it another way, when I read essays by contributors at NewMusicBox who talk about how post-genre we are sometimes those authors talk at considerable length about how just not-white they are.  The fact that they made it into prestigious schools is secondary or not something to be mentioned at all until maybe the biographical note at the bottom of the page.  There can be a set of norms and terms that can be construed as indicating class.  To pick a simple example from teh recent film adaptation of Crazy Rich Asians, Rachel asks her boyfriend just how much money his family has. When he says to the effect "we live comfortably" Rachel smiles and says that she has learned that any time anyone says "we live comfortably" that's a euphemism for being wealthy.  There have been quips and jokes to the effect that the aristocrats of America aren't exactly like the aristocrats of Europe, they tend not to want to call attention to their aristocratic rank out on the street most of the time.  Perhaps I could even joke that aristocratic bearing is something to be saved for flaunting on social media, however aristocratic rank in social media terms may be defined. 

If anything in the last ten years I have begun to wonder whether a focus on race has been taken up so as to avoid class considerations altogether.  Academics have a capacity to focus on the "one percent" or the "elite" being a strictly financial elite and thereby avoid any consideration that they, too, are part of what could be called a ruling caste.  Or that's how it plays until it's time to fret about American anti-intellectualism as if it couldn't possibly be construed in class conflict terms but only in terms of the cognitive floor loathing and resenting the cognitive ceiling which ... can't really be construed in class terms, can it? 

If the axioms about plagiarism are applicable to Harvard then, frankly, they will have to be about class at every level--the classes you have to take if you get into a place like Harvard, the classes you have to be of or not of in order to get into Harvard, and even on the matter of what race you are there's still the matter of whether you have enough "class" to get into Harvard ... it still seems class permeates the matters of Harvard even if race is, obviously, a significant issue.  Whether or not admissions policies discriminate against Asians and Asian Americans or whether suits related to that have some kind of alt right deep pockets behind them (which is something I've seen mentioned in some coverage and editorializing in the last month or so) is a bit harder to say. 

But it seems relatively clear that if we're talking about people getting in as students to places like Harvard, class is going to be the big deal. I could see class disguised as race disguised as class explaining things ... but I'm not convinced that talking about class somehow becomes talking about race disguised as class.  Is Beyonce not one of the top selling artists of our era?  Was Michael Jackson not the King of Pop?  Aretha Franklin was the Queen of Soul.  John McWhorter has written a few times in the last few years that, yes, blacks have some bad experiences and there's a lot that can be better but that he's lived long enough to see that some things are actually better now, and he's frustrated that authors liek Coates can literally sell a narrative that things are still as bad as they were half a century ago--there's room for more nuance. 

This next generalization from the article ... not quite so sure about that, either.
While all writers fall in love with different books and writers who influence them, few are faithful; great writers tend to go beyond a singular influence in that strange alchemy of many influences that creates originality. Plagiarists tend to be monogamous. They return to one or two texts to craft their plagiarisms.

Having, ahem, been a writer who chronicled things going on in a megachurch (Mars Hill) during a period in which its former co-founder Mark Driscoll found himself embroiled in a plagiarism controversy it seems like a potentially safer way to articulate this proposed axiom might be to say that plagiarists tend to be drawn to a body of work that is, however large, still small enough and documentable enough for contemporary specialist scholars to track relatively quickly in our day.  If we modify the axiom to "they return to one or two bodies of literature ... "then that DOES seem like an accurate description of how plagiarists can behave. 

There are some observations about how from the 18th century on into our day the West has been fixated on originality and genius.  Yeah, yeah.  Oh, and the 18th century was rife with hoaxes.  Sure.  It's at these sorts of observations it seems useful to point out that Haydn became one of the popular composers in the Western world during his lifetime but he did not actually publish all that much.  Robbins-Landon has written that most of Haydn's works that were known during his lifetime were published by way of bootleg editions.  Now, when Haydn is more of a musician's composer who isn't given much attention compared to the genius mythologies associated with Beethoven and Mozart it's funny to think that Haydn was a formative influence on both composers (and a friend to Mozart), and that one of the most bootlegged popular musicians of two centuries ago can be a respectable subheading in shorter music history books that students might skim through to get a class over with. 

What seems different about our era is the capacity to annotate and cite, but also that we are living in an era in which so much of our popular culture is under copyright and trademark.  It's at this point that I confess to having little sympathy for the different sides of the issue.  I didn't feel all that bad for the guys who lost the "Blurred Lines" case because I don't like the song, first of all, and because the panic about whether or not you can copyright a genre seemed in bad faith.  Of course ... I am a classical guitarist and compose for that instrument and I might do something like arrange a movement from one of Byrd's masses for solo guitar.  That's to say that to me anyone sufficiently steeped in musical literacy to strip mine public domain literature for ideas doesn't have to worry about this or that new song being copyright protected.  If there's a reason to worry ti's about stuff like corporate juggernauts attempting to have everything under some kind f access point control.  As some writers have put it, when you have stuff on your Prime membership that streams you don't own it, you're just renting it for an indefinite period of time if you stream. 

Now maybe what plagiarists want to get that they don't have is what could be known as "authenticity".  Or maybe "authority".  The plagiarism controversy that swirled around a certain former megachurch preacher here in the Seattle area could probably be a case study of a person wanting authenticity and authority or leveraging what was perceived as being both.  A plagiarism controversy enveloping a celebrity preacher can cast doubt on whether the authority the preacher has is second or even third-hand, and that raises doubts about the authenticity of the instruction.  Unless, of course, you're a person who is set on rejecting ideas like "authenticity" or "authorship".  I've seen those kinds of cases and they are largely unpersuasive because if we live in an era of presidential ghostwriters might we want to be able to say that so-and-so's ideas, however good or bad we think they are, are that actual person's ideas? 

We live in the era of Trump and in the era of Trump polemics against authoriship and authenticity that may have made sense to readers of Barthes seem less germane now.  It may even make sense to people committed to what Theodore Gracyk has called "ontologically thick" music.  But I'm a classical guitarist.  Classical guitar, pedantic obsessions with technique and tone production withstanding, are playing some the most "ontologically thin" music around.  Go dig up a Sor etude and see whether you get much more than black dots on lines and spaces.  There's often very little by way of instruction in post-19th century terms as to "how" this music is supposed to sound. One of my friends joked that in newer scores you can see all this explicit detail and in a Haydn string quartet it will just start with "Allegro" and not much else.  The authentic master of a Beatles album and the authentic autograph of a Beethoven bagatelle are both "authentic" but the amount of information conveyed isn't the same in quality or quantity.  The former is far more specific and concrete and "ontologically thick".  In that sense we're far worse off in the age of mechanical reproduction than musicians might have been before this age. 

Why does that matter?  Well, if someone is steeped in rock or pop or hip hop or country; if someone is steeped in musical traditions in which it  matters if that country song is being played on a Telecaster or a Stratocaster in order for it to be legitimate; making arguments against an "authenticity" paradigm predicated on technological reproduction is going to mean very close to nothing for those of us whose music education is more half and half--certainly for those of us who read scores and play with invertible counterpoint the idea that the music is reducible to a set of timbres or a specific human voice is ludicrous.  In that sense I sometimes wonder whether the new musicology is setting itself against a conception of Western art music that has less to do with the actual scholarship and historiography of Western art music than with some ideals of authenticity in performance and conception that are part and parcel of popular music. 

On the whole the axioms presented by Young about plagiarists didn't really convince me but they are interesting to think about.  If we wanted to put forth a theory about what the plagiarist attempts to do by plagiarism it could be an effort to leverage a halo effect.  If the author's own halo isn't powerful enough to inspire the desired sense of awe in a prospective audience then, well, there's these other authors whose work the plagiarist can appropriate so as to attain the right level of awe and gravitas.  That seems simple enough and relatively accurate to how and why plagiarists seem to plagiarize.  Plagiarists seem to have to have an aspiration to an upward class mobility for which certain types of appropriation make sense.  You don't crib from people to ruin a literary career even if plagiarists can ruin their literary careers ... they don't set out with social mobility going only downward in mind. 

Which is another reason why I suspect it's still more than likely about class even if some authors think it's about race disguised as class. 

from The Tablet, Jane Kallir prposes "It is a safe bet that art history's next grand narrative will not be written in the West"

Over at Tablet, which was the publication within which I saw an article about cultural appropriation as a realization of sumptuary codes that in earlier epochs would have been the purview of the aristocracy, there's a longform piece on the connection between the "art scene" and the middle class.

Historians sometimes speak of “the long 19th century”—a continuation of the superficial stability seen in the late 1800s, which in 1914 was finally shattered by World War I. Almost two decades into the 21st century, we are now experiencing a comparable breakdown of the apparent verities with which many of us grew up. The so-called postwar consensus that led to the formation of the European Union and its attendant international alliances is starting to unravel. Nativist anti-immigrant movements have gained traction in countries (including the United States) formerly considered bastions of human rights. Income inequality has risen to extremes not witnessed since the 1920s. Far from being immune to these external stressors, the art world is very much a product of larger socio-economic forces that determine what gets seen, sold and valued, aesthetically as well as monetarily. In art, the long 20th century, associated with modernism and its postmodern dénouement, has ended. The future of art will be shaped by a very different set of circumstances.

The advent of modernism coincided with Europe’s transition from rule by a land-based aristocracy to industrial capitalism. Industrialization created great extremes of wealth and poverty, but it also spawned a significant middle class. In the first half of the 20th century, the two world wars and the Great Depression leveled the playing field for this emergent class by wiping out large stores of accumulated wealth. To ameliorate capitalism’s harsher side effects and to ward off the threat of communism, governments in Europe and the United States created social safety nets, regulated industry and instituted various forms of progressive taxation. Between 1913 and 1948, income inequality dropped by 10 percent in the United States. The three decades following WWII were characterized by rapid growth and low unemployment throughout the developed world. Per capita income rose at rates unequaled before or since. Many blue-collar workers, traditional members of the proletariat, earned middle-class wages.

Modernism is inseparable from the rise of the Western middle class. In 19th-century Europe, the bourgeoisie created a vast new market for art, previously a luxury enjoyed mainly by aristocrats. Cities, especially, became cultural hubs replete with museums, galleries, concert halls, theaters and publishing houses. The direct patronage that had characterized the aristocratic age was replaced by a wider distribution system that depended on intermediaries to connect artists with consumers. Critics, art historians and curators augmented the promotional efforts of commercial art dealers by legitimizing artists and educating the public. As the middle class expanded in the second half of the 20th century, advances in mass communications further broadened the audience for art.

Although modernism was sustained, financially and intellectually, by the middle class, the European avant-garde was, from the outset, rife with what the historian Peter Gay calls “bourgeoisophobia.” The bourgeoisie were said to represent everything that was wrong with contemporary society: poor taste, superficiality, pedantry, prudery, materialism and the pursuit of profit above all else. It was up to artists to save the world from middle-class philistinism. Regardless of political orientation, partisans of the avant-garde agreed that capitalism was inimical to art. As Gay points out, the modern era was the first time in history that artists rejected their own economic support base.

At this point I interrupt the flow of the article to point out, thanks to having gone through about half a dozen books by Theodore Adorno in the last couple of years, that artists committed to what can be known as a post-Wagnerian art religion in which artists were the priests and prophets of a new kind of religion of universal enlightened humanism.  The canon in Western arts that evolved in the wake of this art religion had bourgeois elements but within the realm of the arts as practiced there was a kind of elitist/aristocratic impulse that was eager to not be too sullied with middle class domesticity.  Artists as a group were not altogether committed to the idea of really casting off the aristocratic elements of a nascent art religion that was supposed to supplant Christendom.  But a more damning way to put this would be to say that the new liberal humanist art religion of the West simply shifted the old theological concepts of clericalism and sacralism from official religion to this new art religion.  In the sense that Adorno observed this shift, his continual lambasting of Western liberal artistic traditions as having completely assimilated into the new bourgeois order is relatively easy to understand.  Now back to the article.

For most of the 20th century, the American avant-garde shared their European colleagues’ disdain for the corrosive influence of money. Battle lines were drawn between “high” art and “kitsch,” defined by the critic Clement Greenberg as an “ersatz culture” designed to entertain the ignorant masses. Cranked out mechanically for commercial gain, kitsch included illustrations, comics, popular music and Hollywood films. Perhaps even more dangerous than such lowbrow amusements were middlebrow vehicles like The New Yorker, which Greenberg accused of watering down avant-garde material for sale to the “luxury trade.” As Russell Lynes (editor of the resoundingly middlebrow Harper’s magazine) noted in his 1949 essay, “Highbrow, Middlebrow, Lowbrow,” midcentury American intellectuals feared that the democratization of culture would precipitate a disastrous leveling of standards. The job of the highbrow, according to Lynes, was “to protect the arts from the culture mongers, and [to spit] venom at those he suspects of selling the Muses short.” Nonetheless, many of those “culture mongers” looked for guidance to the highbrows, who were distinguished from middlebrows by a fastidious devotion to “art for art’s sake” and a contempt for commerce.

I.e. "art for the sake of art" is what Adorno called the bourgeois art religion, only it might be more apt to say that this art religion was not necessarily strictly bourgeois unless we use that term in a knowingly pejorative sense that conflates the sociological patronage role of the new bourgeois with the older establishment role played by the aristocracy in arts patronage.  You have to have come across enough really old left highbrow art criticism to start appreciating these distinctions, maybe.  Another landmark essay about the emerging middlebrow would be Dwight Macdonald's "Masscult and Midcult".  Interestingly, Macdonald hated the emerging "melting pot" cultural dynamic because, as he put it, instead of getting a truly pluralistic arts scene in which the contributions of Jews, Serbs, Croats, Germans, Italians, Poles, Russians, English, Germans and so on could all be appreciated the "melting pot" steeped and stewed everything in the arts into a kind of boiled out gruel.  That's not literally how he put it but when I read "Masscult and Midcult" I was struck by how specific he was about which ethnic contributions to the arts would be boiled out of the then emerging American "melting pot" approach to culture.  There could be a case to be made that what's considered a concern about cultural appropriation or the emergence of what conservatives resentfully call "multiculturalism" might be a kind of revenge on the "melting pot" conception of the arts that Macdonald wrote against at least half a century ago.

The art world—an amalgam of critics, art historians, curators, collectors, dealers and artists who collectively set aesthetic standards—was very much an artifact of modernism. Of course not all members of this clique were bona fide highbrows. In a capitalist society, it was impossible to avoid contact with the marketplace, and no artist really wanted to starve in a garret. Peter Gay suggests that the diatribes and manifestoes generated by the avant-garde were at least partly designed to convert a skeptical middle class into paying customers. This does not mean, however, that bourgeoisophobia was a total sham. Until the final decades of the 20th century, artists who achieved financial success risked being branded sellouts. The art world administered litmus tests to assess purity and ran interference between its anointed darlings and the commercial sphere. It took a quintessential outcast, Andy Warhol, to openly embrace the celebrity culture Clement Greenberg had reviled. When Warhol began painting dollar bills in 1962, it was a far more transgressive act than anyone today can imagine. This was the same year that Milton Friedman published Capitalism and Freedom, which subsequently established free-market ideology as governing economic practice.

So tempted to write about Jacques Ellul's The Empire of Non-Sense here, about his take on how the critical and scholarly establishments formulated a range of theories to make themselves indispensable to arts consumption while evading the reality of their dependence on the consumer castes to validate their reason for being since in the older aristocratic era of arts patronage the educational gap between producers and consumers largely didn't exist in the high arts ... but I basically already just did that so discussing the Ellul book can, once again, be saved for some other time. 

Beneath the seemingly apolitical doctrine of “art for art’s sake,” the paradigms set forth by the 20th-century art world were influenced by the power dynamics of the age. It almost goes without saying that the art world’s key players were male, white, and largely Eurocentric in their cultural orientation. Whereas art history had traditionally been written to affirm the supremacy of European achievements, the postwar American art world shifted the pinnacle in the United States. To challenge the still-dominant School of Paris, Robert Motherwell titled a 1951 exhibition of his Abstract-Expressionist cohort “The School of New York.” Building upon the theories of abstraction developed by Alfred Barr, founding director of the Museum of Modern Art, Clement Greenberg hypothesized a formalist trajectory that positioned America as rightful heir to Europe’s modernist legacy. Cultural hegemony followed global political hegemony. Even as the dominant modernist narrative was being written, there were art historians who recognized that it was inaccurate. The narrative was too focused on France, at the expense of countries like Austria, Germany, Russia, and Italy that had been sidelined by various 20th-century political events. Nor was it correct to build the narrative so exclusively around formalism; modernism was far messier, far more multifaceted than that. And then there were the many artists who were left out of the narrative entirely: women, people of color, socio-economic outliers and citizens of nations outside the Western orbit. Curators are today making valiant efforts to correct these mistakes, a goal most effectively achieved through monographic presentations or deep dives into previously overlooked cultural phenomena. To the extent that such exhibitions retain a central narrative, the story is tightly focused on a specific artist or theme.

This has begun to seem like one of the faultlines in Western arts criticism, scholarship and journalism--did the center of gravity for the arts in the West really shift away from Europe to the United States in the post-World War II period? Bear in mind the article I'm looking at is discussing the art world as the market of visual arts, plastic arts, and so on, but the question is pertinent to the arts across the board.  

It is, however, difficult to mount encyclopedic exhibitions without an overarching art-historical narrative, as is made clear by the Metropolitan Museum’s “Like Life,” from this summer. A ramble through 700 years of polychrome figurative sculpture, “Like Life” followed recent trends by aggressively breaching once sacrosanct high/low boundaries. It includes animatronic dolls, anatomical medical models, a “breathing” wax figure of “Sleeping Beauty” from Madame Tussaud’s, and an effigy of the utilitarian philosopher Jeremy Bentham constructed over his actual skeleton. Works by Jeff Koons, “Michael Jackson and Bubbles,” and “Buster Keaton,” are paired, respectively, with an elaborate Meissen porcelain tableau and a primitive 15th-century religious carving. The show might better have been titled “Looks Like.” But superficial visual similarities tell us nothing about the idiosyncratic contexts within which the works were created. Such spectacles aim to entertain at the expense of the individual artists.

That Jeff Koons even has a career has been taken by folks with liberal and conservative sympathies alike as a sign of the sorry state the Western art scene has devolved to in the last twenty years.  Skipping ahead a bit 

Some segments of the art world may simply have expanded beyond sustainability. According to artnet News, “The number of global [art] fairs has roughly tripled since 2005, from 68 events to somewhere between two and three hundred.” There are currently 320 biennials worldwide. In Europe, the number has risen from less than 30 in the late 1980s, to 136; in Asia, there are 82, up from around 20 in the late 1990s. Fifteen years ago, it seemed every museum was building a new wing; now overall attendance has dropped, and many institutions are grappling with budget deficits. Ten years ago, five galleries opened for every one that closed; today, for the first time in recent memory, more galleries are closing than opening. [emphasis added] Amy Cappellazzo, Sotheby’s chairman of global fine arts, has suggested that even auctions could one day become obsolete. With rampant guarantees and lone buyers bidding against the reserve, this is often already the de facto case in the upper echelons of the auction market.

The art world’s expansion over the past quarter-century was stimulated by a combination of globalization and an influx of baby-boomer consumers, then in their peak earning years. As boomers age out of the market, they are not being replaced in comparable numbers. On average, millennials earn 20 percent less than boomers did at their age. Since the 1970s, financial deregulation, supply-side tax cuts, the evisceration of labor unions, and slowing economic growth have greatly eroded middle-class incomes. [emphasis added] By 2010, 20 percent of America’s national income was going to 1 percent of the population. Thomas Piketty (whose book Capital in the Twenty-First Century is the source of most of the economic statistics cited herein) warns that, inasmuch as the rate of return on capital has historically exceeded the rate of growth, income inequality is likely to become self-perpetuating unless governments step in to reverse it. So far, despite rampant populist rhetoric, little is being done to rein in our resurgent oligarchy.

Many compare the current economic scene to the 19th century’s Gilded Age, and it is therefore hardly surprising that the art world is being overwhelmed by the superrich. To the extent that it was essentially a middle-class phenomenon, one may question whether there still is an art world. The ascetic highbrows have been replaced by “thought leaders,” who kowtow to wealth and equate the “marketplace of ideas” with the financial markets. Any pretense of a firewall between art and money has been abandoned. The roles of dealer, curator, and artist have blurred, compelling artists to promote themselves. [emphasis added] High on the food chain we see Damien Hirst collaborating with Sotheby’s and luxury mogul François Pinault; lower down, artists milk sketchy celebrity contacts on Instagram. Meanwhile, with the end of the “American Century,” nations in the Middle East and Asia are exerting more influence on the global conversation. Just as America’s Gilded Age magnates collected Italian Renaissance paintings and portraits of British aristocrats, newly minted billionaires in other parts of the world are scooping up Western masterpieces. The recently opened Louvre Abu Dhabi suggests a long-range agenda, repositioning these works in a broader context to legitimize the full panoply of world cultures. It is a safe bet that art history’s next grand narrative will not be written in the West. Things change.

While conservative polemics against multiculturalism have often seemed to camp out on the idea that all this new multicultural artistic activity pales compared t the great "universal" achievements of the Western liberal arts tradition there's a cautionary note to consider about the nature of the art scene multiculturalism wants to play a larger part in.  If a neoliberal order has let the market define things an the herd behaviors have created a bubble in the arts then are we sure it's going to be a good thing for marginalized groups to participate in that bubble?  It's not that these groups don't deserve a chance to contribute to the arts scene out of the principle of contributing to art. No, what I'm trying to get at is that the push for that can be inferred from the traditional liberal art religious grounds that were being formulated in the "long 19th century". The progressive case can legitimately be that if that art religion is serious then it should be open to all those groups that were marginalized by the effects of imperialism and colonialism.  And within the confines of traditional Western liberalism that makes sense, it's even a way to force the real world caretakers of the Western liberal arts religions to be held accountable for taking those ideals seriously enough to give more artists who are not (yet) in the arts canons a hearing or a viewing.  But ... 

if we're living in a neoliberal order ... there could be a sense in which a progressive impulse for multiculturalism is trying to win a hearing on the basis of a fundamentalism of art religion that has been displaced by a more neoliberal market fundamentalism.  To use a vivid but possibly sketchy analogy, progressive multicultural impulses for more representation in the arts in a neoliberal order might be like someone who is set on the original theatrical versions of Star Wars being the "real" films when George Lucas settled in his mind that "special edition" is what he "really" wanted and that's just that.  The multicultural push may be a sound and even admirable appeal to have the game played by rules that stopped being used a generation or two ago.  

If we got more education focused on a wider conception of the arts then more art by the not-European-white-dudes canon could get a hearing ... but if the larger arts worlds are in bubble conditions then what we might want to consider as an alternative is the possibly unappealing stance of a Paul Hindemith or a John Sousa that it's the amateurs, all the people never make a living making art, who are the actual lifeblood of any given arts culture.  There will be vocational artists, of course, but the two composers I mentioned expressed reservations about what would happen to musical culture when technology and industrialization ran their course. They were fusty old fogeys at one level, even reactionary on some issues, but the core critique of the net and cumulative effects of market forces and education might be worth keeping in mind.  

When Hindemith bitterly quipped that all American musical education was really good for was making music teachers who made more music teachers rather than actual musicians for whom music was part of life rather than some profession he may have been right.  It can be easy to push for more or a more justly defined set of parameters for music education.  But we do have to ask why we want to educate kids to know about music and to have music literacy.  So that can be their day job?  If that's the case that could be the ultimate acquiescence to what's lately called a neoliberal order.  But saying that music is something people should be able to do even if it's only ever a "loss leader" in terms of their economic lives has never come up much when I read advocacy for arts education. Let me put it this way, back when I was at Mars Hill I sang in a choir and played music with friends and discussed music.  The earliest years of what used to be Mars Hill almost had an arts community vibe to it.  It was not necessarily about making money making music or literature.  Sure, I have come around to having a few trenchant criticism of what Mars Hill became but the case study is meant to show that church or synagogue or religious community has historically been one of the realms in which music was made.  Festivals are another.  Parties still another.  

There's an axiom in military history that warns that you are likely to end p losing today's war because you're so busy preparing to fight the last one you fought.  Arts education may have put itself in a comparable position in the last thirty years.  If that's true, and academics themselves will have to hash that out because I'm not officially an academic, then maybe a broader and more explicitly American conception of arts history is needed in an American context ... although I think the sun has probably basically set on the American era in the last twenty years ... 

For instance, at a personal level I just don't see the future of art music as being symphonic.  Even if we set aside concerns about the future of the NEA or NEH the symphony has become relatively marginal to mainstream musical life.  Hip hop has become the highest selling style of music and while it's not exactly my favorite style of music I don't think it's all crap the way I did thirty years ago when I was first hearing it.  I find Katy Perry annoying enough that her Millenial Whooping makes me relieved to hear ... Bobby Brown and MC Hammer, which I never, ever imagined was going to be possible.  Ed Sheeran makes me far, far more appreciative of Hall & Oates now than I ever thought I would be ... although as balladeers go even Sheeran isn't all that horrible excluding that one terribly, terribly over-played song.

I've been thinking about arts and academics a lot because when I was younger I wanted to go into academics and get into the arts.  Over the last twenty years I've become grateful that wasn't possible.  I've also read debates back and forth about whether popular music can or should be more thoroughly integrated into musicology and music education or not.  The more I think about it the more it seems that if we're mostly using instruments that use equal-temperament why shouldn't we treat everything in the equal-tempered instrumental idiom as sharing things in common.  But ... the narratives surrounding the extra-musical or non-musical aspects of what are associated with music tend to get pride of place in debates.  That's the kind of thing I was thinking about when I wrote about how hegemony seemed to be in the ey of the complainer.

I admit to still being a little worried about paradigms I've seen like Afrological and Eurological.  I can get how and why the terms are useful to describe post-World War II music in the northern hemispheres practice of concert music that involves improvisation as a structural consideration or element of performance but ... I am not sure the narrow scholastic focus may be helpful.  That improvisation has permeated the Western European musical traditions is not that hard to establish.  The Baroque era was full of improvising riffs over popular ground basses, after all.  Someone has probably written a few dissertations on handlings of La folia over the last four centuries. That riff is so steeped in the guitar literature that ... I'll put it this way, I've shown a score or two to guitarists who reflexively thought I was doing some kind of variation on the ground bass La folia when what I was really doing was what I thought was an absurdly straightforward twelve-bar blues!  

I shouldn't have been surprised if the figuration for that twelve-bar blues resembled La folia and that classical guitarists would hear that first and not the i, iv, V chord changes outlining the twelve-bar that I was thinking of as I composed a particular piece.  But at the risk of using a case study of my own musical activity a fusion of 12-bar blues with an ancient and popular European ground bass is easily done, perhaps especially if you're so steeped in both musical idioms you just fuse them as a matter of course. 

My impression is that what the new musicology folks would like to have happen is something like that.  The old musical canons of the West won't go away but the way we relate to them and, to put it bluntly, make use of them, will probably change.

But along the way it seems like we might also want to consider here in the West that there's some sense in which our collective story has been told and the sun may be setting on the pax Americana, for want of a better or more accurate phrase ... because even the bluest of blue state voters doesn't necessarily want to concede that the last century could be construed as the age of American imperialism.  It's interesting to read articles in which authors propose that whatever the futures of the arts may be they won't be written in the West because it gives me a sense that I've been having in the last ten years that when it comes to the arts in the West even the most ostensibly progressive minded people are still, in the sense that they want the history of the arts to be thought of in American or European terms, in some way reactionary.  Just as with the legacy of European liberal arts American arts can leverage a presumed imperial incontestability of global status that it blanches at consider if you point it out in the bluntest possible terms.  The imperium is supposed to be tacit rather than explicit and perhaps what makes the alt-right galling to progressives is that they make the value of the imperium too explicit for a blue state sympathy to stomach.  A thought for a weekend.