Saturday, August 05, 2017

links for the weekend, Atlantic Monthly pieces about men posing as women authors and women finding they'd rather work for men in corporate settings; and stuff like the distinction between mastery and brainstorming in the creative process

Over at The Atlantic there were a couple of features that stood out about men and women.  One of them was the following piece about how men have taken up feminine pseudonyms to publish novels.
Almost 10 years ago, Martyn Waites, a British crime writer, was having coffee with his editor. Waites, who was at something of a loose end project-wise, was looking for new ideas. His editor, though, was looking for a woman. Or, more specifically, a high-concept female thriller writer who could be the U.K.’s Karin Slaughter or Tess Gerritsen.
“I said I could do it,” Waites recalls. His editor was skeptical. But then Waites outlined an idea for a book based on a news story he’d once read, about a serial killer targeting pregnant women and cutting out their fetuses. The concept, he admits somewhat bashfully, was a gruesome one.
“That’s exactly what we’re looking for,” was his editor’s response.
That idea became The Surrogate, a crime thriller published in 2009, and Waites simultaneously became Tania Carver, his female alter ego. Before he started writing, he embarked on a period of research, reading novels by popular female crime writers, and made “copious notes” about their various heroes and villains. Waites was an actor before he was a writer, and “Martyn” and “Tania” soon became different personas in his head, almost like characters. He’d sit down to write as Tania and then realize the concept was much better suited to Martyn. Martyn books, he explains, “were more complex, more metaphorical. The kind of things I like in writing.” Tania books were simpler: mainstream commercial thrillers aimed at a female audience. And they rapidly became more successful than any of Waites’s previous books had been.
The case of a male author using a female pseudonym to write fiction was relatively unheard of when Tania Carver emerged, but the explosion of female-oriented crime fiction in the last five years has led to an increasing number of male authors adopting gender-neutral names to publish their work. Last month, The Wall Street Journal’s Ellen Gamerman considered the phenomenon, interviewing a number of writers who fessed up to being men: Riley Sager (Todd Ritter), A.J. Finn (Daniel Mallory), S.J. Watson (Steve Watson), J.P. Delaney (Tony Strong), S.K. Tremayne (Sean Thomas). The trend is ironic, Gamerman pointed out, because the history of fiction is littered with women writers adopting male or gender-neutral pseudonyms to get their work published, from the Brontë sisters to J.K. Rowling.

Another is about women and men in the corporate world and how women find they don't like working for women in high-powered contexts.
After 16 months, Shannon decided she’d had enough. She left for a firm with gentler hours, and later took time off to be with her young children. She now says that if she were to return to a big firm, she’d be wary of working for a woman. A woman would judge her for stepping back from the workforce, she thinks: “Women seem to cut down women.”
Her screed against the female partners surprised me, since people don’t usually rail against historically marginalized groups on the record. When I reached out to other women to ask whether they’d had similar experiences, some were appalled by the question, as though I were Phyllis Schlafly calling from beyond the grave. But then they would say things like “Well, there was this one time …” and tales of female sabotage would spill forth. As I went about my dozens of interviews, I began to feel like a priest to whom women were confessing their sins against feminism.
Their stories formed a pattern of wanton meanness. Serena Palumbo, another lawyer, told me about the time she went home to Italy to renew her visa and returned to find that a female co-worker had told their boss “that my performance had been lackluster and that I was not focused.” Katrin Park, a communications director, told me that a female former manager reacted to a minor infraction by screaming, “How can I work when you’re so incompetent?!” A friend of mine, whom I’ll call Catherine, had a boss whose tone grew witheringly harsh just a few months into her job at a nonprofit. “This is a perfect example of how you run forward thoughtlessly, with no regard to anything I am saying,” the woman said in one email, before exploding at Catherine in all caps. Many women told me that men had undermined them as well, but it somehow felt different—worse—when it happened at the hands of a woman, a supposed ally.
Even a woman who had given my own career a boost joined the chorus. Susannah Breslin, a writer based in Florida, yanked me out of obscurity years ago by promoting my work on her blog. So I was a bit stunned when, for this story, she told me that she divides her past female managers into “Dragon Ladies” and “Softies Who Nice Their Way Upwards.” She’d rather work for men because, she says, they’re more forthright. “With women, I’m partly being judged on my abilities and partly being judged on whether or not I’m ‘a friend,’ or ‘nice,’ or ‘fun,’?” she told me. “That’s some playground BS.”
Other women I interviewed, meanwhile, admitted that they had been tempted to snatch the Aeron chair out from under a female colleague. At a women’s networking happy hour, I met Abigail, a young financial controller at a consulting company who once caught herself resenting a co-worker for taking six weeks of maternity leave. “I consider myself very pro-woman and feminist,” Abigail said. Nevertheless, she confessed, “if I wasn’t so mindful of my reaction, I could have been like, ‘Maybe we should try to find a way to fire her.’?” [emphasis added]

It's fascinating how Phyllis seems implicitly not a part of the sisterhood implied in the article.  Women could be described as both a historically marginalized group but also as a group of which the late Phyllis Schlafly might not be a welcome participant ... ?  This kind of writing reminded me of Hanna Rosin's proposal a few years back that what many women writers regard as the sisterhood might be more properly described as rich white ladies who can afford to live in New York from the income they make as writers and that the actual sum of women the world over do not really reflect this sisterhood.  The odds of a woman like Margaret Thatcher being celebrated by authors at Slate's Double X seem comfortably close to zero, for instance. 

One of the things social psychologist Roy Baumeister has written about in his book Is There Anything Good About Men? is male social systems tend to develop in ways that ensure that the membership or participation of any one male in the system is contingent.  To put it in more blunt, practical terms, unless you can prove there's a good reason to keep you within the team and that you fulfill the requisites of participation there's no good reason to keep you around.  A tension between this and women participating in the higher levels of the corporate world may be that while women want to contribute at higher levels of corporate life at which society is often guided, the competition and viciousness of that world is often alienating. 

It's possible that what this article circles around but doesn't get at directly is that the social dynamics inside a corporate context call for a type of socialization that requires brutal pragmatism for better and worse, and that perhaps women are socialized to refuse the kind of compartmentalization that men are socialized to accept within post-industrial Western societies.  It involves a gender stereotype, yes, but the stereotype is that men can hate you at a personal level and still work with you in the grudging concession that you're good enough at what you do that you're the best person for the job, personal animosities withstanding.  Perhaps what women have been observing is that in a corporate setting where they work for women this capacity to differentiate is arrived at with greater difficulty, or to go by the sum of the complaints, never arrived at in settings where the aggressive supervisor is female, the proverbial Dragon Lady.

That conflation of professional and personal socialization may not really be "playground BS", it may simply be how women in Western contexts have been socialized from birth. 

There may be a flip side to this in the male stereotype, the guy who can cut you loose because you're just not getting the job done but can recognize that you're a nice enough person who would be a promising employee in another context at some other company. 

Of course ... there's also the proverbial good old boy network in which men who by outsider standards would be regarded as completely unfit for a job and not even competent are granted huge levels of power, access and privilege because the right person knows them and decides to give them a job.  While it's possible to conjecture at some examples of this sort of corporate culture it can also exist conspicuously in non-profit settings and can be potentially just as bad--after all, it's not like no one these days can think of churches were sub-par employees had large amounts of influence and power within an organization exceeding their provable abilities because someone upstairs decided they were on mission.  It seems safe to guess that there's nothing about male or female that exempts managers from engaging in nepotism, cronyism and networking in a good old way to the detriment of an organization.

Over at Quarts, a little piece proposing that pursuing creativity as a goal in itself is very likely a waste of time compared to simply (irony alert) acquiring a comprehensive mastery of the field you're working in.

Why mastery beats creativity—every time
The idea that comes out of nowhere. The eureka moment. If we could figure out how to get there faster and automate up the process, humankind would be forever changed, right? This is something we can’t stop obsessing about as a society—but maybe we’re thinking too hard about it.
Two years ago, the New York Times reported on a whimsical new trend on college campuses: studying creativity itself. Schools were suddenly offering minors in creative thinking and asking their students to problem-solve for problem-solving’s sake. The classes seemed to make the students more confident, and had benefits that were tangible if slight: one student figured out a quicker way to re-shelve DVDs at his library job.
And yet this worship of creativity has haunted me since I first read the article. Aren’t we thinking of “creativity” too broadly here? Is it truly something we can study on its own, divorced from the problems and distractions and flash cards of the real world?
So yes, a creative studies minor can be useful for the first part of “being creative”—the convergent phase. But when it comes to the divergent phase, learning to be broadly “creative” isn’t enough. In the divergent phase, where you assess the ideas you’ve generated, existing knowledge is incredibly important. In other words, “being creative” starts to depend heavily on what we already know.
This prior knowledge of a system or field may be the most important aspect of “creativity”—much more so than convergent thinking.
Some of the most compelling experimental evidence describing brain activity patterns during the “divergent” phase of a creative task implicates the medial temporal lobe and hippocampus, which is the part of the brain that humans use when making, storing, and accessing memories—and the hippocampus lights up like a firecracker during memory recall. Evidence of hippocampal activation during the “divergent” thinking part of the creative process may indicate that subjects are calling upon existing knowledge to complete the task, in order to ultimately generate unique or novel outputs. The mathematician Terry Tao hinted at the same end point, albeit less neurologically, when he said that the ability to apply and intuit arises from mastery.
This is why learning to brainstorm and listening to the Muse isn’t enough when it comes to studying creativity. In order to “be creative,” in order to problem-solve with the best of them, we need to work on becoming not just artists—but experts.  [emphasis added]



In the category of "we're better than average", there were a couple of links at ArtsJournal about how people who pursue artistic experiences and participate in their local arts communities are more likely to be charitable givers.

Michael Lind's argument over at The Smart Set is that we should stop bragging about charitable generosity and work toward an economic system in which it's less and less necessary to solicit that sort of charity.  Certain types of cranky folks on the internet might say the above links about artists and arty types as altruists is pharisaical virtue signaling.  It's not like the Boyle Heights situation coverage made it seem like everybody thought artists and associated gentrification with arts venues was actually making life easier for longtime residents but let's leave that at that for the time being.  We can note in passing that an engagement with the arts and a concern about the arts scene can be found in someone like a Zhdanov, too, so we should be extremely cautious about how virtuous we think it is to be into the arts as if that were a thing in itself to be regarded as great.  After all, ArtsJournal links are probably not going to be going be people touting the remarkable musicianship and cogent philosophical musings of a band like Rush.  Which gets us to ... .

There's more being written about somebody's book about progressive rock, and it's been taken up as an opportunity for authors to vent about a musical genre they don't exactly enjoy, once again, over at The Atlantic,  by a James Parker, which could faintly come across as though it was penned by what's colloquially known among critical circles as a "rockist":

Money rained down upon the proggers. Bands went on tour with orchestras in tow; Emerson, Lake & Palmer’s Greg Lake stood onstage on his own private patch of Persian rug. But prog’s doom was built in. It had to die. As a breed, the proggers were hook-averse, earworm-allergic; they disdained the tune, which is the infinitely precious sound of the universe rhyming with one’s own brain. What’s more, they showed no reverence before the sacred mystery of repetition, before its power as what the music critic Ben Ratliff called “the expansion of an idea.” Instead, like mad professors, they threw everything in there [emphasis added]: the ideas, the complexity, the guitars with two necks, the groove-bedeviling tempo shifts. To all this, the relative crudity of punk rock was simply a biological corrective—a healing, if you like. Also, economics intervened. In 1979, as Weigel explains, record sales declined 20 percent in Britain and 11 percent in the United States, and there was a corresponding crash in the inclination of labels to indulge their progged-out artistes. No more disappearing into the countryside for two years to make an album. Now you had to compete in the singles market.
Some startling adaptations did occur. King Crimson’s Robert Fripp achieved a furious pop relevance by, as he described it, “spraying burning guitar all over David Bowie’s album”—the album in question being 1980’s Scary Monsters (And Super Creeps). Yes hit big in 1983 with the genderless cocaine-frost of “Owner of a Lonely Heart.” And Genesis, having lost ultra-arty front man Peter Gabriel, turned out to have been incubating behind the drum kit an enormous pop star: the keening everyman Phil Collins.
These, though, were the exceptions. The labels wanted punk, or punky pop, or new wave—anything but prog. “None of those genres,” grumbled Greg Lake, retrospectively, “had any musical or cultural or intellectual foundation … They were invented by music magazines and record companies talking together.” Fake news! But the change was irreversible: The proggers were, at a stroke, outmoded. Which is how, to a remarkable degree, their music still sounds—noodling and time-bound, a failed mutation, an evolutionary red herring. (Bebop doesn’t sound like that. Speed metal doesn’t sound like that.) [emphasis added]
I feel you out there, prog-lovers, burning at my glibness. And who knows? If the great texts of prog had inscribed themselves, like The Lord of the Rings, upon my frontal lobes when they were teenage and putty-soft, I might be writing a different column altogether. But they didn’t, and I’m not. The proggers got away with murder, artistically speaking. And then, like justice, came the Ramones.
What might make this sort of condescension toward progressive rock and its fans more diplomatic would be if Parker had demonstrated enough musical history to show us that this kind of thing has happened before.  There's plenty of music from the eighteenth century that has simply note stood the test of time even within the "classical" tradition.  Why?  Haydn's complaint of one of his contemporaries, for instance (citation pending but, trust me, I'll eventually dig it up) was that the composer flitted from one idea to the next and made nothing of his themes so there was nothing to treasure in the heart. 

To put that lament in more 21st century terms, some composers don't respect the cognitive constraints of the human brain as much as they regard their own virtuosity; so they show off their awesome chops and make the mistake of thinking that because they can see how it all holds together on the page the audience (who must surely show some gratitude) must be able to hear it.  Like forgotten eighteenth century symphonies with too many ideas for any one hook to take hold, progressive rock could be considered a comparable dead end.  It's just more fair-minded to the aspirations of the musicians themselves to suggest that the problem was less one of the tunes than that, as a friend of mine from college put, there are too many tunes and in too few songs do the prog rockers commit to the tunes they have.

Whether it was the James Parker review of a book about progressive rock or Richard Brody's recent write up about Die Hard, both reminded me of a piece by Arthur Krystal.
I'm not complaining—OK, I am complaining, but not because reviewers find fault, but because given a chance to perform they forget they're rendering a service to the reader, not one to themselves. [emphasis added] A flawed book gives no one license to flog it in print. If there are mistakes, why not sound regretful when pointing them out instead of smug? If the book doesn't measure up to expectations, why not consider the author's own expectations with regard to it? While no one wants shoddy work to escape detection, a critic must persuade not only the impartial reader but also the biased author—as well as his biased editor and biased family—that the response is just.

And tone matters, tone is crucial. Even writers who check their personalities at the door often condescend without meaning to. Perhaps it can't be helped. There's a reason, after all, that a judge's bench overlooks the courtroom: Sentences must appear as if passed down from on high. I'm not saying only Buddhists should review, but wouldn't it be nice if the superior attitude, the knowing asides, and the unshakeable convictions could disappear from the world of print? From personal experience, I can tell you that my own books have been discussed by people who had no idea what most of my essays were about, but whose pontifical airs demonstrated (as if further proof were needed) that lack of knowledge is never an obstacle to self-esteem.

I got to the end of Brody's write up on Die Hard and it seemed the world would have been no worse a place if he'd never bothered to watch Die Hard.  Brody, over time, has come across like the kind of arts critic who can look down on the half-century of Star Trek as mass culture without being willing to simultaneously repudiate the social and political ideals it has stood for. If there is a vice to which professional critics seem particularly prone in Anglo-American journalism it's that they would prefer to review films in which they can revel in their powers of introversive and extroversive observation about the sum of cinematic art rather than review a movie whose moralizing agenda is 1) patently obvious in its presentation and 2) just possibly not the moralizing lesson they would wish to have presented.  When a reviewer at Salon said of Christopher Nolan's The Dark Knight Rises that Nolan was a fascist there was apparently no real need to resort to evidence, the assertion was enough.  Now it's possible Nolan has a political view you or I would disagree with but that a reviewer at Salon could so confidently assert Nolan was a fascist filmmaker suggests that when journalists despair of Trump fans riffing viciously in comboxes they may not fully appreciate the extent to which they themselves have been leading the way but with the insulation of institutional imprimatur.  Just another idea to consider for the weekend there.

If even professional critics can be found guilty of writing bad reviews as rendering service to themselves rather than the reader, as Arthur Krystal put it, then how shocked should we be that on the internet trolls are trolls?  How shocked should we be if a sea of people who are not professional arts critics could be even more self-referential or self-serving in spraying vitriolic comments about books or movies and any and all associated creative people who were involved in them?  If the biggest topic about the Ghost in the Shell remake was the whitewashing involved in the American remake then that may not signal that the professional critical scene is really engaging with the ideas of the manga or the anime so much as being, well, skin deep. 

It's not necessarily just pop culture criticism or contemporary journalism.  It can happen in academics.  Kyle Gann has blogged about how he actually likes Clementi better than Mozart much of the time.  If we wanted to lionize a composer for being studiously devoted to an intensive development to a small set of thematic ideas you'd think Clementi would be more highly regarded in academic musicology but, nope, he's no Beethoven.   While as a guitarist composer I've found myself benefiting from studying Haydn and Clementi's sonata forms in more formal academic land Beethoven and Mozart are the better and more profound artists.  That's a shame and not because I really dislike Beethoven or even that I dislike all Mozart (though I have to admit I am bored with most of it).  It's that the museum nature of the academic canon excludes as it embraces.  I have felt over the last fifteen years that being a guitarist has been a kind of advantage in a new music (i.e. classical music with modifier) scene.  But more on that, perhaps, some other time.

Finally, longtime readers of the blog know that this blog has featured an awful lot of material about the rise and fall of what used to be called Mars Hill.  IN a somewhat fiery period between 2013 and late 2014 there was this thing going on where we might quote something here and then a week after the quoting happen the material would go down.  Then with use of  The Wayback machine stuff could be brought up for public consideration to keep information in public view.  Then websites might go down or get modified and robots.txt would get introduced to preclude the use of search engines via archiving sites.  For better or worse Wenatchee The Hatchet managed to document a lot of stuff faster than Mars Hill admins and leadership could take it down so a lot of raw material for historical research (emphasis on raw!) has been preserved here for the public record.  But the challenge has been that when so much of the history of the former church has been in the virtual reality of cyberspace rather than in books a lot of material has been purged in ways that make it hard to recover.  Well, for those who might be curious about what robots.txt is/doss ...

No comments: