Putting the 'role' back in role-playing games since 2002.
Donate to Codex
Good Old Games
  • Welcome to rpgcodex.net, a site dedicated to discussing computer based role-playing games in a free and open fashion. We're less strict than other forums, but please refer to the rules.

    "This message is awaiting moderator approval": All new users must pass through our moderation queue before they will be able to post normally. Until your account has "passed" your posts will only be visible to yourself (and moderators) until they are approved. Give us a week to get around to approving / deleting / ignoring your mundane opinion on crap before hassling us about it. Once you have passed the moderation period (think of it as a test), you will be able to post normally, just like all the other retards.

Building fantasy cultures underuses scientific knowledge - Discuss

AwesomeButton

Proud owner of BG 3: Day of Swen's Tentacle
Patron
Joined
Nov 23, 2014
Messages
16,232
Location
At large
PC RPG Website of the Year, 2015 Make the Codex Great Again! Grab the Codex by the pussy Insert Title Here RPG Wokedex Divinity: Original Sin 2 A Beautifully Desolate Campaign Pillars of Eternity 2: Deadfire Steve gets a Kidney but I don't even get a tag. Pathfinder: Wrath
I just finished reading this article, and while reading, I thought how underused cultures-constriction is, in the crafting of fantasy settings. In most RPGs we get fantasy "peoples", "races", or "nations" which reason and act according to the worldview of the modern Western/Globalized man/woman, or make a cheap play on the stereotypes we know these for. We are practically in the position to applaud a game where elves speaking in present tense only is considered a fresh idea to discern them culturally from humans.

A few steps better are games which try to portray the outlook of western medieval people, but these are also pretty shallow, usually limited to simple verbal demonstrations of "vassal obligations", "feudal spirit", etc. One notable exception from the overall sloppiness I can think of are Witcher 3 and Darklands, but I won't get into analysis here. Just read the article, it's rather good, and share opinions.

Isn't the derivative feeling that prevails over most fantasy settings in RPGs caused mostly by lack of education that could easily be remedied (by reading, or hiring consultants)? Wouldn't even the pitiful casual "I want to ride a horse and blow things up" experience benefit a little from better developed fantasy settings?

Article:
THE LUXURY OF TEARS
People in richer societies cry more. Matthew Sweet probes the reasons

When he became a father, Charles Darwin began taking notes on the emotional development of his children. Such record-making was part of the rhythm of the household. He logged the weather, his farts and sneezes, and the behaviour of the earthworms he kept in a jar on the piano. His offspring were too compelling a source of data to ignore.

Willie was his first-born. Darwin tickled his feet with a spill of paper and watched for laughter. Annie arrived 14 months later. Darwin observed the moment when she first responded to her own reflection in the polished case of his fob watch; and her consternation when a wafer biscuit became stuck to her hand. It was the crying, though, that most aroused his curiosity. Darwin kept a careful record of these outbursts, noting when eyes were dry, when filled with tears – and concluded that though we may wail from the moment we emerge from the womb, it takes time to develop the facility for weeping. “I first noticed this fact”, he wrote, “from having accidentally brushed with the cuff of my coat the open eye of one of my infants, when 77 days old, causing this eye to water freely; and though the child screamed violently, the other eye remained dry, or was only slightly suffused with tears.” Crying, he decided, “required some practice”.

Darwin collected most of these data in the 1840s, when his children were young. It was the calmest decade of his life. The voyage of the Beagle was behind him; he had settled with his wife, Emma, in Down House, a comfortable villa in rural Kent. The reading public was devouring the published account of his South American travels, which detailed his tortoise-steak dinners on the Galapagos, his excavation of a fossilised giant sloth and his reflections on the human specimens he encountered on the way. On his desk lay the notes for “On the Origin of Species”. When it appeared in 1859, the book erupted like a cultural Krakatoa. Over 150 years later, we are still living through its aftershock. It looms so large that we are inclined to forget its author’s other published work. Most of all, perhaps, the book nourished by his domestic explorations in Kent, and which, through more subtle channels, also exerted a profound effect upon the future.

“The Expression of the Emotions in Man and Animals” (1872) compared the emotional subjects with whom Darwin shared his home with ones he had encountered on his travels – and hundreds more he would never meet. In a spectacular example of Victorian crowdsourcing, he fired off hundreds of letters and questionnaires to correspondents all over the world. He implored a biologist in Brazil to tell him whether South American monkeys wrinkled their eyes “when they cry from grief or pain”. From a phalanx of missionaries and doctors he drew reports on the weeping habits of the Australian Aboriginals. James Brooke, the Rajah of Sarawak, supplied emotional intelligence on the Dayaks of Borneo. Tristram Speedy, guardian to Prince Alamayu Simeon of Abyssinia, gave a long-distance lesson in east African passions.

From their data, Darwin mined a number of influential conclusions. Emotions, he suggested, were facilitated by the act of expressing them. We don’t cry because we are upset, rather the act of crying informs us that we are upset. In our neuroscientific age, when we’re apt to regard a small firing in the brain as the first stage of all human processes, the proposal may seem bizarre, but it is not quite abandoned – a Japanese study from 2007 drizzled its subjects with artificial tears and found, as Darwin would have expected, that many experienced feelings of sadness.

Darwin’s most tenacious idea, however, was a cultural one. A correspondent in New Zealand had told him the story of a Maori chief who “cried like a child because the sailors spoilt his favourite cloak by powdering it with flour”. He had observed similar behaviour from the deck of the Beagle, notably that of a Fuegian man, recently bereaved, who “alternately cried with hysterical violence, and laughed heartily at anything which amused him”. Civilisation, he reasoned, had bred emotional temperance, and humans who lived beyond its borders were subject to fits of passion. “Savages weep copiously from very slight causes,” he concluded. “Englishmen rarely cry, except under the pressure of the acutest grief.”

Today, the remark sounds ludicrous. Darwin, though, did not write from a position of ignorance. He knew the pressure of the acutest grief. In 1851, his beloved daughter Annie sank from the world under the weight of tuberculosis. She was ten. (“We have lost the joy of the Household”, wrote her father, “and the solace of our old age.”) And those savages? The language now offends, but the assumption it carries – that the inhabitants of rich Western nations shed fewer tears than citizens of the developing world – held firm until the beginning of the present decade.

When I first visited Down House 20 years ago, I was ready to be moved. I was the sort of boy who fished in drains for sticklebacks and kept his fingers crossed during the Lord’s Prayer. Darwin was my childhood hero. Standing in the modest little space of his study, it was easy to imagine the great man feeding slides into the microscope, squinting at his water-damaged notebooks, furrowing his appropriately simian brow. Easy, too, to feel moved to tears. Muslims weep before the Ka’aba, Jews at the Wailing Wall. In Darwin’s workroom, atheists can shed hot rational tears where a doubt-wracked Victorian naturalist sat down to revise the relationship between humanity and the universe.

The tears that swam in my eyes, however, were produced by something much more mundane. A three-legged stool – a low, plain, unremarkable thing mounted on brass castors so that its owner could scoot between his writing desk and his work-table without breaking the line of his thought. Its wooden sides were lined with scuff-marks, as if someone had dragged it hard against the wall. This, the guide explained, is exactly what had happened. Darwin rarely worked past lunchtime, giving over the rest of the day to activities with his family. Once the morning’s studies were concluded, he surrendered the three-legged stool to his children, who punted it up and down the hallway, pretending that it was a boat. The Beagle, perhaps. This is the image that stirred my emotions – Charles Darwin, genius and really good dad.

In the context of “The Expression of the Emotions in Man and Animals”, though, my response seemed an affront to Darwinism, brought to the very room in which its ideas were formulated. I hadn’t cried from grief, acute or otherwise, but from a sentimental sense of connection with a great thinker whom I admired. Darwin’s writings, however, seemed not to accommodate the possibility of such an emotion. Perhaps, in 1872, people just felt things differently.

201604_FE_CRY_007-web.jpg

Brad Pitt in “Babel” (2006)

We sound different from our ancestors. We wear different clothes, observe different philosophies, follow different ideas. Could certain ways of feeling have vanished along with Mother Bailey’s Quieting Syrup and Capstan Filters, yielding to fresh moods and senses? A new generation of scholars working on the history of the emotions believes passionately that this is the case, and wants us to see our feelings not simply as what happens when a neurological circuit lights up in our brains, but as the products of bigger cultural and historical processes. Their first contention: the very idea of the emotions is a surprisingly young one.

“The concept arrived from France in the early 19th century as a way of thinking about the body as a thing of reflexes and twitches, tears and shivers and trembles, that supplanted an older, more theological way of thinking,” says Tiffany Watt Smith, once a director at the Royal Court Theatre, now a researcher at the Centre for the History of the Emotions at Queen Mary University of London. Before the discourse of the emotions took hold, she argues, people spoke of other phenomena – “passions”, “moral sentiments”, “accidents of the soul” – that were not always located within the human body. Ill winds blew no good upon the ancient Greeks, carrying flurries of unhappiness through the atmosphere. Fourth-century Christian hermits were plagued by acedia, a form of religious despair spread by demons that patrolled the desert between 11am and 4pm. Non-human organisms could also be afflicted by passions: in the Renaissance, palm trees became lovesick and horticulturists brokered arboreal marriages by entwining the leaves of proximate specimens.

Watt Smith has compiled “The Book of Human Emotions”, in which she gives the histories of 156 human feelings. Many of these you will have experienced – popular headline acts such as guilt, indignation and apathy. Others seem familiar, despite the exotic names – such as basorexia, the sudden urge to kiss someone; or matutolypea, the ill-temper that flourishes between the alarm-clock and the day’s first cup of coffee. For Anglophone readers, some of her subjects are mysteries locked behind the door of someone else’s culture. Amae, a Japanese term that describes the comfort felt when you surrender, temporarily, to the care and authority of a loved one. Liget, an angry enthusiasm that buzzes in the Ilongot tribe of the Philippines, pushing them to great feats of activity – sometimes agricultural, sometimes murderous. Awumbuk, a feeling of emptiness after visitors have departed, is experienced by the Baining people of Papua New Guinea. Departing guests, they theorise, leave behind a kind of heaviness as they go. (A bowl of water left out overnight absorbs this force.)

Perhaps the most revealing entries are those on emotions that remain felt, but which have been substantially reconstructed. In the early 19th century, for instance, nostalgia was considered a terminal condition. Men in their 20s were thought particularly susceptible. During the American civil war, doctors scribbled the word on dozens of death certificates. In the 1830s, French medical authorities warned that excessive attachment to lost people and lost places could reduce the sufferer to a state of decrepitude. “Little by little his features become drawn; his face is creased with wrinkles; his hair falls out, his body is emaciated, his legs tremble under him; a slow fever saps his strength…his discourse becomes incoherent; his fever becomes ever greater, and soon he succumbs.”

By the middle of the 19th century, it had fallen from the diagnostic repertoire. Technology, doctors asserted, had cured it. If you pined for the scenes of youth, you could get on a train and spend an afternoon running about in them. If you were sick with the melancholy remembrance of your childhood nurse, you could send her a telegram enquiring after her health. “Happily,” reflected one specialist, “nostalgia diminishes day by day; by descending little by little among the masses, instruction will develop the intelligence of people, making them more and more capable of struggling against the disease.”

For most of us, that struggle is over – so over, in fact, that we consider nostalgia a pleasure, and cultivate the condition without suffering anything worse than a hefty bill from eBay. Science, too, has changed its tune. In 2012, an American study measured the therapeutic value of the feeling: “Nostalgia makes people feel loved and valued and increases perceptions of social support when people are lonely.” The following year, British researchers found something even more tangible: “on a physical level”, their leader reported, “nostalgia literally makes us feel warmer.”

The Centre for the History of the Emotions is at the heart of London’s East End, on a campus that enfolds one of Britain’s oldest Jewish cemeteries. Its academics share their findings on a collective blog. Some work on despair, others disgust. One is considering shame and anger in early-modern Spain. Another is examining Busman’s Stomach, a gastric disorder affecting bus drivers in the 1930s, ascribed to carbon-monoxide fumes but probably psychosomatic. (The Marxist biologist J.B.S. Haldane prescribed reading Lenin as a cure.) A founder member, Jules Evans, has found an unexpected enthusiasm in Korea for the ideas of the Greek Stoics, who treated emotions like beliefs or opinions that could be revised. He is now engaged in a global survey of religious ecstasy, which includes a study of the alarmingly transmissible passions of Islamic State and al-Qaeda. Queen Mary’s researchers enjoy happy relations with a small but thriving coalition of similar institutions – the Max Planck Institute in Berlin, the Nordic Network for Intimacy Research, and the Australian Research Council’s Centre of Excellence for the History of Emotions, the biggest and most lavishly funded, with sites in Adelaide, Brisbane, Melbourne, Perth and Sydney.

The director of the London Centre, Thomas Dixon, seems a man in excellent emotional health – friendly, full of infectious enthusiasm for his work, and, when I arrive, waiting at the door of his building with milky tea and two kinds of cake. He ought to look content. His last publication, “Weeping Britannia”, a critical history of British emotional restraint, was one of the most lauded history books of 2015. Its thesis is excitingly revisionist. It takes that most familiar of emotional concepts – the British stiff upper lip – and reveals that it was a historical blip. The phrase, it seems, was coined in America, and only became fully associated with Britain during the first half of the 20th century, as an increasingly militarised and imperial national culture absorbed the shock of global conflict. Britain before this period is, he suggests, better characterised as a wet-cheeked, passionate nation in which tears enjoyed an elevated status. Politicians advanced their arguments by shedding parliamentary tears. Monarchs wept to demonstrate their sincerity. Fans of Henry Mackenzie’s influential novel “The Man of Feeling” (1771) read how its gentleman-hero offered his “tribute of tears” to beggars, orphans and prostitutes, and they attempted to emulate his sensitivity. “Weeping”, Dixon writes, “was a moral and religious activity; something to be cultivated, tutored, practised, learned, performed.” It is a history of which Darwin seems to have known little.

201604_FE_CRY_006-portrait-web.jpg

Diane Kruger in “Joyeux Noel” (2005)

In a poem reflecting on the traumas of the French revolution, William Blake asserted that “the tear is an intellectual thing”. This has become a mantra for Dixon and his colleagues. Blake’s line suggests that weeping is not where thought ends – a state to which we are reduced once more complex processes have broken down – but evidence that we have computed a mass of cognitive data about the world around us and come to a considered conclusion. It’s a challenge to the language of continence and incontinence that attends our conversations about emotion – our tendency to apologise for our tears as we might apologise for losing control of our bladders. “When our bodies respond emotionally”, says Dixon, “we are thinking with our bodies”. The history of the emotions, therefore, is like the history of ideas.

When Dixon began his work on weeping, he put Charles Darwin’s study of Willie, Annie and their siblings at the top of his reading list. He even intended to follow Darwin’s example and use his son as an experimental subject – but the stresses of early parenthood soon put paid to that. Perhaps it was just as well. In 2011, experimental evidence emerged that disrupted the assumptions of “The Expression of the Emotions in Man and Animals” and supported the deeper historical evidence that Dixon was accumulating.

In the 20th century, most research followed the path hacked by Darwin. In 1906, the American psychologist Alvin Borgquist considered the fruits of his own global survey of explorers and missionaries, and asserted that “tears are more frequently shed among the lower races of mankind than among civilised people.” Borquist’s terminology may not have survived the 20th century, but the assumption of Western emotional dryness proved extraordinarily durable. A narrative built up across the disciplines. The Dutch historian Johan Huizinga published “The Waning of the Middle Ages” (1919), which tracked the disappearance of the wild, carnivalesque side of medieval life – the kind you see in Bruegel paintings in which masked grotesques caper around Flemish squares. The German sociologist Norbert Elias published “The Civilising Process” (1939), which described the growth of a culture of politeness – a new empire of the cooler emotions that colonised and pacified older, more chaotic ways of being. (A key event in the Elias timeline is the 16th-century debut of the fork.)

The modern era was seemingly content with this narrative. Then, in 2011, a team of Dutch clinical psychologists produced a study that consigned it to the out-tray of history. Ad Vingerhoets and his team examined data from 37 countries – the results of interviews in which respondents had told stories of their lachrymal lives. Their conclusions would have brought tears to Darwin’s eyes. “Individuals living in more affluent, democratic, extroverted, and individualistic countries,” they wrote, “tend to report to cry more often.” Although people enduring unenviable economic circumstances might be more plagued by depression, those from richer cultures shed more tears. Australasian and American men emerged as the weepiest in the world; their Nigerian, Bulgarian and Malaysian counterparts the most dry eyed. Women in Sweden outcried those in Ghana and Nepal. The female populations of countries where gender equality was highest wept more copiously than those where it was lower. The evidence also showed – contrary to centuries of stereotyping – that the inhabitants of colder climates wept more frequently than those who lived in warmer zones. Tears, the study suggested, were not evidence of primitivism, as they had been for Darwin. They were not even good indicators of distress. Rather than being the habit of the wretched of the Earth, weeping appeared to be an indicator of privilege – a membership perk enjoyed in some of the world’s most comfortable and liveable societies. “If you live in really distressing and difficult circumstances, crying is a luxury,” says Dixon. “We know when we have been bereaved, we might be so shocked or traumatised that tears don’t come. So perhaps we should see tears as a sign of moderate grief, of bearable negative emotion. If you are enduring extreme distress or extreme hardship, that is not the time for tears.”

In countries visited by war or famine, the observation might not seem so counterintuitive. Dorte Jessen, head of the Jordan arm of the World Food Programme’s response to the Syrian refugee crisis, has spent over a decade looking into the tearless eyes of those in the direst need. During the 2011 famine in the Horn of Africa, she was based in the sprawling refugee camp in Dadaab, Kenya, 60 miles from the border with Somalia. Early in her assignment, she recalls, she watched a mother and her two young children receiving emergency rations – sachets containing a sweet mixture of peanut paste, vegetable fat and cocoa. Just a few steps from the distribution point, the mother ripped open one of the packets and handed it to her oldest child. “They didn’t talk or express any emotion. They just kept walking,” Jessen notes. “Once you are past a certain point of exhaustion, there is simply no energy to spare to get emotional.”

In 1890, the philosopher William James drew a distinction between the “crying fit” – a psychological event accompanied by “a certain pungent pleasure” – and the much less bearable sensation of “dry and shrunken sorrow”. Some experiences, it seems, are too bleak for tears. Former inmates of Nazi concentration camps have reported, sometimes guiltily, that they did not weep during their ordeal. At a war-crimes trial in May 2015, Susan Pollock, a Hungarian Holocaust survivor, recalled her dry eyes as she watched her mother being despatched to the gas chamber. “I wasn’t crying,” she said. “I just wanted to recede into myself, never to be seen.”

We might imagine that on the flood map of world history, certain infamous fields in eastern Europe would register as brackish lakes. But the greater volume of those tears, I suspect, would have been shed not by the victims but by those who came as an act of remembrance, once the furnaces had cooled. Pollock considered herself temporarily dehumanised by her life in the concentration camp. But in an Auschwitz or a Treblinka, what would be the purpose of tears? They would be as superfluous as a pair of diamond earrings.

You don’t have to invoke Holocaust and famine to discern the luxurious nature of weeping, or the existence of an economy of emotion in which some are privileged to demonstrate their feelings and others are not. It’s there on the literary and historical record, legible in the scorn that Odysseus feels for his son Telemachus until his boy has given him a big Greek man-hug and produced proper princely tears; in the response of the 19th-century French doctors who were baffled to observe the nervous, weepy symptoms of hysteria – the malady of refined females – in muscular working-class railway workers who had been injured on the tracks. It’s there, too, in contemporary culture. If politicians cry, the value of their tears is assessed by commentators who act like jewellers, squinting for signs of fakery and paste. When, in the first week of this year, Barack Obama wept as he urged Congress to support tighter gun controls, the cameras caught every scintilla of his emotion, their clicks cascading like rain on the White House roof. Sympathetic observers celebrated these as moral tears, saltwater proof of the truth of his arguments. His enemies went on Twitter and Fox News to make allegations about onion juice on his fingers.

Statesmen and -women occupy a professional field in which the expression of emotion is permissible. For others, tears are unaffordable. The medical dispatchers who answer emergency calls receive training to help them discount their emotions as they advise those who have swallowed pills, or whose babies have stopped breathing. We would consider it a dereliction of duty if surgeons, nurses, police officers and soldiers wept during working hours. They have surrendered their right to cry in the same way that other employees might sign away their expectation of fixed hours or sick pay. Their restraint gives us the space to express our pain or gratitude, which we buy from them through taxation.

201604_FE_CRY_009-web.jpg

Saoirse Ronan in “Brooklyn” (2015)

As the world develops, so do its passions – often in ways that are not immediately comprehensible to its inhabitants. Social media are generating new rituals of anger and indignation. Reports of weeping icons have become bullets in the propaganda war between Ukraine and Russia. In Japan, an activity called rui-katsu (tear-seeking) has developed, in which customers gather to watch DVD weepies or pay for a wet-eyed escort to come to the workplace to embrace them. In China, older people bristle at the new emotional culture that they perceive leaking in from the West – exemplified by the moment in 2010 when a contestant on the reality show “Fei Cheng Wu Rao” asked a prospective partner if she’d be happy to go out for a date on a bicycle. “I’d rather cry in a BMW,” was the reply. It was an off-the-cuff remark, but seemed to articulate something about the future of China – and, therefore, the future of the world.

The history of the emotions is a young discipline. It is at the very beginning of its investigation into the long story of our feelings. “Are we”, asks Thomas Dixon, “writing the history of something that has always been the same fundamentally in the human mind being expressed and interpreted in different ways? Or are we, as most of us who do it would think, discovering the historicity of the human mind? Discovering that you can’t feel just any old way; you only feel the way you do because of your language, your experience, your family, your upbringing, your social institutions, your political institutions.” If we could acquire that perspective upon ourselves, how would it change us? Who would we become, if we could chart the flow of oceans of tears, measure the dry breadth of those famine lands, experience our feelings as events in history as well as in our bodies? People, hopefully, who might look back upon these times, their scuffed old artefacts and antique ideas, and shed a generous and sentimental tear.

Matthew Sweet presents BBC Radio 3’s “Free Thinking” and Radio 4’s “The Philosopher's Arms”. He is the author of “The West End Front”
 

Bester

⚰️☠️⚱️
Patron
Vatnik
Joined
Sep 28, 2014
Messages
11,094
Location
USSR
No way for the marketing department to check if such or such unconventional setting would resonate with the audience. And especially when everyone is so easily "triggered" nowadays, you take one step left or right and your company is ruined.

Or in one word:

trigger warning, don't open
capitalism
 

Neanderthal

Arcane
Joined
Jul 7, 2015
Messages
3,626
Location
Granbretan
Players don't want a well simulated setting that abides by its own rules and peculiarities, from which roleplaying situations can arise organically. They don't want any kind of logic, reasoning or realism to exist in a fantasy setting because they think it would ruin it, when actually the mundane exists to highlight the fantastic and you can't escape from realistic aspects. All they want is a thin layer of renaissance fayre bullshit draped over a blatant power fantasy. If anybody even asks a dev to do a little bit better you'll get legions of apologists an fanboys falling over emsens to excuse the dev and champion more streamlining and less detail.

A well realised rpg from when I was a lad a few decades ago would feel far more foreign than most staid, unexciting and boring fantasy rpgs now.
 

Lurker47

Savant
Joined
Jul 30, 2017
Messages
721
Location
Texas
The go-to argument against this is that it'd ruin the mysticism of a lot of fantastical elements. Races, alchemy, and fields of magic could certainly benefit from lore and elaboration but I don't sympathize with that claim when it comes to things like Gods.

Lovecraftian Gods who are terrifying primarily because we lack comprehension of them can't exactly be elaborated without missing the entire point. I'd love to find a God like that who straddles the archetype but has some actual development to them and who you could actually talk to (optimistic) and have their dialogue be anything beyond "lol xd so dark mysterious cryptic" or something that absolutely ruins their mystic. That seems kind of difficult but I think they pay-off would be worth it.

Back on topic: the ruining of fanaticism isn't something purely(?) based on laziness but I'd figure a lot of devs would worry that the world would come off as dull and lackluster if it lacked an otherwordly feeling right off the bat- if you could learn something instantly, they'd have to pace that whole kind of experience. To make the world feel otherwordly then, they'd have to make something, well, truly otherwordly where you're constantly learning new things that are not just logical and interesting but unique and flashy enough to give that casual fantasy vibe for the mainstream market.

And that really seems like too much effort for triple A gaming, especially since it's something tied to writing. Fantasy is a lot more tied to its preconceptions than sci-fi (because a genre with no real limitations besides "fantastical and semi-logical would be hard to market otherwise) so those connotations and expectations have lead fantasy to largely devolve into wish-fulfillment.
 

AwesomeButton

Proud owner of BG 3: Day of Swen's Tentacle
Patron
Joined
Nov 23, 2014
Messages
16,232
Location
At large
PC RPG Website of the Year, 2015 Make the Codex Great Again! Grab the Codex by the pussy Insert Title Here RPG Wokedex Divinity: Original Sin 2 A Beautifully Desolate Campaign Pillars of Eternity 2: Deadfire Steve gets a Kidney but I don't even get a tag. Pathfinder: Wrath
I think the bar is set pretty low on both ends: the audience doesn't ask for originality, the huge mass off players are satisfied playing through the same storyline (you are half-god, or can become a god, or are somehow special, and the world's fate hangs on you realizing your supernatural potential) in pretty much identical settings. On the other hand, game designers do not appear to demand much in terms of originality from either themselves or the other designers. PoE II being set in a region with highly varying climates and cultures seems to be a notable exception in the last years, and we've yet to see how good it will be executed.
 

Tigranes

Arcane
Joined
Jan 8, 2009
Messages
10,350
People say they want 'escapism', but actually, what that really means is that they want the essentially same and comfortable groove of emotional orgasms. They complain they don't want real world politics* or the mundane work of eating and shitting or game mechanics that enforce thinking or failure, not because they want a fantasy lala land as different from reality as possible, but because they want the comforts of the world they know neatly sanitised of everything difficult. For many, it's about gratifying the frustrations pent up from the real world, not venturing into a truly different world that surprises them.

And so, for exactly the same reasons that they want 'escapism', they also want their escapist world to be samey, predictable, and basically the real world dressed up with pauldrons and titties equally as ginormous as the other. Actually basing your setting off Arab cultures or anything that they don't have a 2-second heuristic for understanding becomes frustrating for these players because now they don't know how to get their gratifications (whether because trying to be a 'hero' doesn't work out the same way, or because they can't dress up in the same way they do in every game, or whatever).

This is very difficult to improve because even though you say
caused mostly by lack of education that could easily be remedied
(1) there's no financial incentive or room for companies to educate their employees or their employees to educate themselves; (2) these guys are often working with very ingrained intuitions for what feels fun or interesting, which comes from the same place of 'reality except everyone loves me', and so it's not easy for them to see it as a problem; (3) even if they overcame all this, they can't educate the audience who will whine and punish them with sales.

The solution has been the same as it always was: you need people who are broadly read and have a broad set of interests/experiences come into video game writing, not some dickface that's been playing DA2 all their life. For MCA and many others it is history, and even if it's familiar history like, I dunno, WW2 America, it matters when you actually study it. People make fun of creative writing degrees (and often for good reason) but a proper humanities education is traditionally what is supposed to provide this, as well.

*Though games inserting real world politics 1:1 is usually a dumb idea for other reasons
 

pomenitul

Arbiter
Joined
Sep 8, 2016
Messages
979
Location
μεταβολή
To be fair, I'd wager most people want both the comfortable trappings of familiarity and a twist on their beloved, time-tested formula. For instance, I'm fond of narrative arcs that are mostly recognizable at the outset (aside from a keenly concealed bit of foreshadowing or two) but that slowly, inexorably, imperceptibly slide into a wholly different register. By the time the audience realizes what happened, the shift has already, irrevocably taken place and we have no choice but to adapt to this new tone – an exciting aesthetic experience, when done right.
 

AwesomeButton

Proud owner of BG 3: Day of Swen's Tentacle
Patron
Joined
Nov 23, 2014
Messages
16,232
Location
At large
PC RPG Website of the Year, 2015 Make the Codex Great Again! Grab the Codex by the pussy Insert Title Here RPG Wokedex Divinity: Original Sin 2 A Beautifully Desolate Campaign Pillars of Eternity 2: Deadfire Steve gets a Kidney but I don't even get a tag. Pathfinder: Wrath
(Tigranes ) I think what you describe is one type of player, the one looking for the candy and the shortest route to the candy, whether that candy is the story, or the explosion effects.

But there is another type of player, the one for whom studying the game's rules, including the game wolrd's rules, and dissecting them is a game in itself.

"What is supposed to provide this" is well said. In humanities half the work, or more, towards getting a good education is entirely dependent on the student. The rest is teachers who spark your interest in a field, and then it's up to you to read and explore.
 

Neanderthal

Arcane
Joined
Jul 7, 2015
Messages
3,626
Location
Granbretan
I remember playing KoDP when it first came out and having to adjust my mindset to that of the Orlanthi, xenophobic, warlike, cow obsessed, traditional and omen ridden. Stealing and battle not unfriendly but a natural part of culture that all clans should indulge in. Refreshing.
 

AwesomeButton

Proud owner of BG 3: Day of Swen's Tentacle
Patron
Joined
Nov 23, 2014
Messages
16,232
Location
At large
PC RPG Website of the Year, 2015 Make the Codex Great Again! Grab the Codex by the pussy Insert Title Here RPG Wokedex Divinity: Original Sin 2 A Beautifully Desolate Campaign Pillars of Eternity 2: Deadfire Steve gets a Kidney but I don't even get a tag. Pathfinder: Wrath
That's kind of what I'm missing - I want fantasy cultures to require that I get into them a little, and this knowledge to then be of use to me in the game. Remember Amaunator's ritual in BGII for example. You can't progress until you learn it. Knowledge of a fantasy culture's customs, not as a character stat, but as actual knowledge of details conceived by designers, and counterintuitive to us, the players, and to our culture.
 

Iznaliu

Arbiter
Joined
Apr 28, 2016
Messages
3,686
I remember playing KoDP when it first came out and having to adjust my mindset to that of the Orlanthi, xenophobic, warlike, cow obsessed, traditional and omen ridden. Stealing and battle not unfriendly but a natural part of culture that all clans should indulge in. Refreshing.

That kind of stuff was only really possible because they were using a pre-existing setting; tabletop RPGs are/were a lot more creative than cRPGs, and they didn't have to do all the legwork themselves.
 

Tigranes

Arcane
Joined
Jan 8, 2009
Messages
10,350
I remember playing KoDP when it first came out and having to adjust my mindset to that of the Orlanthi, xenophobic, warlike, cow obsessed, traditional and omen ridden. Stealing and battle not unfriendly but a natural part of culture that all clans should indulge in. Refreshing.

That kind of stuff was only really possible because they were using a pre-existing setting; tabletop RPGs are/were a lot more creative than cRPGs, and they didn't have to do all the legwork themselves.

That's what most games should be doing.

If you think about it, it is completely unreasonable that we should ask video game writers to come up with a whole new world and make it coherent and intelligent. Even in places with the most amount of support writing gets within the industry, e.g. Obsidian, Bioware, CD Projekt, you've got a few people whose published work usually amounts to a couple other games that aren't given the whole 5 years to write the game but anything from a plot written in one night to at best a few months of sketching out that is supposed to then support a gazillion words of dialogue and reactivity and level design and whatnot. No wonder it's full of random stuff you think up on the shitter, that's all there's time for.

I'd much rather have them learn to pick good source material - no, don't give me a fucking game based on a superhero comic book in an endless spiral of derivation, pick some nice moment in history, do some research, maybe a good p&p system, and then work on it.
 

Bohrain

Liturgist
Patron
Joined
Aug 10, 2016
Messages
1,446
Location
norf
My team has the sexiest and deadliest waifus you can recruit.
You don't see it because the audience doesn't demand it. And the typical consumer is probably blissfully unaware of what life was back when literacy wasn't the norm, most kids didn't see age 10, minute schedules didn't exist and so forth.
CRPG's are a niche genre as it is, I highly doubt this is going to change anytime soon.
 

Neanderthal

Arcane
Joined
Jul 7, 2015
Messages
3,626
Location
Granbretan
Yeah they are niche an developers ought to face that an start trying to master that niche rather than seek ever broader acceptance through streamlining an dumbing down.

AwesomeButton has a twofold problem as I see it: 1, Characterless, culture free worlds that have no opportunities of exploration. 2, Every quest has to be marked, logged, explained an rewarded in name of accessibility. No catching Finnegan having an affair, no assembling the Guarda Revanche etc, which is a bloody pity.

Devs need to start respecting their audience, an fuck a broader audience.
 

J1M

Arcane
Joined
May 14, 2008
Messages
14,626
I remember playing KoDP when it first came out and having to adjust my mindset to that of the Orlanthi, xenophobic, warlike, cow obsessed, traditional and omen ridden. Stealing and battle not unfriendly but a natural part of culture that all clans should indulge in. Refreshing.

That kind of stuff was only really possible because they were using a pre-existing setting; tabletop RPGs are/were a lot more creative than cRPGs, and they didn't have to do all the legwork themselves.
It is very possible to present mysterious new cultures, as long as you don't dress them up with words the brain cannot read or pronounce. A sin Pillars of Eternity is especially guilty of. When you can't read the arcane name of something, it gets stored in memory as "weird name" and when there's more than one of those they all get jumbled together and the brain stops caring.

There is plenty of time allocated to writing RPGs these days. The primary issue is that too much of it is spent on volume of words, and not enough is spent on quality and brevity.
 

huskarls

Scholar
Joined
Aug 7, 2016
Messages
112
Despite all the armchair psychology here, Warhammer, Witcher, and Fallout are all widely popular series that attempt to display some foreignness instead of psuedo-medieval fantasy with modern values. I imagine its a production problem since all the writers are in their own corners and its hard to create an overall immersion(TM) unless you have a very solid idea of your setting beginning at pre-production so you can brief every one. Which is why games based off of trpgs are so successful at this proving prestigious video game writers can indeed colour in the lines. PoE even had weekly meetings but ultimately it just turned into one guy writing not wood elves, one guy doing flavour for this kingdom, one writing a quest, and so on with no grand idea of a specific atmosphere
 

AwesomeButton

Proud owner of BG 3: Day of Swen's Tentacle
Patron
Joined
Nov 23, 2014
Messages
16,232
Location
At large
PC RPG Website of the Year, 2015 Make the Codex Great Again! Grab the Codex by the pussy Insert Title Here RPG Wokedex Divinity: Original Sin 2 A Beautifully Desolate Campaign Pillars of Eternity 2: Deadfire Steve gets a Kidney but I don't even get a tag. Pathfinder: Wrath
Yeah they are niche an developers ought to face that an start trying to master that niche rather than seek ever broader acceptance through streamlining an dumbing down.

AwesomeButton has a twofold problem as I see it: 1, Characterless, culture free worlds that have no opportunities of exploration. 2, Every quest has to be marked, logged, explained an rewarded in name of accessibility. No catching Finnegan having an affair, no assembling the Guarda Revanche etc, which is a bloody pity.

Devs need to start respecting their audience, an fuck a broader audience.
That's essentially asking them to forego profits :) There has to be a way, like in other, non interactive mediums, for a smart author to provide both fun for the superficial consumer, and depth for those who can appreciate it. But this would require more skills from the author.

Despite all the armchair psychology here, Warhammer, Witcher, and Fallout are all widely popular series that attempt to display some foreignness instead of psuedo-medieval fantasy with modern values. I imagine its a production problem since all the writers are in their own corners and its hard to create an overall immersion(TM) unless you have a very solid idea of your setting beginning at pre-production so you can brief every one. Which is why games based off of trpgs are so successful at this proving prestigious video game writers can indeed colour in the lines. PoE even had weekly meetings but ultimately it just turned into one guy writing not wood elves, one guy doing flavour for this kingdom, one writing a quest, and so on with no grand idea of a specific atmosphere
Where does your insight into their process come from? Can you tell about it in more detail?
 

YES!

Hi, I'm Roqua
Dumbfuck
Joined
Feb 26, 2017
Messages
2,088
I just finished reading this article, and while reading, I thought how underused cultures-constriction is, in the crafting of fantasy settings. In most RPGs we get fantasy "peoples", "races", or "nations" which reason and act according to the worldview of the modern Western/Globalized man/woman, or make a cheap play on the stereotypes we know these for. We are practically in the position to applaud a game where elves speaking in present tense only is considered a fresh idea to discern them culturally from humans.

A few steps better are games which try to portray the outlook of western medieval people, but these are also pretty shallow, usually limited to simple verbal demonstrations of "vassal obligations", "feudal spirit", etc. One notable exception from the overall sloppiness I can think of are Witcher 3 and Darklands, but I won't get into analysis here. Just read the article, it's rather good, and share opinions.

Isn't the derivative feeling that prevails over most fantasy settings in RPGs caused mostly by lack of education that could easily be remedied (by reading, or hiring consultants)? Wouldn't even the pitiful casual "I want to ride a horse and blow things up" experience benefit a little from better developed fantasy settings?

Article:
THE LUXURY OF TEARS
People in richer societies cry more. Matthew Sweet probes the reasons

When he became a father, Charles Darwin began taking notes on the emotional development of his children. Such record-making was part of the rhythm of the household. He logged the weather, his farts and sneezes, and the behaviour of the earthworms he kept in a jar on the piano. His offspring were too compelling a source of data to ignore.

Willie was his first-born. Darwin tickled his feet with a spill of paper and watched for laughter. Annie arrived 14 months later. Darwin observed the moment when she first responded to her own reflection in the polished case of his fob watch; and her consternation when a wafer biscuit became stuck to her hand. It was the crying, though, that most aroused his curiosity. Darwin kept a careful record of these outbursts, noting when eyes were dry, when filled with tears – and concluded that though we may wail from the moment we emerge from the womb, it takes time to develop the facility for weeping. “I first noticed this fact”, he wrote, “from having accidentally brushed with the cuff of my coat the open eye of one of my infants, when 77 days old, causing this eye to water freely; and though the child screamed violently, the other eye remained dry, or was only slightly suffused with tears.” Crying, he decided, “required some practice”.

Darwin collected most of these data in the 1840s, when his children were young. It was the calmest decade of his life. The voyage of the Beagle was behind him; he had settled with his wife, Emma, in Down House, a comfortable villa in rural Kent. The reading public was devouring the published account of his South American travels, which detailed his tortoise-steak dinners on the Galapagos, his excavation of a fossilised giant sloth and his reflections on the human specimens he encountered on the way. On his desk lay the notes for “On the Origin of Species”. When it appeared in 1859, the book erupted like a cultural Krakatoa. Over 150 years later, we are still living through its aftershock. It looms so large that we are inclined to forget its author’s other published work. Most of all, perhaps, the book nourished by his domestic explorations in Kent, and which, through more subtle channels, also exerted a profound effect upon the future.

“The Expression of the Emotions in Man and Animals” (1872) compared the emotional subjects with whom Darwin shared his home with ones he had encountered on his travels – and hundreds more he would never meet. In a spectacular example of Victorian crowdsourcing, he fired off hundreds of letters and questionnaires to correspondents all over the world. He implored a biologist in Brazil to tell him whether South American monkeys wrinkled their eyes “when they cry from grief or pain”. From a phalanx of missionaries and doctors he drew reports on the weeping habits of the Australian Aboriginals. James Brooke, the Rajah of Sarawak, supplied emotional intelligence on the Dayaks of Borneo. Tristram Speedy, guardian to Prince Alamayu Simeon of Abyssinia, gave a long-distance lesson in east African passions.

From their data, Darwin mined a number of influential conclusions. Emotions, he suggested, were facilitated by the act of expressing them. We don’t cry because we are upset, rather the act of crying informs us that we are upset. In our neuroscientific age, when we’re apt to regard a small firing in the brain as the first stage of all human processes, the proposal may seem bizarre, but it is not quite abandoned – a Japanese study from 2007 drizzled its subjects with artificial tears and found, as Darwin would have expected, that many experienced feelings of sadness.

Darwin’s most tenacious idea, however, was a cultural one. A correspondent in New Zealand had told him the story of a Maori chief who “cried like a child because the sailors spoilt his favourite cloak by powdering it with flour”. He had observed similar behaviour from the deck of the Beagle, notably that of a Fuegian man, recently bereaved, who “alternately cried with hysterical violence, and laughed heartily at anything which amused him”. Civilisation, he reasoned, had bred emotional temperance, and humans who lived beyond its borders were subject to fits of passion. “Savages weep copiously from very slight causes,” he concluded. “Englishmen rarely cry, except under the pressure of the acutest grief.”

Today, the remark sounds ludicrous. Darwin, though, did not write from a position of ignorance. He knew the pressure of the acutest grief. In 1851, his beloved daughter Annie sank from the world under the weight of tuberculosis. She was ten. (“We have lost the joy of the Household”, wrote her father, “and the solace of our old age.”) And those savages? The language now offends, but the assumption it carries – that the inhabitants of rich Western nations shed fewer tears than citizens of the developing world – held firm until the beginning of the present decade.

When I first visited Down House 20 years ago, I was ready to be moved. I was the sort of boy who fished in drains for sticklebacks and kept his fingers crossed during the Lord’s Prayer. Darwin was my childhood hero. Standing in the modest little space of his study, it was easy to imagine the great man feeding slides into the microscope, squinting at his water-damaged notebooks, furrowing his appropriately simian brow. Easy, too, to feel moved to tears. Muslims weep before the Ka’aba, Jews at the Wailing Wall. In Darwin’s workroom, atheists can shed hot rational tears where a doubt-wracked Victorian naturalist sat down to revise the relationship between humanity and the universe.

The tears that swam in my eyes, however, were produced by something much more mundane. A three-legged stool – a low, plain, unremarkable thing mounted on brass castors so that its owner could scoot between his writing desk and his work-table without breaking the line of his thought. Its wooden sides were lined with scuff-marks, as if someone had dragged it hard against the wall. This, the guide explained, is exactly what had happened. Darwin rarely worked past lunchtime, giving over the rest of the day to activities with his family. Once the morning’s studies were concluded, he surrendered the three-legged stool to his children, who punted it up and down the hallway, pretending that it was a boat. The Beagle, perhaps. This is the image that stirred my emotions – Charles Darwin, genius and really good dad.

In the context of “The Expression of the Emotions in Man and Animals”, though, my response seemed an affront to Darwinism, brought to the very room in which its ideas were formulated. I hadn’t cried from grief, acute or otherwise, but from a sentimental sense of connection with a great thinker whom I admired. Darwin’s writings, however, seemed not to accommodate the possibility of such an emotion. Perhaps, in 1872, people just felt things differently.

201604_FE_CRY_007-web.jpg

Brad Pitt in “Babel” (2006)

We sound different from our ancestors. We wear different clothes, observe different philosophies, follow different ideas. Could certain ways of feeling have vanished along with Mother Bailey’s Quieting Syrup and Capstan Filters, yielding to fresh moods and senses? A new generation of scholars working on the history of the emotions believes passionately that this is the case, and wants us to see our feelings not simply as what happens when a neurological circuit lights up in our brains, but as the products of bigger cultural and historical processes. Their first contention: the very idea of the emotions is a surprisingly young one.

“The concept arrived from France in the early 19th century as a way of thinking about the body as a thing of reflexes and twitches, tears and shivers and trembles, that supplanted an older, more theological way of thinking,” says Tiffany Watt Smith, once a director at the Royal Court Theatre, now a researcher at the Centre for the History of the Emotions at Queen Mary University of London. Before the discourse of the emotions took hold, she argues, people spoke of other phenomena – “passions”, “moral sentiments”, “accidents of the soul” – that were not always located within the human body. Ill winds blew no good upon the ancient Greeks, carrying flurries of unhappiness through the atmosphere. Fourth-century Christian hermits were plagued by acedia, a form of religious despair spread by demons that patrolled the desert between 11am and 4pm. Non-human organisms could also be afflicted by passions: in the Renaissance, palm trees became lovesick and horticulturists brokered arboreal marriages by entwining the leaves of proximate specimens.

Watt Smith has compiled “The Book of Human Emotions”, in which she gives the histories of 156 human feelings. Many of these you will have experienced – popular headline acts such as guilt, indignation and apathy. Others seem familiar, despite the exotic names – such as basorexia, the sudden urge to kiss someone; or matutolypea, the ill-temper that flourishes between the alarm-clock and the day’s first cup of coffee. For Anglophone readers, some of her subjects are mysteries locked behind the door of someone else’s culture. Amae, a Japanese term that describes the comfort felt when you surrender, temporarily, to the care and authority of a loved one. Liget, an angry enthusiasm that buzzes in the Ilongot tribe of the Philippines, pushing them to great feats of activity – sometimes agricultural, sometimes murderous. Awumbuk, a feeling of emptiness after visitors have departed, is experienced by the Baining people of Papua New Guinea. Departing guests, they theorise, leave behind a kind of heaviness as they go. (A bowl of water left out overnight absorbs this force.)

Perhaps the most revealing entries are those on emotions that remain felt, but which have been substantially reconstructed. In the early 19th century, for instance, nostalgia was considered a terminal condition. Men in their 20s were thought particularly susceptible. During the American civil war, doctors scribbled the word on dozens of death certificates. In the 1830s, French medical authorities warned that excessive attachment to lost people and lost places could reduce the sufferer to a state of decrepitude. “Little by little his features become drawn; his face is creased with wrinkles; his hair falls out, his body is emaciated, his legs tremble under him; a slow fever saps his strength…his discourse becomes incoherent; his fever becomes ever greater, and soon he succumbs.”

By the middle of the 19th century, it had fallen from the diagnostic repertoire. Technology, doctors asserted, had cured it. If you pined for the scenes of youth, you could get on a train and spend an afternoon running about in them. If you were sick with the melancholy remembrance of your childhood nurse, you could send her a telegram enquiring after her health. “Happily,” reflected one specialist, “nostalgia diminishes day by day; by descending little by little among the masses, instruction will develop the intelligence of people, making them more and more capable of struggling against the disease.”

For most of us, that struggle is over – so over, in fact, that we consider nostalgia a pleasure, and cultivate the condition without suffering anything worse than a hefty bill from eBay. Science, too, has changed its tune. In 2012, an American study measured the therapeutic value of the feeling: “Nostalgia makes people feel loved and valued and increases perceptions of social support when people are lonely.” The following year, British researchers found something even more tangible: “on a physical level”, their leader reported, “nostalgia literally makes us feel warmer.”

The Centre for the History of the Emotions is at the heart of London’s East End, on a campus that enfolds one of Britain’s oldest Jewish cemeteries. Its academics share their findings on a collective blog. Some work on despair, others disgust. One is considering shame and anger in early-modern Spain. Another is examining Busman’s Stomach, a gastric disorder affecting bus drivers in the 1930s, ascribed to carbon-monoxide fumes but probably psychosomatic. (The Marxist biologist J.B.S. Haldane prescribed reading Lenin as a cure.) A founder member, Jules Evans, has found an unexpected enthusiasm in Korea for the ideas of the Greek Stoics, who treated emotions like beliefs or opinions that could be revised. He is now engaged in a global survey of religious ecstasy, which includes a study of the alarmingly transmissible passions of Islamic State and al-Qaeda. Queen Mary’s researchers enjoy happy relations with a small but thriving coalition of similar institutions – the Max Planck Institute in Berlin, the Nordic Network for Intimacy Research, and the Australian Research Council’s Centre of Excellence for the History of Emotions, the biggest and most lavishly funded, with sites in Adelaide, Brisbane, Melbourne, Perth and Sydney.

The director of the London Centre, Thomas Dixon, seems a man in excellent emotional health – friendly, full of infectious enthusiasm for his work, and, when I arrive, waiting at the door of his building with milky tea and two kinds of cake. He ought to look content. His last publication, “Weeping Britannia”, a critical history of British emotional restraint, was one of the most lauded history books of 2015. Its thesis is excitingly revisionist. It takes that most familiar of emotional concepts – the British stiff upper lip – and reveals that it was a historical blip. The phrase, it seems, was coined in America, and only became fully associated with Britain during the first half of the 20th century, as an increasingly militarised and imperial national culture absorbed the shock of global conflict. Britain before this period is, he suggests, better characterised as a wet-cheeked, passionate nation in which tears enjoyed an elevated status. Politicians advanced their arguments by shedding parliamentary tears. Monarchs wept to demonstrate their sincerity. Fans of Henry Mackenzie’s influential novel “The Man of Feeling” (1771) read how its gentleman-hero offered his “tribute of tears” to beggars, orphans and prostitutes, and they attempted to emulate his sensitivity. “Weeping”, Dixon writes, “was a moral and religious activity; something to be cultivated, tutored, practised, learned, performed.” It is a history of which Darwin seems to have known little.

201604_FE_CRY_006-portrait-web.jpg

Diane Kruger in “Joyeux Noel” (2005)

In a poem reflecting on the traumas of the French revolution, William Blake asserted that “the tear is an intellectual thing”. This has become a mantra for Dixon and his colleagues. Blake’s line suggests that weeping is not where thought ends – a state to which we are reduced once more complex processes have broken down – but evidence that we have computed a mass of cognitive data about the world around us and come to a considered conclusion. It’s a challenge to the language of continence and incontinence that attends our conversations about emotion – our tendency to apologise for our tears as we might apologise for losing control of our bladders. “When our bodies respond emotionally”, says Dixon, “we are thinking with our bodies”. The history of the emotions, therefore, is like the history of ideas.

When Dixon began his work on weeping, he put Charles Darwin’s study of Willie, Annie and their siblings at the top of his reading list. He even intended to follow Darwin’s example and use his son as an experimental subject – but the stresses of early parenthood soon put paid to that. Perhaps it was just as well. In 2011, experimental evidence emerged that disrupted the assumptions of “The Expression of the Emotions in Man and Animals” and supported the deeper historical evidence that Dixon was accumulating.

In the 20th century, most research followed the path hacked by Darwin. In 1906, the American psychologist Alvin Borgquist considered the fruits of his own global survey of explorers and missionaries, and asserted that “tears are more frequently shed among the lower races of mankind than among civilised people.” Borquist’s terminology may not have survived the 20th century, but the assumption of Western emotional dryness proved extraordinarily durable. A narrative built up across the disciplines. The Dutch historian Johan Huizinga published “The Waning of the Middle Ages” (1919), which tracked the disappearance of the wild, carnivalesque side of medieval life – the kind you see in Bruegel paintings in which masked grotesques caper around Flemish squares. The German sociologist Norbert Elias published “The Civilising Process” (1939), which described the growth of a culture of politeness – a new empire of the cooler emotions that colonised and pacified older, more chaotic ways of being. (A key event in the Elias timeline is the 16th-century debut of the fork.)

The modern era was seemingly content with this narrative. Then, in 2011, a team of Dutch clinical psychologists produced a study that consigned it to the out-tray of history. Ad Vingerhoets and his team examined data from 37 countries – the results of interviews in which respondents had told stories of their lachrymal lives. Their conclusions would have brought tears to Darwin’s eyes. “Individuals living in more affluent, democratic, extroverted, and individualistic countries,” they wrote, “tend to report to cry more often.” Although people enduring unenviable economic circumstances might be more plagued by depression, those from richer cultures shed more tears. Australasian and American men emerged as the weepiest in the world; their Nigerian, Bulgarian and Malaysian counterparts the most dry eyed. Women in Sweden outcried those in Ghana and Nepal. The female populations of countries where gender equality was highest wept more copiously than those where it was lower. The evidence also showed – contrary to centuries of stereotyping – that the inhabitants of colder climates wept more frequently than those who lived in warmer zones. Tears, the study suggested, were not evidence of primitivism, as they had been for Darwin. They were not even good indicators of distress. Rather than being the habit of the wretched of the Earth, weeping appeared to be an indicator of privilege – a membership perk enjoyed in some of the world’s most comfortable and liveable societies. “If you live in really distressing and difficult circumstances, crying is a luxury,” says Dixon. “We know when we have been bereaved, we might be so shocked or traumatised that tears don’t come. So perhaps we should see tears as a sign of moderate grief, of bearable negative emotion. If you are enduring extreme distress or extreme hardship, that is not the time for tears.”

In countries visited by war or famine, the observation might not seem so counterintuitive. Dorte Jessen, head of the Jordan arm of the World Food Programme’s response to the Syrian refugee crisis, has spent over a decade looking into the tearless eyes of those in the direst need. During the 2011 famine in the Horn of Africa, she was based in the sprawling refugee camp in Dadaab, Kenya, 60 miles from the border with Somalia. Early in her assignment, she recalls, she watched a mother and her two young children receiving emergency rations – sachets containing a sweet mixture of peanut paste, vegetable fat and cocoa. Just a few steps from the distribution point, the mother ripped open one of the packets and handed it to her oldest child. “They didn’t talk or express any emotion. They just kept walking,” Jessen notes. “Once you are past a certain point of exhaustion, there is simply no energy to spare to get emotional.”

In 1890, the philosopher William James drew a distinction between the “crying fit” – a psychological event accompanied by “a certain pungent pleasure” – and the much less bearable sensation of “dry and shrunken sorrow”. Some experiences, it seems, are too bleak for tears. Former inmates of Nazi concentration camps have reported, sometimes guiltily, that they did not weep during their ordeal. At a war-crimes trial in May 2015, Susan Pollock, a Hungarian Holocaust survivor, recalled her dry eyes as she watched her mother being despatched to the gas chamber. “I wasn’t crying,” she said. “I just wanted to recede into myself, never to be seen.”

We might imagine that on the flood map of world history, certain infamous fields in eastern Europe would register as brackish lakes. But the greater volume of those tears, I suspect, would have been shed not by the victims but by those who came as an act of remembrance, once the furnaces had cooled. Pollock considered herself temporarily dehumanised by her life in the concentration camp. But in an Auschwitz or a Treblinka, what would be the purpose of tears? They would be as superfluous as a pair of diamond earrings.

You don’t have to invoke Holocaust and famine to discern the luxurious nature of weeping, or the existence of an economy of emotion in which some are privileged to demonstrate their feelings and others are not. It’s there on the literary and historical record, legible in the scorn that Odysseus feels for his son Telemachus until his boy has given him a big Greek man-hug and produced proper princely tears; in the response of the 19th-century French doctors who were baffled to observe the nervous, weepy symptoms of hysteria – the malady of refined females – in muscular working-class railway workers who had been injured on the tracks. It’s there, too, in contemporary culture. If politicians cry, the value of their tears is assessed by commentators who act like jewellers, squinting for signs of fakery and paste. When, in the first week of this year, Barack Obama wept as he urged Congress to support tighter gun controls, the cameras caught every scintilla of his emotion, their clicks cascading like rain on the White House roof. Sympathetic observers celebrated these as moral tears, saltwater proof of the truth of his arguments. His enemies went on Twitter and Fox News to make allegations about onion juice on his fingers.

Statesmen and -women occupy a professional field in which the expression of emotion is permissible. For others, tears are unaffordable. The medical dispatchers who answer emergency calls receive training to help them discount their emotions as they advise those who have swallowed pills, or whose babies have stopped breathing. We would consider it a dereliction of duty if surgeons, nurses, police officers and soldiers wept during working hours. They have surrendered their right to cry in the same way that other employees might sign away their expectation of fixed hours or sick pay. Their restraint gives us the space to express our pain or gratitude, which we buy from them through taxation.

201604_FE_CRY_009-web.jpg

Saoirse Ronan in “Brooklyn” (2015)

As the world develops, so do its passions – often in ways that are not immediately comprehensible to its inhabitants. Social media are generating new rituals of anger and indignation. Reports of weeping icons have become bullets in the propaganda war between Ukraine and Russia. In Japan, an activity called rui-katsu (tear-seeking) has developed, in which customers gather to watch DVD weepies or pay for a wet-eyed escort to come to the workplace to embrace them. In China, older people bristle at the new emotional culture that they perceive leaking in from the West – exemplified by the moment in 2010 when a contestant on the reality show “Fei Cheng Wu Rao” asked a prospective partner if she’d be happy to go out for a date on a bicycle. “I’d rather cry in a BMW,” was the reply. It was an off-the-cuff remark, but seemed to articulate something about the future of China – and, therefore, the future of the world.

The history of the emotions is a young discipline. It is at the very beginning of its investigation into the long story of our feelings. “Are we”, asks Thomas Dixon, “writing the history of something that has always been the same fundamentally in the human mind being expressed and interpreted in different ways? Or are we, as most of us who do it would think, discovering the historicity of the human mind? Discovering that you can’t feel just any old way; you only feel the way you do because of your language, your experience, your family, your upbringing, your social institutions, your political institutions.” If we could acquire that perspective upon ourselves, how would it change us? Who would we become, if we could chart the flow of oceans of tears, measure the dry breadth of those famine lands, experience our feelings as events in history as well as in our bodies? People, hopefully, who might look back upon these times, their scuffed old artefacts and antique ideas, and shed a generous and sentimental tear.

Matthew Sweet presents BBC Radio 3’s “Free Thinking” and Radio 4’s “The Philosopher's Arms”. He is the author of “The West End Front”

This is a long standing complaint of mine. I just read what you wrote and not the article yet but Witcher 3 is a horrible example and is a revisionist history that makes absolutely no sense at all. Notice humans are bad and all the races are oppressed. Notice how the people saving the people from the monsters are considered monsters themselves, hated, underappreciated, and underpaid. Just like modern revisionists want us to see their heroes such as BLM protestors, college retards, etc. Also, notice how the people that had the good guys and all the "races" usually are the ones that create the monsters the witchers, who are also hated, have to save them from. It is as a clear an analogy to regular people being the cause of all that is bad in modern society with all their lack of white guilt and refusal to tear down all this institutionalized racism and capitalist pigism. The only way to make the monsters go away is accept you are bad and join the cult.

My biggest complaint in fantasy is the monotheistic morals in all of them instead of real polytheism. If you worship the god of war who wants you to conduct war and rewards you for giving him war you would not see war as bad or amoral. It would be a good thing and not to be avoided. Even if you didn't worship the god of war chances are your god wants sacrifices and other weird shit and you don't see the god of war, or war itself, as bad. If war was bad the god of war would be the bad god used to scare your children. War is bad when other people's soldiers makes it on you, but good when your people's soldiers makes it on others. There is no good and evil is what crazy people have caught. There are good things like increasing your wealth, having more slaves, having a good harvest, etc. And bad things like a god forcing you to kill your family which it turn forces you to go on your 12 labors, etc.

The only game I can think of that did a good job of making an accurate setting that didn't reward our modern day morals is Tugedor and the Alliance with Rome. Yes, I spelled it wrong but you know what game I mean and that word is impossible to remember correctly. Let your slave go? That just fucks you and the slave, as it would in that setting.

If you read old sci-fi and fantasy you see an honest attempt by sane and intelligent writers to make a sensible and alien world. Elves are different and alien. They aren't just humans that love trees. They think different than humans. Or why else have elves if they are just like our old tribal savages but better looking? The same with aliens in sci-fi. Rpgs are horrible at aliens. They always have completely homogenous societies and are just like some sort of human culture at some time. Hell, feudal japan is far more alien with an alien mind and culture than any alien race in any rpg I can think of. Aliens always have the good ones, which have the same modern left wing sensibilities and are woke. Aliens aren't.

The issue is leftism stifling art. In order to be a non-hated successful artist you have to have left wing ideology, and that ideology is based on a cult philosophy of join us. Anyone can join, you just need to think as they do accept what your betters tell you to think. Of course elves and dwarves would be hated because humans hate everything and are bad. Of course 90% of the stories are about saving the earth mother from the evil humans who haven't been woke. Of course Vikings had a society filled with warrior women Shield Maidens, and the non-evil Vikings loved homosexuals and transsexuals and had perfect Christian values but don't call it Christian because Christians are evil but some of their values are current woke values.

Left wing entertainment is about the surface, the superficial, and about how to be a good cultist. It will always be devoid of complex imaginations and will always reflect modern society as it must celebrate wokeness and recruit the gullible.

Sci-fi and fantasy took a big shit in the 60s. Look at the stark difference between "The Mote in God's Eye" and the sequel to see how joining the fellow travelers can change writers from real sci-fi to fluff girls.
 

AwesomeButton

Proud owner of BG 3: Day of Swen's Tentacle
Patron
Joined
Nov 23, 2014
Messages
16,232
Location
At large
PC RPG Website of the Year, 2015 Make the Codex Great Again! Grab the Codex by the pussy Insert Title Here RPG Wokedex Divinity: Original Sin 2 A Beautifully Desolate Campaign Pillars of Eternity 2: Deadfire Steve gets a Kidney but I don't even get a tag. Pathfinder: Wrath
This is a long standing complaint of mine. I just read what you wrote and not the article yet but Witcher 3 is a horrible example and is a revisionist history that makes absolutely no sense at all. Notice humans are bad and all the races are oppressed. Notice how the people saving the people from the monsters are considered monsters themselves, hated, underappreciated, and underpaid. Just like modern revisionists want us to see their heroes such as BLM protestors, college retards, etc. Also, notice how the people that had the good guys and all the "races" usually are the ones that create the monsters the witchers, who are also hated, have to save them from. It is as a clear an analogy to regular people being the cause of all that is bad in modern society with all their lack of white guilt and refusal to tear down all this institutionalized racism and capitalist pigism. The only way to make the monsters go away is accept you are bad and join the cult.
Your reading is very incomplete and one-sided here, also needlessly politicized. Like I said, I won't go into analysis of why I think Witcher 3 succeeds as a setting.

The issue is leftism stifling art. In order to be a non-hated successful artist you have to have left wing ideology, and that ideology is based on a cult philosophy of join us. Anyone can join, you just need to think as they do accept what your betters tell you to think. Of course elves and dwarves would be hated because humans hate everything and are bad. Of course 90% of the stories are about saving the earth mother from the evil humans who haven't been woke. Of course Vikings had a society filled with warrior women Shield Maidens, and the non-evil Vikings loved homosexuals and transsexuals and had perfect Christian values but don't call it Christian because Christians are evil but some of their values are current woke values.

Left wing entertainment is about the surface, the superficial, and about how to be a good cultist. It will always be devoid of complex imaginations and will always reflect modern society as it must celebrate wokeness and recruit the gullible.

Sci-fi and fantasy took a big shit in the 60s. Look at the stark difference between "The Mote in God's Eye" and the sequel to see how joining the fellow travelers can change writers from real sci-fi to fluff girls.
I am skeptical towards the view that everything has some political motive, and even more skeptical towards the conspiracy theory that "leftists are trying to influence our culture by inserting their message into videogames". This is just too reminiscent of other scares risen in other periods and just as baseless.
 

YES!

Hi, I'm Roqua
Dumbfuck
Joined
Feb 26, 2017
Messages
2,088
I am skeptical towards the view that everything has some political motive, and even more skeptical towards the conspiracy theory that "leftists are trying to influence our culture by inserting their message into videogames". This is just too reminiscent of other scares risen in other periods and just as baseless.

McCarthyism was certainly not baseless and neither is its modern day equivalent. The methods are the same, the cause has changed.

Your reading is very incomplete and one-sided here, also needlessly politicized. Like I said, I won't go into analysis of why I think Witcher 3 succeeds as a setting.

If you can't explain how I am wrong that means I'm right.
 

As an Amazon Associate, rpgcodex.net earns from qualifying purchases.
Back
Top Bottom