This post has the following content warnings:
some dath ilani are more Chaotic than others, but
Next Post »
« Previous Post
+ Show First Post
Total: 4482
Posts Per Page:
Permalink

Broom swiftly sets them down.

Permalink

Presumably no one has lit Otolmens's oracle on fire but possibly no one has told him not to worry about it. 

 

She lights the papers. Presumably Security, now using a scry rather than an invisible person, has been watching over Keltham's shoulder and they'll have a lengthy deanonymizing debrief later.

Permalink

"Right then.  Leaving all that aside," until their plot-induced lack of mutual communication blows up catastrophically on the whole group later, "I thought today I'd try to speedrun a couple of years' worth of dath ilani lessons for children about how fairness in negotiation works.  On the theory that, first of all, Cheliax could stand to get a glimpse of how dath ilani's children's training works in general; and second, that maybe if an adult with average dath ilani intelligence hears about children's training in the abstract, they can just imagine that they went through that training themselves?"

"The reason I'm picking 'fairness' as the topic is because I'm going to be using those structures to negotiate equity, and those procedures do tend to hope that everyone has - mutual knowledge, common knowledge, stuff that everyone knows that everyone knows - about how 'fairness' works.  Before I start, if I can ask the group - what does the term that 'fairness' translates into, in Taldane, mean to people here?"

Permalink

.....a concept for stupid people who think they deserve more than they can claim and hold, Meritxell doesn't say. 

 

     "Getting what was agreed upon."

     "Trades where - neither side is getting cheated."

     "Rules that are applied consistently or impartially."

      "Everyone gets what they earned."

      

Permalink

"How can you tell how much somebody has earned?  If you make a one-of-a-kind magical item, what price should it sell for, so that neither side is being cheated?"

Permalink

"Whatever you can get someone to buy it for," says Meritxell.

Permalink

"My shirt is a one-of-a-kind relic from another plane; it has no standard market price.  In real life, I plan to never sell it, ever, though I might sell the ability to do science to it.  Suppose however that, relative to how wealthy I expect to someday be, my shirt, one of my only memories of dath ilan, is worth one million gold pieces to me.  In the sense that, if some insidious force was otherwise going to steal my shirt from me, I wouldn't pay any more than a million to protect it."

"Now suppose somebody else has a very weird magical spell that can take any relic of dath ilan, and immediately convert it into ten million gold pieces, no questions asked."

"Any price greater than a million gold and less than ten million gold is a mutually beneficial trade, in the sense that both of us are better off making the trade at that price, than not trading with each other at all.  But if my shirt sells for only a million and a thousand gold, I'm only a thousand gold better off, and the other person is around nine million gold better off.  If my shirt sells for ten million minus a thousand, the other guy has profited by a thousand and I've profited by a bit less than nine million."

"Trading at all, at any price in the range, is mutually beneficial; we're both better off.  But on top of that event, there's another event, a question of the exact price, in which my being one gold piece better off makes the other person one gold piece worse off."

"How do we set that price, then?  Aren't we locked into an adversarial game where it's my interest to say, 'I'll only sell at ten million minus one', and it's their interest to say, 'I'll only buy at one million plus one'?  Why would we say anything else, when saying anything else just makes the other person better off at our own expense?  But if we both think like that, the trade doesn't occur at all."

"What price is fair?  Or to put it another way, how can two people like that agree on a trade at all?  How does Golarion, how does Cheliax, think about that?"

Permalink

It hasn't come up in their books about Taldor yet but Carissa's met Taldane adventurers and in fact tried to trade them things and the answer is the same there as in Cheliax. "You barter. You say 'why, I don't see why I should give up this shirt for a coin less than twenty million gold, which communicates 'I'm open to negotiating a trade but would need to be persuaded it's the best trade I can get', and the other person says 'twenty million! I have a hard time believing even such a sentimental item is worth more to you than an entire week of Nefreti Clepati's time during which she could make a dozen duplicates of the shirt and make you a personal demiplane besides, and that would only be eight million gold. And really it seems to me like this trade is worth your time even for one duplicate of the shirt and the personal demiplane, and that would be only eight hundred thousand gold, which is what I'm offering. Which communicates 'I'm open to negotiating a trade but would need to be persuaded it's the best trade can get'.

And you say 'my, imagine what Nefreti Clepati would say if you tried to lowball her prices like that! I'm not sure this conversation is worth my time, if my shirt is worth so little to you.' Which communicates - and you might be bluffing, you're allowed to bluff when you're doing this - that the quoted price is well outside the range that's worth it to you, and they'd better indicate that they think there's overlap between their willingness to pay and your willingness to sell.

And you iterate on this and then end up settling somewhere, the exact place depending on how competent at bartering you are and on the range of trades you both like."

Permalink

"...fascinating."

Permalink

Consider - as a dath ilani might consider it - the problem of a dath ilani cast into a strange new universe, who must trade with the aliens found there.

(It is in fact quite a common trope, in dath ilani science fiction.  But it wouldn't particularly occur to Keltham to classify this situation as that trope.  Cheliax is way too legible.  They have a currency of 'gold pieces' that they cheerfully translated for him into unskilled-labor-years.  Golarion would need to be a lot weirder before it was good Trade With Aliens science fiction.)

The aliens, one may suppose, have a biological-evolutionary setup similar enough to dath ilan's that they have epidemics, caused by viruses and bacteria and parasites.  Suppose the aliens don't know about viruses and bacteria and parasites; or vaccines or antibiotics or filtering masks or possibly even sterilization.  Nor about how one should use experiments to determine whether a disease is airborne or waterborne or touch-transmitted or transmitted through wastewater contamination or is carried by smaller or larger animals.

The dath ilani, then, knows something which this alien Civilization might find of great value.  The alien Civilization can perhaps pay for this knowledge, with some alien means of payment.

Perhaps the alien Civilization, being nonhuman (or just non-dath-ilani) tries to be stingy about it; to lowball the dath ilani; to buy their knowledge at a cost of, say, a pile of shiny metal, or title to one island in the ocean.  Depending on the exact backstory of how the aliens came to try this, and whether it was in some sense the fault of that whole civilization or just a part of it, even a non-Keltham dath ilani might well say, "Screw you, pay me."

That, too, the dath ilani are taught; in Golarion terms, the difference between Lawful Good and Lawful Stupid.

But then how high does the price need to be, exactly, for the dath ilani to agree to the trade?  By what system do you determine an answer to that?

Permalink

The notion of a fair agreement, a fair trade, a fair division of gains from trade, a fair price, plays a central role in any civilization that relies on its citizens' conscious understanding of their activities.  Dath ilan teaches the Law (mathematical structure) underpinning fairness, very carefully, and from childhood.  After all, if lots of people ended up with widely different notions of what was fair, Civilization would stop trading with itself.

In turn, the notion of 'fair trade' relies on understanding the notion of trade in the first place.

Permalink

'Jellychips', a staple of dath ilani lessons to young children, are small lumps of edible flavored gel.  Jellychips come in distinct appearances, colors, shapes, and flavors; almost always, everything with exactly the same appearance has exactly the same flavor.  Ten jellychips might mass as much as one peanut; they're meant to implement a burst of tasty flavor that's just enough to be present and pleasant.  They're tiny so that children don't get end up getting all of their calories from economics lessons.

To teach the notion of trade, you begin by passing out jellychips to children, and let them experiment a bit to find out which of their favorite flavors have which external appearance.  Then ask the children to write down which flavors they like more or less than others.  Compare the lists; observe to the children that they tend to like different flavors more or best.  (There is in fact a jellychip selection algorithm, based on previously observed food preferences among the kids, which makes sure that this happens.)  Observe to them that, by trading jellychips with each other, they could all end up with more of the kind of jellychips they want.

Let them trade, a bit, as they desire.  So long as they haven't been introduced to any formal concepts of 'fairness' to complain about, this part usually does not go too poorly, among dath ilani children.  They'll find jellychips that they have, and don't want; and look for somebody else who wants those, and has some they want; and trade 1-for-1.  If you let them play longer they'll start to notice triangular trades that no two children can complete, and do those too, but still usually 1-for-1.

Permalink

When the first rush of trading has died down, introduce to the children the concept of a multi-agent optimal arrangement: an arrangement such that it's not possible to redistribute the jellychips in any way that leaves all of the children better off simultaneously.  Ask them if they think their current arrangement got there.

Now the kids have a concept of a social goal to aim for, a way in which they can be collectively winning at trade or performing subpar; and the arguments will become a bit more heated.

(Especially if you've sorted all the kids to have a certain sort of personality, and usually therefore all be boys; because sometimes different children learn different things, and some of those things are best learned by similar children all together.)

It usually doesn't take long for one boy to start telling another that they need to make a trade, in order to get the classroom* into what they've figured as the optimal arrangement.

Of course the older kids immediately step in at this point, and remind everyone that, by the definition of multi-agent-optimality, you should never need to force somebody to trade in order to get to a jellychip arrangement that's better for everyone; the target state should be better for the person who's making the trade too.


(*)  Not actually a 'room' in the sense of being indoors; children need to be exposed to outdoor light levels in childhood in order to not grow up nearsighted.  The surface area required for children to spend enough of their day outdoors is currently the limiting factor on the urban density of the Great City (called also Central City and Default).  This is one of the places where public will and private incentives are in conflict, since there's a pressure towards ever-greater urban density in the center; but if this were permitted, soon it would be mostly childless people who could afford to live in Civilization's dense center.  For that and other reasons, it's been decided that it's better to limit the Great City's density and keep Civilization more spread out.  To find a solid expanse of skyscrapers, you'd need to visit a major city with few or no children per capita, like Big Quiet, or Erotown.

Permalink

After this enlightenment, an adult Watcher comes forth, and argues to the younger children that the whole point of trading things is that different people put different values on the same goods: if you-1 like black jellychips and have blue jellychips, and you-2 have black jellychips and like blue jellychips, then you can both do better by trading jellychips with each other.

This, the Watcher argues solemnly, is the point of trade, and the whole reason why people trade with each other: because they get different enjoyments from owning the same things, so that they can both become better off by passing the same fixed goods back and forth between themselves.

The younger children are asked if they first-order believe that.

None will say 'yes', at that point.  The most overeager ones will say 'No!' but then be unable to explain why not.  Most kids will give the brief Baseline comeback that colloquially translates to 'I probably would have believed it, if I wasn't pretty sure you were trolling me, though I haven't seen anything that I suspect is the real argument against it'.

(A dath ilani childhood tends to make one grow up suspicious of things that grownups say with great solemnity.  Civilization considers this a desirable outcome, which is good, because it sure is the outcome they're getting.)

Regardless of their answers, the children are then asked whether people who all got the same enjoyments from the same goods would never trade with each other.  And so that pathway of learning continues.

Permalink

On a separate track through the lattice of knowledge, a new idea may now be introduced on those foundations, the notion of a fair trade between black jellychips and blue jellychips. 

It begins by showing the children a way to rearrange their understanding of jellychip preferences, as a quantitative relation and not just an ordering, through the concept of indifferences, which state equalities of preference.  Not just, "I like purple jellychips more than black jellychips, and black jellychips more than blue jellychips" but "I am indifferent between having 5 purple jellychips, or 6 black jellychips, or 8 blue jellychips."

But then, of course, you might be able to execute multi-agent-beneficial trades that aren't 1-to-1.  If someone is indifferent between having 6 black jellychips and 8 blue jellychips, then trading 7 blue jellychips for 6 black jellychips will leave them better off than before.  Right?

A lot of children will say 'No!' at this point, and try to find some reason why that couldn't possibly be valid, because they know how economics lessons work, by this point in their lives.  They expect that somebody's about to lead them down a pathway that takes them down to 6 black jellychips and then 5 purple jellychips and so on until they only have one jellychip left.  

But you can, with a bit more work, convince them that it's totally valid to want 6 black jellychips more than 7 blue jellychips, and valid to trade things according to your wants, and tell them that in fact this does not necessarily always expose them to a set of clever trades that take them down to 1 jellychip which, it will then be proven to them, they must want to trade for nothing.  That's not actually going to happen!  You're thinking it's going to be the point of the economics lesson, but it's not!  Adults actually trade 7 hours of labor for 6 fancy shirts all the time, without ending up with 0 shirts, and this is isomorphic.

The children are then asked if they think they can get to a more multi-agent-optimal state by trading uneven numbers of jellychips amongst themselves.

The children approach this warily; or with a burst of initial enthusiasm that fades, after many children prove rather suspicious of attempts to get them to trade more jellychips for less jellychips.

Dath ilan having an average Intelligence of 16 or 17, it doesn't take long for somebody to point out that, even if one person likes some jellychips more than others, that's no reason for them to end up with less jellychips.  Other kids also like some jellychips more than others.  Why shouldn't they be the one to end up with less jellychips, and I, be the one who ends up with more, if that's how we're going to play it?

Why yes, Keltham was the first one to say it in those terms, in his own class, when he was very young.

Permalink

Suppose that Keltham is indifferent between 3 black jellychips and 4 blue jellychips, and that Limyar is indifferent between 2 blue jellychips and 3 black jellychips.  Suppose they both start with 12 black and 12 blue jellychips.

Then for Keltham to trade his 12 blue jellychips, for 10 black jellychips from Limyar, would leave them both better off.

And for Limyar to trade his 12 black jellychips, for 9 blue jellychips from Keltham, would leave them both better off.

And for Keltham to trade his 12 blue jellychips for Limyar's 12 black jellychips would leave them both better off.

All three of these are mutually beneficial trades.

But which of them is fair?  Or fairest?

If you're the sort who agrees to just any trade that's mutually beneficial - like Limyar, in this classroom, had been earlier arguing people ought to do - then you know what Keltham is going to do to you?

That's right.  Keltham is going to offer 9 blue jellychips for your 12 black jellychips, you're going to accept, Keltham is going to carry out the trade, and then Keltham is going to angrily throw another 3 blue jellychips at you and yell that you're being stupid.

Permalink

If you step out and look at that problem from a wider angle, it's pretty much the same issue that holds between the dath ilani and the alien civilization, considering the price of medical knowledge.

If the alien civilization offers some tiny lowball offer - like, say, a supply of food and water - in exchange for every last scrap of your knowledge, and there's no other civilization around to trade with, you and they will both be better off if you accept, compared to if you don't.

But if you accept offers like that one, food and water is the most you can expect to be offered, if the aliens are less Lawful Neutral than Keltham.

(Even if there's two alien factions around to trade with, you can't quite rely on them bidding against each other.  What if they coordinate with each other instead?  There's a noticeable amount for both of them to gain, if they both agree to offer you only food and water, instead of a higher price.)

Permalink

Another game is now introduced to the children, played with a single flavor of jellychip.  It is not, in dath ilan, called the 'Ultimatum Game', but the actual name they have for it is the 'Final Trade Offer Game', which isn't all that different.

One child gets a dial, with settings from 0 to 12.  Another child gets a button.  The first child picks a setting on the dial and locks it in.  The second child then chooses whether to press the button.  If the second child presses the button, the first child gets as many jellychips as the dial indicates; the second child gets jellychips equal to 12 minus the number on the dial.  If the second child doesn't press the button, they both get nothing.

Which is to say: the first child proposes a division of a gain of 12 jellychips, where they get some part, and the other child gets the rest.  The second child can approve the division, or refuse it; and if they refuse, both get nothing.

If you run this lesson on dath ilani children, virtually everyone offers a 6:6 jellychip split and everyone accepts it.

At least, that's what they do on round zero, the initial round where they try things the simple way to verify their starting assumptions.  Then they start experimenting.  It's not so much that they're being selfish, and trying to figure out what they can get away with; it's that they're figuring there must be some clever point to this game, and you're not going to find it if you just offer 6:6 every time.

Some kids try out accepting splits of 7:5.  Other kids are like, ok then, and offer them 7:5 splits, which usually get rejected if, like, people are going to make a thing out of that, right.  Some try offering compacts to trade 7:5 splits for 5:7 splits, but there's no guarantee that any two kids will be matched up again in the future.

At this point the older kids step in and say that the point of the game is drifting away from the reality it's intended to model, and everybody nods and waits for the next part.  (Of course there's a next part.  There's been a weird game and no stunning insight about it has been presented yet.  They've ever been to a lesson before.  Older people aren't going to make you execute a weird pointless procedure and then not have some stunning insight to offer you as payment; kids would stop going to lessons, if that bargain was often violated.)

Before the next part, though, the older child teaching asks what the kids think is probably the ideal or correct thing you're supposed to do if somebody offers you a 7:5 split, not as a game, but in real life.

Keltham, of course, said to reject the offer.  Some other kids agreed the offer should be rejected.  Some claimed that you should accept it, but everyone should be angry at the person and whoever went next with them should offer them 7:5.  Limyar claimed that you should always accept it, even if the other person offers 11:1, because everyone would end up with fewer jellychips if you rejected than if you accepted, so rejecting the offer couldn't be multi-agent-optimal.  Keltham asked Limyar if he actually believed that.  Limyar said no but he was going to go on saying it anyways to annoy Keltham.

The kids argue about it for a while, and then the demonstration moves on.

Permalink

The next stage involves a complicated dynamic-puzzle with two stations, that requires two players working simultaneously to solve.  After it's been solved, one player locks in a number on a 0-12 dial, the other player may press a button, and the puzzle station spits out jellychips thus divided.

The gotcha is, the 2-player puzzle-game isn't always of equal difficulty for both players.  Sometimes, one of them needs to work a lot harder than the other.

Now things start to heat up.  There's an obvious notion that if one player worked harder than the other, they should get more jellychips.  But how much more?  Can you quantify how hard the players are working, and split the jellychips in proportion to that?  The game obviously seems to be pointing in the direction of quantifying how hard the players are working, relative to each other, but there's no obvious way to do that.

Somebody proposes that each player say, on a scale of 0 to 12, how hard they felt like they worked, and then the jellychips should be divided in whatever ratio is nearest to that ratio.

The solution relies on people being honest.  This is, perhaps, less of a looming unsolvable problem for dath ilani children than for adults in Golarion.

Once this solution is produced and tried once, the older children congratulate the kids on having solved the first layer.  On to the second layer!

In the second layer, some children get handed sealed cards before each game, telling them whether to be honest about it, or to try to grab a little more for themselves.  (Though remember, say the older children, that this is all only a game; we are trying to ask how Civilization can be robust to bad people, not teach you to be bad people yourselves; the thing is, you see, that on scales much larger than this class, there really will be some bad people.)

And that means the child who sets the dial, or the child who presses the button, can't trust the other to be honest.  Even if the other child's sealed card didn't say to be dishonest, the first child has no way of knowing that.

(Dishonest people really do complicate things, don't they?  Just the fact that they exist makes things harder on everyone else, because they don't know who the dishonest people are.  But that's part of the difficulty of constructing an adult Civilization, one that has to scale to numbers beyond two dozen or sixteen gross.)

Permalink

The children start having to think harder, at this point.  There are kids playing hard on puzzle-games, and hearing estimates of the other player's labor-effort that don't sound quite right; proposing splits afterwards, and seeing those splits rejected, and both getting nothing.  Some of the kids start to get angry at each other.  Others are trying to come up with a brilliant general solution; and, if they're wise, they know they haven't found one.  Some children are not so wise, but they can't get anyone else to go along with their brilliant general solution.

Keltham plays through with as much cold and steely determination as a seven-year-old can muster, offering exactly what he thinks is fair, rejecting anything he thinks is less than fair; feeling awful when the other kid yells at him that he was being honest, but not swerving from his course.  He can trust himself; he cannot trust the other.  When his card tells him to be dishonest, Keltham gives ridiculously huge estimates for his own labor, and hopes the other child is wise enough to know that Keltham is, must be, lying.  Sometimes he's told to be dishonest and he has to pick the split himself, and then he gives a huge estimate and pretends he believed the other kid's huge estimate.  Sometimes the other kid doesn't catch on in time, and then Keltham has to offer an unfair split or tap out of the game and metagame entirely, which feels like failing even more.  Sometimes the unfair split gets rejected, and sometimes it gets accepted, which is worse.  Keltham sets aside all his unearned chips to redistribute after the lesson ends.  It's a good thing this is only a game, because living life like this would be awful.

Lessons end for the day.  It is sometimes good to let children dwell for a time on problems that don't have known solutions yet, or realize how awful life can become when not everyone has deduced the governing Law.

Permalink

(Children actually do better, dath ilan has found, if you try having them play this elaborate game without having previously introduced the concept of a multi-agent-optimal boundary, or the notion of the Ultimatum Game, or the question of fair trades between unequal numbers of jellychips.  Then they just play and negotiate, without a concept that they are Failing To Reach Multi-Agent Optimality, or the notion that children who disagree with them are Refusing To Make Mutually Beneficial Trades, or that the offered trade was Unfair.  The children are less distracted by ideas they don't know how to operate, goals they don't know how succeed at, and ways to argue that people who disagree with them are doing some particular thing objectively incorrectly.  There is a valley of competence as a function of knowledge in this case, where knowing just a little can hurt you.)

Permalink

When the children return the next day, the older children tell them the correct solution to the original Ultimatum Game.

It goes like this:

When somebody offers you a 7:5 split, instead of the 6:6 split that would be fair, you should accept their offer with slightly less than 6/7 probability.  Their expected value from offering you 7:5, in this case, is 7 * slightly less than 6/7, or slightly less than 6.  This ensures they can't do any better by offering you an unfair split; but neither do you try to destroy all their expected value in retaliation.  It could be an honest mistake, especially if the real situation is any more complicated than the original Ultimatum Game.

If they offer you 8:4, accept with probability slightly-more-less than 6/8, so they do even worse in their own expectation by offering you 8:4 than 7:5.

It's not about retaliating harder, the harder they hit you with an unfair price - that point gets hammered in pretty hard to the kids, a Watcher steps in to repeat it.  This setup isn't about retaliation, it's about what both sides have to do, to turn the problem of dividing the gains, into a matter of fairness; to create the incentive setup whereby both sides don't expect to do any better by distorting their own estimate of what is 'fair'.

They play the 2-station video games again.  There's less anger and shouting this time.  Sometimes, somebody rolls a continuous-die and then rejects somebody's offer, but whoever gets rejected knows that they're not being punished.  Everybody is just following the Algorithm.  Your notion of fairness didn't match their notion of fairness, and they did what the Algorithm says to do in that case, but they know you didn't mean anything by it, because they know you know they're following the Algorithm, so they know you know you don't have any incentive to distort your own estimate of what's fair, so they know you weren't trying to get away with anything, and you know they know that, and you know they're not trying to punish you.  You can already foresee the part where you're going to be asked to play this game for longer, until fewer offers get rejected, as people learn to converge on a shared idea of what is fair.

Sometimes you offer the other kid an extra jellychip, when you're not sure yourself, to make sure they don't reject you.  Sometimes they accept your offer and then toss a jellychip back to you, because they think you offered more than was fair.  It's not how the game would be played between dath ilan and true aliens, but it's often how the game is played in real life.  In dath ilan, that is.

Permalink

After that came the part where Keltham's learning-group was introduced to their first sophisticated trading-game, with tokens that produced varying quantities of jellychips, but only when held in proximity to other tokens, and large enough groups of tokens could produce more tokens.

Despite their best efforts and the lesson they'd just learned - and, since they were still young boys, after a lot of shouting, beyond a certain point - the nascent market had soon shut down almost entirely over refused trades, caused by disagreements about what was 'fair'.

During the game's post-mortem, it was eventually figured out (with some nudging and hinting from the supervising older-children) that children with rarer tokens had tended to think that the weight of a token's value for its even division ought to be determined by that token's scarcity; children with tokens that produced lots of jellychips (even if they required some other tokens to be nearby to work) tended to think that direct jellychip production was the obvious starting anchor for weighing economic value; children with tokens that produced other tokens argued themselves to have the only goods that mattered in the long run, and that you'd need a lot of lesser tokens to trade fairly for one of those.

This begins the pathway of learning that leads to market prices, the other way of setting prices; in which larger Civilization has a collective interest in seller prices ending up close to the marginal cost of production, so that as many trades as possible occur.

Children who master the complications here have officially passed Financial Literacy Layer 2, and can have their own investment accounts*, which was the main reason Keltham was going through this whole lesson-sequence at age seven instead of age eight.


(*)  Having an investment account in dath ilan is the equivalent of having a 'bank account' in other places, rather than a mark of greater financial maturity than that.  Dath ilan doesn't particularly use, as a store of value, currency-denominated packaged bank debt with fixed returns.  Value is stored mostly in equities.  When you write a check against your investment account, divisible fractions of equity are automatically sold out of it in some medium of exchange, and automatically reinvested in the receiver's account, according to the (simple) autoselling and autobuying algorithms on both sides.  If you want to pay for less volatility in your assets, you buy insurance on the equity, so that somebody agrees to buy the asset from you if it drops below 80% of its original-purchase price; and the price of insurance like that is a risk signal.

Permalink

When it comes to selling knowledge to aliens, to be clear, Financial Literacy Layer 2 is not going to get you there.  If the answer across every plausible premise was trivial and similar, that trope wouldn't be such a staple of dath ilani economic fiction.

Thankfully, Golarion is not nearly weird enough for Cheliax to be composed of aliens in the relevant sense; the Chelish have money and will tell you how many unskilled-labor-hours it corresponds to.  The most you have to worry about is that somebody gave them a dishonesty card - which does mean you have to do your own calculations about what's fair, and not just ask them.

When you are not dealing with alien aliens, when setting prices with those aliens is not the point of the story, a normal dath ilani would consider the solution obvious.  There comes a saturation point beyond which individuals cannot realistically use any more money to become happier themselves, for usual reasons of just-noticeable-differences being a mostly-constant fraction of how much you already have, which implies utility roughly logarithmic in wealth.  If the aliens offer to pay you that much, asking them to cough up more would mean that a number of poorer aliens would all have to give up chunks of utility that loom larger for them, so that you could get much smaller amounts of utility; and even if that's fair, it isn't Good.  If the aliens offer an ultimatum for less, turn them down with very high probability; they're trying to give you far less than you're worth.  Would Civilization offer less than a billion labor-hours to an alien bearing knowledge of how to eliminate whole swathes of diseases hitting large sections of the population?  (No.)

Of course, in a normal dath ilani economic-fiction isekai story, the entire world you end up in is not insane in every part; there may be one insane point of departure with some insane consequences lawfully extrapolated, but the author is not going to throw an entire insane world at you; it wouldn't be credible.

A normal dath ilani, thrown into another world, does not come in expecting to need enough money to make lots and lots of important investments that the natives haven't made because the natives are insane.  They're expecting to find an alien efficiency of no simple ways to make everyone collectively much wealthier, not the howling absence of that efficiency.

Keltham wasn't expecting Golarion either.

He did, however, catch on in short order to what he currently thinks is the magnitude of the problem.

It is possible he will need a lot of money to solve it.

Permalink

If you've actually got to negotiate with very humanlike aliens, you need Financial Literacy Layer 5; or at least, Keltham hopes that's what he needs, because that's what he has.  This gives him access to a spotlighted permutation-based method for determining the fair contribution of one actor to a multiagent process.  It's not spotlighted nearly as hard as, say, the Probability axioms, or Validity; but it's pretty much the only spotlighted method for that kind of fairness, and Civilization is somewhat hopeful that aliens will use it too.

The permutation-based method says to consider how much marginal added value an agent produces, by being added to a collection of other agents, when considering every possible order in which to add all agents into the evaluation.  If, for example, two people are both needed to complete a task worth 10 jellychips, and it can't be completed at all without both of them, then there's two possible permutations:

Permutation 1:
  Alis:  Alis alone receives 0 jellychips; her marginal value, added to the empty set, is 0.
  Alis+Bohob:  After adding Bohob, the payoff is 10 jellychips; Bohob's marginal product, added to Alis is 10.

Permutation 2:
  Bohob:  0 jellychips.
  Bohob+Alis:  10 jellychips.

Averaging the marginal products together across all permutations, the method says that Alis and Bohob both receive 5 jellychips.

Yes, this is a very simple answer to be produced by all that logic, but the point is that it generalizes.

Total: 4482
Posts Per Page: