This post has the following content warnings:
some dath ilani are more Chaotic than others, but
Next Post »
« Previous Post
+ Show First Post
Total: 4482
Posts Per Page:
Permalink

Nope, that's Cheliax's solution. 

Permalink

Keltham presents the standard solution (in dath ilan) to the Ultimatum game.  If they offer you 6:6, accept with probability 100%.  If they offer you 7:5, accept with probability slightly less than 6/7.  If they offer you 8:4, accept with probability slightly less-less than 6/8.

Does anyone want to try and guess the reasoning behind that solution, in advance of it being stated?

Permalink

 

"I see why it creates good incentives for the person who is deciding splits," Meritxell says. "...I don't see why the person deciding whether to accept splits or not has any incentive to do it, if they can't establish a reputation for it, and it's hard to establish a reputation for doing something sometimes."

Permalink

"Well, reputation-wise, it's definitely easier to have a reputation for doing something if everyone in your entire Civilization got trained to do it at age seven or eight."

Permalink

" - I see why you'd want to require everyone to do it, yeah. It'd be hard to catch them fudging, if we're talking about random peasants, but maybe that still keeps the incentives reasonable."

Permalink

"I think this is a place where I have the same reaction you had to burning down schools?  People don't need to be required to behave like that to be accepted for residency in a city, it's just in their own interests to behave that way.  Nobody wants to get a reputation as that weird person who accepts 11:1 splits and is very easy to take advantage of.  At least, nobody I know wanted it."  Limyar doesn't count, he was totally trolling.

Permalink

"The thing I'd expect people to be tempted to do, especially in a big city where they don't have much individual reputation, is make a show of using the randomization but take the split ten percent more of the time," says Meritxell. "So you get a bit more money but it's not obvious you're doing something exploitable, which means it isn't exploitable. But obviously it's bad for everyone if everyone can predict that lots of people will do that, so we will be better served if the Crown prohibits that."

Permalink

"Suppose I put to you:  Two gods interacting in the Ultimatum game would use the pattern I just showed you, even if they had no reputations and would never meet again."

Permalink

" - yes, of course."

Permalink

"Civilization in dath ilan usually feels annoyed with itself when it can't manage to do as well as gods.  Sometimes, to be clear, that annoyance is more productive than at other times, but the point is, we'll poke at the problem and prod at it, looking for ways, not to be perfect, but not to do that much worse than gods."

"If you get to the point in major negotiations where somebody says, with a million labor-hours at stake, 'If that's your final offer, I accept it with probability 25%', they'll generate random numbers about it in a clearly visible and verifiable way.  Most dath ilani wouldn't fake the results, but why trust when it's so easy to verify?  The problem you've presented isn't impossible after all for nongods to solve, if they say to themselves, 'Wait, we're doing worse than gods here, is there any way to try not that.'"

Permalink

Meritxell looks - slightly like she's having a religious experience, for a second, before she snaps out of it. "All right," she says quietly.

Permalink

"Once you've arrived at a notion of a 'fair price' in some one-time trading situation where the seller sets a price and the buyer decides whether to accept, the seller doesn't have an incentive to say the fair price is higher than that; the buyer will accept with a lower probability that cancels out some of the seller's expected gains from trade.  The buyer also doesn't have an incentive to claim the fair price is lower than they think it really is.  The seller won't actually adjust their price, if they think a lower price is unfair, and the buyer will have to follow through by accepting with a lower probability, which destroys a big chunk of their own expected gains from trade, and doesn't get them a different price even if the random number says to accept."

"The initial notion of a fair price has to come from somewhere - from the part of yourself that initially suggested 6:6 in the Ultimatum game, which reflects a bit of Law I'll describe later - but once you get that notion of fairness from somewhere, and put a system like this around it, no seller has an incentive to claim an unfairly high fair price, and no buyer has an incentive to claim an unfairly low fair price.  And if they happen to honestly disagree about that anyways, in some ambiguous situation, they'll still complete the transaction with very high probability so long as they only disagree a little."

"That, roughly, is how bargaining works in dath ilan over one-time trades:  If somebody offers a price the other side thinks unreasonable, the other side says, 'That strikes us as an unfair division of gains, even if mutually beneficial as such; but if you made that your final offer, we'd generate a visible random number and accept with 10% probability'.  And then the price-setting side can potentially offer further arguments about why the trade is more valuable than it looks, or make a better offer, or accept that low probability."

"The bargaining process Carissa described earlier, for selling my shirt, sounded like - people were probably trying to sort of flail at that underlying structure, by acting like they might be very unlikely to take an offer, or be moderately likely to take an offer, as they got closer to an agreeable price?  But with a lot more... weirdness, acting, in Baseline we'd say 'LARPing'.  Maybe because they think they have to pretend a lowball offer isn't mutually beneficial at all, in order to justify rejecting it; and also with some incentives to be misleading, because the underlying signals aren't as precise and legible as saying '10%'... and there's an incentive to exaggerate, but then the other side knows you're probably exaggerating, so you exaggerate even more, and you get people saying these exaggerated statements that both sides know aren't true, but there's uncertainty about how much the speaking side thinks they're really exaggerated, and modulating that uncertainty ends up being the medium of communication?  At least, that was my attempt to decode what Carissa described."

Permalink

"That sounds right."

Permalink

"If I imagine trying to negotiate a 256-page merger between two large companies, with 1024 clauses, I can't actually see how the Golarion method would scale, if you don't know about explicit acceptance probabilities.  Every time you wanted to negotiate one clause, you'd need to be ready to walk away otherwise, staking 100% of the success probability, because otherwise they don't have any incentive to give in.  But there's no way that would scale across 1024 clauses without triggering once... maybe the walk-away claims are mostly bluffs," wow, what a concept to have a single-syllable word for, "but the other side isn't sure you're bluffing each time they call it?  Does Golarion just not do large complicated contracts by dath ilani standards, or..."

Permalink

"I......I don't think you could have a contract with that many clauses, no. The Worldwound treaty has five. Wars are sometimes settled with lots of terms but generally only if one side gets to impose them and doesn't have to negotiate them."

Permalink

"Yeah, we go higher than five.  And there's reasons we do that, because we're not fans of complexity that can be eliminated without cost; so it's not of zero economic importance to have contract negotiations that scale better.  Subject of potential interest to Asmodeus specifically, or am I misreading the part where he's a god of contracts?"

Permalink

"Definitely of interest to Asmodeus," Meritxell says. Soul-contracts have a lot of terms and maybe Asmodeus is secretly annoyed that Chelish people don't negotiate them more but you know the standard works and devils can run rings around you, so it's stupid to, really.

Permalink

You couldn't have covered this topic FUCKING YESTERDAY?

Asmodia realizes her hand is clenched into a white fist and quickly relaxes it before anybody sees, but with the connection to compacts finally spelled out, she can now see how, even if she wouldn't plausibly have suicided and gone to Hell directly, she could have sworn to do that with a probability, inconvenienced them with some probability, and had any negotiating leverage at all -

Too late.  Why it is always, always, too late for everything.

Permalink

Keltham goes on to cheerfully describe how the dath ilani children, returned the next day and told of the solution to the Ultimatum bargaining game and the concept of fairness, now blitz through the previous emotional difficulties of the Uncertain-Labor-Difficulty Game.

No more anger and shouting!  Yes, sometimes somebody says your offer isn't fair, and you say it is fair, and they generate a random number, and the random number says that neither of you get anything, and that is a little sad.

But you know that they didn't claim that unfairness in order to try and profit at your expense; you know the incentives weren't like that, for them.

And they know you didn't state your offer in order to try and profit at their expense; they know the incentives aren't like that, for you.

You know they know you don't have the incentive to cheat, so you know that when they state a higher price than you think is fair, and end up rejecting your offer, they weren't trying to punish you for trying to cheat with a lower price.

You can see how, if you kept on playing this game for a bit, pretty soon both sides would learn to converge on a similar concept of fairness, and fewer offers would get rejected.

Permalink

"....does this actually outperform continuing to split evenly, though? Since sometimes offers get rejected - I guess continuing to split evenly doesn't appropriately train skill in - having a shared concept of how labor translates to offer distribution? And it's good for people if the whole society has a shared notion of that? ....what goes wrong if the whole society's shared notion is in fact 'effort doesn't matter only outputs'?"

Permalink

"Well, there's two components, I think, to my answer to that."

"The first answer is that outputs aren't always legible, and then you have to appropriately incentivize people's fairness on valuing the outputs.  In the version of the training game that the kids got, how much effort they had to put in wasn't fully legible, but the outcome of the game being won was visible and unmistakeable.  But suppose somebody is making a shoe; how good of a shoe is it exactly?  Maybe you could pay a trained third-party shoe-evaluator to come in and say exactly what they thought it would be worth, but measuring your output objectively like that is expensive.  What we have instead is the partially legible output of a shoe, where the quality of shoeparts or the evenness of the make or whatever it is that people value in Golarion shoes, might not be clear and objective to the point where the shoemaker and shoebuyer couldn't possibly disagree on it.  So then they need to both reason in a way that incentivizes fairness from the other, without everything shattering with probability 1 in the presence of a small disagreement."

Permalink

"- like they're already doing, when they barter over the shoe, but properly. That makes sense."

Permalink

"The second component - is something where I feel more like I know what my teachers would say, than like I really know the answer."  (These, of course, are vastly different internal subjective sensations that no dath ilani would confuse.)  "What I think they'd say is that the amount of human interaction and endeavor where we mutually benefit one another, in a way that we negotiate explicitly, where we could possibly pay to have a third party evaluate the outputs, is the tip of an ice floe... you don't have much ice here.  Is the thin tip of a pyramid, whose much larger base is all the places where people cooperate with each other without explicitly negotiating a price in money.  Can I arrive a little late to our meeting?  Oh, sure, they say.  Somewhere in the back of their mind, you expended a tiny bit of your social currency with them, and they now think you owe them a tiny bit of debt or cancel a tiny bit of debt they used to consider themselves to owe you.  And you'll also keep track of how much you fairly owe one another in implicit favors like that, and if the two of you disagree on that a little, it should only cause a breakup with very small probability, but if the divergence gets wider, maybe the two of you don't want to deal with each other anymore.  When you don't even stop to negotiate and no money changes hands, matters are in a much less legible place still, and you're relying to an accordingly greater degree on people being implicitly fair in how they reward effort or output, which means that the surrounding structure which incentivizes that implicit fairness matters even more."

"I'm sort of skeptical about to what degree you really need all those implicit exchanges, and couldn't maybe just pass small bits of money back and forth more often, like maybe in the world made of Kelthams they just do that.  But also I've never tried it, so maybe my imaginary teachers are right in what I imagine them saying, that it wouldn't work, or it would just be more inconvenient without helping much."

Permalink

Maybe all of this is hacked together because you can't just light people on fire a bit when they deserve it? ....she should not discard any pieces until she's totally sure she understands how they function, though. 

Permalink

"So in the example with your shirt," says Meritxell, "the other person just says out loud 'I can make 10million gold pieces with that shirt' and you just say out loud 'I value it one million gold pieces' and then they do some math and figure you'll accept a trade of 5.5million or trades of less with less probability. But what stops them from saying in the first place 'I can make five million gold pieces with that shirt' when they can make ten."

Total: 4482
Posts Per Page: