This post has the following content warnings:
some dath ilani are more Chaotic than others, but
Next Post »
« Previous Post
+ Show First Post
Total: 4482
Posts Per Page:
Permalink

Applied directly to the situation with Cheliax, the method says, roughly, that Keltham should receive an amount proportional to how much marginal product he adds, on average, to all possible (ordered) subsets of Cheliax.  If Cheliax had only half its current people, for example, Keltham might only add around half as much value.  For even smaller subsets of Cheliax, product might fall superlinearly; Keltham couldn't necessarily accomplish 1/20,000,000th as much with a single Chelaxian.

It adds up to 'somewhat less than half of his marginal product when added to all of Cheliax, probably'.  Yes, this is a very simple answer to be produced by all that logic; but the point is that Keltham knows why that is the fair answer and what he ought to do if he gets offered less.


Keltham doesn't spell out this part explicitly, or say that he was willing to accept Cheliax's opening offer taken at face value, and indeed would have compromised on substantially less had it been necessary.  Cheliax mentioned difficulties in accurately measuring the gains to the country, and may intend to offer a measuring methodology expected to be an underestimate of the real value; or it could be that Cheliax is thinking the split is about direct profits from project sales, where Chelish consumers are capturing much more value than the sale price of the products, the consumer surplus.

Also Keltham might find there's weird terms or conditions in there, in which case he wants to get the highest initial offer on hand so he can burn percentage points as bargaining power, to iron out the terms and conditions.  He can always hand back any excessively generous jellychips that are still left at the end of that.

Permalink

 

Somewhere in a place that is not this place, so far away that there is no distance and no time between here and there, a former airplane passenger named Thellim reads how Earth economists have tried to analyze the Ultimatum Game, played by splitting $10.

Earth's economists have concluded that it is 'irrational' to refuse a $9:$1 split, since it leaves you $1 worse off.  They note that human subjects seem to be 'irrational' by occasionally refusing offers below $5 with increasingly great probability as the offer drops.  Perhaps it is meta-rational to develop a reputation for acting 'irrationally', since it causes people to make you bigger offers, if they know you'll irrationally refuse smaller ones?  (For some reason they don't continue on to ask why not develop an 'irrational' reputation for refusing all offers below $9, instead of $5.)

Thellim swiftly infers that Earth's moon prevents its inhabitants from thinking clearly about negotiation.

(She's mistaken.  It's kind of a long story.)

Permalink

Sometime even later, Thellim is going to conclude that maybe it's not the moon.  She will then wonder if there's any way to explain to Earth economists how the absolute basics of negotiation work in coherent decision systems (eg those consistent under reflection in the presence of correlated agents and/or models of agents).  Or even, minimally, get them interested in what sort of 'irrational' behavior rational agents want to have 'reputations' for having, and if there's any systematic structure in there that might possibly be interesting.

It turns out that Earth economists are locked into powerful incentive structures of status and shame, which prevent them from discussing the economic work of anybody who doesn't get their paper into a journal.  The journals are locked into very powerful incentive structures that prevent them from accepting papers unless they're written in a very weird Earth way that Thellim can't manage to imitate, and also, Thellim hasn't gotten tenure at a prestigious university which means they'll probably reject the paper anyways.  Thellim asks if she can just rent temporary tenure and buy somebody else's work to write the paper, and gets approximately the same reaction as if she asked for roasted children recipes.

The system expects knowledge to be contributed to it only by people who have undergone painful trials to prove themselves worthy.  If you haven't proven yourself worthy in that way, the system doesn't want your knowledge even for free, because, if the system acknowledged your contribution, it cannot manage not to give you status, even if you offer to sign a form relinquishing it, and it would be bad and unfair for anyone to get that status without undergoing the pains and trials that others had to pay to get it.

She went and talked about logical decision theory online before she'd realized the full depth of this problem, and now nobody else can benefit from writing it up, because it would be her idea and she would get the status for it and she's not allowed to have that status.  Furthermore, nobody else would put in the huge effort to push forward the idea if she'll capture their pay in status.  It does have to be a huge effort; the system is set up to provide resistance to ideas, and disincentivize people who quietly agreed with those ideas from advocating them, until that resistance is overcome.  This ensures that pushing any major idea takes a huge effort that the idea-owner has to put in themselves, so that nobody will be rewarded with status unless they have dedicated several years to pushing an idea through a required initial ordeal before anyone with existing status is allowed to help, thereby proving themselves admirable enough and dedicated enough to have as much status as would come from contributing a major idea.

To suggest that the system should work in any different way is an obvious plot to steal status that is only deserved by virtuous people who work hard, play by the proper rules, and don't try to cheat by doing anything with less effort than it's supposed to take.

Thellim could maybe solve this problem if she put around five years of her life into taking the knowledge, and putting it into a form where the system thinks it's allowed to ever look at it or talk about it without that being shameful.  But Earth has problems that are plausibly more important than their entire field of economics being firmly convinced that a particular set of crazy behaviors are 'rational' and that healthy, prosocial, equilibrium-solvable behaviors are 'irrational'.

She ends up writing a handful of blog posts about it, tossing mentions of it into a couple of stories she writes on the side, and otherwise leaving Earth to its fate there; Earth has rather a lot of awful fates going on simultaneously, and that one is not literally their most important problem.

This, however, is not her story.

Permalink

Keltham, in any case, now attempts to recount to Cheliax what he went through as a kid to learn about the basic concepts of negotiation.

The first part of this would be handing out assorting jellychips to children, as selected to guarantee that different children will have different preferences over them but all will find those tastes and textures pleasant at all; letting the children trade among themselves, which they usually do 1-to-1 and peacefully; introducing the concept of a multi-agent-optimal solution to the kids, which gives them a social goal they could be failing at instead of just a few voluntary improvements to make among themselves; whereupon they start yelling at each other to make particular trades for the good of the class; and then the older kids come in and remind them that, by the definition of multi-agent-optimality, solutions like that should make all the kids better off so you shouldn't have to force anyone to go along with trades leading there.

How are the Chelaxians doing so far?

Permalink

Absolutely no yelling at each other to make particular trades for the good of the class! Say what you will about Evil, it doesn't inculcate that particular tendency. 

Permalink

Meritxell has made herself a multi-column tracking sheet - six of them, actually, ordered by different things. 

"Can everyone report to me their hypothetical reward preferences in, uh, negative wrist-slaps? Imagine we'll settle it out at the end by giving out a number of actual wrist-slaps equal to the reward so there's no incentive to overstate or understate your reward preferences."

     

Permalink

Keltham wasn't expecting them to go off and immediately start setting up games to simulate the thing he was describing despite the absence of actual jellychips, he'd sort of wanted to see if imagination would be enough, but he's not going to stop them if they do that.

He draws the line at the wrist-slaps, though.  "The point of positive rewards in this case is that there's an incentive to play the game at all," Keltham says.  "If you tried paying dath ilani kids in negative wrist-slaps they could avoid all the wrist-slaps by not coming to class.  It's like trying to buy shoes at a shoe-shop by threatening to wreck the guy's shoe-shop unless they give you shoes.  Even if they did give shoes, the guy doesn't want to be part of the whole system then, and now they have an incentive to call the town guard... okay, 'town guard', sure.  And anybody else who sees that's how you operate has an incentive to poison your shoes and send you to the afterlife early before you come around to their shop.  When you're trading things the other person actually wants, they want the whole system to stay in place, which is what makes stable equilibria possible.  It's an important difference!"

Permalink

" - I want the whole education system to stay in place," says Meritxell, baffled. "It taught me to be a wizard!!!"

        "Wait, do dath ilani children just....not go to school if they don't feel like it? Wouldn't that get you a lot of people who never learn anything, or at least never learn anything they aren't being bribed to learn?"

        "And never learn how to do things that are unpleasant for a long-term reward -"

 

Permalink

"Civilization goes to a great deal of effort to arrange things so that kids actually do want to go to school, because dath ilani kids are smart by your standards, and the grownups do not actually want to get into a contest with us about whether or not we can rig the school's boiler system to explode if we use a cunning plot with coordinated distractions.  It doesn't matter that they would very likely win, they don't want to get into the contest with us.  Ah, with them.  I mean they do still have all sorts of security systems to make it hard to blow up schools because, you know, kids, but they're based on the assumption of fending off like three kids who want to see if they can, not three hundred kids who all have the same incentives."

Permalink

 

 

 

"Chelish kids do not....coordinatedly try to destroy our schools," Meritxell says faintly after a while. "Even wizarding kids, who are smart. It - wouldn't even be hard, with magic, you wouldn't need coordinated distractions but no one would do it, even if you'd made a very bad mistake at school and were going to be disciplined -" they did check, Taldor beats students for misbehavior too. "You don't have to..... be so nice to children they wouldn't ever occasionally wish their school was on fire, you just have to teach them enough discipline that even when they wish it was they don't do it."

Permalink

"Maybe if you're Good and.....refuse to .....use any punishments ever..." Carissa feels like she's kind of caricaturing Good here, like if she said this to a paladin they'd object that obviously they do punish people when appropriate - "then you have to bribe everyone all the time to just nondestructively participate in society because the - differential between cooperation and noncooperation still has to be just as large and you're trying to do none of it with pain."

Permalink

Ione Sala is starting to feel nervous, for the first time, about what exactly Lord Nethys might be working towards with His plans around Keltham.

Well, it's not as if she has any other options, so, moot point, she'll go along with His goal, even if it's exploding Cheliax or whatever.  It's not like she has any friends here.

Permalink

"Look, I get that Golarion is a poorer and more dangerous place and that you cannot afford to have kids occasionally successfully destroying their school.  You still - want to treat children as miniature adults, right, so that they'll grow into adults with the right shape?  When they grow up into adults, you don't want those adults sticking around places where they're being hurt, or tolerating the existence of systems that leave people worse off than if the whole system didn't exist.  So you don't put children into childhood situations where their own incentive is to destroy everything around them, and all they lack is the power to do that."

Permalink

 

 

"Chelish students are not incentivized to destroy their schools, even if they wouldn't get in trouble for it, because becoming a wizard is really valuable," says Meritxell. "Their incentives are sometimes on the scale of their lives, not on the scale of that specific day being more fun than not-fun, but that's - how being an adult is, too."

Permalink

"Do kids here already understand that when they're seven years old?  Five years old?  By the time somebody understands and has integrated subtle incentives for their future self spanning decades, they're no longer a child; they don't need adults to guardrail their decisions anymore."

"I suspect there's some weird sticking point here that - look, sufficiently young kids do get slaps on the wrist.  Civilization doesn't like it, I don't like it, but even dath ilan never figured out how to produce healthy adults while never doing that at any point.  There are elements of morality and personhood that humans just weren't designed to learn without experiencing small amounts of pain in childhood.  But every time you set up a situation where a kid gets told that they need to do something or else get slapped on the wrist, you also add some value to an investment account the kid gets access to when they're older, such that even if the kid was secretly an adult in a kid's body, they would still be calculating a net benefit on being present for the whole transaction.  To make sure the total interaction is still mutually beneficial, which means, beneficial to them too, so that ideal kids wouldn't have an ideal incentive to escape their parents or destroy the whole system.  Civilization goes on optimizing its heritage and the kids keep getting smarter and more Law-comprehending, which means that you always check all the interactions with children to make sure that the system wouldn't fall apart if the kids started being more ideal intelligences than expected one year.  And having to pay that amount to set up a potential wrist-slap situation reminds adults to check, every time, whether they really needed a wrist-slap there."

"I realize you can't afford any of that, but it is how Civilization thinks.  We don't want to build into the system a load-bearing assumption that our kids are stupid and weak, even if they are."

Permalink

It has occurred to most of the girls that ideally they should be learning Keltham's economics not arguing with him about punishment so they nod gravely rather than trying to explain the dozen things wrong with that.

 

The most obvious, thinks Meritxell, is that you don't actually want adults who believe themselves entitled to blow up any system that isn't serving them, because then you end up like Taldor having a civil war every few years. 

 

The most obvious, thinks Tonia, is that kids can in fact run away and get eaten by wild beasts if they want, and none of them do, so they obviously think being around their parents is better than not that, which they're right about. 

 

The most obvious, thinks Gregoria, is that adults are still children, in Keltham's ontology, and the only real adults are gods.

Permalink

(The most obvious, thinks Carissa, is that the fundamental system in which everyone is participating in is existence, life and then afterlife, and that's so obviously, wildly worth it that no possible specifics could matter - it'd be like trying to sell someone a +6 Headband of all three mental statistics for the price of an afternoon scrubbing floors and assuring them that you won't yell at them for missed spots. It doesn't matter, it's all nothing next to the magnitude of the gift they've already been given, the only reason they're even able to parse it instead of rounding it off to the zero it is is because their minds are broken and they're very small and stupid. If a god were somehow born into a human child's body they wouldn't care if they got hit in class or not; human weakness isn't any particular nature of the bribes but the fact they're required at all, and planning for more perfect agents would mean planning for agents whose thoughts were too big and vast to give this question a second's contemplation.)

Permalink

Keltham notices that he's running across a class of external and internal subjective sensations that precedes learning something is horribly wrong with Golarion, and sets it aside, because he's allowed to take longer than two days to learn about all of the problems.  At least the problems they're making no effort to conceal from him, which they don't seem to be doing here, what with having just volunteered all of that info.

Anyways, they can play a pretend version of the trading game, if they like, so long as they don't try to literally pay in negative wrist-slaps because no just no that's the literal opposite of the larger point.

Keltham checks their final result, and references it against the supposed ordinal preferences for the players.  Does it look multi-agent-optimal at a computerless glance?

Permalink

Meritxell has helpfully circled each trade and noted why it increases utility for each participant, and then written down possible trades from the final state and why they don't. If something's wrong it's a more complicated error than that; the girls are in fact heatedly speculating, now, in whispers, about whether there are local multi-agent-optimal maxima that aren't a global multi-agent-optimal maximum.

Permalink

...right.  Not actual children here.

"If there's such a thing as a local optimum in that sense, which isn't global, you ought to be able to produce a simplified example of it.  Say, try constructing one with three players and three kinds of jellychips," Keltham suggests.

Permalink

"There's not going to be," Meritxell says. "If there were and we knew what it was then we could just switch to that arrangement."

"It could be better for someone from their starting point but not better for them from the place we just arrived at, and higher value total -"

"If it's higher value total and they get a veto we use some of the higher value to pay them."

"Oh, I see, do we have continuous jellychips now, Keltham must have forgot to mention that feature of theirs."

Permalink

"Would you care to state exactly what is a 'local optimum' and how it differs from a 'global optimum'?"

Permalink

"Take, like, water," says Gregoria. "Water flows downhill, but if it flows into a crater or something, it's not going to go up in order to get to keep going down. And water usually isn't sentient and even when it is it's not very smart but you can have a situation where everyone agrees that the current situation isn't as good as some other situation, but none of them have a step that's a clear step up for them. And Meritxell is right that if you have centralized control you can just make everyone go to the new place even though there's not a series of smaller mutually beneficial steps to get there, and also that if this involves some people losing out you can pay them, but that doesn't always work, like, for example, if you're dividing things that come in units."

Permalink

"This comes up in spell structures."

Permalink

"Comes up all over all of reality, including in the basic elements of the human body that the tiny-spiral instructions say how to make, which fold up into configurations of least local resistance in order to - have the kinds of material properties that they do.  I'd guessed that spells were the same way almost as soon as I heard about them."

"Anyways, I agree that's a good metaphor, but if you could have a very simple arrangement of three players with three jellychips of three kinds, what would you say about that situation which made it a local multi-agent optimum, and what would you say about it that made it a global multi-agent optimum?"

Total: 4482
Posts Per Page: