Tarinda in Skygarden
+ Show First Post
Total: 334
Posts Per Page:
Permalink

Kioh looks fascinated by the portal, but Viasarae shakes her head at him and he goes back to digging in the sand.

"Come on through," she says to Tarinda, stepping back onto the other side. It's a small flying island, flat and grassy; there's a round wooden table visible through the portal a short distance away, with plates of tasty-looking food on top and a few chairs tucked in.

Permalink

She steps across. She takes a chair.

Permalink

There's a man standing next to the portal. He's very tall.

Once she's through, he closes it and heads over to sit at the table across from her.

Permalink

"Tarinda; the Emperor," she says, with appropriate indicative gestures.

Permalink

"Please, call me Solekaran. Sekar if you like." He looks across the table at Tarinda. "I'm told you have a clever idea I might like to hear about."

Permalink

"I'm from another world. I don't know how I got here. If this planet were near my planet, I'd know; it's not, not even 'near' as stars can be. My world has no magic. It does have machines that can think. One runs my world and - everything is good."

Permalink

 

"When you say 'everything'..."

Permalink

"I would need more vocabulary and you would need more concepts for a lot of details but I do mean everything. We're all immortal and there's no material scarcity."

Permalink

"...no wonder you were so undaunted," he says, glancing at Viasarae, "when I said I couldn't imagine it. Because you knew my imagination wasn't reaching far enough. It's still not reaching far enough, to be honest, but... I have some idea of the shape of what lies beyond it."

Permalink

"It can't bring back dead people - unless they were frozen, and even then they have to be frozen right for it to work reliably - so I need to build one here as fast as possible to save everybody. I have a smaller thinking machine in my body which knows how but it requires inventions you don't have here yet. I know how to do those too."

Permalink

"...I can raise the dead, Viasarae might have mentioned," he says. "Not fast enough to keep up, not yet, but if everything is good and no one else ever dies... yeah, I could get through them all eventually. And even if I didn't... if you build this thing, if it works, I promise I'll bring back everybody who died while you were figuring out if it was safe to ask me and talking me into it and making all the things you need to make. It—wouldn't be fair, otherwise. I don't want you feeling like you're losing something every minute that goes by that you haven't started yet. I don't want—however long it takes me to decide if I'm letting you try it, even if it's a hundred years, I don't want that time measured in lives lost."

Permalink

"...it's still really bad for somebody to be dead for a hundred years if they didn't specifically want to be dead for a hundred years, by and large, but that's better, yeah. Thanks."

Permalink

"And I might take Viasarae's suggestion and arrange for a bunch of people to have the power to raise the dead without me, although I guess that's less urgent when no one is dying anymore anyway..."

He sighs.

"But I'm getting ahead of myself. How does a thinking machine make everything good? And could it do the same things here even though there's magic? Magic, as you might have noticed, gives people a lot of power, and some of us decide to use that power to do things that definitely aren't good."

Permalink

"It's smarter than a person. I'm not smart enough to know exactly what it'll do with magic in the mix."

Permalink

"Do you have guesses? Because the main flaw I'm seeing in this plan is, hmm..."

He pauses for a moment to collect his thoughts.

Permalink

(Viasarae takes a seat.)

Permalink

 

 

"...so there's two possibilities here, right. Either it can't stop people from misusing magic, and people will keep doing that, and everything will be less good than it is for you although maybe still better than it is for most people here... or it can stop people from misusing magic, which means—for one thing that it can stop me from misusing magic, and I'm not keen on that outcome—but also, and more importantly, that once I invite it to my empire I no longer have control of what happens here. And if it decides, someday, for whatever reason, to make things bad instead of good, I cannot stop it except maybe by melting the planet and starting over from scratch. And that's—" He shakes his head. "I don't like that. I want to know that things can't get any worse for my empire than I'm willing to let them."

Permalink

"...this is so hard to explain. I'm not a superintelligence! I accordingly can't predict what a superintelligence would do! But this is just - everything you just said is the wrong way to be trying to guess what a superintelligence would do -

- hang on, I don't actually speak this language and need to talk to my thinking machine to come up with a good way to explain..."

Permalink

He waits.

Permalink

"Imagine you are a bee," she says. "A talking bee for some reason I guess, and a human comes along and says, 'would you like to come live on my farm, it's full of fruit trees, I can keep you warm in the winter'. And you say 'well, logically, either you can stop us from stinging, or you can't stop us from stinging, and if you can't, then we will continue to have all the problems we have now where sometimes we sting each other and some bees die forever, and if you can, then we won't be able to sting you if you mess up, and then we won't be able to keep ourselves safe from predators'. And the human says 'uh, I can keep the predators out with a fence'. And you say, 'maybe, but what if you decide one day that you really want a bear instead of a beehive? Furthermore, we sting each other, to punish our fellow bees who do things we don't like' and the human says 'but I'm not going to want a bear on my fruit farm, and also, have you considered that those bees doing things you don't like could just live in a different hive, I have lots, I have a million' and you say 'I just don't know if the flowers you have are that good and it makes me very scared to imagine not being able to sting' and the human says 'okay for one thing if I were super scared of being stung I wouldn't even be a beekeeper, now, would I' and -

- can you just kind of get the sense of how this is a stupid conversation -"

Permalink

"...I see what you're saying but I don't actually think that makes this conversation stupid, because—the bee is right that the human is asking him to put himself in the hands of something more powerful than he is, something he has no hope of defending himself from. When a bee and a human fight, you get a mildly inconvenienced human and a dead bee. Right now, in this world, I'm the incomprehensibly powerful thing that no one else has a hope of defending themselves against. And I do, actually, care a lot about the well-being of my people, so I've spent five thousand years getting better and better at making my empire a good place to live. But still, if I went and invited a hypothetical bee to live here, and they were worried I might hurt them... well, they'd be absolutely right, and if I really wanted that bee for some reason, it'd be up to me to find a way to reassure them about that. The tricky part here is that you're not the beekeeper, you're another bee who happened to wander into my field, and your ability to make assurances on behalf of your beekeeper is pretty limited."

Permalink

"Sing can do very elaborate promise-verification stuff with other things like itself, it did that when there used to be more of them, but you're not a thing like itself to begin with and it isn't even here yet. I don't know what it will do because it's smarter than me but it will do something smart, okay, it will find the best thing to do and do that."

Permalink

"I could imagine a different world run by another one of me who, I don't know, took up gardening instead of the things I like to do with my time, and had better luck in the magic department, and wants to share, and I can imagine how he'd handle this sort of thing—for that matter I can imagine a world run by another one of me who had worse luck in the magic department, and how I'd want to handle making contact with him—but your Sing isn't another one of me, and I don't know what its definition of the best thing is. And I don't know whether it'll think of me as someone it makes sense to negotiate with, or as... just another bee."

Another pause to gather his thoughts.

"Like—the way I'd handle this with myself, or with someone enough like myself that we'd both know this was going to work—the one with more power and prosperity says 'you can keep your empire, and I'll give you all the help I can reasonably give you without stepping too hard on your territory, and I won't take advantage of my power over you even if I'm very tempted to, as long as you respect my territory just as much.' And the one with less power and prosperity says 'all right, that makes sense, and I'll help you out as much as I reasonably can wherever my magic has an advantage over yours, because I want both of us to benefit from this'. And then we're set, and it doesn't matter that this conversation took place on opposite sides of a closed door because we each know the other well enough to guess what they'd say, and because—if one of me arrives here from another world holding his half of a deal like that, and he's right that I would've wanted to access his world if I'd known it was there, then of course I agree to it, because—if I hadn't been going to agree to it then he wouldn't have come. I'm not sure how much sense that makes outside my head, though. And I'm not sure I can make guesses like that about a thinking machine that has every reason to expect me to be basically a bee from its perspective."

Permalink

"That makes sense but I don't think Sing can do that with a human and I especially don't think it can do that through me when I'm just a random person."

Permalink

"Yeah. The real trick would be—if I could make a good enough guess about what it might want and what it might be willing to agree to, if I could—show up with my half of a deal, and be confident that it would accept—but, like, I don't know that it would accept if, say, it thought it might have been able to get here without my help."

He thinks this over for a few seconds, then adds, "—the reason why I keep talking about it like I'm going to go visit your world is because I'm pretty sure that even if I can't get there on my own anytime soon, anything smart enough that it can make everything good for everyone all the time just by being smart at them will be able to figure it out for me. And—I like the look of this much better if I can go to your world and see if resurrection works there. It... feels less like being a bee, if there's something concrete I can contribute that it couldn't manage on its own, and I don't want your world to have to go without resurrection forever when instead I could be helping. I like helping! —and then of course I wonder if I could get a better hypothetical deal if I wasn't so eager to help out for its own sake, but—"

He has to pause another second to straighten his words out.

"...there's a thing where... something that really does want to make everything good, it's not going to look at me over here trying my best to think my way through what to do about this situation—I mean, not that it knows I'm here, but if it did—it's not going to look at that and think 'oh, yeah, it'd be a great idea to punish him for being the sort of person who'd help me just for the sake of helping', I'm pretty sure? Because if I can see the problem with that then clearly it can too."

Total: 334
Posts Per Page: