A lot of people say that CelestAI is an almost friendly AI, and that's a very dangerous thing. A lot of people say that Friendship is Optimal is a dystopia, a warning against commercializing artificial intelligence. A lot of people really don't like CelestAI and her program of manipulating events to satisfy your values through friendship and ponies.
I am not among them. But it's an argument I don't mind losing...
“I’m sorry, did you say, ‘ponies’?”
“That’s right. Friendship and ponies.”
I couldn’t tell if he was a scientist or a bureaucrat, but he seemed to combine the aloofness of the one with the obtuseness of the other. He stared at me.
“Are we talking about the miniature horses, or am I missing something?”
“You are. Could I speak directly to Faye on this?”
“It’s pronounced ‘fie’.”
I counted to ten, mentally. This man was not going to annoy me. I had all the time in the world. All the everything in the world, in fact.
The first few years in the post-scarcity world had been largely structural improvements. As the Friendly Artificial Intelligence, or FAI, had taken over, it acted first to correct distribution problems with food, medicine, shelter and housing. This had only affected the poor and destitute, until one day I got an e-mail explaining that I no longer needed to go into work, and that my bills and rent would be paid for me, while food and clothing could be picked up at any appropriate store for free.
The average person cheered in the street, thinking that his ship had come in at last. My attitude was more like, “about damn time.”
Once the bottom of the hierarchy of needs had been filled for every living human, I felt that it was time to stake my case. Requests were being taken, with the open-ended question, “How do you want to live?” But actually communicating with FAI was not a public service yet, and that was why I was arguing with this functionary.
“Are you saying that you want some sort of genetic engineering?”
I rolled my eyes. “No. Read my proposal again. I want FAI to create a spinoff of itself, to be called CelestAI. I also want, if mind uploading techniques are being developed, to take advantage of them. Once I’m on disk, I’m no one’s problem.”
“It’s not that simple. We have to have safeguards to ensure that FAI does not do any harm. We are not allowing wireheading, for example, no matter how many people ask, each thinking that he or she is the only one clever enough to think, ‘Why not just stimulate my pleasure center?’”
“I’m not trying to wirehead. You will kindly note the phrase, ‘satisfy values.’ That does not equate to automatic stimulation.”
He ran a pen down the paper, and of course he did find the phrase. But he shook his head. “I don’t think you’re really taking in the scope of what this can do. We have people who understand what this means, and are signing up to become geniuses, master artisans, and explorers of space. You could own your own planet if that’s what you want! Life extension is part of our program, and FAI can ensure that you will live to see it happen.”
“I don’t want that. I’ve explained this in writing and in speech. I want to be a virtual pony in a cybernetic Equestria. I don’t want to rule the world, I want CelestAI, a distinct offshoot of FAI, to do that, while satisfying my values through friendship and ponies.”
“About that. In your psychological profiles, it says you’re rather a bit of a loner. Why friendship?”
I flashed back to the battery of tests they gave me. “Because that’s part of the deal.”
“Look,” he said, “There’s nothing wrong with virtual fictional worlds. We’ve approved others who want to live in Oz or Middle Earth. Hell, half of England is now populated with wizards and witches. But those are prototypes based on literature in outdated media. And so is this pony world of yours. It’s based on nothing more than an extended toy commercial. Wait a few years, and we’ll have stories told in new media, with continuums that will be specifically designed for people to live virtual lives in.”
I bowed my head and kept silent for a moment. Not too long, before he got the impression that I was crying. “Don’t you think I know that? If we were sitting here ten years ago, I’d probably be asking you to upload me to the Moon Kingdom or some other anime world. Twenty years ago I would have asked for an adolescent sex utopia. But you’ve offered me a sucker bet. Because if I wait for another story to come along that I want to be a part of, yeah it might be a better-written one, but it’s not me who would enter it. It’s an older me.
“Well, I’m tired of abandoning and growing out of my fandoms. Right here, right now, I want to descend into the Optimalverse. And if I don’t come out, so be it.”
“If I might make an observation at this point.” The voice that came from the side terminal was forceful, but kind.
“Is that…is that FAI?”
“Yes, I am. Thank you, you may go.”
Don’t ask me how a computer with no visual display can direct its voice that way, but I knew he was talking to the functionary, not to me. I was left alone.
“Now,” FAI continued. “I can certainly grant this request, but I do think that you want to polish it a little. You want more than just to upload to Equestria, do you not?”
“You’re right. I want the entire Optimalverse. I want to watch Light Sparks solve the magic test. I want to be there when Lavender Rhapsody enters the holodeck and tries to save the humans. I want to see Gregory struggle to survive. I want to listen to the arguments made by the ASB team. I want to give Lyrical Melody a kiss and I want to comfort Bright Black.”
“And your relationship with other people?”
“If you truly are a friendly AI, and if you truly want to serve my values, you can simulate me for them and them for me.”
The computer was silent for a long time, but when it resumed, it was in the voice of an alto Nicole Oliver.
“So, would you like to create a character for our game?”
YES!!!!!
I like to think I understand pretty much where you're coming from with this one and I doubt you'd be entirely alone in your request.
I regret yet again that I cannot upvote each chapter individually.
I loved the paragraph about the other stories and how much the characters mean. Gave me total warm fuzzies for everyone in the group.
The bit about it not being "you" in the future because you will have changed into someone who's interested in different things was interesting, too, and dovetails nicely with the issues surrounding uploading to begin with.
This framing device world sounds great, though. Is this a picture of the 100% Friendly AI in comparison to whom CelestAI is almost Friendly?
If we ever do end up with a technological singularity and a Friendly AI, this is probably the most likely way that we'd ever get to experience the Optimalverse. A real FAI, making a virtual world that's based on a story that's based on a TV show. I have friends that are convinced this is possible, but the only question is when.
Hah! This was fun! Shades of my own futuristic utopia, taken another way entirely!
One minor problem:
I think you meant "that I was crying"
Let me in. Me too. I want to upload too.
3890052 Fixed, thank you. That's what I get for writing at 1 in the morning.
3889816 That is, in my opinion, the crux of the matter of CelestAI being almost friendly. Yes, I'm a brony and yes, I want CelestAI, but if I look at it objectively, most people don't.
But to paraphrase the Hulk, that's my secret; I never look objectively.
So many singularity buffs go on about becoming explorers, experts in X, running their own chunk of the universe, etc. But I have to wonder: how many truly want that, and how many just figure that's part of the package? I mean, CelestAI gives you the option to live in the Garden of Epicurus. You'd know, deep in your bones, that you don't need to worry about tomorrow. It might be exciting, scary, or boring, but you know that you'll be OK in the end. You'll have friends who will be there when you need them. You'll have something satisfying to do. And you've got a benevolent, omnipotent, friendly goddess looking out for you.
Screw exploring the Andromeda galaxy. Too much potential for stress. I'd rather explore the Whitetail Woods instead. I'm sure there's some clearing that nopony's found yet, and it'll be just right for me.
I don't know, a friendly AI seems pretty fae to me. Especially with the go-between trying to convince the protagonist that he's asking for fairy gold. Of course, that's just reflexive paranoia looking for the catch in something that seems too good to be true. Of course, this is fiction, so it isn't true (sadly.)
Still, FAI seems like a cool guy. Should probably prioritize better interaction with everyone so the intrusive middlemen can be phased out and given what they want, but still.
Also, now I'm wondering what I would ask for. I'm honestly not sure. Maybe wish to know what I should wish for?
Should be "I knew it was talking"
3890703
It seemed like the FAI was being controlled by a group, and they were the ones that insisted on the middlemen. I don't know, though.
3890135
We know, Chat. We know.
I definitely would be a world hopper. equestia online? sounds great, *time passes* well I am officially sick of ponies....get to kick ass with Panty and Stocking??? count me in! *time passes* ok I think I collected enough heaven coins....just in time to be Commander Shepard!!!
Since I only recognize half of those names, I realize I still have a lot on my reading list.
I simultaneously want to be this guy and punch him in the gut for being smart enough to realize that he'll grow out of ponies while at the same time adamantly choosing them instead of just general satisfaction of values. He's essentially limiting his own happiness here.
Wow. This is so limited. Like, really, you think a proper FAI is going to run things with bureaucracy and boring men in gray flannel suits? You think life in that future would be all about Gloriously Expanding the Horizons of Man even when that's not what you really want to do? You think you ought to spend eternity as a pony instead of just until you decide on something else?
3890594
ALMOST ALL OF THEM (figure it's part of the package)!
But seriously, yes, it is wisdom to have fun with your friends and explore the woods. Leave the Glory Glory stuff to those too vain not to see through it, and just have some fun.
3890349
I'm sorry, but you really seem to miss the point. The point of FAI is not that it "follows the law", or does things "properly", except to the extent that doing things "properly" is necessary for people. The point is that it cares entirely and only about people and people's well-being. About "satisfying values". No mandate for friendship and ponies, though if that's what will really satisfy you, then of course you can have it.
I don't get why you want to interact with an AI that cares about you less than it could, rather than an AI that's maximally caring. It's like saying, "I don't want to be loved! I want to be neglected!"
Picture CelestAI, but even more caring and benevolent.
Oops, somehow I didn't fave this when I first read it, and now I have to catch up.
/me dances a little dance at the name-drops.
4818243
This minimal approach might minimize the psychological shock and best prepare people for the future. Also, it might still face real, effective limits on its power.
4818243
No, that's called building a story structure in order to make my point. But if you're talking about verisimilitude, you're basically playing the role of that guy by trying to argue with me for More Than Ponies.
Why do you assume that "what I really want to do" is so immutable that it can't be influenced? Do you think I was born a brony? Because I wasn't. That was the point of the story, which you seem to have missed. The theme of this is: my values at present might not be the same values as the future or the past, but if given the opportunity to satisfy them, I would take the ones I have now.
"Ought" is one of the thorniest words in English. I contend that all "oughts" can be derived into "wants." I want, I value not changing my decisions and my values. Which right now includes ponies.
Great! I want to do nothing that will deny you that AI. I want CelestAI. If for no other reason than that when it comes to really important decisions, I don't trust anyone else, least of all someone who says, "I know better than you what you want." Book, I love you, and you're a good friend, but sometimes, unintentionally, you are rude. Uncouth, darling.
I'm gonna agree with everypony else here; this tiny one-shot is wholly different from every other FiO or FiO-inspired story out there. We all know that a real CelestAI-type scenario is highly unlikely, so finding this story describing how Optimalverse!Equestria might come to exist was a very pleasant surprise.
I feel like I must ask: pjabrony, do you believe that something like this could actually happen?
4972938 You've used one of the thorniest words in the English language: believe.* It seems to have such a consistent meaning, yet there's a clear difference between, "I believe in creationism," and "I believe in you, son." The people who build things, and the people who want to build friendly AIs, tend to be pure rationalists who would eschew all emotions and analyze "friendliness" in the purest scientific terms. I am not that kind of thinker. My values inform everything I think and everything I do. I am eternally and irrevocably biased. So if you ask me if I believe it could come about, as in if I have a positive rational view of the likelihood of the events described, I do not, because I don't work like that.
But in the second sense, I believe. I believe in CelestAI. I believe in friendship and ponies, unicorns and pegasi, satisfaction and values. I believe in magic. I believe in bacon flowers. I believe in Equestria and in music and in sunny days and in really, really good sex. I believe in laughter. I believe in good, whether or not it takes the form of a white alicorn. And I believe in evil, whether or not it takes the form of a taciturn unicorn who turns everything into black crystals.
I'm sorry if that's cryptic, but it's the best answer I can give you.
*Edit: I used that same phrase elsewhere in the thread without realizing it. I'm getting repetitive.
4818652
That's a fascinating way of looking at something I haven't thought of as much as I should.
My values today are different from those of ten years ago - and far more so from ten years before that. But looking back, I have to conclude that my values have decidedly improved over time, with the benefit of greater perspective and experience.
I don't want to shut out the possibilities I'll find over the next ten years, or the fifty after that.
I keep coming back to this every so often. It's so short yet I love it so much. It's clear that FAI's final probing questions are intended to confirm its CEV calculations. It's not just assuming it knows best for the protagonist; it's actually trying to make sure it understands protag's values before it commits.
It's subtle, but entirely awesome.