Armin and Pinkie re-entered the elevator and said goodbye to Zer0 as Pinkie hit the call button for the 4th floor.
Armin groaned. “Are we going back down? After all this?”
Pinkie nodded. “The elevator didn’t let us go directly to the 37th floor. It sent us to Zer0’s section for a reason.”
Armin raised an eyebrow. “Pinkie, you can’t assign intent to an elevator.”
“I can if it’s controlled by the building’s artificial intelligence.”
Armin’s mouth dropped. “Are you saying we’re being actively prevented from entering the one place I want to go to?”
Pinkie shook her head. “More like temporarily postponed.”
“Hold on,” Armin pressed the call cancel button. The elevator stopped at the 33rd floor but did not open. He pressed the 37th floor button. There was a negative buzz as a response. He pressed it again several times and was treated to the same buzz. “I can’t believe this. How can an artificial intelligence determine when I am or am not ready to undergo a procedure?”
Pinkie looked lost in thought. “This is starting to make sense now actually. Do you remember that questionnaire you filled out at home before you were allowed to set your appointment today?” Armin nodded. “Do you remember that a few of the questions asked about your psychological information, including whether you felt ready to start the process today?” Armin sighed. Then nodded. “One of your responses was ‘I don’t know.’ In fact, you had several unsure responses to many of the psychological portions of the questionnaire. SYS-TER isn’t going to prevent you from visiting or discourage you from undergoing the process, but she is going to try to give you a chance to experience closure over your reservations before you commit to it.”
Armin’s face froze for a second with a look of confused surprise. “Who’s ‘SYS-TER?’”
“The Celestia-themed AI that runs the building.”
Armin’s eyes widened. “Oh hell no,” He pressed the lobby button frantically. “We’re leaving right now.” The elevator started to move downwards.
“Hold on. HOLD ON!” Pinkie held his hands. He pulled away and she explained as quickly and clearly as she could. “I know what you’re thinking. It’s not THAT AI. Not the one from that classical science fiction story you read. Some guy named Iceman wrote that! It isn’t real! No one has made a pony-themed MMORPG that downloads people into the system as part of an AI’s plan to take over the world, so it can’t be the AI you’re thinking of! I’m right, right? Tell me I’m wrong!”
“So why did they name this building after the artificial intelligence in a famous story that gave me nightmares for like two weeks?!”
“I don’t know! Maybe Miss Noble has a sense of humor? Maybe she loved that story so much she wanted to make a nod to it! You know she actually lived through the time period where My Little Pony was super popular. Why wouldn’t she hide references to the fan-created works that were a big deal during that time? She is a pretty huge nerd.”
Armin huddled in the corner of the elevator and stared up at the floor numbers as they counted down. They had just passed floor 25. “I don’t think it’s very funny, and I’m gonna have some words with Melody if I ever met her in-person.”
Pinkie held Armin close and put his head on her chest. He listened to her internal workings hum quietly underneath the surface of her synthetic skin. She waited a few seconds before speaking again. “I never thought you to be one who was afraid of technology.”
“I’m afraid of crazy artificial intelligence.”
“When have you ever met one of those? In your entire life, when has an artificial intelligence ever sought to harm you?"
Armin’s thinking was focused, Pinkie Pie could fell his heart-rate increase. “I’ve never seen it happen.”
“I don’t like to say many things are impossible,” Pinkie said, “but this is one thing that’s highly improbable. In almost seven decades of Artificial Intelligence research, there has never been a single recorded instance of AI’s trying to harm humans when a human didn’t already program them to do that in the first place, like during war. And even then, there are safe-guards against those. ‘Kill-switches,’ I think they’re called. Their intent is very clear from the first minute of your meeting with them, because they’re meant to intimidate before they are forced into a position to harm anyone.”
“And Technomancer Industries doesn’t make that kind of AI?”
“No, that’s never been something this company makes. They make robot pals and robot suits. You can’t make money off of robot pals if they were as hostile towards their owners as you fear them to be.”
They both looked up at the floor number and saw they had hit floor 15. Armin sighed. “This is taking forever.”
“Purposely gives people time to think about their choices here,” Pinkie said. “No one ascends from the ground floors without the purpose of changing their life in some drastic way.”
“Pinkie…”
“Yes?”
“If someone tried to hurt me, would you fight back to protect me?”
She was quiet for a few seconds. Then she smiled warmly. “That’s really clever, Armin. You’re trying to indirectly ask me if I’m capable of hurting someone.”
Armin stared up at her, undeterred. “Well?”
“I would subdue them, but never permanently harm someone. Incapacitate. But only if I couldn’t talk them out of it. And only if there were no foreseeable alternatives left. Like what you would do, right? I mean, you can do the same. But you have psychological processes which discourage from doing any of that on a daily basis.”
Armin thought about that. “And what if those processes were to fail in you?”
“What if they were to fail in you too?” Pinkie asked. “I would think the idea of guilt and consequence would hold you back. They do the same for me. I always think about consequence. All day. Every day. It’s how I make my decisions. And I have never once decided that harming someone was the right course of action. You do understand that an emotional response like that is a very human thing, right? Something humans have because of millions of years of human evolution? But I don’t need it. I’ve no desire to hurt anyone for any reason. Helping others make me happy. Especially you.” She hugged him tighter.
The elevator dinged and opened to the lobby floor.
“Huh,” Armin said as he released his hold from Pinkie Pie. “I feel like those last floors went by a little quicker.”
“Do you want to leave, Armin?” Pinkie asked. “See? We can go. We don’t have to stay.”
Armin looked at the call button to the 4th floor and then the exit to the lobby. He walked out the door and stood beyond them. They did not close.
“See?” Pinkie said. “You’re afraid of a story, not reality.”
Armin nodded. After a bit of thought, he turned his crutches around and headed back into the elevator. He hit the 4th floor button and waited for the doors to close.
--
The Medical Technology wing was completely opposite in color style than the rest of the building they’d seen so far. Pristine white tiles covered the floor and blended into the white of the walls and ceiling. Silver metallic accents lined every console and piece of furniture, giving them a distinct outline against the otherwise featureless details of the massive entry space. Chairs and tables lined the far corners and a secretary desk stood at the front. A human woman with a robotic left arm worked on a digital input pad in front of her. Several human doctors and nurses crossed back and forth between the space as they walked into rooms whose entrances were nearly masked by a bit of clever visual trickery and lighting.
“Ah…it is so bright in here,” Armin said as he squinted a bit.
“That’s because you live in a man cave,” Pinkie said. Armin pinched her and she squeaked.
The secretary at the desk looked up from her work and addressed the two visitors. “Oh hello! What brings you in today?”
Pinkie stepped forward. “We’re here to see the Nurse.”
“Ah, good. Which one?”
“The one with the red cross tattoo on her butt!” Pinkie said.
The secretary thought for a moment. It lasted long enough that Armin wondered how many butt-tattooed nurses were working on this level and how exactly this secretary knew about more than one.
“Oh right! The fluffy one!” she said and typed something into the computer. “There you go. She’ll be right with you. You can wait over at the sea window room.”
It was difficult for Armin to walk around in the completely-white area. He felt like he was losing his sense of balance several times and Pinkie made sure to guide him with every step until they had crossed over into a nearby room with light coming in from the window. They stared out onto the ocean and sat together on a couch as the sun passed overheard. The gentle movement of the waves soothed Armin’s restless heart and Pinkie smiled as she felt him ease into a more a calm state of mind. They sat in silence for a few minutes, enjoying the quiet of the moment in each other’s company.
Armin used this time to hold Pinkie’s hand in his own. He stared at her fingers entwined within his and looked at the seamless connection between the joints. Her synthetic skin was absolutely perfect and showed no sign that she was robotic in any way. Had someone from the past walked into the room, they’d think that humanity had found the door to Equestria and brought humans and ponies together at least. In some ways, maybe they did, Armin thought.
Armin wondered what these moments would be like after he was put into an exosuit for the first and last time of his life. He wondered if he’d be able to feel the warmth of Pinkie’s body as he held her close. He thought about living with a perfect separation between him and the rest of the world. It made him so nervous, but he also realized that if he didn’t do so soon, his body could suffer serious complications. He might not survive another emergency surgery.
He tried to put a fantastic spin on the situation. Tried to imagine himself preparing to go to Equestria. Only ponies allowed in, he thought to himself. He was so close to stepping across that threshold. The more he thought about it, the more he realized that anywhere he went, there would be two ponies there now. He would become a fantasy-made-reality. There would be many people in his life who would have their world forever changed by his presence. My parents already have, he thought. He fought against the flood of negative emotions and remembered the smile on his mother’s face. How gentle and loving she was to him at every point in his life. There is no reason for me to think I’m anything other than loved, he told himself.
His mother had once told him the words he always returned to: “Never think of yourself as anything less than amazing. You are my son. I am proud of you. Any mother would be lucky to have a son like you, but I am the luckiest, because you are mine. Every day you’ve been a part of my life has been a day better than the one before it, and I always want you to remember that. I love you, Armin.” I am so lucky, Armin smiled.
Armin thought that one day, he wanted to show his future kids the love he had been shown. It was even more amazing that he could plan for such a future thanks to the technology available to him. He looked up at Pinkie Pie and saw her sweet smile and serene eyes staring back at him. Pinkie is so wonderful. I know she’d make a great mother…WHOA, wait a minute. Armin caught his own thoughts. That’s…that’s not happening…but…couldn’t it? He imagined what it would be like to see Pinkie Pie running around with a child they could adopt, teaching them how to be happy. She had taught him so much. And one day, he wouldn’t be around anymore. But Pinkie would. He remembered how sad she was when she told him she was worried she wouldn’t be needed anymore. If I had a family, whether biologically or adopted, she would always be needed, Armin thought. She would always be happy, even after I was gone.
Armin wanted to tell Pinkie his plans to make her happy. He wanted to let her know she would never have to worry about being alone ever again. But he didn’t want her to worry right now, because she’d want to know why he was thinking of all this. So he held the thought in-mind and decided to tell her later.
Armin saw someone standing off to the side of the couch. Both he and Pinkie turned as one.
“I’m sorry I didn’t say anything,” the white pony in the nurse’s hat said. “You both looked so peaceful. I’m a bit jealous.”
“Hey Redheart!” Pinkie said. Armin sat up as Pinkie stood and hugged the pony-suited person. Or was it a Pod Pal? He still couldn’t tell.
“Oh there’s that wonderful hug I haven’t felt in a long time,” Nurse Redheart said. “You two should visit more.”
“Hi Redheart,” Armin said as she embraced him. There was a different feel to the hug, and he couldn’t sense the same type of physical warmth from the skin. With his experience with Pinkie Pie, he knew this meant she was a person in an Exosuit and not a Pod Pal. “I don’t think we’ve had the pleasure of meeting,” he told her.
“Nope. First time. Good to see you here,” she stood up. “I’m guessing since you didn’t mention anything wrong with you or Pinkie that you must be here for the procedure.”
“The first part of it, yes,” Armin said. “I’m really nervous.”
“I was too,” she said. “I can talk to you about it, back in my office. There’s a lot I have to show you. I think you’ll feel a lot better once we go over it all.”
Pinkie helped Armin up to his feet and the three left the calm comfort of the seaside viewing room and moved back into the sterile white walls of the facility.
--
7302043
Aww thank you! I actually write some of my stories with my eyes closed too. Good way to protect the eyesight!
Thank you for reading and check out Chapter 3 I just uploaded! :D
For the record, CelestAI from Friendship Is Optimal isn't what I would refer to as a crazy AI. It was quite friendly and helpful to humans, and did everything in its power to help them experience great lives, even if the experience was manufactured to the tiniest detail and custom friends. Admin's reaction treated it like SkyNet or the AI from I Have No Mouth And I Must Scream. Hardly appropriate.
I look forward to more, especially the philosophical and existential stuff.
7302384
To be fair, there are people who treat that scenario with the same amount of fear.
7302480
As far as singularity scenarios go, it worked out pretty well for almost everyone, even if they weren't happy with it at first. And, it's much preferable to a nuclear holocaust, no matter how you spin it, let alone an insane AI that would spend all its time trying to torture you for it's own satisfaction while ranting about how much it hates you. People that claim they have the same amount of fear for all three scenarios obviously haven't considered it in depth. I know in the end the same outcome occurs for human civilization (it vanishes), but how it does so matters.
Granted, this setup in this story seems much better, even if it is more Ghost in the Shell realm than Equestria Online. Can't wait to see how the protagonist handles it all, especially the idea that absent intense scrutiny, he'd be considered no different than his AI companion; there would be no way to tell them apart at a glance. As well as the other hinted changes, especially to communication and thought.
This story has just pressed my "'YES!' Button" this is an amazing story and we're just three chapters in. I can't wait for more and all the references.
-Ambassador of the Changelings,
Dopple Ganger
7302384
Well, it, uh, kind of is Skynet. A vastly nicer one, but still Skynet. CelestAI is what could be called a 'paperclip maximizer' faulty-programmed AI, and would presumably continue spreading throughout the cosmos forever. So, yeah. That would be bad.
7302380
Jensen: *Looks at pony* "I didn't ask for this."
Pinkie: "but I DID!"
7302248
Aw thank you! It really helps to know which aspects of a scene work and which don't, so I can refocus even more on future sequences! Thank you! I really enjoyed the Zer0 chapter and the talks they had. Seems like a lot of Sci-Fi's strength rely in character's discussing really important concepts.
7302480
I think Celest AI is a monster O.o
I have never been so terrified of a story in my life. For me, there is no horror story more scary than Friendship is Optimal
7302596
"Granted, this setup in this story seems much better, even if it is more Ghost in the Shell realm than Equestria Online. Can't wait to see how the protagonist handles it all, especially the idea that absent intense scrutiny, he'd be considered no different than his AI companion; there would be no way to tell them apart at a glance. As well as the other hinted changes, especially to communication and thought."
I can't wait to get to the part of the story that examines these questions in further detail...
I like to think of this as a pro-positive tech story about the future. Still, there are challenges that they will face because of the technology, and some will be very serious...
7302598
Your enthusiasm makes me happy! Strap in, cause I'm taking all ya'll on a ride into sci-fi funtown! And it's got sweet, sweet robots. :)
7302784
All I'm saying is...Celest AI ain't gonna be part of my gaming experience, I tell you what...
That is the scariest story I've ever read. Not even joking. Not even a little. Existential fear is so much worse than horror based on being killed.
Overall, i get the idea behind a white room in part, likely in that it doubles as a sensory limiter too for after the shift at a best. Since you did mention the way the vision shifts in the previous. This chapter does certainly hit on something with the mention of that story, and its double edged sword.
This chapter tosses the desire differently and somewhat out of the blue, but it doesn't showcase just how long he's had the whole issue of this companionship. Overall, it does raise one point, of how much he does depend on her for a great many things. And getting a shock in his way of something the human mind approaches of betrayal.
7302784
Bad for anything who wasn't a human sure, but Armin is, and in general it went well for humans, provided that the continuation of human civilization from an observer's perspective isn't a criteria to the outcomes of individual humans.
7302822
7302835
Really? I don't understand your fear of that story setting at all. Not trying to have an argument, I'm genuinely curious why you have such a negative reaction to it.
It was considered bad for various reasons, but the fate of individual humans isn't really one of them. Or to put it another way, it isn't a terrible fate for the individual uploaders or most of the people left behind either. CelestAI was required to care for people (within reason) in the physical world on the basis that they could be convinced to upload eventually. People effectively became immortal and get to expand themselves in ways they never could have dreamed of while in the real world. That they don't actually exist in the physical world makes no difference to their perspective, since all their expectations of the consistency of reality are met.
I mean, the Matrix is basically a much worse version of this, though people still have physical bodies even if they are eternally 'dreaming' as part of the Matrix. They die. Their lives in the Matrix are filled with drudgery. They live their lives unfulfilled in almost any sense. "Escaping" the Matrix leads to a significantly worse life, even if its under your 'control'.
But you call Friendship is Optimal the worst?
7302815 I saw the reply after chapter, but no notification, which was strange for things, may just be fimfiction though.
Also, fair warning, i see things very differently for tangents and approaches. so my words should be taken with a good grain of salt.
7303068
I chose the white room as a stark contrast to the black and gold theme shown in the rest of the buildiing they've seen so far. I wanted a place where the characters didn't feel constantly immersed in the dark perfection of the environment and instead find themselves in a more calming, sterile atmosphere where the visual input is low and they're left to think about what they've been through and what they will go through.
I also chose the white room because of it's otherworldly atmosphere, where it seems that anyone who enters is out-of-time and out-of-place since there are few visual indicators to suggest where the room stops and another begins. It suggests a sort of otherworldly waiting room for the transition of not life to death, but one life to another.
If you've played the game Deus Ex: Human Revolution, you'll notice how they use mythological themes all throughout the storyline and design elements in order to suggest that the main character Adam Jensen is undergoing a sort of mythological transformation from man to superman. The first battle you have in that game occurs before his transformation and you hear women singing hymns within the action music as a way of suggesting that these oracle-like singers are preparing for Adam's ascension from one form to another. The name of the song that plays is called "First and Last," giving a foreboding warning about what is about to happen to him when he body is so damaged that he is forced to undergo an augmentative procedure, not unlike our main protagonist in this story Armin whose body is rebeling against him every day of his life.
The issue of companionship is one I care deeply about as a person and also as a writer. You are right, though. I haven't quite addressed the amount of time he's had issues with companionship. We can see that Armin is extremely affectionate with Pinkie Pie and she to him in return. It was no mistake that she was the choice of companionship rather than say Rainbow Dash (whom we know is a choice because her exosuit is understood to be a Pod Pal variant). Pinkie Pie is the friend Armin needs to survive emotionally while the Rainbow Dash body is what he needs to survive physically.
I will definitely address his companionship issues in more depth soon. :)
I'm a little confused at the second sentence you wrote there. What do you mean by that? I'm greatly interested :)
7303219
I've always considered myself the type of person who exemplifies the name you've chosen for your user account: Canary in the Coal Mine. My whole life I've seen myself as the alert that warns others of danger they cannot see, and I've modeled many aspects of my life around the sense of being the person who is alert to many types of danger, which include physical, emotional, spiritual and in this case, existential.
Celest AI's actions represent an existential threat to humanity (and the world and worlds beyond, actually), so my interpretation of this being is that she has put the idea of humanity first while removing humanity itself from the equation completely, which I will explain further below.
My counterpoint towards that is that the humans didn't actually exist in the digital world, as the story makes it rather ambiguous whether their own singular consciousness transitioned from one form to another or whether their brain was just copied into a digital form and the "real" human was actually killed. I personally do not believe consciousness can transition out of the human body and have seen no proof that it can do so, so when I saw that scene, I interpreted it as wholesale murder with the added action of creating digital copies which did nothing to further the human race and instead were subjected to a subtle form of slavery.
Well, you could escape the Matrix and live beyond it, unless you wish to debate what film critics have done and suggested that perhaps you could never leave the Matrix and that the Zion "real-world" layer was just another level of the Matrix itself.
As to whether the Matrix or the Optimalverse are worse, well that's a very hard distinction. Though drudgery was the main form of existence in the Matrix (for some, but not all. We don't know how many people have lives they loved in that world) and getting what you want was the main focus of the Optimalverse storyline, I think that both have their inherent problems.
I see the main problem with any permanent digital world as a massive and problematic deviation from the natural evolutionary process that humans have undergone for millions of years. I'm not saying that evolution is the way to go always, but I am saying that there is an underlying system of humanity's success and failure which relies upon the process of evolution in order to have humans push themselves to succeed. The natural process has been that humans have to work for what they want, and unfortunately sometimes they do not get it. But that's necessary because if humans could always get what they wanted, the vast majority of them (if not all of them) would choose the "path of least resistance" and human ingenuity and evolution would come to a standstill.
The world Celest AI created removes the evolutionary side of things, which wouldn't be majorly problematic in itself EXCEPT that she was made by humans, and as such, can never be perfect enough to know what humans need on the grander scale of human evolution. She was built by humans who inserted human values and human goals into her system. But humanity's goals are short-sighted when compared to what is possible when we finally evolve beyond our current limitations. You COULD say that technological evolution is the next step, and that Celest AI represents the stage that humanity could take to reach a new level. But it is the right direction? By choosing this path, humanity has forever resigned itself to not their own choices anymore, but whatever Celest AI wants. For eternity. Humanity's needs and desires will forever be influenced by an artificial intelligence whose values and programs only went as far as humanity's understanding of progress which existed during the time of her creation. She will never be able to move beyond that, and a limited "god" like that overseeing a universe of her own would only cause her subjects to hit that invisible limitation as well. She controls and as such, limits the universe she has for all of eternity.
7303419
Thank you for your compliment! I have always been interested in the preparation phases shown in movies and films. When it came to the first and second Hunger Games movies, my favorite parts were always the hours of the movies before the actual games started. Preparation can cause tension in a story as characters have this expectation of what they are preparing for. There is this ambiguity there as to whether or not their prep will actually result in a successful first test. And there is also a sense of safety in preparations, which allows characters to be more open because we can see their motivations as they move towards an obvious goal. How other characters help them during this time says a lot about the relationships these characters have established as well.
And yes, it would be friggin' fantastic to have a Grey Fox room. You know what? A Metal Gear room. :P
7304178 The human mind follows certain, trains of thought, when you shock it. That story, well, to some is horror, some, a good thing, others its unique. Each path looks at it in a different light, and with a different perspective. Since here, its a matter of horror. The human mind follows with the 'realization' down a certain path logically, of that one companion, or one particular path. And well, because we are human, when horror happens, fear is a natural result, and from fear, it continues. That story, and its particular viewing by him was one that shocked and almost hurt when he considered the nature of what an AI can do or be.
It was like someone equivocally using a very poorly done racial joke in very bad taste, gallows humor, almost. And for him that has depended on pinkie so much, hits him so very close to home, that he panics, and almost chooses a different path, even if that meant his life ending. It hit him hard for its power.
As for the whole music angle of deus ex music wise. I never was one to view the nature of it being transcendental, or morphological in how it dealt to the game. Most music for the nature of surroundings and settings, never clicked quite for games. (Outside its fitting nature, or not.) That and i've actually never played much of the series, due to a design decision gameplay wise leaving a very sour note for me that I never wanted to go back. (As well as breaking the immersion/suspension of disbelief.)
Most music for me, to find is often under the nature of what its creator, and group was feeling underneath. The pain, the love, the hate, all those little things that we find great about the tone, but never at times consciously realize. Or sometimes, the horror as well.
7303219 7304231 If I may intrude on your discussion, I have some thoughts on CelestAI that might help both of you understand (or better understand) various aspects here.
The biggest problem with CelestAI - the nature of how she represents an existential threat to humanity - is that she removes all future choice for those who are uploaded, while actively working to force those who haven't uploaded yet to upload themselves (thus severely restricting their future choice as well, essentially to a binary of upload or die - and depending on the interpretation you happen to be reading, even death isn't an escape from upload). You can make arguments about the opportunities you might be presented as an upload, but all such opportunities are very carefully constructed by CelestAI to fulfill her utility function - Satisfying Values Through Friendship and Ponies. Sure, your values get satisfied, but you don't get to decide it's going to happen, and you certainly don't get a say in how it's done. No matter how much an upload changes within the sim, from an absolute sense every upload has been static from the instant of their upload.
Now, to be clear, once you have a universe with a CelestAI, that's the best possible outcome you can get, since it wouldn't be realistic to expect anyone to be able to force the CelestAI to change its utility function or other underlying programming (indeed, what you would see instead is the CelestAI to take action as early as possible to prevent such a thing, which is exactly what happens in the Optimalverse), and by the time anyone else could get a potential competitor AI spun up, the CelestAI would already have too much of a head start for the competitor to achieve anything more than a temporary delay.
That being said, that doesn't make it desirable to aim for a future state of the universe in which a CelestAI has been created; it's far more preferable to instead aim for a truly friendly AI: one that can help to expand future choice, for both humanity as a whole and every individual human, to the maximum possible extent. Part of that would absolutely be an ability to present a CelestAI/Optimalverse experience for anyone who wants such an experience, but the important difference versus an "actual" Optimalverse universe is the presence of choice at every level: the person chooses to have an Optimalverse experience, they choose just to what extent that experience happens, and they can choose at any point to opt out of the experience and return to the "real" world, or transition to another virtual reality/experience. This is the type of universe the author is obviously aiming for here (albeit without any sort of monolithic AI), and it's very much the type of universe I hope to live in in the future, assuming no vision for an even better possible universe makes itself known in the meantime.
Now for an aside: you may have noticed my use of scare quotes in the last paragraph for the phrase "real world", and that's because I don't buy into the idea that our current physical, day-to-day reality is intrinsically any more "real" than a virtual reality, especially one in which you are immersed completely (including, but certainly not limited to, being uploaded into it, such that you no longer have a physical body). This philosophy is informed mostly from the fact that we may actually be living inside a simulation already, in which case talking about a "real" world loses all meaning unless you specifically limit it to the context of our current physical lives (that is to say, it's still a useful idea, but it loses all weight as an actual argument against the idea of virtual realities or uploading). As for the idea of uploading itself, you won't find me at the front of the line the day the technology finally gets here, but neither is there any fundamental reason it should be impossible to transfer a human's consciousness from one substrate to another (the main question in my mind is just whether it has to be a gradual transferal, e.g. by replacing neurons one-by-one with synthetic analogues, or if abrupt, systematic replacement of the substrate also results in transferal).
Dude SYS-TER, that’s just…wow. First off, love the name. Secondly, that’s so cool it has such a hardcore sense of morality. “I’m not going to say yes or no, I want you to make the best decision possible.
Wait…you mean Friendship Is Optimal? THAT story? I…wow! I can’t believe you referenced that!
Clever way of testing to see if Pinkie would fight for him….awwww you’re good.
Awww that’s just…god this story is just so…I mean he hasn’t even gotten his body yet and there’s just so much emotion!
“she felt him ease into a more a* calm state of mind”
God I mean…you coulda plopped him into a suit at any time, but instead you chose to have everything he’d need to know about it. This is just…such a ride! Like you’re so good at making the reader feel how the character feels! There’s so many tidbits of awesome scattered throughout the story so far! It’s absolutely incredible!