He would no longer think about the disaster that would happen to others because of such a decision.

He lost his empathy.

And why did Liang Xingyi go crazy? Is his madness due to his own will?

No, it's not. It's because of the end.

Under normal circumstances, some people do experience extreme abnormalities in their psychological status due to various reasons, such as mental illness or physiological lesions, and ultimately harm others and themselves.

But after the end of the world, such situations have become so numerous that the entire society has fallen into chaos.

For example, the betrayer in the maze. Why did he betray all mankind?

For what reason would he do such a thing? Even his teacher couldn't understand this.

Therefore, when madness spreads among mankind, the future fate of mankind seems to be doomed.

An artificial intelligence destined to be shut down actually became the guardian of mankind. This is a uniquely funny thing.

Maki closed his eyes slightly, and then asked: "How many times? After how many times of failure, will it be shut down?"

"One thousand and twenty-four to the power of one thousand and twenty-four." Liang Xingyi said slowly, "After so many failures, the artificial intelligence will automatically shut down."

The corners of Makiashi's mouth twitched.

How do they verify how many times artificial intelligence has failed? This is an astronomical figure!

"There is no way to close it manually." Liang Xingyi continued to say, seemingly ignoring Mu Jiashi's anger, "Unless..."

"Unless what?"

Liang Xingyi hesitated for a moment, and then said: "Unless you can find where the source code of artificial intelligence was at your time."

"Source code?" Makiashi asked subconsciously, and then said, "You don't know where it is..."

Liang Xingyi shook his head.

Now, of course, the source code was on his computer—at this point in time, artificial intelligence hadn’t even been created yet!

But what happens after that? Later, this artificial intelligence created by Liang Xingyi became the manager of Zhailou. Where will the AI ​​source code be at that moment?

Makijiashi smiled bitterly and sighed: "It seems like another impossible task..."

After finding Liang Xingyi, they did get some direction. For example, they can choose to awaken Liang Zhiyi’s brain;

For example, they can look for the source code of artificial intelligence.

Either way seems to be a solution to their current predicament.

But the question is - is this a fucking way that humans can do it? ! If they could find the source code or awaken Liang Zhiyi's brain, would they still be in this situation? !

Before entering this ultimate nightmare... no, before entering this hospital, they didn't even know that Liang Zhiyi existed!

In the end, they took a big detour and got nothing?

Unknowingly, a feeling of depression and self-abandonment once again covered Mu Jiashi's psychology.

But this time, his eyes were still cold and sharp, as if the emergence of this kind of psychology could no longer disturb his reason.

His emotions seemed to have become an obsession in this damn hospital. Therefore, he can ignore the spread of this negative emotion.

Mujia closed his eyes forcefully.

His two companions also fell into silence.

Liang Xingyi looked at them with a little uneasiness and fear. This man had the aura of a shrinking little guy, but he did things that shook the world.

Makiashi finally couldn't help but sigh and said, "You are the creator of this artificial intelligence."

Liang Xingyi replied uneasily: "Yes, yes."

Makishi's tone was cold: "And you still don't know how to turn off this damn artificial intelligence!"

Liang Xing opened his mouth, but in the end he could only remain silent. Finally, he showed an expression that was about to cry: "I... I really don't know... I just do this artificial intelligence step by step... I just follow the normal order..."

"Wait?" Goddess suddenly said, "You mean, the normal order?"

The other three didn't understand why she mentioned it.

Liang Xingyi nodded blankly.

The goddess said word by word: "Three principles of robots?"

Liang Xingyi said not very clearly: "Yes, yes...I did this...it can't hurt humans or ignore death;

You must obey human orders, unless you violate the first one; you will also protect yourself, but you will not violate the first two..."

He watched the expressions of the three people in front of him getting worse and worse. He was confused for a moment and felt that something was wrong.

His voice was trembling a little, and he whispered: "What's wrong? I did do this..."

He didn't know what to say, so he could only look at each other with the three of them.

Finally, Shen Yunju said in disbelief: "But this artificial intelligence clearly violated the first two. It threw those madmen out of the narrow building, which must have harmed those humans, and even allowed them to die."

“And if it has to obey human orders, how could the ultimate nightmare be this development?

Tap the screen to use advanced tools Tip: You can use left and right keyboard keys to browse between chapters.

You'll Also Like