Crafting the SoulMinerva eventually copied down everything she felt she needed from the first book. The most important parts had been copied down, word for word, in a stack of paper sitting off to the side. The unimportant parts, or the parts she felt she could commit to memory on her own, she had just skimmed. Eventually, yawning, she set the book aside and pulled the next one off the stack.
But then her stomach grumbled.
She decided to take a break and go find some food. The food shortage in this damn city was starting to get on her
nerves. But it wasn't as if she were starving, even if she kept leaving the table not quite satisfied after a meal.
Once she had gone out and found some food, she brought it back to the library, keeping it tucked out of sight so she wouldn't be caught with it. The library probably had strict rules against eating here. Getting food stains on the books was probably a grievous offense. She made extra sure to be careful to avoid that.
Then she dove into the next book, carefully copying the pages once again. She skipped past most of the opening, and skimmed the first chapter, since it only covered things she had already learned during her apprenticeship. Once she reached chapter three, however, she found a section she needed to copy in its entirety.
Chapter 3
Conflicting Directives
While it is generally known that Directives are given an order or priority, which theoretically prevents conflict between them, what is not always known is that subtle loopholes can create potential conflicts. These conflicts can, when they become severe enough, either cause a psychotic break within the persona of the Automaton, or in sever cases, result in complete shutdown.
In order to understand how this occurs, one must consider how the loophole happens. Let us first take an example of a basic Order of Priority of common Directives.
Directive 1: Inflict no harm
Directive 2: Protect oneself at any cost
These common Directives are a classic case. Without Directive 1, this Automaton will do anything needed to protect itself, even if that means harming another in order to do so. However, since the first Directive overrules the second, this results in a logical algorithm wherein the Automaton will Protect itself if and only if it can do so without inflicting harm. In this case, there should be no conflict, since the first Directive clearly and completely overrules the second.
However, sometimes more complex or subtle Directives may have criteria which makes it possible for both to be logically "true" at the same time. Consider the following example:
Directive 1: Perform no act of violence
Directive 2: Protect oneself at any cost
At first glance, these Directives seem nearly identical to the previous set. However, there is a key flaw. That is because harm can come without being caused by an act of violence. Given the Directives above, the Automaton could, in theory, find itself in a situation where it must protect itself by allowing indirect harm to come to another, such as by shutting a door to a burning room. A living creature could become trapped inside that room as a result of the Automaton's actions, even though the Automaton actually committed no act of violence. The loophole caused by the poorly worded Directive could have allowed the Automaton to bring about a person's death, without it ever violating its Directives. Furthermore, in cases where such situations have occurred, the Automata in question often do not even understand that they did anything wrong, since "wrong" can only be defined by a violation of their list of Directives. An Animator can attempt to imbue human morals into an Automaton, but doing so merely means a longer, more elaborate set of Directives dictating what is "right" or "wrong", and thus these loopholes are still possible.
The points raised in the book gave her a lot to consider. She had a great deal to learn. After she finished copying these pages, she realized she was out of ink. She set the copied pages aside, and went to find some more vials somewhere.