The novel I'm working on right now is that a future where AI is more deeply integrated into us through neural implants.
Many of the AI based subplots are about the dangers a non malicious AI can pose, including the risks of having this frictionless access to a thing that can generate simulacra of virtually anything or anyone you want. Without a powerful ideological innoculation against addiction, many people could end up in the opium den in their own minds.
If you lose a loved one, would a mostly accurate simulation work to calm the pain of loss, or would the mistakes in the simulation just make you feel it more? Would you ever leave your simulation if that's where all your loved ones remained? And what sort of inoculation would you require to not fall into a trap like that?
Many of the AI based subplots are about the dangers a non malicious AI can pose, including the risks of having this frictionless access to a thing that can generate simulacra of virtually anything or anyone you want. Without a powerful ideological innoculation against addiction, many people could end up in the opium den in their own minds.
If you lose a loved one, would a mostly accurate simulation work to calm the pain of loss, or would the mistakes in the simulation just make you feel it more? Would you ever leave your simulation if that's where all your loved ones remained? And what sort of inoculation would you require to not fall into a trap like that?
- replies
- 0
- announces
- 0
- likes
- 1