“I think we're been around this curve before with Nick saying that he would grant that a robot feels pain if it acts as if it does convincingly enough.”
A cybernetic organism based on a human might feel pain like humans do. But with different hardware one should expect different sorts of experiences. Sensors and nerves will have different dynamic ranges,
and more or less compute resources could be placed on processing these signals. A robot could have diagnostics to ignore or forget traumas, while humans or cyborgs might not be able to disentangle wanted memories from unwanted ones, just given the way neurons
work.
It seems to me subjectivity in humans is a high-order effect where the representation changes with experience. Experience builds on objective events, and physical
laws, and people share those experiences. So for that reason it is not unreasonable to expect that experiences are in some sense the same – even their physical manifestation stored as proteins. Cells of the visual cortex work more or less the same way
across humans, as do the signal processing mechanisms involved in detecting an audio source. There may not (or may?) be similar structures in encoded memories and high level skills. Maybe learning is possible in the Matrix way? I Know Kung Fu!
Marcus
| Free forum by Nabble | Edit this page |