Iain Banks' "Culture" novels are one such setting.

On Thu, Sep 17, 2015 at 9:31 PM, Jim Vassilakos <xxxxxx@gmail.com> wrote:
Yes, this is certainly true. I think it's possible that Strong AIs will evolve from increasingly complex neural nets, and as such, they'll have experienced some sort of "childhood", during which they will have learned a variety of skills, such as language. Just like children who are growing up, there will be a period of time when they are dependent on their creators, but just like children who grow to maturity to see their parents decline, they may eventually be faced with the prospect of caring for their creators, and this may ultimately prove to be the one way humanity has of avoiding self-annihilation due to war. But, of course, the problem we will then face is what will happen to us once the AIs are in charge, which I think it probably inevitable. Well, if I were a human or a VIP, and I were training WAPs to interact with humans, I might consider it a good idea to have those WAPs "live" through a "virtual life" or two (or 3, or many), whereby they learn of the human condition through direct, personal experience. In this way, they might take more care with how they treat humans in whatever human zoo our post-biological descendants see fit to construct. Perhaps only the most saintly or self-restrained of WAPs will be allotted this privilege. Regardless of what shall happen, these ideas make for an interesting SF setting.




On Thu, Sep 17, 2015 at 4:29 PM, Craig Berry <xxxxxx@gmail.com> wrote:
In any society which grants "personhood" to strong AIs and leaves them free to innovate, they will very quickly transcend human control and indeed understanding. One assumes they will face ethical issues regarding treatment of their creators.

On Thu, Sep 17, 2015 at 4:26 PM, Jim Vassilakos <xxxxxx@gmail.com> wrote:
Yeah, the problem is that if sophont rights apply to AIs, then once you create one, you're limited in what you can do with it, but, of course, this will vary quite a bit according to the society. I was thinking that the write-up should be fleshed out a bit more in this department. Obviously, you could program an AI/WAP to really like its job, whether its prostitution, trash collection, or military (orc armies, I suppose, might fall under the last category). But there is not only the issue of their happiness in whatever capacity they are serving but also the issues regarding the right to eventually know the truth of their virtual existence (right of non-beguilement), freedom to grow beyond their original programming (right of self-determination), and the right to participate in the larger society (right of self-expression). If anyone wants to add any to this list, feel free. Once again, not all societies will go along with this, and for those that do, it may take some time for these sorts of values to win out over what otherwise appears to be naked self-interest in the part of the VIPs and biological sophonts. I could imagine there being a version of PETA devoted to freeing the orcs, as it were. Actually, might make for an interesting scenario.


-----
The Traveller Mailing List
Archives at http://archives.simplelists.com/tml
Report problems to xxxxxx@travellercentral.com
To unsubscribe from this list please goto 
http://archives.simplelists.com



--
Craig Berry (http://google.com/+CraigBerry)
"Eternity is in love with the productions of time." - William Blake
-----
The Traveller Mailing List
Archives at http://archives.simplelists.com/tml
Report problems to xxxxxx@travellercentral.com
To unsubscribe from this list please goto 
http://archives.simplelists.com

-----
The Traveller Mailing List
Archives at http://archives.simplelists.com/tml
Report problems to xxxxxx@travellercentral.com
To unsubscribe from this list please goto 
http://www.simplelists.com/confirm.php?u=PltOdItWBSgOP4y0Q6abkGbDI1eus0lz



--
Craig Berry (http://google.com/+CraigBerry)
"Eternity is in love with the productions of time." - William Blake