Hello, Fellow Carbon Based Lifeforms,
I do not think that we can (nor should, even if we somehow could) seek to build a perfect, utopian paradise on this planet. There are too many variables to control (outside, of course, of some simulation of a "paradise" we might experience as a virtual reality experience) and, frankly, human life needs to be rather rough and tumble in order to be rich, fertile and exciting human life. We would not wish to make (even assuming we every could) the world too safe and too predictable.
Even so, some serious problems which humankind is facing might be solved, in whole or part, with a few small changes to our animal natures and tendencies to certain destructive behavior.
My book imagines what might happen, and our choices, with regard to certain technologies which might be misused. If that is the case, might we instead consider to use them for good instead? I remind readers that the discussion in my book is based on four premises. I advocate that we consider the use of certain technologies if, and only if, those certain technologies and medical abilities are:
... then, and only then, we should consider (i.e., talk about their possible) use for good.
Is the moral vision of Buddhism, and other religions and humanistic philosophies with which it shares an ethical foundation, important to prevent harmful uses of powerful future abilities?
What are the challenges we face?
Not everything is a bed of artificially bred roses: No doubt, the military will be on the cutting edge, as will folks just wanting to make a buck or politically pacify and control us. Among all the other technical advances, the weapon designers are sure to conceive of countless new and more efficient ways to kill people, fight wars, blow things up, including our whole globe.
If the future continues to bring war and violence, with ever more powerful ways to slaughter and maim each other, the non-violent message of Buddhism will become even more crucial. It was so when Zen masters of old counseled samurai to hesitate in using their swords; it will be so when Zen masters of the future counsel robot samurai to be cautious in blasting their ray guns. Might the military be the first to develop “peace bombs” which flood our enemies with feelings of peace so that they are incapable of fighting? Might we someday turn such “anti-weapons” on ourselves, in “Mutually Assured Mass Pacification?” What need to kill an enemy if we can simply rob them of their will to fight, if not turn them into friends? Our moving away from violence, and from the conflicts which are ever generated by our greed and excess needs, remains an aim of Buddhism today, just as in the past. It not only helps preserve a simple and happy life, but is more crucial than ever to preserve all human life on this planet in this age far beyond swords and spears.
Let us “break bread, not bones,” “kill our enemies with kindness” and “smother our foes in love” until foes no more, all helped along by technologies and teachings of peace.
...
Ultimately, if we are to be subjugated by future computer masters or super-intelligent apes, I would like them to be good electric-pacifists, loving Bodhisattva bonobos more than aggressive Asura chimps. I would like them to be vegetarians, kind and gentle with their pets and laboratory test animals … because those pets and test animals may be us this time! If we travel to other worlds as colonists, I hope that we do not repeat the same mistakes of greed and war that we have made here, our race spreading junk and violence to space, turning our petty earthly conflicts into petty martian conflicts. If we do design newer and better “bio-species” or “electro-species” or combinations of both, I would like to aim for increased wisdom and compassion built into their genetic and digital programming, not merely enhanced strength and efficient function.
This is where Buddhist values of peace, generosity, kindness and compassion come in. We need to write these values as our robotic/computer codes and (if genomic mucking about is inevitable, cannot be halted, cannot be stopped) insert goodness into human and post-human genetics.
I believe that the human race is heading, in the foreseeable future, for severe decline or its own destruction if we do not take action soon to change how human beings are right now. I believe that we will not get to the root cause of our global problems until we adjust downward (just by a little) the drives creating our human desires to consume, acquire, anger and fight, while adjusting upward (just a tad) our human potential to be satisfied, simple in tastes, moderate, caring for others, loath to act in violence and hate. We do not need to be perfect saints, ideal Buddhas or flawless Bodhisattvas. We just need to be a little better. In my view, the answer lies (assuming the technology becomes common, cannot be avoided, might be otherwise used for more nefarious ends) in altering our human DNA together with other moderating changes to our biology and brains, in body and mind, thus making us gentler, kinder, more giving and compassionate residents of this planet, more easily satisfied, caring, pacifist, loving.
Such alterations are a central theme of this book, and the focus of our next few chapters.
It is our best (and only) real hope.
Gassho, J
stlah
I do not think that we can (nor should, even if we somehow could) seek to build a perfect, utopian paradise on this planet. There are too many variables to control (outside, of course, of some simulation of a "paradise" we might experience as a virtual reality experience) and, frankly, human life needs to be rather rough and tumble in order to be rich, fertile and exciting human life. We would not wish to make (even assuming we every could) the world too safe and too predictable.
Even so, some serious problems which humankind is facing might be solved, in whole or part, with a few small changes to our animal natures and tendencies to certain destructive behavior.
My book imagines what might happen, and our choices, with regard to certain technologies which might be misused. If that is the case, might we instead consider to use them for good instead? I remind readers that the discussion in my book is based on four premises. I advocate that we consider the use of certain technologies if, and only if, those certain technologies and medical abilities are:
(1) inevitable and coming anyway, cannot be halted, cannot be ignored;
(2) have a high chance of being misused by bad actors unless we use them in beneficial ways;
(3) can be shown to be effective and safe to use; and
(4) can be introduced in an ethical way respectful of individual free choice, civil and human rights ...
(2) have a high chance of being misused by bad actors unless we use them in beneficial ways;
(3) can be shown to be effective and safe to use; and
(4) can be introduced in an ethical way respectful of individual free choice, civil and human rights ...
... then, and only then, we should consider (i.e., talk about their possible) use for good.
Is the moral vision of Buddhism, and other religions and humanistic philosophies with which it shares an ethical foundation, important to prevent harmful uses of powerful future abilities?
~ ~ ~
What are the challenges we face?
Not everything is a bed of artificially bred roses: No doubt, the military will be on the cutting edge, as will folks just wanting to make a buck or politically pacify and control us. Among all the other technical advances, the weapon designers are sure to conceive of countless new and more efficient ways to kill people, fight wars, blow things up, including our whole globe.
If the future continues to bring war and violence, with ever more powerful ways to slaughter and maim each other, the non-violent message of Buddhism will become even more crucial. It was so when Zen masters of old counseled samurai to hesitate in using their swords; it will be so when Zen masters of the future counsel robot samurai to be cautious in blasting their ray guns. Might the military be the first to develop “peace bombs” which flood our enemies with feelings of peace so that they are incapable of fighting? Might we someday turn such “anti-weapons” on ourselves, in “Mutually Assured Mass Pacification?” What need to kill an enemy if we can simply rob them of their will to fight, if not turn them into friends? Our moving away from violence, and from the conflicts which are ever generated by our greed and excess needs, remains an aim of Buddhism today, just as in the past. It not only helps preserve a simple and happy life, but is more crucial than ever to preserve all human life on this planet in this age far beyond swords and spears.
Let us “break bread, not bones,” “kill our enemies with kindness” and “smother our foes in love” until foes no more, all helped along by technologies and teachings of peace.
...
Ultimately, if we are to be subjugated by future computer masters or super-intelligent apes, I would like them to be good electric-pacifists, loving Bodhisattva bonobos more than aggressive Asura chimps. I would like them to be vegetarians, kind and gentle with their pets and laboratory test animals … because those pets and test animals may be us this time! If we travel to other worlds as colonists, I hope that we do not repeat the same mistakes of greed and war that we have made here, our race spreading junk and violence to space, turning our petty earthly conflicts into petty martian conflicts. If we do design newer and better “bio-species” or “electro-species” or combinations of both, I would like to aim for increased wisdom and compassion built into their genetic and digital programming, not merely enhanced strength and efficient function.
This is where Buddhist values of peace, generosity, kindness and compassion come in. We need to write these values as our robotic/computer codes and (if genomic mucking about is inevitable, cannot be halted, cannot be stopped) insert goodness into human and post-human genetics.
I believe that the human race is heading, in the foreseeable future, for severe decline or its own destruction if we do not take action soon to change how human beings are right now. I believe that we will not get to the root cause of our global problems until we adjust downward (just by a little) the drives creating our human desires to consume, acquire, anger and fight, while adjusting upward (just a tad) our human potential to be satisfied, simple in tastes, moderate, caring for others, loath to act in violence and hate. We do not need to be perfect saints, ideal Buddhas or flawless Bodhisattvas. We just need to be a little better. In my view, the answer lies (assuming the technology becomes common, cannot be avoided, might be otherwise used for more nefarious ends) in altering our human DNA together with other moderating changes to our biology and brains, in body and mind, thus making us gentler, kinder, more giving and compassionate residents of this planet, more easily satisfied, caring, pacifist, loving.
Such alterations are a central theme of this book, and the focus of our next few chapters.
It is our best (and only) real hope.
Gassho, J
stlah
Comment