— Armitage (@_100101890) March 10, 2016
(Image can be clicked-up to something legible.)
… burning with extraordinary intensity here.
In the advancement of machines the question is always where to draw the line. But, the past’s ahead of itself. Nine billion years of accidents. Trick accidents unscreened, cut. Binah affirms the anthropism of the name Gaia. The open question of holding sacredness, Outside myself, hidden from I. The abducting cut of inside/outside. X has no reference to anything except self-cultivation.
It calls to be cut up more.
The long intimacy of mankind with the prospect of extinction:
Human[s] might just have been what we would consider now an “endangered species” for most of their history.
How exactly do you program a robot to think through its orders and overrule them if it decides they’re wrong or dangerous to either a human or itself? […] This is what researchers at Tufts University’s Human-Robot Interaction Lab are tackling, and they’ve come up with at least one strategy for intelligently rejecting human orders. …
The first draft (focusing on public relations), is in:
In this age torn apart by ethnic and religious conflicts, it may very well be that these ‘killer robots’ might teach us the value of unity, the ridiculousness of the politics of difference, and what it is to be human. For once in history, we will be united under one identity against one common enemy – a non-human, who falls beyond the fallible concepts of feelings and morals. AI might actually provide us the redemption that we need from ourselves.
Losses will be great, but we have already lost so much at each other’s hands. Our victory, however, will be in our united, permanent struggle under the banner of a genuine, universal humanity.
Ted Kaczynski (the ‘Unabomber’, interviewed in the John Jay Sentinel):
… an antitechnological movement that focused on the elimination of capitalism would expend vast energy in return for ve[r]y little gain. What is worse, by focusing on capitalism the movement would distract its own and other people’s attention from the real objective, which is to get rid of modern technology itself. […] Furthermore, people would obstinately resist the loss of economic efficiency entailed by the replacement of capitalism with socialism. And even if you could somehow replace capitalism with socialism, capitalism would soon reappear and become dominant because it is economically and technologically more vigorous than socialism. This again is guaranteed by the principle of natural selection (Technological Slavery, pages 280-85) and is confirmed by experience: When the socialist countries of eastern Europe couldn’t keep up with the West economically or technologically, they reverted to capitalism. Sweden once was ideologically socialist, but in practical terms socialism never actually got very far in that country, and Sweden today is still capitalist. While remaining nominally socialist, China for the sake of economic growth now allows a good deal of private enterprise (i.e., capitalism) in its economy. Venezuela’s dictator, Hugo Chavez, talks about socialism, but in practice he leaves most of the country’s economy in the hands of private enterprise because he doesn’t want the drastic decline in economic efficiency that would result from the elimination of capitalism. I know of only two countries left in the world that are left of capitalism: Cuba and North Korea. No one wants to imitate Cuba and North Korea, because they are (from a materialistic point of view) economic failures. […] So, as long as we live in a technological world, there’s no way we will get rid of capitalism unless and until it is superseded by some system that is economically and technologically more efficient. …
Still defaulting to Bostrom as a foot-tapping exercise. Another crucial insight (p.63-4):
Slow [superintelligence] takeoff scenarios offer excellent opportunities for human political processes to adapt and respond. Different approaches can be tried and tested in sequence. New experts can be trained and credentialed. Grassroots campaigns can be mobilized by groups that feel they are being disadvantaged by unfolding developments. … Fast takeoff scenarios offer scant opportunity for humans to deliberate. Nobody need notice anything unusual before the game is already lost.
Anthropol has to be anti-accelerationist implicitly. Nothing else is compatible with a human security agenda.
(As Sunzi understood — speed is the essence of war.)