Tuesday, May 20, 2014

Constructive Complexity, or Posthuman Ethics and Value Systems

George Carlin once asked, "If God loved you, how come He gave you tumors?"

Let's find out.
Table of Contents
I. Background
II. The Rectification of Names
III. The Danger of Wireheading
IV. Beyond Wireheading
V. And Then What Will They Do?
VI. Aren't They Kind of Bastards, These Posthumans?
I. Background

One of my basic principles is that if you know that you’re going to believe something at the end of the day then you may as well skip all of the fighting and pushing and pulling and awkwardly hanging on to your old, mistaken belief and get right to the business of adopting the belief that is less wrong. If you know that you’re going to lose your present belief in the end then that’s a sign that some part of you already knows it’s incorrect.

This idea can apply in other situations too. There’s been many an argument where I’ve seen where it was going and said something along the lines of, “Look, I’m going to say this, you’re going to say that, and then I’ll say this other thing, so let’s skip all of that and get to the end where we have to say that this is something that we can’t convince the other person about.” This not only saves both of us some time but can actually leave them more likely to take my side in the future than if I persisted, because people become more convinced in the veracity of their beliefs simply by arguing for them, even if their arguments are baseless.

Where this serves to make inevitable my slow descent into insanity (relative to everybody else) is that I began thinking, “If Trans/Posthumanity is going to think of something in a given way because of technologies that we know are possible but just haven’t come into play yet, and if these technologies are going to force us to rethink a given concept for reasons that we can articulate pretty well already, then the only reason to not let those reasons sway us now is that our collective arm isn't being twisted yet.”

This is kind of silly, in my opinion, and the best decision to make is to think, so to say, as forwardly as possible. I also think, if I may be so bold, that the refusal to do this is a failing on the part of many transhumanists. Not all are guilty of it, but for those who understand the implications of, say, mind uploading and then fail to update their conception of the self? I am sorry, but I don’t see how that makes any sense at all.

What follows is derived from this desire to think forwardly and to adopt the beliefs that Trans/Posthuman civilizations will likely adopt.

II. The Rectification of Names

This section may be a little dry but trust me, it’s worth it. I want to make sure that we’re all on the same page and speaking the same language when it comes to get into the meat of my argument.

Complexity
  1. The state of being complex; intricacy; entanglement.
  2. That which is and renders complex; intricacy; complication.
Complex
From French complexe, from Latin complexus, past participle of complecti (“to entwine, encircle, compass, infold”), from com- (“together”) and plectere (“to weave, braid”).
  1. Made up of multiple parts; composite; not simple.
  2. Not simple, easy, or straightforward; complicated.
Simple
  1. Uncomplicated; taken by itself, with nothing added.
  2. Without ornamentation; plain.
The preceding definitions (sans irrelevant meanings) were lifted from Wiktionary, as are any following in quotations marks, if they are not attributed to another source. For our purposes we could also say that diversity = complexity or that 1 bit of data is less complex than 2 bits of data.

I trust that my meaning will be understood but nevertheless we will, for the sake of making certain, go over a few examples.

Less complex                  More complex
A stick figure                The Mona Lisa
An IBM 650                    A Microsoft Surface Pro 2 tablet
Diary of a Wimpy Kid          James Joyce’s Ulysses
Monocultural societies        Multicultural societies
You                           You, after you learn something new

And now we get to go into some other terms. Constructive complexity is one of those deceptively simple phrases that hide a whole lot, mainly in the first half. “Constructive” covers everything from “it can’t be too much complexity for a person or system to handle” and “while it may produce pain, it should not produce more than what can be justified” to “the complexity is great if it doesn’t overwhelm someone and is even better if it allows that someone to handle an even greater amount of complexity from there on out.”

I might be able to write a small pamphlet on the concept of constructive complexity, but I think that we could get along well enough for now by summarizing it as complexity that is comprehensible and does not cause undue suffering as a consequence of attempting to comprehend it. Undue suffering is any which is not, on the balance, outweighed by the benefits. Exercise can bring suffering but it is not undue because proper exercise brings benefits that make up for this.

Posthumans are beings that have so far progressed beyond our present state as to effectively be no longer human. Just as our nonhuman ancestors could be termed “pre-humans,” these are in terminology and reality post-human beings. The term Posthuman will be used without regard for the actual species (present Posthuman civilizations, should they exist, would not have been derived from homo sapiens but some other species). This is very chauvinistic of me but at present there is no species-blind version of the term, so far as I am aware.

In philosophical discourse, the hedon is “a unit of pleasure used to theoretically weigh people’s happiness.” Someone with one hedon is less happy than xir neighbor with two hedons. The opposite of a hedon is the dolor, a unit of pain/anti-pleasure. From Less Wrong (which shall handle the remainder of our quotes here), “We might say that eating an ice cream cone yields 3 hedons to a normal individual, while solving an intensely difficult logic puzzle yields 15 hedons. The frustration undergone in the course of figuring out the puzzle might be judged as -20 hedons (or 20 dolors), depending on the puzzle’s difficulty and the individual’s temperament. The specific numbers are generally not too important; the main point is to give a rough sketch of how the enjoyment of one experience relates to others.”

Orgasmium is a substance, or perhaps the physical state of a person, “which is in a constant state of supreme bliss.” Speaking of “reducing a person to orgasmium” could be a succinct way of describing someone who has been put in this state. Another term, seemingly more popular, is wireheading, or “the artificial stimulation of the brain to experience pleasure.” Less Wrong describes this as “counterfeit utility.” Cocaine is an example of a means of proto-wireheading but differs in that it is extremely detrimental in other ways. Wireheading, on the other hand, could theoretically be maintained for so long as the body was properly sustained.

A Terminal value (AKA intrinsic value) is a value that is “an ultimate goal, a goal-in-itself.” If you value some things because they let you get other things (money lets you buy stuff, for example) then those things are not terminal values. If you want something simply for itself, however, and for no other reason, then it is a terminal value.

III. The Danger of Wireheading

Less Wrong points out that “in theory, wireheading with a powerful enough current would be the most pleasurable experience imaginable.” That is, if the brain is capable of experiencing a given number of hedons, and wireheading is capable of artificially stimulating the brain so as to experience hedons, then there is nothing in the universe that could produce more hedons than wireheading. The very fact that it is possible for something to produce X hedons means that the brain can, under other circumstances, be artificially stimulated to experience that same amount. But better, because with artificial stimulation those hedons can be maintained more steadily, for longer periods of time, and for less work.

If hedons are truly what you want, then you are going to become a wirehead. At that point you will cease to play any meaningful role in the story of your civilization, except perhaps as a cautionary tale. There is also danger in addiction. That is, our bodies consider hedons to be a terminal value, which is the main reason that we can become addicted to hedon-producing activities that we know are bad for us in other ways. And if one can be forcibly addicted to a proto-wireheading substance like meth or heroin then it stands to reason that it would also be possible to be addicted to the real deal, the perfected wireheading process. For that reason, some who don’t consider hedons to be their terminal value could still be damaged and, unless the damage is repaired, similarly fail to play any meaningful role.

Incidentally, we find another danger presented by the possibility of wanting things but not knowing why, or that we want other things even more. Do you really value happiness more than anything else, or do you just think that you do? This can be disastrous because if we have known conscious values and unknown, subconscious terminal values then we will act on the values that we know, not the ones that we don’t know. Where this leads to a conflict we can actually end up less fulfilled than we were before and not know why.

While going further into this idea is a discussion for another day, it is relevant because there are some who believe that they are after something when really they what they are after is hedons, and this is the means by which they can achieve hedons most effectively or acceptably. Given an infinite lifespan, they will either change their terminal value to something other than hedons or they will succumb to it and become wireheads. Given a finite lifespan, they may do either of those things but even if they don’t they will die.

What this means is that you will be able to divide people into two groups: those that value hedons and have become wireheads and those that do not, and have not become wireheads. Given the current march of technology it appears that wireheading will become too simple to achieve to effectively remove it as an option. Whether the wireheads have actually died or simply been reduced to orgasmium until the stars burn out, I feel it would not be incorrect to refer to the non-wireheads as “the survivors.”

IV. Beyond Wireheading

What would the survivors value? If they stop valuing hedons then what will be their terminal value?

You could suggest a number of things, actually, but there appears to be only one possible terminal value that will encourage development and dynamism: constructive complexity. I propose that any civilization that has reached beyond its own solar system, being advanced enough by this point to have fallen prey to wireheading, will have adopted constructive complexity (as filtered through its psychology, if the details are overly anthropocentric) as its terminal value.

To put it most pithily, if to Plato the Good and the Beautiful are identical, then in this value system the Good, the Beautiful, and the (constructively) Complex are likewise conflated into a single whole. It is beautiful because it is good. It is good because it is constructively complex.

As mentioned above, complexity is more valuable if it is not only comprehensible and free of undue suffering but also strengthens our capacity for complexity, allowing us to comprehend greater amounts of complexity. On the other hand, complexity that is incomprehensible is no better than, and possibly in some ways worse than, undifferentiated simplicity. Both are noise on the airwaves.

I further propose that authenticity will also be highly-valued, as a means to fulfilling its terminal value. The real thing, with full substance and an actual history of varied causes and effects, will always be more complex than the fake, which will at least lack the history even if it is an atom-for-atom exact duplicate. If complexity were chocolate bunnies then the authentic subject would be solid chocolate and the false model hollow. From the outside they appear to be the same, but that does not change the fact that one is less chocolaty/complex than the other.

Self-awareness, or luminosity, is another probable value. Those who know X are less complex than those who know X and also how they think, especially as the latter group can more effectively optimize itself whither it chooses (such as in the direction of greater complexity).

V. And Then What Will They Do?

Make more civilizations. Not just spread civilizations. Make them.

A Posthuman civilization is going to have a certain mindset and cultural history and so on that descends from its evolutionary heritage, random quirks of their history, etc. And all of this is going to make their culture unlike any other.

Now, remember that a monocultural society is less complex than a multicultural society. A group with five Germans is going to have dynamics less complex than those of a group with a German, a Russian, a Scot, a Canadian, and an Amish, because all of them will be coming to the table with different worldviews and more ways to see the same picture. To the extent that it is sustainable (i.e. still constructive), it is always going to result in higher complexity to add another culture to your multicultural society. As a side note, it will also result in higher complexity to have both “melting pot” and “identity-protective” segments, as demonstrated by the different ways in which Jews in the United States have chosen to interact with the rest of American culture, which together make a more complex whole than any one of those approaches taken alone.

Now, if you want more complexity, if you want new stories, new culture, new ways of thinking about stuff, then the best way to do it would absolutely be to let life and sapience develop on its own on another planet. Time is no object- you’re a Posthuman god- but authenticity is. You could possibly shortcut the process from the outside by running it in a sped-up simulation but you couldn't shortcut it from the inside. That is, if you want a civilization to form with thousands of years of cultural history alien to your own then it needs thousands of years, at least from its point of view, to develop it.

Granted, this is pretty pre-Posthuman-centered thinking, but I think that I have the gist, if not the details, down. I won’t go so far as to speculate whether a Posthuman civilization would create other civilizations in simulations or create other civilizations in their own present universe, but it does seem to me (with my admittedly-limited thinking) that there’s no reason why it couldn’t be both, and even more. There may even be Posthuman Roger Eberts making all sorts of commentary on the civilization-building projects being performed according to their preferred method, and who idly mention how some other method isn’t real art and then coming under all sorts of fire because it turns out that some people really disagree with that.

(I jest)

Also, Kurt Gödel’s Incompleteness Theorem demonstrated that “any system of logic must necessarily either be internally inconsistent or incomplete. In other words, Gödel’s proof demonstrated for the first time that there exist statements that are unprovable in any logic system, and that all arithmetic as we know it is at best incomplete, at worst inconsistent. It is logically impossible to construct a single grand ‘metalogic’ capable of subsuming all other modes of logic while remaining consistent” (Robert A. Freitas, Jr., Xenology).

It can also be seen that our psychology biases us toward particular manners of thinking and may even blind us to some others. It is even possible that there are some ways of thinking to which we are, from inside at least, permanently blind because we can’t even begin to conceive of them in order to figure out how to make their use possible. Dialogue with aliens, then, could open up a Posthuman civilization to new information not just because of the time spent to learn it but because they can think- and grasp truths- in a way that the Posthumans could not until they were shown the path.

So in addition to or instead of (I strongly think “in addition to” is more accurate) the first reason for creating new civilizations, Posthumans might create them (and let them develop on their own because custom-tailoring them would run the risk of setting limits based on the limits of the Posthumans’ psychology) in order to glean new truths from them.

(But again, that it is possible for even isolated Posthumans to be blind to a particular way of thinking is pure supposition and what I describe may not be needed at all)

Now, it may be a bit “shooting in the dark” to suppose that Posthuman morality would have to be based on constructive complexity, but it does seem that way. In order to survive the Wireheading Stage it has to value complexity, and value it over hedons, and if it’s going to be sustainable then it has to have some concept of “self-destructive complexity.”

VI. Aren't They Kind of Bastards, These Posthumans?

Well, maybe. It depends on how you look at it, but be careful now. If they’re objectively bastards, then so are you if you ever let your child get a scraped knee.

“Being nice,” as we humans understand it, isn’t much of a factor for Posthumans, really. Especially if the Posthumans, say, intend to bring everybody back at some point (if we’re in a simulation then I think it should be simple to copy each mind before it dies, and there are outside-simulation possibilities like quantum archaeology). In that case, compared to everything that a person could experience post-singularity (and post-resurrection, if necessary), even the most horrible life is to a Posthuman the equivalent of a kid who gets a scraped knee while xe’s learning to ride a bike and xe’s crying and crying and wondering how you could be such a horrible parent as to let go of the freaking bike when you knew xe was going to crash on xir first time.

It's the end of the world for that kid, but you're just standing there and thinking how silly this is because you can see the bigger picture. In fact, because pain can lend itself to complexity of experience and greater self-awareness and is therefore a-ok so long as it doesn't cause too much damage (and for a Posthuman, what is too much damage?), the difference between one million murders or two has, as Ben Goertzel says, the same moral significance to them as a rounding error. 

This “rounding error” mentality may be held even if it is impossible to save the dead. This is important to remember: Constructive complexity does not hold hedons as a terminal value. It does not hold human life, or any other kind of life, as a terminal value. Its terminal value is constructive complexity, or complexity that is not self-destructive, and it values hedons or human life only insofar as these things increase the net complexity of the universe. This is why, remember, constructive complexity allows- even encourages- the existence of dolors. A universe with 20 hedons and 20 dolors, all properly distributed so as to not discourage further complexity, is better than a universe with 20 hedons and no dolors. The "constructive" qualifier is there because the wrong kind of complexity (the non-constructive kind) damages a subject's capability to add to the net complexity of the universe. 

What does this mean? Constructive complexity value systems permit the permanent death as an acceptable cost for increasing the net complexity of the universe. A life, lived, has necessarily increased the net complexity of the universe so long as that life has not acted in a way to reduce it (for example, releasing a plague that killed everyone on the planet).

Constructive complexity is not bunnies and sunshine. It’s just probably the only terminal value that will let us survive the Wireheading Stage whilst maintaining various traits that we consider to be fundamental to the human experience (art, consciousness, dynamism, exploration, scientific progress, etc). The transition to Posthumanity will leave us Post-human, but it is neither inevitable nor impossible that we could also be left without anything to betray our origins.

That said, if it is possible to save a mind from destruction by death, through copying it at the time of death, recovering it through quantum archaeology, or some other means, then it stands to reason that a constructive complexity value system will support this, so we can rest assured that if it is possible then it will happen. All else being equal, a group of five beings can have more complex interactions than a group of four beings, and death prevents a being from doing anything at all, which reduces the net complexity of the universe as beings are necessarily more complex than a collection of the same number of atoms arranged in any non-conscious, non-thinking pattern.

And from that point of view, yeah, our idea of benevolence doesn't quite line up with theirs, but that's because they're operating on a different value system (and on a greater scale of comparison). They may wish us well, but in the long run, and we're also running into values dissonance: we think that a benevolent posthuman god would make us happy, but our theoretical post* creators think that they're benevolent because our lives are more complex, and any permanent damage (like death and trauma) is either fixable with quantum archaeology, rewriting the mind, and so on, or just doesn't matter to them, because even my understanding of "unjustifiable pain" is too humanocentric for them, and they don't think that the deaths of billions of people is too high a cost for a certain amount of psychological, historical, and species complexity in however many people last to become posthumans. We may not value any of that now, especially against the cost that we paid for that complexity, but when your kid is screaming beside xir bicycle xe probably doesn't value the ability to ride a bike all that much either.

As twisted as it may seem to us today, if Posthuman civilizations could have interfered with our development in any significant way then we will likely thank them, in the future, for refraining and allowing our history to remain as authentic and complex as was possible. 

No comments:

Post a Comment