Saturday, 13 November 2010

The secrets of latency

One of the most intriguing aspects of the system of mind is the matter of latency. It can be found in the wetware, the software simulation, indeed in nature as a whole.
In what follows I shall recapitulate via a brief summary, touch upon the ramifications and present some examples to show how useful an understanding of the issue would be.
Let's use the OtoomCM (a computer program that simulates cognitive dynamics) to see what happens. An input I1 is presented to the program, processed by the system and provides some output O1. Any output is treated in a formal manner which causes coloured discs to appear on the screen which in turn get blurred to make them look like coloured patches. The same formality is applied every time so that each end result is unique and can be compared with any other (see the web page for more detail).
Next a second input I2 is presented and we get its output O2, now different from O1, that is one or several of the patches (not all) have changed shape as well as their colour.
We take a third input I3. In most cases this will merely result in a repetition of the above, but some I3s generate an output O3 such that some particular patch has regained the qualities achieved through the initial input I1. The recurrence is not an exact copy of the latter, but the chances of a coincidental similarity are minimal (the manner in which the output is drawn on the screen sees to that).
Let's do the experiment again but this time we omit I2. Without I2 the output O3 is quite different from O3 in the previous experiment. Clearly, there is something about I3 which triggers the re-emergence of a certain state in a certain cluster of the matrix nodes so that the cluster has become functionally similar to what it was before.
We can say the cluster possesses a latency (ie, the non-manifested potential for entering a previous state) regarding O1 and I2 that gets triggered by a particular further input; other inputs won't have the same effect.
(Note that some patches do not change anyway, so presenting I3 without I2 would not the clusters responsible for those stable patches exhibit the latency as well? No - because no modification is not latency under the definition)
Before we go any further I need to point out that the state of each matrix node is defined through the collection of integers it holds and those integers get modified according to an algorithm which induces chaotic behaviour, turning each node into a stable, or periodic, or strange attractor (for more detail see the IPSI-2005 Venice paper). And yet we have latency, observable on many occasions.
Since the phenomenon exists in a specific cluster (we can tell because the patch occurs in the same location on the screen) it can be interpreted as a means of packing several layers of information within a particular domain; all it needs is for the right trigger to reveal the respective layer.
Needless to say, subsequent inputs (with and/or without the I2) cause different series of outputs, depending on whether or not there was an I2. For latency to manifest the cluster needs the right trigger, which is another way of saying that every cluster possesses latency which may or may not be triggered.
This is where it gets really interesting. We can say that ordinary inputs cause the system to come up with outputs that are a function of its cumulative states as well as its environment with its own type of inputs making the system appear ordinary in its behaviour, whereas only special inputs evoke its latent states and make the system enter an event trajectory that is now out of the ordinary.
It seems memory operates on the basis of latent cluster states among the brain's neurons being triggered by the appropriate input (for a discussion of memory see "On the origin of Mind", chapter 15). In a different context, ordinary weather relies on ordinary input to go through its common variety, but extraordinary input (eg, a combination of prolonged heat, moisture, updrafts) lead to cyclones, although the necessary functional ingredients exist throughout the air mass all the time provided by previous inputs (the causes for the temperature gradients, humidity, dew points, dust particles, etc). In another context still, dinosaurs existed on earth for a very long time, having settled in their environment in combination with all the other organisms including mammals. It took an extraordinary event to trigger the latency in the DNA structure of mammals to unfold into the complexity we find today. Note that without their respective latencies none of those extraordinary trajectories could have eventuated (ie, no memory recall, no cyclones, no humans).
Back to cognitive dynamics. An abstraction is a form of interpretation in which the principle aspects of an event and/or an object (generally termed 'system') are highlighted. For example, I can describe a pump in terms of pipes, valves and cylinder and piston, but I can abstract all this to a system consisting of a space acting as a receiver under one configuration and turning that same space into a supplier by changing the configuration. Defined in this manner I can use whatever comes in handy and, provided the functionality of the abstraction is adhered to, the resultant system will be a pump.
Cognitively speaking, an abstraction is represented by the intersection of various sets of relevant neuronal clusters; its output is the abstraction.
We can go one step further and consider the intersection of several intersections, leading to the next higher abstraction level; and so on.
For example, 'art nouveau tables' are level 1 abstractions of all the particular tables made according to that style (these particular tables could be termed level 0 abstractions); 'table' is a level 2 abstraction of 'art nouveau tables'; a 'flat surface held up by some support' is a level 3 abstraction of 'table'; 'mesa' being an example derived from a level 3 abstraction applied to terrain.
A series of outputs derived from their inputs represents the results of affinity relationships between the participating clusters, modified along their time lines by the oncoming inputs (and the affinity relationships exist within the context of chaotic systems). Affinity relationships are just that - relationships based on affinity. Therefore they occur among clusters representative of level 0 abstractions, or among clusters of a higher level. Since the creation of affinity relationships as such depends on the abilities of neurons to interact with each other and nothing more, it is - technically - possible to combine different abstraction levels with each other, but the result will be found wanting (imagine applying the criterion 'art nouveau' to a mesa).
What about latency? For a latent state to be manifest it needs a trigger, but is it feasible to have a trigger coming from a level 1 abstraction applied to a cluster representative of its level 0 counterpart? Technically yes, but what are the consequences - not only in terms of the sheer system but also in terms of the meaningfulness of the subsequent outputs as interpreted by us humans.
Remember that for latency to become manifest it needs extraordinary input. However, extraordinary input is not compatible with the ordinary inputs underpinning the entire range of clusters representative of level 0 -> n abstractions. Hence it is possible that the probability of affinities between the result of an instantiated latency and the result of ordinary states is diminished; it would be further diminished still by another manifested latency and so on - in other words, the meaningfulness of cognitive output would be sharply reduced, but there would come a point beyond which the accumulation of realised latencies has reached such a degree that meaningful affinities between them are possible once again. Within that particular context a quite different way of thinking has been achieved through the availability of extraordinary inputs, which would become less extraordinary once they have had the opportunity to establish a pattern.
At the moment the OtoomCM program cannot be run on a platform that allows for a sufficiently large-scale configuration in order to test such a scenario.
Nevertheless, this kind of investigation would reveal what it takes to literally change a person's mind - or society's for that matter. Not having to rely on ad-hoc events (our fate so far) has dramatic consequences. For that matter, unforeseen events and their effects could be mitigated, even reversed.
Here is an example from biology. Until the discovery of black swans in Western Australia Europeans firmly believed swans were always white. Imagine being able to analyse the cluster of swan DNA with respect to its feathers and their pigmentation. It would become obvious whether other colours were possible, given the availability of the necessary triggers.
Despite our knowledge in chemistry we still rely on nature to show us some particularly exotic protein formations and their qualities. Knowing how to interpret the latency along the chain of biochemical protein complexity we would be able to create those substances ourselves - provided of course they are possible but if they are not we would know that too.
The phenomenon of latency in complex, chaotic systems promises insights way beyond any investigation that relies on the availability of content-related instances of reality could furnish. It only needs an appropriately scaled simulation.

No comments: