Sunday, 28 October 2007

Teaching blindness

Is it possible to teach blindness? You bet.

Case 1. The other day I bought a piece of software which came in a number of files, altogether several hundred megabytes large. There were problems with downloading so I asked to pay the extra charge for a CD burn and shipping after the software itself was already paid for. That's when the trouble started. Their website didn't allow for ordering the burn for that version, and naturally I didn't want to order - and pay for - the whole thing again. I am still involved in trying to explain the situation so that I may get an answer that makes sense in terms of what their system allows a customer to do.

Case 2. While working on the OtoomCM computer program there was the need to save certain screen areas for later reference as the program was running. The bitmap file format suited fine, but for several reasons I had to write and slightly modify the bitmap-file producing function myself. The way Microsoft designed this function is a case study in obscurantism, and so I hunted around the internet for some hint on how the reading of pixels is actually accomplished. There were dozens upon dozens of web pages offering advice about the use of the MS function per se, essentially useless because the parameters make that rather obvious anyway. Yet not a single one explained the weird system used by Windows. It took many frustrating hours trying to figure out how they did it.

Case 3. In various discussion groups I tried to explain my use of the word 'functionality', its understanding essential for an understanding of the Otoom model itself. Although I abided strictly by the definition Webster's, the Oxford English Dictionary, and the Macquarie Dictionary offered, I couldn't get through to certain people because they stuck to the more specific meaning developed later by researchers in the fields of artificial intelligence and others. Using the general meaning common in the English language just didn't work with them.

Case 4. In an article submitted to a journal I was criticised for not dealing with a number of hypotheses that seek to explain in various ways how the mind works. None of them had proven useful, and therefore wasting precious space on things that do not apply seemed futile. Still, those conceptualisations were so ingrained in the minds of the reviewers that stepping outside their bounds was just too much to ask.

What do those cases have in common?

There is a mindset trained to process a given template and not more. Such a mind had never been given the opportunity to deconstruct a scenario in terms of its inherent elements, so that a reassembled version could be applied to the purpose at hand. If that template happened to be appropriate the information was processed, but anything even slightly outside such a norm constituted a challenge. Since the challenge cannot be taken up the situation is not responded to and a constructive outcome proves elusive.

For many decades pupils and students have been faced with a fundamentally different teaching method, one that is "outcome based" and "holistic" rather than concentrating on schematically organising one's thoughts. Information is presented in chunks, and those chunks are interpreted and reinterpreted without giving the learning mind the chance to understand the underlying bits and pieces.

The result is the mind's inability to reorganise an overall concept to suit the moment, and so university lecturers have to teach new students basic maths while education departments focus first and foremost on sociology and social justice rather than on literacy and numeracy, turning schools into "quasi-sociology departments". Of course sociology and social justice are important, but how can you properly evaluate an event if the capacity to critically evaluate its components is missing?

In programming the current development platforms enable the quick and easy assembly of functions - just drag the icon into your form and it's done. Nothing wrong with that, except there is now a whole generation of programmers who simply don't know what stands behind those functions and what's more, don't even see the need to understand their finer points.

Academic journals have settled into well-trodden paths of endlessly repetitive concepts, with no freedom to step beyond the rut no matter how promising such an escape could be.

And simple questions about payment methods turn into a frustrating cycle of emails that in no time escalate into ridiculous complications.

A corollary to the above would be the inability of people to appreciate the detailed mapping out of results as a confirmation of one's message. How can they if the functional detail of a concept had never been the subject of their mental processes to begin with? I strongly suspect this flaw played a considerable role in the evaluation of my thesis at Griffith University.

No wonder that in the UK synthetic phonics is reintroduced into the classroom, and in Australia the government has recognised the serious problem of many graduates being incompetent in basic science, math or history.

Under Otoom the inherent pattern of a given cognitive process - whatever its representative nature - gives rise to further complexes. If those patterns are insufficient or too coarse, the resultant applicability of the complexes will suffer too.

It is high time the post-modernist and feminist habits of mindless chunking are given the flick.

Sunday, 21 October 2007

Globalisation and the disappearance of skills

The migration of industries and their associated expertise from the developed world to newcomers such as China and India is not a new phenomenon and has been amply commented upon by now.

Related issues about flawed goods that had to be recalled at great cost (think of Mattel and their toys) and price dumping due to the sheer opportunity offered have also made themselves felt. Only recently the European Union saw fit to take measures against China's state-sponsored undercutting of steel prices. The latter's steel output is massive, the product is often of lesser quality and sold below production costs with significant consequences in the EU.

Under the economic perspective there are two extreme views; one advocates a completely open trade with no restrictions across borders, on the other end are the import restrictions which in effect constitute trade barriers and lead to the isolation of old.

I leave economics to the economists, but it is an interesting exercise to consider the situation in terms of Otoom.

There we have systems within systems, defined by their respective complexity, connectivity, and number of functional elements. These can be circumscribed as functionalities with respect to the exhibited dynamics.

We can also identify conceptual intersections; that is to say properties of the participating elements which, when compared with their respective counterparts in a neighbouring domain, allow their degree of complexity to be compared with each other. If information travels from a domain of high complexity to one of a lower kind, many associations get pared away and the result in the target area has become poorer. Should the movement occur in the other direction, further associations (now made possible due to the target's higher complexity) get added to the data but are in no direct relationship to the contingencies at the source. For example, a project designed by first-world scientists runs aground in a third-world region because it cannot be properly instantiated there. Or, an utterance by a child is turned into a sophisticated concept by adults.

In functional terms 'system' can be transposed into 'economy', 'data' become 'goods', and 'functional elements' are now 'human activities' in a society. Nevertheless, in terms of the underlying meanings the relationships still hold.

Another feature of such systems are their interdependency. Although a subsystem can be identified as a separate entity, it cannot exist in isolation from the rest of its domain. A steel mill for example may be unique, but without adequate transport, energy supply, and a suitably trained work force it won't function. Transport cannot exist without roads and rail, needs fuel or electricity, another work force, and so on and so on.

The overall quality of a subsystem (defined under the terms mentioned above - complexity, connectivity, etc) therefore relies on comparable characteristics of its neighbours. A breakdown in them anywhere transmits its effects across the network.

Everything in life has a cost, and the maintenance of subsystems is no exception. Hence the training of a work force, industrial standards to be upheld, health and welfare, the availability of education and the quality of life in general, they all come at a cost met by the system overall. Goods from developing countries are cheap because in those economies the costs expended upon the population are lower too. Lower costs means less variance, lower quality overall, and gets translated into an output that now competes with similar products in the importing First World. So much so that entire industries have disappeared in the West, and with them the related skills and self-sufficiency.

To let go of the manufacture of garments for instance may not have a catastrophic effect in a place like the EU or Australia, but it does mean that the stakes have now been raised in the context of international competitiveness. If the entire system (ie, Australia) is capable of up-skilling its work force such that more sophisticated output takes the place of garments, all is well and good. In terms of interdependent systems - at any scale - two potential problems arise at that point.

One, should the other domain (for example, China) experience difficulties, the availability of its products becomes affected and therefore influences everyone else who has come to rely on its supply. And Two, should Australia's work force contain sections that cannot be readily trained upwards, we witness the emergence of niches living outside the required standard. Instituting assistance programs for those demographics may or may not work.

From a system's point of view, and considering the costs carried by a system in order to maintain its standards as well as the aspect of interdependency, the solution would be to peg the tariffs on imports to the relativity identifiable in terms of those respective standards.

In other words, if the overall costs of a given product in a high-complexity region is x, and its counterpart in a region of low-complexity is x minus a, where a represents the difference between the costs born by the higher and the lower region, then the tariff at the border to the high-complexity region will be a pro rata. It is not an arbitrary tariff, but a value arrived at by positing the respective societal costs next to each other.

As far as systems are concerned, such values are not artificial because they reflect the very real difference between standards, and they would not be destructive to either side because they are based on existent dynamics, according to which, after all, either system functions.

As such pro rata tariffs are nothing new. Whether applied to the specific usage of shipping tugs for example or the supply of information by State Public Sector Agencies, a basic cost is adjusted according to the conditions at the time. The above merely represents an extension of the concept applied to the wider system.

Given the sheer magnitude of emerging economies and the costly challenges faced by everyone in today's world (just consider climate change and political altercations), the time may not be far off when these considerations are no longer idle musings but will have become a necessity.