Previous Next Title Page Index Contents Site Index

2.1. Coming to base
2.2. Computing, Physics, and the Universe
2.3. The Leibniz System and Hypermedia Technology
2.4. Hypermedia as extension of the human thinking apparatus
2.5. A structural note: Outlines and the importance of folding
2.6. Problems of contemporary mouse-icon technology
2.7. The Software industry and the Law
2.8. The fundamental necessities of a standard ASCII programmable shell
2.9. The scourge of binary only configuration and control files
2.10. Technical and Organizational Infrastructure, Financing

2. Engineering, Technology and Informatics

Technically and pragmatically, this part is intended to provide an infrastructure for the creation of Hypertext and Multimedia based Symbolization Systems, or, as I coined the name: The Symbolator. This is based on a further development of contemporary computer based hypermedia systems. We may also call it the project of a technological ars memoriae, a research and development project proposal covering a period of the next 20 years. This study is about creating an infrastructure. It is about the ars inveniendi, and not so much about one specific invention.

This part is also intended to be used as base for a PhD thesis in computer science or software engineering.

2.1. Coming to base

Compared with the lofty and soaring bird's eye views of the philosophical section, the mundane approach to pragmatic matters seem to be a bit pedestrian. That may be as it will. Without the praxis, we could not feed the theoreia. Ancient society which had invented the theoria did not have to bother about this: The Greek society of Plato was a society of slave-holders. Only because there was a vast majority of rightless humans considered as chattel could a small minority of the leisure-class afford to think and philosophize to their hearts content. The immediate problem of this approach was that even though the steam power was already used by Heron of Alexandria, it was of no economic concern since slave power was so ubiquitous and cheap that no-one would have thought of applying steam power to do actual work [33]. So his machinery was used to drive ingenious contraptions that performed some temple miracles for the wonderment of people. The situation stayed like this until the end of the roman empire. (See also NEIRYNCK-ING , 138-158)
2.1.1. Forging the european technological infrastructure
Practical application of techne combined with theoretical physical models and mathematics had to wait for about 2000 more years until a whole different thought structure had arisen in Europe, by a long detour of arab reception and refinement, a complete re-working of the socio-economic infrastructure, the introduction of a new numbering system, and finally, the mass adaptation of printing technology that put knowledge in the hands of the people. European technical infrastructure had slowly turned to mechanization, the seeds having been sown with the rule of Benedict of Nursia of "ora et labora", requiring that those whose business was the higher order contemplation were not exempt from manual labor. This meant that at least some of the brain power of those who had learned the scriptures, was applied to practical matters, and not only to idle speculation, as it unfortunately had been the case in antiquity. The cisterciensian order provided further infrastructure with the spreading of wind- and water power machinery, transforming the whole european continent to the use of simple but ingenious devices until they became so ubiquitous, that there were people everywhere in Europe, who understood them and could construct and maintain them. This factor of technological infrastructure tends to be completely forgotten in accounts of European technical history, because is is a slow, unspectacular, bits-and-pieces, hammer-and-nail process that took many centuries, practically all through what we now call the dark middle ages. They were not so dark after all. That the scholastics produced not very much of technical utility during this time doesn't mean that the common man out there was idle. Far from it. Later, clocks were the base for the most intricate mechanisms that found their climax in the calculating machines of Pascal and Leibniz. The clock was the model of rationalistic philosophy, and such this philosophy owes its source from mechanics (MUMFORD ). The clock was the important instrument for the european mastery of the seas, ships now being able to establish their position with an uprecedented measure of accuracy, and independent of the weather. This long process of about 1000 years raised the conceptual level of the whole population of the continent, with many, many smiths, mechanics, and artisans being able to perform all those mechanical miracles that were necessary to create a technical civilization. On this base could then the steam engine be built.

It was the nameless work of many thousands of ingenious tinkerers and technically skilled laborers that produced the ever more refined instrumentation which finally enabled the great scientists of that age to formulate their new observations and theories. Without the art of glass grinding, there would have been no telescope, and so on. And Galileo surely had someone else grind his lenses for him when he constructed his telescope. That process was tedious, laborious, and dangerous. Spinoza was the great exception in the whole line of gentleman engineers and scientists who had their mechanics do the dirty work for them: Spinoza was a glass grinder. That cost him 30 years of his life. He died at the age of 45 because of tuberculosis, that was aggravated by his constantly inhaling the glass dust (see Appendix I). Science rested firmly on this base of technological expertise of the whole continent. It was technology that allowed science to make their discoveries. Further elaboration of this can be found in TOULMIN-KRIT and NEIRYNCK-ING .

2.1.2. Leonardo's contribution to european technical development
Leonardo was a key player in the development of the european technological infrastructure. His real influence cannot be determined because it happened largely outside the books. A possible path of research would be to analyze all technical drawings and pictures in technical books (like Agricola's de re metallica [34] ) that were produced in the time before him and after him, for specific features that he used, and could not be found earlier, but then made it into common usage later. His inventions were so far ahead of the time, that many found their realization only 200 or 300 years later when the mechanical arts had progressed and the technical base was ripe. It is likely that many of his technical drawings had inspired later inventors and that technicians and artisans used the designs of Leonardo. At that time no one bothered to cite from where they had their ideas (and Leonardo bothered the least), therefore, the absence of citations doesn't mean a thing. (See LEON-HEYD , 133-134). In his mechanical drawings, we can see that Leonardo's work was not only concerned with the invention, but much more so with the infrastructure of invention. When he drafted screw mechanisms, he also designed screw cutting machinery (LEON-HEYD , 160). His whole work was all-encompassing to a degree that was outstanding in his time, and unimaginable for us today.

One thing is certain beyond all the insecurity: Leonardo's mechanical drawing style had made an indelible mark in european technical draftsmanship. Leonardo was able to draw things such that one could intuit (recognize by direct observation) the essential mechanical principles. European artists and technical drafters unquestioningly adopted his style. Only later was a different, cartesian-coordinate oriented style adopted by the engineering professions. Today, we have the problem, that because this style is so ubiquitous, we cannot realize how revolutionary it was at his time.

2.2. Computing, Physics, and the Universe

2.2.1. Fundamental questions of modern computing
@:COMPUT_FUNDAMENT
The Oberschelp question . Are our current numbering systems an optimal representation when we want to compute anything pertaining to organic nature, evolution, and intelligence?

If nothing better is available, then what is there, is the best.
But: If there are better possibilities, then we come to the old adage:
The good is the deadly enemy of the better.
This is because the good knows that it will be entirely destroyed by the better.

What could be an indication that the current numbering systems are not adequate? Let us reformulate this question a little: The question means, restated: Is our universe built up of finite fractional numbers or not? Because if it is not, then we run into a computing problem.

As far as I can see, the universe is not built up of finite fractions. All the interesting physical constants, like Planck's constant h [35], are irrational. A computer is badly equipped to deal with irrational numbers, because of the limited machine word length to represent it. Irrational numbers are computer-unfriendly numbers. If rounding mechanisms are used, we are bound to run into problems, somewhere, sometime. And we never know when (PEITGEN92 ). In classical mechanics and macroscopic systems, we usually don't need to bother, but it is one of the main problems when we want to simulate things that are dealing with quantum physics. Because here the difference counts.

To put it in different words: Nature has the perplexing habit to base her own inner workings on a numbering system that is entirely incompatible with present digital electronics. Remember the from the introduction of our Oberschelp question .
->: PERPLEXING_HABIT, p. 25

(rest omitted)

2.2.2. Towards a neutral noetic base for intelligent machinery
The problems of the philosophical base of computing.
Misplaced ontological assumption of "the thing out there".
Kant has irrevokably proven that if ever there is a "thing out there", then we will never have any possibility of directly knowing it. So we shouldn't worry about that one any more. Konrad Lorenz is entirely in accord with Kant. Many of the natural scientists seem to have read neither Kant nor Konrad Lorenz, and still seem to think that the kind of representations we have are "it". Lorenz seems to have a problem that Kant talks about the "thing-in-itself" as singular. But that is only logical, because it can only be one. It is The One [36]. And there is no other beside it. Its manifestations, and representations, are manifold, and that is also logical.

The same was stated by Locke, yet he was misunderstood by all his empirist followers.
The assumption of a physical reality "out there" is entirely substantiated as working hypothesis, but is misplaced if viewed as an ontological dogma.
And for application like machine intelligence, it is a serious obstacle.

Gotthard Günther gave an account of the philosophical domain of intelligent machinery.
Problems of using the Hegelian approach. Deep mistrust of natural sciences against idealism.

No such thing as pure cognition.
Applying the lesson learned in biological vision research.
The case of the kittens who stay blind when restrained from running around.
Why sensory machinery must integrate the action part.
The action part is again a philosophical problem.
William of Ockham.
Not only Ockham's razor. But also the philosophy of action, and will.

2.2.3. Weinberg's Law of the Hammer
Give a child a hammer...
Weinberg's Law of the Hammer: "The child who receives a hammer for Christmas will discover that everything needs pounding" WEINBERG85 , p. 53
An example why a sensing computer system may need a hammer.
Material testing:
The quality of cast metal pieces can be determined by the sound made when struck.
Instead of giving a neural network that is used in a material testing machine
samples of sound patterns of "good" and "bad" casts, let the machine itself do the hammering.

2.2.4. Konrad Zuse's Computing Spaces and the representation
@:COMPUTING_SPACE
Here we take up the thread from the section: "The ontological base of Vorstellung: the infrastructure". We now go about a computational representation of the representation. Before we get to the implementation of intelligence proper in this substrate, let us look at the easier case: How can we implement matter? The pioneer of computing, Konrad Zuse , had been the first to think about computational representations of the universe (ZUSE69 ). His work was duplicated independently later by researchers in the field of Cellular Automaton theory, namely Tommaso Toffoli and Edward Fredkin . The problem of all these models is their reliance on the ontological assumption. If we drop that and instead make the assumption of the Schopenhauer representation then we are in a different situation. Therefore, Konrad Zuse's ideas and the theory of the universe as cellular automaton will probably have a revival. Let's see how this could be effected.

2.2.5. The universe matrix
@:UNIVERSE_MATRIX
We are now talking about the universe matrix, and from this matrix is derived all materia. In the ancient terminology, they said it was born out of it (natura, from nasci). But that doesn't need to bother us. Let's just say that it is generated.

We don't need to insist that an atomic piece of matter is something preserving an identity in itself by itself (or a kantian thing-in-itself) because the matrix cares for an identical appearance of the phenomenon while it supports it there. This is still an entirely materialistic theory.

The acid test of any materialistic theory is that the things the matrix produces are displaying exactly the kind of behavior and characteristics that we perceive with the aisthaesis. This means that a stone that is being thrown follows a ballistic arc, a thermodynamic machine like a gasoline engine will still drive a car, and so on. As long as the machinery still functions the way we are used to it, we don't need to worry. Remember that we humans, all our nervous system, and everything, are just as much fabrication of this machine as all those things we see and experience as "the world out there".

A different thing is with the physical and mathematical formalism behind the matter (materia). I don't guarantee for that. Probably some of that will change. I don't know how much, and I hope that it will.

How this can be done is illustrated by Lord Kelvin .

2.2.6. Kelvin's toroidical atomics revisited
Kelvin had assumed that atomic matter is similar to smoke rings. Toroidical vortices show many behaviors which serve perfectly well to illustrate many aspects of the behavior of matter. And they don't have to be material. Let us now assume that the whole universe is what Schopenhauer states. It is representation. We add to this that we can treat it as informational phenomenon. Konrad Zuse would say it is a cellular space. All we have to take care of is that we don't affix any unjustified material or ontological assumption on what we are modelling here. The phenomena modelled of course have all the material attributes, but not the modelling system, the representation.

This is quite hard to think of without the proper instrumentation. All that is needed to get a more elaborate modelling system is a computer simulation that generates the appropriate toroidical models for us.

2.2.7. Toroidical vortices and Logics
The question of the binary logics of such a toroidical universe model is quite easy to solve. It is the rotation of the toroids around the ring that is their center. We know a similar system that served well for our first generation computer memories: Magnetic core rings. Only here in the simulation, the rings are dynamic, and moving. But they would act quite well to display the cellular automaton characteristics that Zuse and the other cellular automaton workers hat intended.

2.3. The Leibniz System and Hypermedia Technology

2.3.1. List of abbreviations
ASCII 7-bit code containing the basic American Standard alphanumeric codes.
This set can be displayed by the largest number of computers and printers on the globe.
CSCW Computer Supported Cooperative Work.
HM Hypermedia
IP Instruction Pointer
MM Multimedia.
OEM Makes local configuration and adaptations of standardized products for special
user communities. Literal meaning: Original Equipment Manufacturer.
OO Object orientation
OOP Object Orientated Programming
SW Software
TLSI Token List Subroutine Interpreter.

2.3.2. Abstract
The Leibniz TLSI is a simple and flexible programming interface that can be added as macro programmable user shell to existing C compatible applications to provide ASCII readable definition files for user interfaces, menu layouts, key bindings, and help information, as well as site-local and individual configurations. It thus serves as secondary programming facility or End User Programming Language (EUPL). In courseware applications, it provides programming access for local adaptation by teachers and students. The Inter-Actor technology of the system provides a powerful user-interface metaphor similar to a mechanical power tool for the end-user to experiment with available functions and build his own applications. The TLSI technology used is similar to the Java Virtual Machine (VM) principle and can be integrated in the Java environment.

2.3.3. Short Version of the Paper
2.3.4. Challenges of Multimedia Software Development
There are several challenges posed in Multimedia (MM) software development projects: 1) The production environment: The need to integrate an inhomogenous range of skills: programmers, designers, documenters, artists, media technicians, etc. Each of these groups have their specific work and communication procedures. 2) The wide range and demands of application media. 3) The inhomogenous user community that is to be served with the multimedia applications created. The market is in full flux and the main driving force is the prospect of opening up ever new customer bases that have not been accessible with the existing technology so far. Systems that can from the beginning be designed to accomodate very different user groups will have a strong margin of profitability. In the education field, a special customer group are intermediary courseware developers who need very flexible construction sets for their work. Pre-fabricated, frozen-structure systems as they dominate the market today, don't serve the needs of this group. Another potential user group that is presently hard put to use the interfaces of current systems are people with disabilities who don't have the hand-eye motor control needed to operate the mouse-icon interface. This paper describes an approach to systems construction that provides a simple, robust, flexible, and adaptible base, which can be library-linked to existing C compatible system, and provides a standard common ASCII programmable interface.

2.3.5. The Leibniz TLSI Virtual Machine
The technical principle of the Leibniz TLSI is based on the VM (Virtual Machine) principle, similar to the one used in Java® (and compatible with it). Similar to SUN's system philosophy, it is the ultimately open system: The TLSI system was designed on the base of a minimal kernel system of about 50 K bytes, to be completely open and extendable in all ways. Its power rests on the ability to create special-purpose interpreters for special applications on the fly, in the field, and even by the end-user. A similar, but not as general and flexible, approach is the popular EMACS editor which implements its functionality in a special interpreter. The minimal run time kernel can be linked on top of any C compatible system. This allows to use the TLSI as a re-vectorable and re-programmable interactive user interface on top of any existing software (the host system). The minimal kernel alone provides the equivalent of an interactive low-level debug monitor system that allows to test and execute any of the functionality of the host system that one desires interactive access to. Any routine can at any time be called separately from the interactive user shell. By way of its macro programmability, any higher assembly of a set of basic host functions can be constructed on the fly.

2.3.6. EUPL: User Programming of the Macro System
The TLSI approach offers a very easy way to achieve a secondary programming facility or End User Programming Language (EUPL). The developer of the basic MM functionality (the MM provider) can use a standard compiler technology or an authoring system to provide the tool set which the user (or secondary programmer) can then extend into any direction he deems necessary. All secondary programming can be done with the macro language of the TLSI. The TLSI can provide a large functionality to the user without having to include the original authoring system or the compiler package, who is also relieved from the need to learn the conventions of the authoring system, he needs only to concentrate on the functionality that is offered by the specific TLSI interface which the MM provider supplies. This approach allows a comfortable division of expertise and responsibilities between the different groups involved in the authoring process of a MM system. The software engineers need only to deal with their compiler tools and SW methodology to provide a rich tool set of TLSI tools for secondary, or applications developers to build their systems upon.

2.3.7. UIL: User Interface Language, Field Configuration, Integrated Help
The TLSI principle allows the construction of a very simple and effective common interface shell on top of different software systems, thus providing a generalized User Interface Language (UIL) that is adaptdable to specific user profiles. It allows to create flexible keyboard and menu layouts. All the functionality of the system is configured in ASCII readable files, that can be changed and re-configured with any text editor, at any time, even while the system is running, without any re-compile or otherwise low-level system interaction. Ideally the whole menu structure of the system resides in a single file, giving also a transparent access path to the logical structure of the whole system. In the Leibniz Hypertext Software Development System, an integrated hypertext function connects every menu with an associated help text residing in an ASCII readable text file that can be accessed in hypertext manner.

2.3.8. Factors of the Optimal Rapid Prototyping Process
In the Leibniz TLSI environment, special focus was laid on maximum production speed in the rapid prototyping environment. These factors are essential: 1) Archive: Access to existing large libraries of functions. This requires optimization and structuring of a hypertext data base. Different access hierarchies must be possible for the different user groups who have different views of the system. Quick overview functions are available through a combination of hypertext with special folding method implemented in the system editor. The principle is: overview combined with detail view results in perspective. 2) Constructor: Must allow overloading and modification of functions without recompile of the system. 3) Testing: Flexible and powerful testing environment. Allow to generate standard input scripts that can be used on whole classes of functions. The testing scripts can also be derived from the interface declarations in the source code. All testing inputs can be logged, for reproducible error histories and to generate automated test suites for further testing sessions. All inputs that the user would perform interactively can be redirected to script files so that auto demo functions are readily available.

2.3.9. The Inter-Actor Principle and Power Tool Metaphor
The development of the Leibniz system resulted in the Inter-Actor principle, a powerful unified representation and access metaphor for program functionality that is accessible in all its aspects from the user interface level. The structure of the Inter-Actor is similar to an object class, but it provides many extra user interface handles, like a self-display function, and a standardized experimentation field with test data for a new user who wants to try out the functionality of the system. For training purposes, all functionality of the system can be provided with sample data, sample inputs, the display of the internal state of the Inter-Actor before and after each function, and a way to change all the state variables individually. The metaphor of the Power Tool provides a tangible and direct user interface for non-software engineer users. The TLSI resembles a multi-function power tool, which can be converted in a minute into a drill, a buzz-saw, a jigsaw, a water pump, a sander, a grinder, and a screw-driver. Its add-on functionality rests on a practical functional classification that is hard to describe in a generic structural class inheritance structure of OO Software. It is achieved through function frames with re-vectorable executor modules. They can be re-vectored interactively, on the fly, just like in the machine shop. The functionality can be implemented on top of available OO technology, if it is possible to interface with the message structure of the OO processor. How this can be practically added to an existing authoring system, is not a technical question but mostly a problem with the closed shop proprietary policy of authoring system vendors. SUN's Java approach seems to provide the most painless access path.

2.3.10. Long Version
2.3.11. Introduction
Multimedia (MM) software development poses challenges for production tools and applications development that are in many ways more exacting than conventional software projects:

1) The production environment: Here it is the inhomogenous range of skills that needs to be coordinated in the development of a multimedia system: programmers, designers, documenters, artists, media technicians, etc. Not all of these can be assumed to have a working knowledge of the rules of the trade of software development. Each of these groups have their respective procedures and communication interfaces. A software system that implements a workable CSCW approach for the cooperation of these heterogenous groups would is a useful, if not essential precondition for a successful project. At least it should be avoided that an authoring system is tailored only to the tastes and preferences of just one of the participating groups, as is the case with current authoring systems.

2) The wide range and demands of application media. The technology is in full flux, and new media and new formats are appearing every few months, and it is not possible to establish a fixed set of standards and procedures in an authoring system and expect it to last for more than half a year. The built-in obsolescence of MM technology is a boon for the authoring system manufactures who are assured of a constant revenue flow because the systems developers are forced to buy the latest version of the authoring system every year, but it is a catastrophe from the viewpoint of backwards compatibility and re-usability of investments already made in existing technologies.

3) Third, and most difficult, is the particularly inhomogenous user community that is to be served with the multimedia applications created. As much as the technology, the market is also in full flux and the main driving force is the prospect of opening up ever new markets and user groups that have not been accessible with the existing technology so far. This means that systems that can from the beginning be designed to accomodate very different user groups will have a strong margin of profitability. This also means that the existing user interface metaphors, like the mouse-/touchscreen and icon interface prove to be limited, and limiting. There is a need to have field exchangeable interface routines, for user groups that either have no table top to position a mouse on, others who only have a keyboard, and still others who can't or won't use their hands at all and can do only voice input. Then there is the large, and potentially decisive group of educational multimedia customers, who want to act as MM-OEMs. These are school teachers or students who want to make their own selection of materials to present to their special clientele. They want to buy a CD with a loosely ordered and assembled set of materials, and they need a programmable access to the interface to make their customized adaptations of course materials. The business potential of such half-product applications is substantially larger than that of huge, frozen-structure systems that need to be targeted at a very large customer base to be profitable. This requirement is currently being totally neglected by state-of-the art authoring systems which encapsulate the presentation structure and user access procedures without modification possibility. The last potential user group that is presently hard put to use current interfaces are people with disabilities who don't have the hand-eye motor control needed to operate the mouse-icon interface. (Fig. TLSI-1)

Current MM authoring systems are struggling to meet the various, and sometimes conflicting demands from these diverse areas, and technology will need considerably more time to mature and deliver a platform that can accomodate to such demands more easily and flexibly than the available systems. This paper describes an approach to systems construction that provides a simple, robust, flexible and adaptible base, which can be library-linked to existing C compatible systems, and provides a standard common ASCII programmable interface.

2.3.12. The Leibniz TLSI "Token List Subroutine Interpreter"
The value of computer systems will not be determined by how well they can be used in the applications they were designed for, but how well they can be used in cases that were never thought of.
(Alan Kay, Scientific American/Spektrum, Okt. 1984)

When Alan Kay made this statement in 1984, no one exept maybe he himself could have foreseen the rise of MM capable computers ten years later. In effect, the criterium he stated will be the hallmark of successful MM authoring systems of the future. It was also the criterium for the development of the Leibniz TLSI or Token List Subroutine Interpreter project. Its use for MM applications is but one of a range of (unforseeable) domains that it can be adopted to.

The technical principle of the Leibniz TLSI is based on the VM (Virtual Machine) principle, similar to the one used in Java® (and compatible with it). Similar to SUN's system philosophy, it is the ultimately open system: The TLSI system was designed on the base of a minimal kernel system of about 50 K bytes, to be completely open and extendable in all ways. Its power rests on the ability to create special-purpose interpreters for special applications on the fly, in the field, and even by the end-user. Similar, but not as general and flexible, approaches exist in the popular EMACS editor which implements all its functionality in a special LISP interpreter, or the AutoCad system, which also uses a LISP interface, or Microsoft's Word Basic, that allows the user to field rig the editor to his tastes, if and only if he ever finds out how to operate the macro programming facility. (Fig. TLSI-2, 3)

Hypertext support with instant access: All the system's interface bindings, names, keyboard configurations, help texts, and (if supplied) source codes are accessible as plain ASCII files through an integrated hypertext system that allows to fetch any required information in a time lag of maximally 100 msec for a one-level lookup, leading to maximal search times for even deeply structured hypertext searches of about 1 to 10 sec.

Optimization for speed: The optimization principle of the Leibniz TLSI is speed, speed, and again speed. The system answering reaction is totally transparent to the user. For this purpose, a fully transparent programmable keyboard access to all functions is essential, because mouse- /icon interfaces, though convenient for beginners and casual users, are about one order of magnitude slower. For any repeated acess to any set of consecutive actions, the extensive macro programming facility (see below) allows the user to field rig his own preferred access method.

Implementation of the minimal kernel system: The TLSI is implemented in C. The minimal run time kernel requires about 30 to 50 K Bytes. It can be linked on top of any C compatible system, this makes it feasible to use the TLSI as a re-vectorable and re-programmable interactive user interface on top of any existing software (the host system). As minimal kernel, it provides the equivalent of an interactive low-level debug monitor system that allows to test and execute any of the functionality of the host system that one desires interactive access to. Any routine can at any time be called separately from the interactive user shell. By way of its macro programmability, any higher assembly of a set of basic host functions can be constructed on the fly. A program for the TLSI consists of a list of code tokens to be executed. Each code token corresponds to an entry in the code data structure. The token is accessible via a unique name in the system dictionary. This corresponds to the conventional linker symbol table but is interactively accessible to the user. The name table can be exchanged by the user for any name pattern the user requires. With the necessary pointers, each token code table entry needs on the average 50 Bytes, depending on the name length. The token can point either to a directly executable binary code of the host system, or it points to another list of token codes that are executed recursively as subroutines.

Macro programming facility: New subroutine lists can be constructed as macros at run time and entered into the system as user programs. These programs can be saved across sessions in a user dictionary image and re-loaded again. Since each token code entry in the subroutine table requires 4 bytes, large programs that consist mostly of recursive token calls are much shorter than C programs whose function calls are usually in-line expanded for performance reasons. Each new program is identifiable by its unique name and is entered into the system dictionary together with all the existing programs. These macro routines can be removed at any time as well. Thus an application can grow and shrink during a user session or as an applications developer experiments with the available possibilities.

Conversion to compilable C code and local optimizations: Since all TLSI subroutines resolve eventually into C function calls, a simple filter can convert a high level symbolic TLSI code into a compilable sequence of C function calls that can be optimized with standard compiler technology. Therefore, optimizations can be fine-tuned either to the run-time performance end, or to address space conservation end. A complete conversion into C code will have the fastest run-time performance and will require the most address space. Since a large code image that has been compiled for maximum speed will slow down to snail pace when the machine needs to swap a lot, it is very useful to provide different kinds of optimization images for different sizes of available target system memory. There still are 4 MB RAM computers out there.

2.3.13. EUPL: End user programming facility through the macro system
A pressing need for MM systems development is the facility of secondary programming (or the end user programming language EUPL). This is required for any but the most simple-minded consumer applications (which are at best a very temporary market to disappear when the customers get more sophisticated). In the case of the MM-OEMs mentioned above, the practicability and comfort of this facility will prove crucial for wide market penetration of systems. The TLSI approach offers a very easy way to achieve this secondary programming facility. The developer of the basic MM functionality (the MM provider) can use a standard compiler technology or an authoring system to provide the tool set which the user (or secondary programmer) can then extend into any direction he deems necessary. All secondary programming would be done with the TLSI. Because of its simplicity and extendability, the TLSI can provide a large functionality to the user without having to include the original authoring system or the compiler package, this also relieves the user from the need to learn the conventions of the authoring system, he needs only to concentrate on the functionality that is offered by the specific TLSI interface which the MM provider supplies. This approach can also be chosen for in-house MM development of the MM provider. It allows a very comfortable division of expertise and responsibilities between the different groups involved in the authoring process of a MM system. The software engineers need only to deal with their compiler tools and SW methodology to provide a rich tool set of TLSI tools for secondary, or applications developers to build their prototypes upon. These can then be converted with the filter tools into "hard" software structures, or delivered to end users as TLSI structures to be modified by them or not, just as the MM provider decides is within the scope of his marketing policies. It is as easily possible to implement tertiary programming structures which can then be even more simplified and standardized to provide a very specific programmability to end users. Because of the unlimited extendability of the TLSI, this can have any shape and form wished for.

2.3.14. UIL: User Interface Language, field configuration, specialized interpreters
The TLSI principle allows the construction of a very simple and effective common interface shell on top of different software systems, thus providing a generalized User Interface Language (UIL) that is adaptdable to specific user profiles.

One of the main user interface problems of contemporary software packages in a windowing system like MS-Windows is the chaotic menu and hotkey layout of the diverse systems that are offered for this platform. The situation is worse for Windows, than for the Apple System, where a certain level of standardization has been provided, and it is worse for national language adaptations than for the original US layout. Each package developer seems to express his original creativity by inventing his own special hotkey binding layout from scratch. Some manufacturers insist on using the special symbols from the national character set for their hotkeys, so that anyone not using this keyboard will be out of luck and not able to use that nationalized product. The solution taken by many recent MM systems to provide no hotkeys at all, is also no solution at all. Because any repeated use of such a system, for example when the user wants to traverse just a few specific access paths that are often repeated, leads invariably to the "Pinball Wizard Syndrome", the unhappy user having to wade through an orgy of points and clicks before he finally gets to the place where he wants to do some real work. Instead of trying to create a uniform hotkey and menu pattern for all packages, which is hard do design, and impossible to enforce, the much easier solution is to allow the end user (or, in larger organizations, the in-house systems configuration manager) to create exactly the kind of keyboard and menu layout that is considered most appropriate at the location.

This can be easily accomplished with the common interface level of the TLSI, where all the functionality of the system that is accessible to the user is configured in ASCII readable files, that can be changed with any text editor, and re-configured at any time, even while the system is running, without any re-compile or otherwise low-level system interaction. Ideally the whole menu structure of the system would reside in a single file, giving also a transparent access path to the logical structure of the whole system, something that the handbook writers seem to consider a totally unnecessary luxury. The TLSI enables one to build very simple specialized interpreters that can build keyboard binding schemes or menu layouts. Below is an example how a menu configuration file may look like. The exact syntax is open to any design principle one may want to implement. The sample given is from a menu configuration file of the string handling processor of the Leibniz Hypertext Software Development System. (Fig. TLSI-4)
2.3.15. A sample TLSI menu configuration file
-----------------------------------------------------------------------------------
\\ start of sample

:GRP VX-IO/MOVE \\ i/o and move

LD-VX[ VD-A1
$.user_input 0 I 0 "user input string "
$.get 5 G 0 "get string with len from adr "
$.get_cnt 12 A 0 "get counted string from adr"
$.put 6 P 0 "put string to adr"
$.put_cnt 6 U 0 "put counted string to adr"
$.dup_top 0 D 0 "duplicate top string"
$.mov_2nd 0 M 0 "xchg top 2 strings"
$.copy_2nd 0 C 0 "copy 2nd string to top"
$.copy_# 0 O 0 "copy nth string to top of buffer"
$.mov_# 4 T 0 "move nth string to top of buffer"
$.mov_bot 0 B 0 "move bottom string to top"
$.push_#x 16 R 0 "move/del to nth_pos top string"
$.del_top 0 X 0 "delete top string"
$.del_#strn 9 N 0 "delete n strings"
$.len 0 L 0 "length of top string "
END;

(| AG 19:29 08/11/92 )

\\ end of sample
-----------------------------------------------------------------------------------
Explanation:

:GRP VX-IO/MOVE \\ i/o and move
^ ^ comment
^
The :GRP indicates a new program module or group.

LD-VX[ VD-A1
^
call the special menu compiler for the following menu, VD-A1

$.user_input 0 I 0 "user input string "
^ ^ ^ ^
^ ^ ^ ^ menu-entry as shown in the pulldown menu
^ ^ ^
^ ^ ^ Menu Hotkey Character (ALT-I)
^ ^
^ ^-optional input paramter string (here as MSG#)
^
called method or function, this lets the user input a string

(| AG 19:29 08/11/92 )
^ programmer initial, time and date stamp, also indicates the end of this function block.

2.3.16. Hypertext integrated help information system
Using the TLSI principle, a complete production system has been created: the Leibniz Hypertext Software Development System. Through the integrated hypertext function, every menu entry has an associated help text that resides in an ASCII readable text file, which can also be edited and modified at any time (if the systems provider opens access to this option). Below an extract of the help file associated with the above string processor menu. For specific national languages, the file contains keyed entries with the prefixes !ENG !GER !FRA !ITA etc. Switching to different languages is simply done at run-time through the system configuration menu. The menu configuration file and the help information file can be filtered and merged easily to produce a printed manual. Thus, documentation production is centralized on a single data base for all national language editions, and for all stages of the production process: the in-house source code documentation provides the base for the help file, which then enters the printed documentation.

!" M:STRING-ACTOR String Processor Help

!" $.user_input
!ENG user input string
\LThe user enters a string at the keyboard. The key <ENTER> finishes
the string. With ESC it is possible to leave the input, and a
zero-string is produced. The string input system has an
editing capability which allows to edit the present line with
the cursor control keys. It also allows to retrieve old inputs and gives a
full-screen editor for them.
!GER Benutzer String Eingabe.
\LDer Benutzer gibt einen String von der Tastatur ein. Die <ENTER> Taste
beendet die Eingabe. Mit ESC kann man die Eingabe verlassen, und ein
Null-string wird erzeugt. Das String-Eingabesystem hat einen eingebauten
Zeileneditor, der die Eingabezeile mit den cursor control Tasten editierbar
macht. Es erlaubt auch, fruehere Eingaben aus dem Puffer zu holen,
und erlaubt diese in einem full-screen editor zu bearbeiten.

!" $.get
!ENG get string with len from location
\LA string that is at a buffer location is collected into
the String Actor buffer. The length of the string is needed
as second parameter ( buf len -P- )
!GER String Text ...

2.3.17. Sample Leibniz TLSI Source Code
This sample gives a souce code extract from the program group ( :GRP ) of the string processor, here called the STRING-ACTOR.

The commands :MSG-GRP M:ACX define a message group M:ACX for the string processor, which contain the prompting and dialogue strings. These can also be edited and re-loaded at run time with different nationalized versions if so desired. This way, all language string handling is completely separated from the actual code implementation.

-----------------------------------------------------------------------------------
\\ sample Leibniz TLSI Source Code

:GRP M:STRING-ACTOR \\ string processor program group
(| AG 20:15 24/11/92 )

:MSG-GRP M:ACX \\ actor function exec and exec messages
M" STRING-INIT" ( 1 ) \\ this is the actor init function
M" STRING-DISPLAY" ( 2 ) \\ this is the actor display function
M" " ( 3 ) \\ spare
M" STRING TOOL " ( 4 ) \\ actor title II
M" LEIBNIZ ACTOR SYSTEM : " ( 5 ) \\ actor title I
M" performing Actor command: " ( 6 ) \\ actor message
M" edit input parameters: " ( 7 )
M" enter a string " ( 8 )
M" return parms: " ( 9 )
(| AG 16:22 15/11/92 )

-----------------------------------------------------------------------------------
The function v.std_input is a generic window string input function called by the
high-level routine $.user_input that appears in the menu.
-----------------------------------------------------------------------------------

:DEF v.std_input \\ standard input in W-MSG , line 0
( $msg -$- $input ) \\ parameters: prompt string in, answer string out
{ v.push_parm \\ push present window parameters
W-MSG v.default \\ switch to message window as new default window
$.len v.msg_push \\ save string length, put the prompt msg to window
0 p.m1 v.at>at \\ position cursor to top window line after prompt string
cur.set \\ show input cursor
$.get_inp $.len v.add_cur \\ get user input, increment cursor
msg? ZEQ DO_IFT v.clr_display DO_END
v.pop_parm \\ return saved parameters to last window used
TRUE -> V:MSG? \\ set state variable
}
\\ $msg prompt string displayed in msg window
\\ $input user string returned. $"" (0-string) if only ENTER
(| AG 20:16 26/12/89 )

:DEF $.user_input \\ user input string
( -$- string )
{ 8 M:ACX M>$ v.std_input
}
(| AG 19:22 08/11/92 )

:DEF STRING-DISPLAY \\ string actor display
{ v.push_parm \\ push present VDEV window parameters
FALSE -> MK?
W-BUFF v.default \\ swith to buffer window as default
0 -> VP:OFFS \\ initialize display frame offset
&. v.chr_fillbuf \\ fill buffer with "." character
$P:PTR v.mov_$buf \\ move string buffer to VDEV buffer
v.brd \\ show VDEV border
v.display \\ display VDEV contents
W-MSG v.default v.brd v.display \\ show message VDEV
v.pop_parm \\ return to prior VDEV
}
(| AG 23:00 08/11/92 )

GRP; ( M:STRING-ACTOR )
(| AG 16:11 01/07/92 )

\\ end of code sample
-----------------------------------------------------------------------------------


2.3.18. Higher functionality and information management of the Leibniz System
In the Leibniz TLSI environment, special focus was laid on maximum production speed in the rapid prototyping environment. For this aim, all the system components were carefully tuned and integrated with each other to achieve maximum efficiency. Originally intended for standalone hardware, embedded systems, and real-time industrial machinery control (the most general interpretation of multi-media), it contains its own operating system functions, a software prototyping and testing shell, a fully self-contained windowing system, a hypertext-integrated full-screen ASCII editor, and a hypertext database system. In the present implementation, the Leibniz system contains 10.000 functions in 100.000 lines code with 6 megabytes source, and 500 source and configuration files. A system of this size puts the built in information management functions to a severe test, because such a huge function library is impossible to memorize, and printed paper documentation is much too cumbersome besides being hopelessly behind the current version status. 100.000 lines would occupy a hefty 2000 pages of A4 paper. Listing only the function names fills a book of 200 pages. Even an experienced OO SW developer cannot be expected to master such a library in an afternoon. Therefore the optimal structuring and maximal speed of the retrieval environment is crucial. The available tools of the system allow access any of the 10.000 functions even in multi-level hypertext access in a matter of maximally 5 to 10 seconds. A special interface shell allows to call the very efficient GREP tool from within the editor with one keystroke, searching the 6 MB source in about 10 to 30 seconds, and delivering its output in a table that is routed to the editor. The output can then be fed into the hypertext system.

2.3.19. Mnemotechnic and embedded database access keys in function names
The code examples supplied are not just there to show another kind of yacc (yet another cryptic code) but to exemplify how a very condensed and powerful knowledge organization scheme can be embedded in the function names, serving as their own database ordering principle. While these names may seem cryptic and not very intuitive in the beginning, they have a few important information management factors speaking for them. First, the average length of a Leibniz function name is about half of that of usual comparable systems which have the well known expressive full-length meaningful english names like:
WindowIncreaseApertureOnMouseClick
These names may be more expressive but if they were used throughout the Leibniz system, the size would not be 6 Mbytes, but 12 Mbytes, each file would be twice as big, it would need twice as long to load, the user dictionary would have to be twice as big, and the listing would need 4000 pages, not just 2000 pages. And what is even more problematic, testing would be more than twice as long, and less efficient, because of the tedium of typing in those long expressive names. Typing errors tend to accumulate more than proportionally after ten characters. It is clear to see that there are information handling limits more on the human than on the machine side that are imposed by the sheer size of application systems.

The message structure is modeled after the Eiffel pattern: class.action The dot supplies a very useful visual separation between the two parts. Commands like: v.brd v.display v.push_parm are messages to the window or VDEV (for Virtual Device) processor. The individual object needs to be addressed only once, when its parameters are anchored:
W-MSG v.default
This declares the window W-MSG as default window to which all the subsequent v.xxx messages apply. This saves a lot of unnecessary individual name-dropping, also saving code space. Moreover, the usual case in an application is that we return to the window we came from before we entered the special function. So we need not even name the old window any more. v.pop_parm does this all for us, returning us to that window without any additional code. All string processor commands are keyed with $.xxx, and for each special processor, there is such a very short keying sequence. m.xxx for the memory operators, mov.xxx for moves and fills, and so on. This allows to implement a very cheap run-time search of all methods a class has to offer by making a string compare through the database for the first 2 or 3 matching letters. More elaborate codification schemes for names are implemented with matrix words, which are a kind of logical place value system wherein a certain character at a specific position encodes a specific function subclass. The move processor has the following message modifications:
mov.<[l mov.>[l mov.[lc mov.[l0 mov.[lb
mov.[]>[ mov.[]<[ mov.[]?[ mov.[]>[a mov.[]>[x mov.[]<[a
Without going into unnecessary details, it suffices to say that the universal prefix mov. specifies the operator class for memory moves. The < > [ ] ? a x codes specify what kinds of parameters are needed, if it is a move-and-fill (with a fill pattern of byte, word, longword, or string), and so on. There are very many combination possibilities for just this one operator alone. Naming them all with long names would be quite cumbersome.

2.3.20. Factors of the optimal rapid prototyping process
The optimal rapid prototyping process rests on the ease and speed of the interplay of these factors:

1) Archive: Access to existing libraries of functions. This requires the optimization and structuring of the hypertext data base. The categorization of functions must be possible under any suitable ordering scheme, and must not be restricted to (for example) object hierarchies that are exclusively implementation dependent. Many different access hierarchies must be possible for the different user groups who have different views of the system. Quick overview functions are available through a combination of hypertext with special folding method implemented in the system editor. The principle is: overview combined with detail view results in perspective (Überblick und Einblick ergibt den Durchblick). It is necessary to present the system designer with as much useful information as he can keep in short term memory, and carry on to the task he has to solve.

2) Constructor: Rapid prototyping method. Must allow overloading and modification of functions without recompile of the system. The simple TLSI constructor method provides a very rapid turnaround. The library database provides a fast and easy update of the new functions created.

3) Testing: Flexible and powerful testing environment. The TLSI allows to generate standard input scripts that can be used on whole classes of functions. The testing scripts can also be derived from the interface declarations in the source code. All testing inputs can be logged, for reproducible error histories and to generate automated test suites for further testing sessions. All inputs that the user would perform interactively can be redirected to script files so that auto demo functions are readily available.

4) Interface access: Transparent, most direct method. The mouse- /icon interface provides the easiest to use, and most fool-proof access method. Unfortunately it is also one of the slowest. For expert users, this proves to be a serious productivity obstacle. These users need a completely transparent access to all functions through key-strings, and must be able to create their own shortcuts. This is possible only when a uniform keyboard interface exists for all functions. This is ensured by the TLSI principle. Only through the keyboard interface can the complete control of the system through script files be achieved, as well as the log function.

(Fig. TLSI-5)
2.3.21. The Inter-Actor principle
The development of the Leibniz system resulted in the Inter-Actor principle, a powerful unified representation and access metaphor for program functionality that is accessible in all its aspects from the user interface level. The structure of the Inter-Actor is similar to an object class, but it provides many extra user interface handles, like a self-display function, and an experimentation field with test data for a new user who wants to try out the functionality of the system. For training purposes, all functionality of the system should be provided with sample data, sample inputs, the display of the internal state of the Inter-Actor before and after each function, and a way to change all the state variables individually. The functionality also resembles the Unix filter tool approach, with the difference, that in the filter approach, the CR-file is the one-and-only class of data it can work upon. Also, the Unix filters are "lean and mean" tools that are very powerful but don't offer much in terms of user friendliness. In the Inter-Actor approach, there exists a CR-file processor which has the functionality of GREP, AWK, and SED combined. Besides this, there exists a powerful string processor. With all the available string methods taken together, about 200, the Leibniz String Inter-Actor provides a functionality much exceeding that of commercially available string libraries and comes close to special string handling systems, like SNOBOL. There is also a processor for memory patterns (byte patterns) and shifts, for Windows (here called VDEVs, see the code example) and for numbers and arrays of numbers (like in APL), and extensions for pictures and drawings, like an on-board Postscript, for movies, sound files, can be added as well.

2.3.22. Interfacing with OO-Systems and the power tool metaphor
While the present implementation of the TLSI presents a solution to software structure that is in many ways equivalent to the OO-paradigm currently en vogue in the software industry, it takes different and parallel approaches in some aspects. The OO-paradigm is a solution to structuring problems that is geared to a certain way of thinking in the software industry and needs a lot of training. For unexperienced users, the ins-and-outs of the OO-classification rules are sometimes hard to understand. This may pose too much of a time investment for practically oriented user groups who should better use their time creatively designing and implementing artistic and aesthetic parts of the system, instead of dealing with deeply nested class hierarchy trees whose structure is not obvious. The metaphor of the Power Tool provides a tangible and direct user interface for non-software engineer users. The TLSI resembles a multi-function power tool, which can be converted in a minute into a drill, a buzz-saw, a jigsaw, a water pump, a sander, a grinder, and a screw-driver. Its add-on functionality rests on a practical functional classification that is hard to describe in a generic structural class inheritance structure of OO Software. It is achieved through function frames with re-vectorable executor modules. They can be re-vectored interactively, on the fly, just like in the machine shop. The functionality can be implemented on top of available OO technology, if it is possible to interface with the message structure of the OO processor. How this can be practically added to an existing authoring system, is not a technical question but mostly a problem with the closed shop proprietary policy of authoring system vendors. SUN's Java approach seems to provide the most painless access path.

2.4. Hypermedia as extension of the human thinking apparatus

2.4.1. Strategies of Overview
@:STRATEGIES_OVERVIEW
Hypertext and the text tradition.
Textual techniques used in the book, reappearing as hypertext now.

Current application of the hypertext principle as an example of the Law of the Hammer: "everything needs banging".
Hypertext is not a panacea.
Unidiscriminating application of Hypertext leads to fragmentation.
See the WWW as deterrent example. Mini files of 4 or 10 K bytes are shipped over the internet, each requiring a lot of network message passing to get. Wasteful in terms of network resources and user time.
The whole idea is running into absurdity.
The contextual information is vital. Hypertext neglects this.

The complementary techniques of Hypertext and Outlining.
Outlining as forgotten art, Folding, Fish Eye.
Two examples from the stone age of personal computing: Thinktank and Maxthink.
applying the strategies of overview and insight
dependence on the human conceptual apparatus

2.5. A structural note: Outlines and the importance of folding

@:OUTLINE_FOLDING
One main reason for using WinWord® is its outline folding feature . This technique is also called hierarchic editing, or fish-eye . Once one has come to exploit the potential of this technique, one will "rather fight than switch" as one cigarette commercial expressed it some years ago. It is also hard to explain in words, one of those things that are much easier done than said. Folding turns a nuisance into a boon ("It's not a bug, it's a feature!", as they say in the computer industry). The computer screen, no matter how big a boob tube you get, is always too small. When you have a long text to work on - and this is a long one - you can cumulatively spend days just paging up or down to find the right place you want to work on. It is of no great use to split the text up in smaller files, because then you will be continually hopping between files, never knowing where the text you wanted really is at the moment. Having many small files is no solution at all. Especially when you want to build an index or a table of contents.

The folding feature allows you to keep it all in one file. This makes heavy demands on the machinery. One always needs the latest technology, no matter what that is. You will want at least 8MB of RAM and a -486 machine for anything larger than 100 KB or you will loose time again waiting for the program to finish paging.

2.5.1. The essential law of overview
@:ESSENTIAL_OVERVIEW
Folding will allow you to compress all your file on one or two screens-full. The essential law of keeping an overview of your work in the computer is: Have it all on one screen, or with one PG-UP or PG-DOWN at most. That is always available with folding level 2 or 3. Then you can go where you want in your text in a matter of about one second. Then you expand the text to normal-view, and you edit as usual. The side effect is: You will write many headings. This is not primarily to get a scholarly-looking table of contents but for your own convenience of knowing where you wrote what. This also influences you to do your best to formulate as succinctly and pointedly in one line or at the most 50 characters exactly what you are talking about in this specific paragraph that is outlined this way. It is a very nice feature that you can move a whole paragraph or a whole chapter by folding to the specific level and moving only the resulting outline.

2.5.2. The essential feature of speed
@:ESSENTIAL_SPEED
The other essential feature is speed . You have to do all this with one keypress. Usually, WinWord doesn't let you do this. To get the outline mode, you have to activate it in the "View" menu, and then set the right level. All with the mouse, interrupting your normal flow of work. Far too slow for anything useful. Therefore it is not very often used. Fortunately, Bill Gates had his boys do something sensible this time around: The macro programming facility of Word lets you save a sequence of keystrokes under one special control-alt-key. So you can use one separate control-alt-key for each folding level, and one more control-alt-key to return to Normal View. All in the maximum time span of 1/10 second.

And it happens more often than not, that outlines and paragraphs develop a life of their own which you didn't plan at all, by wandering into totally different regions of your text, re-assembling by their own dynamics. Of course you have to do the moving, but it is the structure of the outline which induces you to do it, not your wilful planning. Many times, you can see an idea actually develop by itself on the computer screen, giving you an eerie feeling, that there is something thinking outside of you. You are no more the master of your thoughts, but they form their own patterns by their own dynamics.

2.5.3. The potential of Hypermedia
Hypermedia is in some ways a current buzzword of the computer industry. Since it is an emerging technology, it is not possible to give one, universally accepted definition. It is especially difficult to give a definition of what Hypermedia might be in the forseeable future, not what it is by already established technological standards. Now the future aspect is particularly important here. We will define Hypermedia by the potential it offers.

ED-MEDIA95

2.6. Problems of contemporary mouse-icon technology

@:ICON_PROBLEM
Contemporary mouse/icon driven software has some problems of which two areas will be dealt with here. They are in the field of interface ergonomy and programmability.
Share problems of all mouse oriented sw.

2.6.1. Mouse Interface as sole user metaphor
Activation areas (hotspots) are activated by mouse click. Very little or no support for other means of activation, like the keyboard, is offered. The mouse interface is implementing a type of ease of use that is sacrificing the needs of expert users. Although there is a definite advantage in offering the point-and-click interface as an easy access method to novice users, we can list the disadvantages:
2.6.2. Discrimination against hand-eye-control limited persons.
The activation areas on the screen are very small and require extremely accurate hand-eye coordination. While this is no problem for people up to age 50, there are definite problems for older persons and ones with motor/eye disabilities. For people with serious motor disabilities, the mouse interface is impossible to use. This cuts out one market where the benefits of hypermedia would be extremely useful. The easy solution to this would be a standardized exchangeable activation interface level for all systems that can be converted to any input medium on the fly. By all standards of equal chances of access for handicapped persons, systems with mouse-only access should be banned.

2.6.3. Interruption of activity flow (the pinball wizard syndrome).
One could also call this "the pinball wizard syndrome". To move the mouse to a small screen area and pick or click a small object of only a few pixels in diameter necessitates that the user take the attention away from any activity he is just performing and concentrate on the eye/motor skill of moving the mouse to a required spot. A good example for this problem can be found in WinWord, where the TAB sliders are so small that it is almost impossible to move one and not move the other at the same time. (See the illustration).

This problem is aggravated when one is occupied with a task that requires one specific kind of attention and concentration, like for example reading a longer text. The mental activity of reading is assuredly totally different in kind from hunting and pecking things on the screen with a mouse. Thus, a mouse controlled reading system can be extremely tedious. This adds to the already adverse condition of much lower contrast and smaller display area that a screen offers in comparison to a printed page of paper. Reading a text is an activity that should not be interrupted by frequent mouse movements and button clicks. The problem is especially serious when the system uses a text metaphor that is not one, continuous stream, but broken up into many independent pages which are each longer than can be displayed on one display screen. Then the user is forced to perform the uncomfortable activity pattern of reading a portion of the text, then having to look and grab for the mouse, then to a different area on the screen, and then to activate the slider to display the lower half of the page. When that part has been read, one has to activate another button to let the system make a page flip, only to have to activate the slider again, because the page viewer is still adjusted to the lower half of the new page. This was found in the Hyper-G Postscript viewer of the Harmony client. On most text processing systems, like WinWord, it is possible to move through the whole text with keyboard commands (cursor controls, pg-up, pg-dn) alone without hunting and searching for the mouse.

A rule of thumb is that any repeated mouse action that needs to be performed more than ten times in an interaction sequence of ten minutes takes unnecessary attention away from the task to be performed and should better be executable from the keyboard.

2.6.4. Discrimination against touch typists and expert users.
The keyboard has the great advantage that it can be activated without looking at it. This phenomenon can only be put to use by touch-typists, and this is why contemporary mouse-interface only systems discriminate against touch-typists. Since the keys are in an always defined position, the touch-typist does not need to lose time and nervous energy to take his/her eyes off the work he is just doing and can perform repetetive actions about one order of magnitude faster than when using the mouse. The combinations of Alt- and Cntrl-Keys allow the access of about 100 functions instantaneously. This can very easily be achieved in contemporary hypermedia systems by marking each hot spot on a display with the appropriate alphanumeric key. We can give only a small indication here what immense loss in efficiency and productivity was incurred when the computer industry switched wholesale to mouse interfaces while completely forgetting the keyboard interface [37].

An expert user needs the possibility to write script macros and incorporate them into the system he is working with. This problem is worked at in the next paragraph.

2.6.5. Macro or shell programmability
In any repeated use which goes beyond the very basic novice level the user will want to save sequences of actions he is performing regularly as a macro and let them be performed automatically when he uses the system. This potential was developed to the highest extent in the Unix shell languages where it still is cherished highly by seasoned Unix hackers. Unfortunately, the shell languages were too cryptic to appeal to common users and therefore, very little use was made of them outside that community. It also offered no great incentive to the industry to supply shell interfaces to their common products. If they are offered, they are proprietary products of every vendor, completely incompatible with those of any other vendor. Some systems, like Autocad, have created a vivid aftermarket for suppliers offering customized scripts for specific tasks. For copyright reasons, macro programming systems are protected against use by other vendors, like it happens with spreadsheet and data base languages. This has for the vender the beneficial effect that customers who have invested many man-years in programs of one script language, will think twice before deciding to switch to a different product. Over all, it is very detrimental for the user community in general. Just like in the beginning days of computing, when each vendor had only one programming language to offer: Assembler, and that was different for every machine.

In a keyboard driven system, it is usually no great programming problem to vectorize the keyboard interface and allow the system to be driven by a macro script consisting of a list of key sequences. This ability was lost when the industry switched to the mouse interface. This type of programmability is not very legible and can only be used in limited cases. A better solution is a LISP-like language like in Autocad or BASIC, like in WinWord.

2.6.6. Keyboard layout schemes versus ergonomics
The issue of keyboard layout and ergonomics will be enlarged a little more. Some people will remember the discussions around the virtues and drawbacks of cursor control schemes of an earlier era of computing before the mouse became popular. Notably schemes like the Wordstar diamond versus emacs and vi types.

2.6.7. The Wordstar diamond
The Wordstar diamond appeared on CP/M machines about 1978 and was still usable on the early IBM PC computers up to the advent of the AT MF-2 keyboard. This layout had the following codings: cntrl-s := back, cntrl-d := forward, cntrl-e := up, cntrl-x := down, cntrl-a := tab left, cntrl-f := tab right, cntrl-r := page-up, cntrl-c := page-dn. For daring power-users there were even the control-q combinations which further amplified these keys. (See ILL:WD-1 , The Wordstar Diamond)

This key layout allowed the fortunate touch-typist to move the cursor while writing without interrupting the writing process. The positioning of the control key in immediate proximity of the cursor control keys made the coordinated touch of two keys at once extremely easy and allowed very fast cursor movement in a text. The sensible issue is that often used keys must be positioned in close spatial proximity to allow easy touch-typing. The same argument was also made in the discussions on alternative layout schemes than the qwerty model which never made it into common use, since the conventional scheme was too entrenched. Of course the Wordstar diamond was criticized because for people who are not touch-typist it makes less sense, and it takes just a little mental effort to memorize (and maybe one hour of practice).

Other cursor control schemes used another method: cntrl-b for back, cntrl-f for forward, cntrl-u for up, cntrl-d for down. This has the advantage of being somewhat easy to remember for english speaking people (and pure gibberish for all others), and for people who are not touch typists it may be an easier method. By positioning the cursor keys evenly distributed across the keyboard, the immediate feeling for control could never be achieved. This scheme and others were more used in the Unix world. It is probably safe to say that most Unix programmers are not touch typists.

2.6.8. The MF-2 desaster
As opposed to the qwerty scheme, here it was not a bad scheme that stayed because it was entrenched. To the contrary: A fairly useful scheme was displaced by a vastly inferior solution. The MF-2 layout obliterated this type of cursor control because of the extremely uncomfortable new position of the control key which made this key much harder to use. It positioned this key at the lower left end of the keyboard, way out of reach from the other, often used keys of the main field. The user has to bend the small finger of the left hand to a very uncomfortable angle to reach the key in its new position, destroying the close association needed for secure simultaneous operation of both keys. Instead, the caps-lock key was moved up. The question is how often does one use the caps lock key? In all my own typing practice of about 15 years, and about 10 Megabytes of text, I came to an average of maybe once a month. How often does one use the control key? In the days of the Wordstar diamond, when all cursor control was done with control keys, this key was the most important and most often used key of them all.

The decision to place the rarely used caps lock key in the central row of the keyboard adds another problem because this key is now quite often hit accidentally. And, to add insult to injury, we cannot toggle this key. That is, we must hit another key, the shift key, to undo the action of the caps lock key. I don't think that all the devils in hell are able to conjure up a more horrible ergonomic nightmare than the MF-2 shift lock key [38].

Here, a bad design decision wiped out a whole sector of ergonomy. The use of a separate cursor control key pad does not offset this disadvantage because now one has to move the writing hand away from the main part of the keyboard to reach the cursor keys. Bad news for touch-typists. And, more problematic, one has to move the eyes off the screen, interrupting the flow of work. For someone trying to think with the text while s/he is writing, this is desaster. Ten Million flies can't be wrong. Somehow the new scheme got adopted without so much of a question. It is nowadays impossible to find a vendor for the old type of keyboard and even the non-PC workstation vendors have adopted it. Since keyboard drivers are usually not user programmable, there is no easy solution for the problem save re-programming the keyboard with the soldering iron.

2.6.9. Marketing strategies leading to contra-ergonomic schemes
Now perhaps the MF-2 scheme was not a bad or sloppy, unergonomic design at all, but one that was carefully planned. It is not possible to find out who designed this layout - but if someone should have wanted to get some competition off the market that was using this scheme, they succeeded nicely:

The Wordstar diamond was used by this once most important CP/M text editor, which has now almost disappeared from the market, and a few followers. Most notable of those are the Borland Turbo tools. As it turned out, Borland also lost the market and guess who won? We don't need to delve into further speculations on this subject but it possibly serves to illustrate the point that not everything in the olden days was bad, and not everthing new is therefore better.

And we can be sure we will be in for "more of the same kind" of ergonomically strangling technologies. This is simply because some large players in the game want to keep a productivity advantage by not supplying the best possible tools to the large public.

We can refer here to the much discussed theme of the hidden secrets of Windows. This is just the small beginning of a whole industry of hidden trapdoors and intellectual mazes in software that makes the dungeons and labyrinths of the castles of an earlier epoch seem like child's play when compared to modern possibilities. We always can turn to "Neuromancer" to get an indication where the development is heading, and who will be the one to profit from it.

This brave new industry will be capable of giving us sensory and computing implants in a few years to come. At the Ed-Media 95 conference, we heard an invited talk about "the implantable workstation" (Gerald Q Maguire, ED-MEDIA95).

But should we trust this industry at all to let it come closer than a five-foot distance to our bodies and sensorium? I believe we should be extremely wary. We may have already let it come closer than is salutary.

2.7. The Software industry and the Law

The software industry has made a big noise about the multimillion dollar losses they are having because of pirated software. Obediently, the lawmakers followed suit and issued heavy criminal penalties for people copying and selling software without consent of the authors. Rightfully so.

But another multibillion dollar loss has been going on unnoticed all the time, causing incredible damage to world economies because of faulty, inefficient, unergonomic, and otherwise unsuitable software.

As in any new industry, the law of Commonized Costs and Privatized Profits (HARDIN85 ) is creating glaring examples of incredible cost to literally hundreds of millions of users, just because one manufacturer, who has all the communication and control in one hand, doesn't deem it necessary or profitable to think a few things right in the beginning.

The losses inflicted upon world economy are to a staggering amount of billions of dollars, and just because they come in little pieces, because they can't be seen in their full weight at one specific place, nothing is done about it.

A few software companies are controlling the whole globe of standard software use, and controlling these software companies should be the first aim of supranational legislation. But how is this to be effected?

2.8. The fundamental necessities of a standard ASCII programmable shell


The educational Hypermedia market is not going to get off the ground if the most important needs of the user community will continue being neclected as they are in the present systems:

Modern educational Hypermedia need a common programmable interface for students and teachers to make own amendments of the material.
This must be a programming shell that is not depending on a compiler.

It must be of the kind of WinWord Basic.
What is the most general, and the most economic shell principle?

On the horrors of using Windows® software.

Every Windows program we are using comes with a different keyboard binding schema for accessing the menu functions.
Users are forced to use the mouse because no one can remember all the different keyboard codes on how to access submenus.
Apple made it a little better, there is a larger set of common functions accessible by a standardized interface.
But this is not the solution.

Every standard computer software on the market must be macro programmable.
And this macro programming facility must have standardized features so that macros of one system can be run on a different system.

2.8.1. Common Interface Alpha
The introduction of a standardized interface shell for all standard software products is imperative.

There is imminent necessity to keep all keyboard-bindings, all help-texts, and all menu-items in ASCII readable files that have a common, uniform format all over the software industry. These must also be used as a macro programming interface for the software.
This we call the Common Interface Alpha.
This must also be accessible as an interactive and programmable shell while the software is running.

2.9. The scourge of binary only configuration and control files

How much blood, sweat, and tears would millions of computer users world wide have avoided if the innumerable "closed shop" binary configuration files of our modern software systems were in plain, readable ASCII? My guess is that the time and data loss caused by those unreadable or otherwise unchangeable, binary configuration files is in the billions of dollars world wide. Just the nuisance that configurations are only accessible through the circuitous manufacturer's "same procedure as everyday" mouseclick orgy, wading through mazes of menus, to change the parameters, adds to millions of man-hours lost.
2.9.1. My Microsoft Windows .GRP Odyssee
A horror tale of personal experience tells the truth of life. Who would ever think of those unobstrusive Windows .grp configuration files? Probably 90 % of all Windows users don't even know about this kind of software beast that is controlling a vital part of their daily working environment.

One of these fine days, I had changed my system by putting in a EIDE harddisk to add to my SCSI harddisk. That is innocuous, would you think. Not so, my friend. Because the EIDE cannot, by any way, made disk D:, since the BIOS insists that it must be disk C:. Now comes the surprise: I had the Windows system on the SCSI disk, which was C: before, now residing on D:. Consequently all the .grp entries were invalid [39]. What should I do? Obviously the Windows designers hadn't intended that situation to appear, and there is no way to edit the .grp files (that I know of. I looked long enough in the manual, believe me). Should I copy the Windows to the new hard disk C:? That would have been the best thing to do, but I didn't want to. Re-install the whole system? I thought in my naivety that I might outsmart friend Billyboy by getting out my trusted bit-editor and patch in D: wherever it said C: in the binary .grp files. You can imaginge my surprise when the system came back at me with the message: "corrupted .grp file" and gave up. Checksum test, of course. Billyboy had shown me again what he thinks he can do with people who try to thread on territory that his software designers had decided "closed shop". And he gets away with it. Every day a million times.

I tell you how I managed anyhow. The procedure is called in German "Über den Rücken durch die Brust ins Auge". I don't know how to translate that. It just means verrrry circuitous. I have the Central Point PC Tools system. Its desktop system, which can substitute the Microsoft desktop has the nice menu entry for "import .grp files". Of course the designers hadn't envisioned the possibility that anyone would want to ever quit their fine system, and an export function for .grp files is therefore missing. Like so much of everything of these nice gifts of the computer industry is one way only [40]. Once you are hooked on the company product, you are done in, for good. Fortunately, CP had not been as tightly security-minded as Billyboy's men: The CP-.grp importer ate my binary edited .grp files with the D: patch without so much as a hickup. And converted them into the CP desktop data base. That is, you have guessed by now, just another one of those ominous, closed shop, binary only data base files that are controlling the whole central point Desktop system. Opposed to Microsoft, all the information resides in one big file, not many small .grp files. Now I had it in there, cleanly, but for some reason, CP desktop isn't entirely up to my liking. So what did I do? I created new Windows .grp files with the menu, but dragged the converted entries of the CP desktop into the new .grp files. That cost me in total probably as much as re-installing the whole system, all in all about half a day's work. Just so much money wasted because some lazy or over zealous systems designer had decided that the user is not supposed to change the .grp files with an editor.

You may get to know the details of these things when you subscribe to the "Microsoft Systems Journal" or get into the special mailboxes, or have a friend at the manufacturer. But if you don't you are stuck. And since any self-respecting software today lavishly produces control files in binary you can never be in all the mailboxes, subscribe all the trade journals, or have your friends at all the manufacturers, unless, of course, your name is Jerry Pournelle, and you work for Byte magazine. But not for the rest of us.

2.9.2. Mouse Menu Madness
This would be the appropriate title for the next Hollywood horror movie, that is going to be a block buster among computer hackers. Unfortunately this is reality. It is going on a million times daily in the world of those wonderful iconic interfaces. Have you also cursed about that strange habit of most menus, that you have to click your way about three or four levels down a menu tree, and then have to change a whole list of things on that last level, but you can't do this in one sweep, because the darn thing insisted that once you had done one action, you must have been finished and teleported you right back to menu level 0? This seems to be Microsoft Special. One hundred million flies can't be wrong. And if someone up there has decided that is to be the way people have to do it, then that's it.

A particularly nice example of Mouse Menu Madness was presented to me by WinWord. There is an insidious bug in the program which appears only in very specific cases. I haven't figured it worth my while to track the exact source of the error, but it is about like this: When you put the index and table of contents of your work in a separate file, and embed this as a whole in your text, you can have Word update all these fields at once with the F9 key. Unfortunately, this also caused a miraculous multiplication of certain entries in my format definition template file (which is normal.dot or a derivative thereof). So one fine day, Word complained that the template file was full. I looked and found that there were about 200 superfluous copies of a few of my formats. Now how do you get them out again?

Quite easy, you see: You click the format template definition, click each of those 200 needless formats that you want to have deleted, answer the ubiquitous safeguard "Do you Really want to delete it?" and go on to the next format. Repeat that about 200 times in a row, and you are ready to shoot Billyboy and all his men to the moon. If you could edit the .dot entries with a text editor, that would have been a matter of five minutes.

2.9.3. Damaging attitudes of software manufacturers
This kind of proprietary behavior (in Deutsch: Nach Gutsherrenart) clearly shows us that the software industry has already taken forms that would be called expropriation and road-robbery if it happened in another sector of society. Somehow, the user community thinks that this cannot be any better and puts up with it quietly. The cumulated losses are in the billions of dollars when we add them up. The damage doesn't fill the coffers of anyone profiting positively. It is just a general loss due to close-minded proprietary thinking and sloppyness, sometimes deliberately created to prevent people from creating their own interfaces to hook in subsidiary products that might infringe on some part of the market. The mafia extorts maybe at most a tenth of this sum.

The computer press will not do anything about these things either, because they know that they are vitally dependent on the manufacturers' ads. If the ads are gone, the journalists' jobs are gone also. So there is another nice little tight loop of reciprocating interests to the detriment of the overall functioning.

The industry will never change that behavior unless it is forced to do so by legislature. In the earlier years of industrialization, the industry continued to produce woodworking machinery that took off fingers and hands of workers at liberty. Steam engines that exploded, scalding and cooking workers in the sweat shops. Transmission belts for machinery that dislodged and cut people in half, just like cheese. It took about 100 years of worker maiming before legislation was finally there to make machinery a little more safe. The factory owners themselves wouldn't have changed a bit.

Software nuisance is a problem that doesn't take people's lives, nor does it maim them. But it takes their lifetime, a little minute here, sometimes a few hours there, sometimes a week. The professional ethos of the software engineer and the systems administrator is to put up with this, otherwise, if things were so easy, he might lose his job, because users might be smart enough to edit configuration files themselves. That fear is unjustified, with systems getting more complicated by about an order of magnitude every 3 years.

How much more usable would the Next Interface Builder (NIB) have been if it wouldn't create a machine-only binary file but an ASCII readable file constructed with the Leibniz TLSI principle? (See below)

You could use a pattern of widgets that you had developed once and add other pieces with the editor, or exchange them, without going through the mouse menus every time. Using the NIB once or even ten times is nice. After you have done it ten or more times, you would prefer to be able to mechanize the process. Therefore, what was a boon in the beginning, becomes a millstone around your neck after ten times repetition.

2.10. Technical and Organizational Infrastructure, Financing

2.10.1. Databases

2.10.1.1. Finding databases with Greek, Latin, Sanskrit etc.

The work dealing with the Onoma-Semephonic Theory can only be continued when a computerized infrastructure of materials is available.

Most important: intergloss translations
Texts and Thesauri of ancient greek from Homer to Plato.

We must be aware greek dialects varied widely. See Kratylos. What is called classical greek today, is a product of alexandrine scholarship, and is derived mainly from the Athenian dialect. The scholars standardized and organized, but they also edited away many traces and details that are extremely important for the research to be done.

If we can get these tools into the infrastructure, then the linguistic and etymological researches proposed here will be a matter of days and weeks instead of man-years if done manually. This will be crucial to the progress, since there is hardly a way to prove in the conventional manner, what we are proposing. It would take an immense amount of time and life-effort. Ars longa, vita brevis. And it is an entirely senseless waste of time, when it is obvious that with this infrastructure, we will get it in a fraction of the time.

2.10.1.2. Access to CD-Rom and Internet Encyclopedias

Access to large computer-based hypermedia encyclopedia databases is essential for the project to include the knowledge on the separate fields.
The Encyclopedia Britannica
McGraw-Hill Encyclopedia on Science and Technology

2.10.1.3. Building up a local datbase/library

A library structured after the criteria of Aby Warburg
Overview of human symbol use
1000s of titles present in many bibliographies
Necessity for local full-text and hypermedia database (or fast Internet access)
2.10.2. Patenting and copyright
Here is another continuing desaster of humanity.

How much better would it be if about a hundred thousand inventions of untold thousands of inventors had found a market and a use, instead of rotting in the archives or in the safes of corporations who bought the rights just so that no one else could use them?

2.10.2.1. The principle of just stewardship instead of roman property right

The principle of patent ing is based on the right of the owner to use or abuse the property he has the legal title to. This has been derived from roman property right [41] (->: ROMAN-RIGHT ). Now, patenting and copyright only extend to a certain time (usually 20 or 50 years). But during this time the owner can deny anyone else any use that involves commercial distribution and resale. Since ideas and technical methods, as well as software, are not material, they can be copied infinitely with digital methods, and distributed over modern communications networks at very low cost. Therefore, there is a great potential for imbalance when ideas are treated with the same legal principles as one treats tangible matter. It gives ample opportunity for pure greed, avarice, rapaciousness and monopolization.

Of course, the author or the copyright holder deserves and must be paid for his work. But he must not be allowed to keep other people from using the idea of the work - if there is a societal value in using it. In music, there is a workable solution for re-users accepted in the industry. The same must be implemented for patents and hypermedia copyright. The basic issue is that ideas cannot be individual property rights, because ideas don't belong to anyone. But as individuals have toiled hard and invested their whole lives in developing them under extremely adverse conditions, there must be a secure way for remunerating them, once an idea has come to fruition.

This principle is called the Principle of Stewardship . Like a good land-owner, who has inherited the land from his forefathers, will not do anything to let it go to waste, and improve it to pass it on to his children in good condition, so the good steward of an idea is entitled to fair use of and profit from its products.

But it must be appealed in a court if he is not putting the idea to use, keeps it hidden, or any of the like. This is of course, a knotty issue. Who guards the guardians? This is the old question. But compared to the problems created by the present system even a certain measure of chaos with the new principle will by all means be easier to tolerate.

2.10.2.2. The chaos of hypermedia copyrights

Present copyright has a terrifically detrimental effect in the now arising market of hypermedia, that is stifled and monopolized right from the beginning, before humanity and legislatures become even aware of what is going on. Since no regulation exists, and all the usage rights for any work must be negotiated anew, this leads to complete chaos and an unspeakable gold-rush atmosphere among a few law consortia, and legal-rights hoarders who are already literally controlling the whole industry. The internet and other global communication systems are playing wholesale into the hands of the big players before the rest of humanity even gets a chance to influence what is going on.

2.10.2.3. The right of fair use and fair distribution

It is imperative that a global right of fair use and fair distribution is implemented so that those who want to use reproductions of any published work in their multimedia products can do so by paying a fixed percentage of the income to the author or copyright holder.

There must be a difference in treatment between the case of a living author, who deserves to keep a stronger control over his work, and the case, where the original author is dead, and cannot profit from his work any more. In this case, the rights of copyright for reproduction of those, who now hold rights to the work, must be secondary to the right of humanity to use the work in the general interest. There may not be such a thing as hoarding reproduction rights of works of possible societal value for the sake of speculation!

Since this would cover only works that are already existing, there ensues no up front cost for the author and therefore, it is just, when the author or copyright holder gets paid out of the net income of the project instead of getting paid up front, a situation that creates an unsurmountable financial barrier for educational purposes.
2.10.3. Finances

2.10.3.1. How much is needed?

While the development of a device that helps us putting a few new good thoughts into our heads is much cheaper than putting a few brave men to the moon, it is no trifle business. It is an area where Brooks' law of "the mythical man-month" (BROO75 ) probably applies with a vengeance. The project described here covers a running time of 20 years. Since there is no use pushing the river, we don't need a Saturn-5 approach to develop special hardware for our purposes. The autonomous market and technology dynamics are going exactly where we want them to go [42]. Industry will be too eager to sell us their newest gadgets as soon as they roll out of the factory. So, the costs to cover will be for infrastructure with off-the-shelf technology and personnel cost. If we start with a production team of ten people, we will have enough to work on honing this team to syn-ergeia for about three years. After that, we may add a few people, but it is doubtful whether more than 20 persons will ever be able to cooperate in a way that is useful for the project. We will call this team the nucleus team or the LLP-NT (team) [43].

So, by this rough rule-of-thumb calculation, we will not surpass $10,000,000 per year operating cost for the first five years. After that, the protoypes of the system should have evolved far enough that there are spin-offs to sell which will then finance the whole venture by itself.

2.10.3.2. How to get it

Now how to go about to get it?

2.10.3.2.1. Direct financing through service

The most direct way to get financing is doing a service to someone who is willing and able to pay you for it. This is obvious in theoria but not always as straightforward in praxi. The questions are:

How do I find someone whom I can do a service?
How can I convince him that what I am doing is of service to him?

Since the service will need an investment on the recipient's side and a change of ways and procedures, we have the question:
How can I convince him that it is in his best interest to change a few of his accustomed ways of doing things?
How can I rationalize that the beneficial effects of the service will outweigh the necessary investments?

Humanity being what it always was, and forever will be, there will be the additional task of dealing with the interference of those who make their profits within and exploiting the shortcomings of the accustomed system. To many of those, any new and better method will cut into their profits and power. Therefore, they will oppose any change.

2.10.3.3. The education market

The principles developed here have immediate use and application in education. Hypermedia is a tremendous educational tool, but there is no experience what it may be good for. So far, most of the application has been "old wine in new barrels". The concept of syn-aisthetic learning could give a real break-through in the learning environment. It may for example realize the promises we have heard all the time from such interesting proposals like "superlearning", which didn't quite live up to the expectation.

While the educational field could profit most from the idea, it has very little to offer in terms of ability to pay for it. At least not in the frame of existing government-administrated educational systems. Another situation appears, when we take free-market education into the picture: mainly education-at-a-distance, virtual universities, on-the-job training for large companies and multinational organizations. Here is big money, and a prospect for a big jump in efficiency for those designing the education. After some time, the do-it-yourself business might get into the game. Once a substantial free-market base of multimedia-capable PCs exists, packages can be sold to individuals. But this is in the future since mass-marketing involves very heavy up-front investment.

2.10.3.4. Financing the development through spin-offs.

Pay as you go. After five years of initial development, the protoypes of the system should have evolved far enough that there are spin-offs to sell which will then finance the whole venture by itself. This of course neecessitates a reworking of the infrastructure. Obviously, it might be very unwise to waste the precious energies ot

2.10.3.5. Strategic Information Systems

Obviously, the most pressing need for superior representation systems coupled with most of financial potential are those places where immense amounts of information must be processed to steer huge complexes and empires of technological and organizational nature: Namely governments, multi national corporations, the military, the world bank, the UN, EU and other supranational bodies. This is also the most dangerous area to tread.

First, because there is an extreme amount of vested interest solidified in those places. Most of these organizations became as important and powerful as they are by exploiting the conventional system to the hilt.

So there may be a few problems that better be considered beforehand:

1) That new thinking methods may be too dangerous to be leaked out in the open may cause someone up in their ranks to decide that it would be much better if the new methods together with their creators disappear silently, and act accordingly.

If such may not be the deplorable end of the story, the next possible scenario is:

2) Yes, we can use your idea, but no, our lawyers just decided that your idea was in the philosophy books since 2300 years, and is therefore public domain because the patent rights have expired just exactly 2250 years ago. And who is to argue with a legal department of a company which is just sitting there, "a fleet in being", with constant running expenses of about $50,000,000 and just itching to put up a good fight. As countless inventors all over the globe have found out empirically, the ability and readiness of such an organization to do something of the like is exactly proportional to the amount of money their legal department is receiving per anno. Those smart folks in the legal dept. must prove to their superiors that they are worth their salaries.

and so on, ad nauseam.



[33] Well, actually it was not like this. Every once in a while someone tried to invent something technical and put it to the market. But these people never made it into the history books. And that has a very simple reason. What do you think did those great slave-owners do, who were all the while controlling the political system? A mechanical device would have possibly meant a competition for their system. You guessed right, they sent a hit squad after him, and the whole thing was done for, "requiescas in pacem" for the rest of antiquity.
[34] Agricola, Georgius. From SOFT-ENCYC.
The German educator, city official, and physician Georgius Agricola (Latinized form of Georg Bauer), b. Mar. 24, 1494, d. Nov. 21, 1555, is best known as the author of De re metallica (1556), a treatise on mining and metallurgy. The treatise was translated into English in 1912 by future U.S. president Herbert Hoover and his wife, Lou Henry Hoover. Agricola studied medicine at Leipzig University. He became a devoted follower of Erasmus, who wrote a foreword to Agricola's first book on mining and metallurgy (1530). While town physician of Joachimsthal (now Jachymov, Czechoslovakia), he became intensely interested in all aspects of the mining and metallurgy industry by which the town thrived and began a 25-year study of the subject, which culminated in his posthumously published masterpiece. The 12-chapter treatise included 292 woodcut illustrations carefully executed by Blasius Weffring. Agricola also wrote a number of works on medicine, geology, mineralogy, politics, and economics.
David Hounshell
Bibliography: Agricola, Georgius, De re metallica, trans. by Herbert Clark Hoover and Lou Henry Hoover (1912; repr. 1950); Dibner, Brian, Agricola on Metals (1958).
[35] Planck's constant, from SOFT-ENCYC
Planck's constant, h, was introduced in 1900 by Max PLANCK in his equation that described BLACKBODY RADIATION. According to his theory, radiation of frequency (designated by the Greek lower-case letter nu) comes in quanta, or packages, with an energy E determined by E = h (nu). In mks units, h is about equal to 6.6261 X (10 to the power of - 34) joule-sec, an extremely small quantity. The constant appears in every description of matter and radiation. Planck's constant measures the size of the quantum effects in the system, which are large if the system is small. The equations for many physical properties can be expanded in powers of h as if it were a variable. Classical physics assumes h is zero, so that quanta can have zero energy and radiation can have any energy whatsoever.
Herbert L. Strauss
See also: ATOMIC CONSTANTS; QUANTUM MECHANICS
[36] The hen of Plato.
[37] The law of the great numbers applies strictly: When one million users are forced to needlessly perform repetetive actions in the order of ten minutes a day, then this sums up to 41 hours per working year. In total that makes 41 million working hours lost. If we calculate the cumulative cost of each working hour at $ 100, we have the staggering sum of $ 4 billion, lost each year to a national economy because of inefficient software. Compared to efficiency effects of this kind, the over-the counter price of a software package is negligible.
[38] This is possibly the case only with the german encoded keyboard.
[39] Of course, all the .ini files also, but these are fortunately text files, so you can edit them with a normal text editor or you write a shell procedure to catch them all at once if you are a real smart hack. Of course the whole thing would be entirely unnecessary if you had Unix, because there you don't worry about C: or D:. Also, someone told me that the last versions of DOS may allow a re-assigning of drive names. I haven't checked into all the possibilities.
[40] This kind of tactics made the books when Rockefeller (or was it someone else) around the beginning of this century, traded the Chinese his brand new oil lamps for their old smelly lanterns. The awakening on the side of the poor Chinese was rude when they found out that the new lamps would only burn Rockefeller oil. This was, of course ten times more expensive than the old oil. And the old lamps were gone, traded in. The computer industry has infinitely refined and magnified on that time honored trick.
[41] Even though the absolute property right of the Romans has been somewhat limited in the development of western civilization, it still holds full sway to extremely detrimental effects in all other sectors: The right to use and abuse of land owners, wasting the soil, and the life forms, on his land for all the future. The terrible heritage of the Roman Empire is still with us, alive, screaming and kicking!
[42] Here we are eternally indebted to Jerry Weinberg for his invention of the Buffalo Bridle: The sure way to make a Buffalo go where you want it to go is that you want it to go where it wants to go! (Slightly modified, from: WEINBERG85)
[43] I swear, I didn't steal this acronym from Bill Gates!

Previous Next Title Page Index Contents Site Index