12. The LPL TLSI Principle: Neuronal Resonance Technology, User Interface
Language, and End User Programming Language
(Preliminary version)
Dr. Andreas Goppold
Postf. 2060, 89010 Ulm, Germany
Tel. ++49 +731 921-6931
Fax: (Goppold:) +731 501-999
http://www.noologie.de/symbol13.htm
(URL)
12.1. Abstract
The accompanying paper: "Neuronal Resonance and User Interface
Technology" deals with the theoretical backgrounds and the history of NR
Technology, as well as some of its design principles, and a critical look at
current GUI technology for its omissions and defects in the NRT realm. This line
of thought will be pursued here to explore the application of NRT for
constructing "User-Tailored Information Environments".
After fifty years of wider and wider penetration throughout
all sectors of society, the information technology arising from the merging of
computers, media, and telecommunications has reached a ubiquitousness level
where it is to be considered as a societal infrastructure that needs to be
treated like a public utility, similar to the construction and maintenance of
roads, railways, and air-traffic lanes, and other public service facilities on
which civilized life depends, extending to the access of uncontaminated air,
water, and food supplies.
When User Interface Technology is considered as a systems
design subject in its own right, this indicates that it is more than a technical
and marketing factor that is subservient to the capital cycling and utilization
processes of the respective industry. The present UIT situation can be viewed as
the outcome of a vonNeumann game of player coalitions, and of societal power
struggles of industry-capital complexes, much like the processes that happened
in the 18th and 19th century of industrial history, when monopolies and cartels
were forged and dismantled, or became entrenched and part of the societal power
structure.
The present paper outlines a strategy for creating a
technological infrastructure supporting "User-Tailored Information
Environments": the Leibniz LPL TLSI Principle. This is based on a VM (or
Metacode Machine) principle similar to the Java VM, and it allows the
formulation of an alternative model of technological and societal structure by
which the requirements for adapting software for local needs can be
met.
12.2. Abbreviations
aka also known as
CIA Common Interface ASCII
EUPL End User Programming Language
GUI Graphical User Interface
HW Hardware
LPL Leibniz Programming Language
NR Neuronal Resonance
NRT Neuronal Resonance Technology
OS Operating System
RSI Repeated Stress / Strain Injury
SW Software
TLSI Token List Subroutine Interpreter
UIL User Interface Language
UIT User Interface Technology
VM Virtual Machine
WIMP Windows - Icon - Mouse - Pointing UIT
12.3. Keywords
Human Factors Design, Time- Memory- Interaction- Flow, Human
Memory Bandwith and Attention Span, Neuronal Resonance Technology, Hypermedia
Browser Technology, User Interface Technology, User Interface Language, End User
Programming Language, Virtual Machine Design
12.4. The political dimensions of technological infrastructure decisions:
Lessons from industrial history
Introductory literature: Anthro, Chandler, Creveld (1999),
Diamond (1976), Eisenstein (1979), Gellner (1993), Giesecke (1991), Goppold
(1984a, 1984b, 1992a-1992c, 1994b, 1995), Innis (1952-1991), Kingdon (1997),
Mumford (1934-1977), Smith (1994), Wittfogel
(1957)
[34].
The admonition of Santayana: "Those who don't know history are
condemned to repeat it" applies most acutely to the computer industry, whose
different technological and historical phases (Mainframe, Mini, Micro, Network,
...) are characterized by a historical amnesia, with the same errors committed
over again in every generation. The hard-earned lessons from former failures
reported by the one-time master architect of IBM's mainframe operating systems,
F. P. Brooks, related in "The Mythical Man-Month" (1975) have gone completely
unheeded as the present-day monster operating systems of MS Windows® flavor
show. Therefore, a very quick commemoration of the past 5000 year epoch of
technological innovations (and its sometimes unexpected, and unwanted
side-effects) is advisable, even if this is not the place to fully delve into
it. For this, the above literature references will give an entry point.
Ever since the emergence of the first hydraulic civilizations
around 5000 years ago, the fundamental decisions concerning technological
infrastructure exerted a strong, even determining influence on the social
structures of the respective societies. Even while the exact details are a
matter of academic dispute, their general inter-dependency is universally
recognized.
At present, computer technology has seen a period of fifty
years of continuous expansion, and of wider and wider penetration throughout all
sectors of society. There is now a new societal technological infrastructure in
place, the information technology that has arisen from the merging of computers,
media, and telecommunications. Because of its general spread throughout all the
industrialized societies, and because they are vitally dependent on this
infrastructure, the technology needs to be considered as a public utility,
similar to the construction and maintenance of roads, railways, and air-traffic
lanes. And like all the public service facilities on which civilized life
depends, the question of universal access becomes imperative, similar to the
fundamental issue of access to uncontaminated air, water, and food
supplies.
The issue of "User-Tailored Information Environments" is in
many respects a fundamental infrastructure question which extends deeply into
the societal "rules and regulations" side: the question of the legal status and
the intellectual property rights of the information infrastructures vs. equality
of access and fair usage statutes. Under the present regime, the fixation of
intellectual property rights as a capital asset of the SW industry leads to the
direct consequence that many of the information access routes that would be
necessary for optimally adopting Information Environments to the user, are
blocked by trade secrets.
As was already mentioned in the accompanying article
("Neuronal Resonance and User Interface Technology"), the present UIT situation
can be viewed as the outcome of a vonNeumann game of player coalitions, and of
societal power struggles of industry-capital complexes against their
competitors, and against the rest of society, much like the processes that
happened in the 18th and 19th century of industrial history, when monopolies and
cartels were forged and dismantled, or became entrenched and part of the
societal power structure. The present Microsoft monopoly lawsuit is just an
indicator of the general situation, and it had its precedent in the IBM lawsuit
of a few years ago, and its forerunner and close parallels were the various
telecommunications monopoly breakups, like the AT&T in the U.S. or the
Telecoms in Europe, and before that, the electricity, steel, and oil
conglomerates.
The issue of "User-Tailored Information Environments" turns
into a general question of access routes into the infrastructures of information
systems. In order to achieve that aim, one has to solve the trade secret
questions necessarily associated with the know-how and licensing for those
persons who adapt and tailor an "Information Environment", who have to get into
the "machinery" of the SW. The principal field of (con-) tension is the interest
conflict of centralization vs. localization. The "User-Tailoring of Information
Environments" can only be done with the local know-how of the user. And a
centralized vendor, who likes to direct all SW adaptations from his home office,
is not in the condition to accommodate to a multitude of local adaptations, and
spread his resources thin in this way. The access into the "machinery" of the SW
is made difficult or easy by the choice of the infrastructure technology. An
open system like LINUX (and most UNIX systems) maintains all their configuration
data in text-editor accessible, ASCII readable files. A closed system, like
Windows® tends to hide the configuration files and their structure
(system.dat, user.dat, regedit), as much as seems advisable and most profitable
from a trade secret point of view of the aspiring monopolistic vendor.
Another area of concern are the User Interface definitions,
the layout of menus, the keyboard bindings, the help and command texts, those
aspects of the Information Environment which are covered under the terms UIL and
EUPL. (Dertouzos (1992), De Souza (1993, 1996)). In the present compiler based
SW technology, these definitions are mostly embedded in the compiled module or
in the DLLs. Again, this is as much a technical issue as a societal and legal
one, since to change these definitions, one needs access to tools and
proprietary information from the vendor, who again likes to keep all this under
control. On the other hand, there is no reason in principle why the vendor could
not compile only one version for all different language applications, and
different user specifications, and design them open enough to make them
field-adaptable by local providers. But this means loss of centralized control
by the vendor who has better control over pricing, and sealing local markets off
against each other, when these adaptations cannot be performed locally.
To facilitate the maximum flexibility for "User-Tailored
Information Environments", the presently used compiler technology is not
optimal, and an alternative interface technology will be presented here, called
CIA: The Common Interface ASCII. This has been developed in the Leibniz
LPL system and its further details will be the subject of the next
sections.
12.5. Cognitive Engineering and NRT
The OODA principles of NRT design can be adopted with a few
slight variations to UIT. As was pointed out above, they mainly consist of
rhythmically close-coupling the interplay between the human cognitive and
manipulative factors, the human memory, and the access and display speed and
data volume of the information devices. All these factors must be brought into
an optimal balance, which is also precarious easily lost, when technological and
personal factors change (as for example, the change from a naive user to an
expert user). The work of Ishii and Veltman
[35]
shows theoretical and practical approaches to various aspects of this task
domain.
12.5.1. The OODA Loop and application in
UIT Cognitive Engineering
12.5.2. A "Magic" Triangle of SW
productivity
12.5.3. UIT and Cognitive
Engineering as Technological Ars Memoriae
NRT supported Cognitive Engineering means enhancement of human
memory. This had its precedents in the Ars Memoriae techniques in Antiquity and
the Renaissance (Yates 1989, 1990). The Renaissance Ars Memoriae masters
(Giordano Bruno, Giulio Camillo, Robert Fludd) used elaborate poly-hierarchic
access schemata for their systems. This intermeshing of several parallel
hierarchic access systems is one of the most important and most pressing needs
for the management of humanity's knowledge bases. It primarily involves time
critical Real-Time tasks of:
Building and modifying external auxiliary memory structures
"on the fly" or "as we go".
Real-Time structure compilers
Real-Time Hypertext linkers / automatic link
generation
Real-Time feature / information extraction
12.6. The LPL TLSI principle, technological Ars Memoriae, EUPL, UIL, and
CIA
Introductory literature: Goppold (1992c-1992d, 1993, 1994a,
1996a, 1996b).
http://www.noologie.de/lpl.htm
(URL)
http://www.noologie.de/lpl11.htm
(URL)
http://www.noologie.de/symbol.htm
(URL)
The Leibniz LPL system was developed by A. Goppold between
1984 and 1995, originating initially as a stand-alone SW development system for
industrial real-time control applications. The connection to present-day
multimedia systems is evident when their technical nature as subset of the
real-time task control domain is considered. In the development process, the
system grew to about 10,000 functions in 100,000 lines code with 6 megabytes
source, and 500 source and configuration files. In 1985 it was probably the
first SW system that integrated the hypertext principle into the design
programming language. (Other systems had the hypertext principle grafted onto,
which diminished its efficiency.)
The essential factor of hypertext integration is speed: the
ability to access any function or documentation one needs to examine within a
few keystrokes, or within about 1/10 sec. minimum, and an average of 1 sec, a
performance that is beyond the "sound barrier" of present day GUI paradigms
because of the slow-down effect of mouse-clicking. The LPL system was perhaps
the first and only NR-optimized SW system ever conceived. (Even if these ideas
and principles were unknown to the designer at the time of inception. This came
much later as result of far-reaching researches into the neuronal
infrastructures of the human mind.) In the course of research in the NR
principles, it was found that the methods thus developed had led into an
independent re-invention of ancient principles of Ars Memoriae. The
LPL system can be considered as a modern-day mnemotechnic machine. Because
effects like this are conceptually inexistent in the technical spectrum of
academically acknowledged computer science paradigms, it proved impossible in 15
years of R&D to even publish about these results in academic papers, due to
the screening effect of the temporal-factors insensitive computer science peer
review academic mindset.
12.6.1. The Leibniz TLSI Virtual
Machine
The basic technical principle of the LPL TLSI is based on a VM
(Virtual Machine, or Metacode Machine), similar to the one used in Java®
(and compatible with it). Similar to SUN's system philosophy, it is the
ultimately open system: The TLSI system was designed on the base of a
minimal kernel system of about 50 K bytes, to be completely
open and extendable in all ways. Its power rests on the ability to create
special-purpose interpreters for special applications on the fly,
in the field, and even by the end-user. A similar approach is the popular
EMACS editor which implements its functionality in a special interpreter. The
minimal run time kernel can be linked on top of a C compatible system. This
allows to use the TLSI as a re-vectorable and re-programmable interactive user
interface with any existing linkable software (the host system). The minimal
kernel alone provides the equivalent of an interactive low-level debug monitor
system that allows to test and execute any of the functionality of the host
system that one desires interactive access to. Any routine can at any time be
called separately from the interactive user shell. By way of its macro
programmability, any higher assembly of a set of basic host functions can be
constructed on the fly.
12.6.2. EUPL: User Programming of the
Macro System
The TLSI approach offers a very easy way to achieve a
secondary programming facility or End User Programming Language (EUPL). The
developer of the basic SW functionality (the provider) can use a standard
compiler technology or an authoring system to provide the tool set which the
user (or secondary programmer) can then extend into any direction he deems
necessary. All secondary programming can be done with the macro language of the
TLSI. The TLSI can provide a large functionality to the user without having to
include the original authoring system or the compiler package, who is also
relieved from the need to learn the conventions of the authoring system, he
needs only to concentrate on the functionality that is offered by the specific
TLSI interface which the SW provider supplies. This approach allows a
comfortable division of expertise and responsibilities between the different
groups involved in the authoring process of a SW system. The software engineers
need only to deal with their compiler tools and SW methodology to provide a rich
tool set of TLSI tools for secondary, or applications developers to build their
systems upon.
12.6.3. UIL: User Interface Language,
Field Configuration, Integrated Help
The TLSI principle allows the construction of a very simple
and effective common interface shell on top of different software systems, thus
providing a generalized User Interface Language (UIL) that is adaptdable to
specific user profiles. It allows to create flexible keyboard bindings and
window menu layouts. All the functionality of the system is configured in ASCII
readable files, that can be changed and re-configured with any text editor, at
any time, even while the system is running, without any re-compile or otherwise
low-level system interaction. Ideally the whole menu structure of the system
resides in a single file, giving also a transparent access path to the logical
structure of the whole system. In the Leibniz Hypertext Software Development
System, an integrated hypertext function connects every menu with an associated
help text residing in an ASCII readable text file that can be accessed in
hypertext manner.
12.7. CIA: The Common Interface ASCII, Macro Script Languages,
forming a
link between compilers and WIMP interfaces
The LPL TLSI principle can be used to provide a uniform
programmable interface layer between a set of precompiled modules provided by a
SW vendor, and the final product accessible to the end-user. This layer is also
called the CIA: The Common Interface ASCII.
The approach taken by the Leibniz TLSI is a revival of ancient
script languages: APL (NIAL), MUMPS, SNOBOL, combined with the UNIX shell script
filter and pipe principle. In the earlier days of computing, these script
systems were very popular with their user communities because they supplied very
powerful processing paradigms with easy-to use access interfaces for specific
data types: APL (NIAL) for numeric and character arrays, MUMPS for string and
data base handling, SNOBOL for string processing. The single-character operator
command structures of APL and MUMPS were in fact a direct translation of the
underlying byte code machine glorified as user interface, that gave these
languages their distinctive and elegant mathematical-looking write-only flavor
that was as equally cherished by their adherents as it was abhorred by their
detractors. On the upside of the tradeoff balance, these were also the most
powerful programming languages ever created, and it was not only possible but
very easy to write a data processing solutions with five lines, that would need
five pages with contemporary C or Pascal code, object oriented or not. In
textual information applications, the powerful string capabilities of SNOBOL are
needed for the complex context-dependent string and text search, while an
approach derived from the matrix handling model of APL is to be used for the
more general data type of graph traversal and search strategies. While in matrix
processing, it is of no concern by which order the elements are processed, this
is very much of concern in tree traversal. Only the combination of string and
graph data model approaches can yield a truly versatile and powerful toolset.
These strategies can, of course, be implemented in any suitable interpreter
model, be it LISP, Smalltalk, or a variant of BASIC. In any case, the data
processing infrastructure, the libraries and data structure machines must have
the minimum processing power, regardless of the syntactic icing applied on top.
The approach taken with the TLSI model was chosen for reasons of flexibility:
The TLSI is dynamically user-modifiable, imposing no syntactic structure on the
solution space. Since the TLSI is a general implementation of a bytecode
machine, it can be, and has been, used to implement other languages, like BASIC,
LISP, MUMPS, or APL.
The fine-tuning of UIT necessitates very powerful
field-adaptable interactive script, query, and macro languages. These
requirements cannot be satisfied either by current compiler based technology,
nor by WIMP interfaces. Standard compiler technology like C (and compiled Java)
programming is unusable for field-adapting search and evaluation script
strategies. WIMP is also problematic, because of the one-shot character of a
WIMP interaction sequence. A script derivation of WIMP interactions is possible,
and moderatly usable, like Apple's user interface transaction logging facility.
12.7.1. APL, MUMPS, generic data types,
and bytecode machines
APL and MUMPS are systems from the early days of computing
which were developed on the old mainframes or mini computers that typically had
a maximum of 32 K words of RAM address space, but provided fast hard disk access
and they implemented extremely powerful scripting languages dealing with a wide
range of data types. This model can be used to break the UNIX cr-file metaphor
bottleneck. Each of these systems implemented a highly complex file metaphor and
combined with an extremely powerful programming mechanism that was essentially a
bytecode machine that operated on this data type. Even though the programmers of
these systems have by now mostly retired or have moved on to newer technology,
when two or three of these old-timers meet, you can still hear them talk about
their old systems somewhat like old warriors talking about their old tanks and
fighter planes. This is more than pure techno-romance of a by-gone age, more
than "Real Programmers don't use Pascal" mythology. There is a truth to the
story. These systems were the most powerful programming environments ever
invented by man, only to be watered down by later generations of SW products. Of
course, they yielded their power only in the hands of experts, and needed quite
some training. But where in heaven or on earth could you write a fully
functional program with a maximum of one-80x25-screenful of code, as was the
routine in these languages?
12.7.2. A missing link in the SW
universe: powerful scripting languages
The computer industry has a natural tendency to supply those
behemouth WIMP systems to the whole world, for various reasons: First, the
constant dependable need for new hardware through constantly increased computing
requirements of the SW. That benefits the shareholder value of the industry like
nothing else (silicon snake oil, as Clifford Stoll called it in a different
context). Guess who are the same financial conglomerates investing in and
profiting from both sectors of the industry? Second, the "one-size-fits-all"
serves perfectly to mask the prime requirement that SW should be more freely
user-programmable and user-adaptable, and serves well to keep all the
trade-secrets and technological leverage within the folds of "The Company" (we
all know which one that was twenty years ago, and which one it is now). The
names have changed, the system stays the same. (In french: le plus ça
change, le plus ça reste le mème.) Another adage from the computer
industry grapevine, slightly modified from the better known version: If
transportation technology had gone in the same direction as the software
industry has taken us, we would all now drive with battleships and armored
cruisers to work, the supermarket, and to the football stadium. What the
industry has lost with the by-gone era of APL and MUMPS are the most powerful
scripting languages ever invented by man. My guess, why we don't have them any
more: They were too good for the masses. But there is a recipe for getting them
back. It is called "the surprise unplanned re-use of software".
12.7.3. Using the Java VM for hitherto
unintended purposes
The value of computer systems will not be
determined by how well they can be used in the applications they were designed
for, but how easily they can be fit to cases that were never thought
of.
(Alan Kay, Scientific American/Spektrum, Okt. 1984)
When Alan Kay made this statement in 1984, no one exept maybe
he himself could have foreseen the rise of Hypermedia computing and the WWW ten
years later. In effect, the criterium he stated can be called the hallmark of
successful software systems. SUN's Java development may be cited as a case in
point, because it is was salvaged from the limbo of an already aborted software
development effort.
12.7.4. The hidden secret of the Java VM:
the TLSI
Deep down, in "the nooks of granny", in the hidden recesses of
the Java VM lies a grand secret. The most powerful and the most primitive
programming language ever invented. Sounds paradoxical? Only for present-day
"bigger is better because of featuritis"-techno-hype. As every LISP programmer
will certify, LISP is as much the oldest, and simplest, and most powerful
programming language ever invented (at least for the LISP afficionado, and he
has an argument there)
[36].
The Java VM is a bytecode machine, one of the oldest virtual
machine technologies known. And since Sun made Java an open specification, one
can do a little tweaking here and there, and still keep the same Java
compatibility, but use the VM for entirely different, unintended
purposes.
The Java VM can be easily converted into a TLSI. This is
simply the capability to build Token lists for making recursively nesting
subroutine calls to any string of native code (eg. Java VM code), and treat
these Token lists as infinitely extendable program code. Each Token list is
associated with a name in a dictionary which makes it calleable from a user
interface, similar to the UNIX command shell, and this in all is an utterly
simple, utterly extendable, macro programme alphanumeric UIT.
The power of the TLSI principle derives from its utter
simplicity and from the implicit data types. This is mainly the stack data
buffer, which is transparent to the program nesting, and doesn't need to make
any overhead for (stack framed) parameter passing. This stack buffer model can
be easily generalized to any data type required, be it floating point, strings
(then we have SNOBOL), arrays (then we have APL), and even complexes of that, in
which case we have MUMPS, and if we want to BLOBS or any other data type
required. With present day vast address spaces, it is quite easy to accomodate
combined APL, MUMPS, SNOBOL and other capacities in one DP model. The power of
generic data types stems from the fact that they are implicitly provided by the
operators, and they are in a sense the complete inverse of the OO data type
philosophy.
The TLSI outlined here adds another twist to the many cases of
unexpected uses that the very versatile principle of bytecode VM technology used
by Java can be applied to, of which their erstwhile creators would have been
very hard put to think of, and if someone had mentioned it to them, they would
have surely protested that intention most vigorously and violently.
12.7.5. The Common Interface ASCII power
shell principle
One main drawback of the keyboard interface is that typically
each SW vendor designs for his programs a completely ideosyncratic private
command line parameter or hotkey scheme
[37].
This is often enough done purely for marketing reasons so as to give no reason
for confusion of the "look-and-feel" with another, competing program, but in the
effect, it leads to unlmited user obfuscation and confusion. Notorious are the
powerful, cryptic, and unforgiving command line switches of UNIX commands. On
the upside, UNIX provided for an almost miraculously powerful and streamlined
use to build higher level functionality from simple components (alas, only for
experts who knew the ins-and-outs of the various shell languages). This is what
I call the idea of Common Interface ASCII, a simple, extendable, interactive,
user controlled, prototyping and productivity shell. Alas, this idea, and these
shell languages, were never improved upon UIT-wise, and were not taken up to the
potential of more modern UIT, because not much profit was to be made from them:
the old-time power users would never pay for a new shell, since they knew the
good old Bourne and C shell inside out, and the new users never bothered to
learn them all, unless they were promoted to Sysadmin, and had to bite the iron.
So there is PERL which combines much of the power of older systems but it has
lost some of the flexibility, and has essentially added nothing new of the
flavor. Some essentials are missing.
In order to make keyboard / command line interfaces more
widely useful, a standard method would have to be implemented for
custom-tailoring the user interfaces. This facility should be present in all
programs co-residing under the same OS in one user-space, or for an organization
employing a range of SW products in a coherent work environment that should have
a unified "look-and-feel". In the present state of the industry, such a
coherence can only sometimes (and definitely not always) be achieved by buying
all SW from the same vendor. But it can be achieved through a standardized UI
convention, that should be as commonplace as the OS interface conventions as
realized in the open system UNIX implementations. Unfortunately, the pressure
for UI conventions has never been as strong as that for the OS interface. This
standardized interface will is called "Common Interface ASCII" (CIA). A CIA can
be easily constructed with a TLSI which can be linked on top of any existing SW
products. It provides for a uniform transparent method to build any kind of
menus and keyboard bindings, and resides entirely outside the compiled binary
code module in an own token code region, where it can be exchanged in the field,
using plain text files for the bindings. To show how such a file could look
like, an example is given from the Leibniz TLSI system. The syntax and the order
of entries is entirely open to convention. In the present example, the order is
like this:
1) $.user_input function
name
2) 0 I key
bindings
3) "user input string
" menu entry
A sample TLSI menu configuration file
LD-VX[ VD-A1
$.user_input 0
I 0 "user input string "
$.get 5
G 0 "get string with len from adr "
$.get_cnt
12 A 0 "get counted string from adr"
$.put 6
P 0 "put string to adr"
$.put_cnt 6
U 0 "put counted string to adr"
$.dup_top 0
D 0 "duplicate top string"
$.mov_2nd 0
M 0 "xchg top 2 strings"
$.copy_2nd 0
C 0 "copy 2nd string to top"
$.copy_#
0 O 0 "copy nth string to top of buffer"
$.mov_# 4
T 0 "move nth string to top of buffer"
$.mov_bot 0
B 0 "move bottom string to top"
$.push_#x
16 R 0 "move/del to nth_pos top string"
$.del_top 0
X 0 "delete top string"
$.del_#strn 9
N 0 "delete n strings"
$.len 0
L 0 "length of top string "
END;
12.7.6. Bridging the gap between UIL and
EUPL
In more general terms, the TLSI is the simplest solution for
bridging the gap between UIL and EUPL
[38],
which is the main technical problem for making WIMPs user-progammable and
-adaptable. If that has not been provided for in the initial design, it is
extremely difficult in practice to retro-engineer a window menu system to be
user-state-sensitive, and to extract the tokens from the menu options and
convert them into program steps, to allow them to drive a script. Scripting
languages are of no great use, if they don't form a common bridge between and
over many applications, like we had it in the UNIX system. If they are treated
as private intellectual property, capitalized, licensed, and crypted, then there
is not much power left for them and for the user. The copyright lawsuits over
the LOTUS and DBASE script languages have been a major retrograde development
for standardization. Technically the implicit allocation of data buffers of the
TLSI is a great advantage for UIL and EUPL.
12.7.7. Factor Ten, the untapped
potential of Flow State
The untapped potential of fast and high-powered interactive
script languages lies in an effect that has found only spurious attention in the
history of computing, and was lost probably as often as it was rediscovered.
Csikszentmihalyi has called it the "Flow state". There has also been extensive
research on this effect during the development of the Leibniz system, but
apparently there has been little of general research. So it remains one of the
best-guarded secrets of the computing industry.
There exists a lot of lore about what this effect is, and how
it is caused, but not much hard data. From the empirical research some
determinants are known: Human short term memory enjoys a variant of the
"flicker-fusion-effect". This effect makes it possible at all that we perceive
the quickly exchanged still frames of a movie as a motion
flow
[39]. It is a time factor of about 1/20
sec. and has in other psychological research also been termed "the cogent
moment". It is the minimum time span for anything experienced as discernible
event, but it seems that around that time span, things don't just fuse into an
indiscernible blur. What exactly happens at that frontier of human awareness, is
quite unexplained. There are hypnotic effects reported, with flashing strobe
lights at that frequency, etc. But all that serves more to dull the creative
potential, and not to increase it. Contrarily, the flow effect indicates
something entirely different.
We all know that short term memory holds "seven chunks plus or
minus two". This is not much. But if we apply the Flow effect, we can experience
an impressive expanding of short term memory, to be almost panoramic. This way,
we could enjoy panoramic overview over complex relationships even on the old
80x25 character screens when we let them succeed each other fast enough. And on
top of it, as Csikszentmihalyi had found out, it strangely makes people feel
happy, puts them in a state of exuberance, and of lucid trance. Unfortunately,
such an observation is very difficult to verify by stringent academic standards
of peer reviewed scientific papers.
For whatever reasons, the computer industry has gone all the
way into the other direction, with powerful graphics that could give us also a
panoramic overview, but as everyone in the industry knows, windows and icons
will always clutter the screen and lead to more confusion, not more overview.
Perhaps the industry forgot some essential lessons. And this is what I mean. The
industry went the wrong way, wholesale, and because it was more profitable to
build and sell more powerful machinery, instead of doing some hard thinking and
heeding the advice of some old hands.
12.8. The history of the LPL project and some personal views from fifteen years
of experience with a self-contained standalone NR UIT
More literature: Goppold (www)
If we have been around the APL or the MUMPS community (some
old-timers have) we could always hear those wondrous stories of one lone expert
programmer writing a whole computing solution for a tricky DP problem in one
day
[40]. I will give a different example from
my own practice and experience. Between the years 1985 and about 1995, I
programmed what was probably (one of) the largest self-contained standalone SW
system ever produced, the Leibniz system. And I didn't produce it as main job,
but besides my industrial consulting work, spare time. Originally intended for
standalone hardware, embedded systems, and real-time industrial machinery
control (the most general interpretation of multi-media), it contains its own
operating system functions, a software prototyping and testing shell, a fully
self-contained windowing system, a hypertext-integrated full-screen ASCII
editor, and a hypertext database system
[41]. In
the last version of 1995, the Leibniz system contained about 10,000 functions in
100,000 lines code with 6 megabytes source, and 500 source and configuration
files. The system contained as a necessity an embedded information management
and retrieval system, which, by the sheer size of it, is essential. Such a huge
function library is impossible to memorize (especially concerning the interface
conventions), and printed paper documentation is much too cumbersome besides
being hopelessly behind the current version status. 100,000 lines program
listing would occupy a hefty 2,000 pages of A4 paper. Listing only the function
names fills a book of 200 pages. A library of this size explodes even large OO
projects. Therefore the optimal structuring and maximal speed of the retrieval
environment is crucial. The available tools of the system allow access any of
the 10.000 functions even in multi-level hypertext access in a matter of
maximally 5 to 10 seconds. A special interface shell allows to call the very
efficient GREP tool from within the editor with one keystroke, searching the 6
MB source in about 10 to 30 seconds, and delivering its output in a table that
is routed to the editor. The output can then be fed into the hypertext
system.
I can say from my experience that a system this big would have
never been able to construct given the very limited resources available, with
conventional compiler based SW and UI technology, and, by no means, with a mouse
driven WIMP interface.
12.9. References
ACM-CHI:
http://www.acm.org/sigchi/
(URL)http://www.acm.org/sigchi/chi95/proceedings/
(URL)http://www.acm.org/sigchi/chi96/proceedings/
(URL)http://www.acm.org/sigchi/chi97/proceedings/
(URL)
Amiet
, P.: Les elamites inventaient
l'ecriture, Archeologia 12, p. 16-23, (1966)
Anthro: http://history.evansville.net/prehist.html
(URL)
http://www.massey.ac.nz/~ALock/hbook/frontis.htm
(URL)
http://www.geocities.com/Athens/Acropolis/5579/TA2.html
(URL)
Bednarik
, R.G.: On the scientific
study of paleoart, in: Semiotica
, p. 141-168
(1994)
Bernard
, J.: The hand and the mind,
Parabola, New York, p. 14-17, Aug. (1985)
Birdwhistell
, R: Kinesics and
context, Univ. of Pennsylvania Press, Philadelphia (1970)
Brooks, F. P.: The Mythical Man-Month, Addison-Wesley,
Reading, Mass (1975)
Bücher
, K.: Arbeit und
Rhythmus, Reinicke, Leipzig (1924)
Businessweek.
http://www.businessweek.com/1998/42/b3600052.htm
(URL)
CACM, Dec. 1992: Information Filtering
CACM, Jul. 1994: Intelligent Agents
Calvin
, W. H.: The cerebral
symphony, Bantam, New York (1989)
http://www.WilliamCalvin.com/bk4/bk4.htm
(URL)
Calvin, W. H.: The throwing madonna, Bantam, New York
(1991)
http://www.WilliamCalvin.com/bk2/bk2.htm
(URL)
Calvin, W. H.: The cerebral code: Thinking a thought in the
mosaics of the Mind, MIT Press, Cambridge
(1996a)
http://www.WilliamCalvin.com/bk9.html
(URL)
Cassirer
, E.: Philosophie der
symbolischen Formen, I, II, III, Bruno Cassirer, Oxford (1954)
Cassirer, E.: Wesen und Wirkung des Symbolbegriffs, Wiss.
Buchges., Darmstadt (1959)
Cassirer, E.: Was ist der Mensch?, Kohlhammer, Stuttgart
(1960)
Cassirer, E.: Symbol, Myth, and Culture, Yale Univ. Press, New
Haven (1979)
Cassirer, E.: Symbol, Technik, Sprache, Meiner, Hamburg
(1985)
Cassirer, E.: Zur Logik der Kulturwissenschaften, Wiss.
Buchges., Darmstadt (1994)
Chandler, D. (www): Media Theory Web
Site.
http://www.aber.ac.uk/~dgc/influ05.html
(URL)http://www.aber.ac.uk/~dgc/about.html
(URL)
Chernoff
, J.M.: Rhythmen der
Gemeinschaft, Trickster, München (1994)
Cohen
, M.: La grande invention de
l'écriture et son évolution, Imprimerie nationale, Paris
(1958)
Common: Common Elements in Today's Graphical User Interfaces:
INTERCHI '93, ACM, p. 470-473. (1993)
Creveld, Martin van: Aufstieg und Untergang des Staates,
Gerling Akademie Verlag, München (1999)
Csikszentmihalyi, Mihaly. Flow: The Psychology of Optimal
Experience. Harper Perennial, New York (1990)
Daniels
, Wright (eds).: The world's
writing systems, Oxford Univ. Press, New York (1996)
DeLanda, Manuel: War in the Age of Intelligent Machines,
Swerve, New York (1991)
DerraDeMoroda
, F.: The dance
library, Wölfle, München (1982)
Dertouzos, M.: The user interface is the language. in Meyers,
B. (ed.) Languages for developing user interfaces. Boston, Jones and Bartlett,
pp. 21-30, (1992)
De Souza, Clarisse Sieckenius: The semiotic engineering of
user interface languages, Int'l Jrnl of Man-Machine Studies. No 39, pp. 753-773,
(1993)
De Souza, Clarisse Sieckenius: The semiotic engineering of
concreteness and abstractness: From user interface language to end user
programming languages, Dagstuhl Seminar on Informatics and semiotics, February
19-23, (1996)
Diamond
, S.: Kritik der
Zivilisation, Campus, Frankfurt/M (1976)
Diringer
, D.: The alphabet: a key to
the history of mankind, Philosophical library, New York (1948) repr.
1953
Diringer, D.: The hand-produced book, Hutchinson's, London
(1953)
Driver
, G.R.: Semitic writing,
Oxford Univ. Press, London (1948)
Eisenstein, Elizabeth: The Printing Press as an Agent of
Change: Communications and Cultural Transformations in Early-modern Europe, 2
Vols., Cambridge (1979)
Engelbart, D. http://www.bootstrap.org/biblio.htm
(URL)
Ferrill
, A.: The origins of war:
from the Stone Age to Alexander the Great. Thames and Hudson, London
(1985)
Feyerabend
, P.: Against Method,
Humanities Press, London (1975)
Feyerabend, P.: Farewell to reason, Verso, London
(1993)
Foucault, M.: Überwachen und Strafen (1969)
Franko
, M: Dance as Text, Cambridge
Univ. Press, Cambridge (1993)
Gadamer
, H.G. (ed.): Um die
Begriffswelt der Vorsokratiker, Wissenschaftliche Buchgesellschaft, Darmstadt
(1989)
Gehlen
, A.: Urmensch und
Spätkultur, Frankfurt/Main (1964)
Gelb
, J.J.: A study of Writing: The
Foundation of Grammatology, Univ. of Chicago Press, Chicago London
(1952)
Gellner, E.: Pflug, Schwert und Buch, DTV, München
(1993)
engl: Plough, Sword, and Book, Collins Harvill, London 1988
Giesecke, Michael: Der Buchdruck in der frühen Neuzeit,
Suhrkamp, Frankfurt/M (1991)
Goppold, A.: Memosys: WWW-Site
(www)
http://www.noologie.de/
(URL)
Goppold, A.: IBM oder PC ? Erschienen unter dem Titel: "Folgen
eines Erfolgs" in der Microcomputerwelt, April 1984, p.20-24, Mai, p.20-21,
Juni, p.13-14
(1984a)
http://www.noologie.de/lpl02.htm
(URL)
Goppold, A.: Das Paradigma der Interaktiven Programmierung,
Computerwoche, 24.8. und 31.8.(1984b)
Goppold, A.: Die Computer-Industrie und welthistorische
Entwicklungen, Projekt Leonardo-Leibniz, 10.09.1992
(1992a)
http://www.noologie.de/lpl03.htm
(URL)
Goppold, A.: Ansätze zu einer humanistisch orientierten
Computer-Technologie, Projekt Leonardo-Leibniz, 01.10.1992
(1992b)
http://www.noologie.de/lpl04.htm
(URL)
Goppold, A.: MISC - Metacoded Instruction Set Computers,
Projekt Leonardo-Leibniz, 01.11.1992
(1992c)
http://www.noologie.de/lpl05.htm
(URL)
Goppold, A.: Leibniz. Ein skalierbares objektorientiertes
Software-Entwicklungssystem für Echtzeitanwendungen, Echtzeit '92,
Stuttgart
(1992d)
http://www.noologie.de/lpl10.htm
(URL)
Goppold, A.: Die Inter-Aktor Shell ACsh, GUUG-Tagung Wiesbaden
14.-16.9. 1993, 95-100, Network GmbH, Hagenburg, (1993)
http://www.noologie.de/lpl09.htm
(URL)
Goppold, A.: The Symbolator Project: Multimedia Systems for
Envisioning Dynamic Mental Images, forthcoming
(xxxx)
http://www.noologie.de/symbol.htm
(URL)
Goppold, A.: Leibniz - Ein hypertextbasiertes Softwaresystem,
Projekt Leonardo-Leibniz, 24.10.
(1994a)
http://www.noologie.de/lpl10.htm
(URL)
Goppold, A.: Lingua Logica Leibnitiana. Ein computerbasiertes
Schriftsystem als Fortführung von Ideen der Characteristica Universalis von
Leibniz
Kongress: Leibniz und Europa, Leibniz-Gesellschaft, S. 276-283,
Hannover, 18.-23. Juli
(1994b)
http://www.noologie.de/lpl12.htm
(URL)
Goppold, A.: Schrift, Technologien, Macht und das
nach-alphabetische Zeitalter, Projekt Leonardo-Leibniz
(1995)
http://www.noologie.de/lpl13.htm
(URL)
Goppold, A.: The Leibniz TLSI. A secondary macro programming
interface and universal ASCII User Interface Shell for Hypermedia, CALISCE '96,
Donostia-San Sebastian, Spain, 29-31 July
(1996a)
http://www.noologie.de/symbol05.htm
(URL)
Goppold, A.: Data Processing Infrastructures for Knowledge
Filtering. The TLSI Approach. PAKM: First International Conference on Practical
Aspects of Knowledge Management, Basel, Oct 30-31
(1996b)
http://www.noologie.de/symbol06.htm
(URL)
Goppold, A.: Polycontexturality, Society, and the Distribution
of Subjectivity, Günther-Symposion Klagenfurt, 14.12.-15.12.
(1997)
http://www.noologie.de/poly.htm
(URL)
Goppold, A.: Projekt Multimedia-Diskurs
(1998)
http://www.noologie.de/diskur.htm
(URL)
Goppold, A.: The Ethics of Terminology and the new Academic
Feudalism, Proceedings of TKE '99, 23-27.8.1999, p. 771-780, TermNet, Vienna
(1999)
http://www.noologie.de/symbol09.htm
(URL)
Goppold, A.: Balanced Phi-Trees: The Hierarchy and Histio-logy
of Noo-logy, ISKO '99, Hamburg 23.-25.9.1999,
(1999b)
http://www.noologie.de/isko1.htm
(URL)
Goppold, A.: Hypertext as a practical method for balancing the
Hierarchy and Histio-logy of Noo-logy, ISKO '99, Hamburg 23.-25.9.1999,
(1999c)
http://www.noologie.de/isko2.htm
(URL)
Goppold, A.: Design und Zeit: Kultur im Spannungsfeld von
Entropie, Transmission, und Gestaltung, Dissertation, Univ. Wuppertal, 1999,
forthcoming,
(1999d)
http://www.bib.uni-wuppertal.de/elpub/fb05/diss1999/goppold/
(URL)http://www.noologie.de/desn.htm
(URL)
Goppold, A.: Spatio-Temporal Perspectives: A new way for
cognitive enhancement, HCI International '99, München, August 22-27,
(1999e)
http://www.noologie.de/symbol10.htm
(URL)
Goppold, A.: Time Factors in Interface Design for Augmenting
Human Intellect,
HCI International '99, München, August 22-27,
(1999f)
http://www.noologie.de/symbol11.htm
(URL)
Goppold, A.: Neuronal Pattern Mechanisms and the Semiotic
Base, "Sign Processes in Complex Systems", 7th International Congress of the
IASS-AIS, Dresden, Oct. 3-6, w.e.b. Universitätsverlag Dresden, Bergstr.
78, 01069 Dresden, (1999g)
http://www.noologie.de/symbol16.htm
(URL)
Goppold, A.: Neuronal Resonance Fields, Aoidoi, and Sign
Processes, "Sign Processes in Complex Systems", 7th International Congress of
the IASS-AIS, Dresden, Oct. 3-6, w.e.b. Universitätsverlag Dresden,
Bergstr. 78, 01069 Dresden, (1999h)
http://www.noologie.de/symbol17.htm
(URL)
Goppold, A.: Time, Anticipation, and Pattern Processors
(2000a)
http://www.noologie.de/symbol08.htm
(URL)
Goppold, A.: Neuronal Resonance and User Interface Technology
(2000b)
http://www.noologie.de/symbol12.htm
(URL)
Goppold, A.: The LPL TLSI Principle: Neuronal Resonance
Technology, User Interface Language, and End User Programming Language
(2000c)
http://www.noologie.de/symbol13.htm
(URL)
Günther, G.: Beiträge zur Grundlegung einer
operationsfähigen Dialektik, Meiner, Hamburg (1976)
Band 1: Metakritik
der Logik - nicht-Aristotelische Logik Reflexion - Stellenwert-Theorie -
Dialektik Cybernetic Ontology - Morphogrammatik - Transklassische
Maschinentheorie
Haarmann
, H.: Language in Its
Cultural Embedding, de Gruyter, Berlin (1990)
Haarmann, H.: Universalgeschichte der Schrift, Campus,
(1992a)
Haarmann, H.: Geschichtsschreibung der Semiotik, in: Posner,
p. 668-709 (1997)
Halang, W.: Zum unterentwickelten Zeitbegriff der Informatik.
Physik und Informatik. Springer, 30-40, Berlin (1992)
Hanna
, J.L.: To Dance is human, Univ.
of Texas Press, Austin (1979)
Hardin, Garrett: Filters against folly, Penguin, New York
(1985)
Heidegger, M.: Heraklit, Klostermann, Frankfurt/M
(1970)
Heidegger, M.: Was heisst Denken?, Niemeyer, Tübingen
(1971)
Heidegger, M.: Zur Sache des Denkens, Niemeyer, Tübingen
(1976a)
Heidegger, M.: Wegmarken, Klostermann, Frankfurt/M
(1976b)
Heidegger, M.: Sein und Zeit, Klostermann, Frankfurt/M
(1977a)
Heidegger, M.: Holzwege, Klostermann, Frankfurt/M
(1977b)
Heraklit:
Fragmente: Übers. Bruno
Snell, Heimaran, München (1976)
Heuser
, H.: Als die Götter
lachen lernten, Piper, München (1992)
Innis
, H. A.: Changing concepts of
time, Univ. of Toronto press, Toronto (1952)
Innis, H. A.: Empire and communications, Univ. of Toronto
press, Toronto (1972)
first printing 1950
Innis, H. A.: The bias of communication, Univ. of Toronto
press, Toronto (1991)
first printing 1951
Ishii, Hiroshi (MIT): Tangible Bits: Towards Seamless
Interfaces between People, Bits, and Atoms, Keynote Speech, HCI International
'99, (1999a)
Fraunhofer IAO, Nobelstrasse 12, D-70569 Stuttgart, Germany,
Phone: +49 711 970 2131, Fax: +49 711 970 2300, Email: HCI99@iao.fhg.de
http://hci99.iao.fhg.de
(URL)
Ishii, Hiroshi (MIT): Im Rausch der Sinne, Süddeutsche
Zeitung, 7.9. (1999b),
S. V2/10, Article by: Hilde-Josephine Post
Jeschke
, C.: Tanzschriften Ihre
Geschichte und Methode, Dissertation LMU München, Comes, Bad Reichenhall
(1983)
Karn, K. S.; Perry, T. J.; Krolczyk, M. J.: Testing for Power
Usability. SIGCHI Bulletin, 29 (4), Oct , p. 63-67, (1997)
Kingdon, Jonathan: Und der Mensch schuf sich selbst, Insel,
Frankfurt (1997)
Klages, L.: Der Geist als Widersacher der Seele, Bouvier, Bonn
(1981)
Kuhn, Thomas S.: The Structure of Scientific Revolutions,
Chicago (1962)
Lamb
, W., Watson, E.: Body code,
Routledge & Kegan, London (1979)
Lambert
, M.: Purquoi l'ecriture est
nee en mesopotamie, Archeologia 12, p. 24-31 (1966)
Laurel, Brenda: Computers as Theatre, Addison-Wesley, Reading
(1991)
Leroi-Gourhan, A.: Hand und Wort, Suhrkamp, Frankfurt/M
(1984)
Levi-Strauss, C.: Traurige Tropen, Suhrkamp, Frankfurt/M
(1978)
Levy, Pierre: The second flood, report on cyberculture,
Council of Europe (1996)
Lippe, Rudolf z.: Neue Betrachtung der Wirklichkeit, Europ.
Verl. Anst., Hamburg (1997)
Locke, John: Essay on human understanding, Scientia, Aalen
(1963)
Marvin
, C.: Innis, McLuhan and Marx,
Visible Language, 20, 3, p. 355-359 (1986)
Maurer, Kappe, Andrews, et al., various articles about
Hyperwave:
ftp://ftp.tu-graz.ac.at/pub/Hyperwave/
(URL)Germany:
ftp://elib.zib-berlin.de/pub/InfoSystems/HyperWave/
(URL)
Moore
: Beyond Words, Gordon &
Breach, New York (1988)
Morris
, W.: Kunst und Schönheit
der Erde, Stattbuch, Berlin (1986)
Mumford
, L.: Technics and
Civilization, New York (1934)
Mumford, L.: Mythos der Maschine, Frankfurt/M (1977)
engl:
Myth of the Machine, Vol.1, Technics and Human Development, New York 1967,
Vol.2: The Pentagon of Power, New York 1970
Nadin, Mihai: Civilization of Illiteracy, Dresden Univ. Press,
Dresden (1997)
Noeth
, W.: Handbuch der Semiotik,
Metzler, Stuttgart (1985)
O'Connell
, R.: Of Arms and Men,
Oxford Univ. Press, (1989)
Parmenides
: Die Anfänge der
Ontologie, Logik und Naturwissenschaft, (Ernst Heitsch ed.), Heimeran Verlag,
München (1974)
Platon
, Werke: Sämtliche
Dialoge, Band VI: Timaios, Kritias, Sophistes, Politikos, Briefe; Meiner,
Hamburg (1988)
Pleger
, W.: Die Vorsokratiker,
Metzler, Stuttgart (1991)
Pöppel, Ernst: Time Perception, In: Handbook of Sensory
Physiology, R. Held, H.W. Leibowitz, H.-L. Teuber, eds., pp. 713-729, Springer,
Heidelberg (1978)
Pöppel, Ernst: Grenzen des Bewußtseins: Über
Wirklichkeit und Welterfahrung. 2. Auflage, DVA, Stuttgart (1985)
Pöppel, Ernst: Mindworks: Time and Conscious Experience,
Harcourt, Boston (1988)
Pöppel, Ernst: The measurement of music and the cerebral
clock: A new theory, Leonardo 22, 83-89 (1989)
Pöppel, Ernst: Das Drei-Sekunden Bewußtsein.
Psychologie Heute, 10/93, S. 58-63 (1993)
Pöppel, Ernst: Lust und Schmerz, Goldmann, München
(1995)
Root
, D.: Cannibal culture,
Westview, Boulder (1996)
Schlott, A.: Schrift und Schreiber im Alten Ägypten, C.H.
Beck, München (1989)
Schmandt-Besserat, D.: Vom Ursprung der Schrift, Spektrum der
Wissenschaft, p. 5-12, Dec. (1978)
Schmandt-Besserat, D.: Before writing, Vol 1, Univ. of Texas
Press, Austin (1992)
Semiotica
, Special issue on
paleosemiotics, Vol. 100-2/4 (1994)
Smith, M.R.; Marx, L.: Does Technology drive history? MIT
Press, Cambridge (1994)
Spencer, P. (ed.): Society and the dance, Cambridge Univ.
Press, Cambridge (1985)
Stein, G. Talk given at Ars Electronica Infowar Symposion.
(1998). http://www.aec.at/infowar/
(URL)
Stonier, Tom: Internet and World Wide Web: The Birth of a
Global Brain? FIS '96, Wien: http://igw.tuwien.ac.at/fis96
(URL)
Straub, Dieter: Eine Geschichte des Glasperlenspiels,
Birkhäuser, Basel (1990)
Veltman, K.: Why Computers are Transforming the Meaning of
Education, ED-Media and ED-Telecomm Conference, Calgary, June 1997, ed. Tomasz
Müldner, Thomas C. Reeves, Charlottesville: Association for the Advancement
of Computing in Education, vol. II, pp. 1058-1076 (1997)
Veltman, K.: Frontiers in conceptual navigation, Knowledge
Organization 24, No. 4, p. 225-245 (1998)
http://www.sumscorp.com/articles/
(URL)
Whitehead, A. N.: Process and reality, The Free Press,
Macmillan, New York (1969)
Wittfogel, K.A.: Orientalic despotism, Yale Univ. Press
(1957)
Yates, F.: Giordano Bruno in der englischen Renaissance,
Wagenbach, Berlin (1989) engl: Giordano Bruno and the Hermetic Tradition,
London, 1938, 1981
Yates, F.: Gedächtnis und Erinnern, VCH, Weinheim
(1990)
engl: The Art of Memory, Routledge&Kegan, London 1966
.
[34] CD version:
file:///d|/0htm/99-09/hydr-civ.htm
(URL)
[35] Ishii (1999a, 1999b),
Veltman (1997), (1998), http://www.mmi.unimaas.nl/Veltman/Sums/sumsarticles.html
(URL)
[36] The downside of LISP is
expressed in this adage: The LISP programmer knows the value of everything and
the cost of nothing.
[37] Only Apple had succeeded
in creating a modicum of UI standardization in their system.
[38] De Souza 1993, 1996,
Dertouzos 1992
[39] This is perhaps also why
Csikszentmihalyi called it so.
[40] Of course, the downside
often was, that no-one else understood what had been programmed here. But that
is another matter, the expert could have documented his optimized, compressed
code, with some plain text. When we look at contemporary OO code, we get the
impression, that the documentation is the program. That may be fine, but has
also some hidden costs, of which the whole argument in this article is all
about.
[41] In 1985, when the
hypertext system was build, it was probably the first hypertext-integrated SW
development system, with a programming language designed to fit into the
hypertext principle. The TLSI and hypertext are complementary
principles.