Interfaces: Standardization or Standard-Action? By Jorge Luis Marzo

From interfacemanifesto
Jump to: navigation, search

Interfaces: Standardization or Standard-Action?

By Jorge Luis Marzo

What would aliens say if they saw us running applications with buttons that represent “magic wands”, “buckets pouring liquids”, “lassoes to catch animals”, “sponges”, “drops of water”, “erasers”, “droppers” and “ink pads”? Would they laugh? Would they think that these metaphors are ingenious because they are so easy to understand? Going by what science fiction writers and screenwriters have imagined, the aliens are unlikely to laugh about it, given that few of these writers have been able to go beyond the imaginary threshold of the GUIs we have created when they imagine little green or metallic men. An exception may be Stanislaw Lem and his tales of Pirx the pilot with his malfunctioning interfaces [1]. Naturally, we could quickly argue in favour of the power of domestic metaphors and the notion of consistency, which appears to be immovable once it has been deployed. But what if the aliens saw those buttons as a sign of the unstoppable force of a social magma that refuses to submit to certain updates? The “magic wand” may be a silly detail, the result or reflection of what engineers in the seventies had in mind in regard to the stupidity of the users they were designing for, with their promises of simplicity, readability, and browsability. But it may also be one of the “acts” or “verbs” in which social groups come together in order to avoid being totally broken in.

Generally speaking, people are tremendously resistant to change. Is that good or bad? Polish-born writer Joseph Conrad, who was a sailor and thus familiar with a world subject to standardized procedures, claimed that Western man cannot abide explanations: we do not want technical or moral adventures that cast doubt on the certainties on which we have built our illusion of security. As it happens, we can turn to the world of ships and sailors for numerous genealogical sources that can help us chart the history of interfaces and of their standardisation. And what this history tells us is that in the case of both tools (hardware) and systems for displaying visual information, standardization did not always mean choosing the best option, but simply following a strategy that did not require frequent changes. Driven to despair by his sailors’ reluctance to adopt (in the late 17th century) the cartographic principles devised by Gerardus Mercator in 1569 (a century earlier), English naval commander Rear-Admiral Sir John Narborough said: “I could wish all Seamen would give over sailing by the false plain Card, and sail by Mercator's Chart, which is according to the truth of Navigation; But it is an hard matter to convince any of the old Navigators, from their Method of sailing by the Plain Chart; shew most of them the Globe, and yet they will walk in their wonted Road.”[2]. According to the historian J. H. Parry, by the fifteenth century a competent sailor could already feel his way around the world with reasonable confidence. While at sea, he could quite accurately calculate the position of his ship, or of an unfamiliar coastline or an island he had just discovered. But he could not plot these positions, and the routes that led to them, on a chart with the same level of accuracy. “This was not the result of ignorance or lack of skill among marine cartographers, but rather of technical conservatism in a distinguished and well-established craft. The entrenched excellence of the portolan chart, in the area where it had been developed, hindered the technical changes needed for charting other and vastly larger areas.” [3]. Lucas Janszoon Waghenaer, a Dutch chief officer who produced very reliable nautical charts and instructions in the late sixteenth century – the first to include water depth and tidal variations – noted that the things a man practices, seeks, and observes first-hand become embedded in the memory more quickly that the things he learns from others. As such, he included 36 pages of instructions, including a diagram that showed how to calculate the date of a new moon without the need for a calendar or almanac, a list of fixed stars and their uses, the declination of the sun and how to use it, instructions for calculating the tides of any coastline, and how a sailor should use his own personal chart [4]. It would seem that Waghenear took an interest in a very modern form of exploration, but also in the paradox that goes with it: use a series of general guidelines to make your own system of exploration, record-keeping and data visualisation, but don’t forget to think about how others will understand and use it. In short... “Let’s see how we can work out this standardization thing between my thing and everybody else’s.”


Nonetheless, the standardisation of mechanical user interfaces such as the astrolabe and its updated version, the sextant, the lever with the spring mechanism used in the electrical telegraph, loudspeakers, telephone headsets, and steering wheels, happened relatively quickly or sometimes even instantly, at least in the cases of nineteenth century technologies: the interfaces proposed by designers were quickly adopted by the users of those technologies, and remained practically unchanged during the whole time that they were in use. Why is one interface adopted by society while another encounters resistance? Is it because it is an improvement on earlier models? We know that this is not always so. Technical efficiency does not always appear to be the condition that leads some systems to prevail over others. According to what Bijker, Hughes, Pinch and Law called “heterogeneous engineering” years ago, “the social groups that constitute the social environment play a critical role in defining and solving the problems that arise during the development of an artefact.” [5] But aside from technological determinism, there is something else that can have enormous power: people. More often than we realise, users themselves decide to standardize procedures, even at the expense of “official” procedure. An example of this phenomenon in the analogue world is the so-called “desire paths” that are created over time by people taking routes (often shortcuts) other than the official constructed paths in public spaces (gardens, parks, etc.). Anybody who tries to go against these actions should be prepared to fail, or at least to be in a constant state of tension. For example, an increasing number of gamers get together not only to propose new interfaces for certain video games, but to pool their energies in harass-and-destroy campaigns to achieve their aims, often bending companies to their wills.

One of the most paradigmatic cases in this sense is the QWERTY keyboard. We won’t go into details of the problem – the impossibility of changing the standard keyboard in most countries since it was designed in 1868 – given that it is widely discussed in relevant academic literature. From P. A. David’s 1985 essay on how the keyboard was designed to fix a design fault in typewriters at the time but reduced its efficiency, to more recent approaches that question David’s interpretation [6], the key thing is that no attempt to change the keyboard has come to fruition. Not the DVORAK proposal in the 1920s, not Don Lancaster’s “TV Typewriter” in 1973, and not the introduction of PDAs in the nineties, followed by smartphones a little later. Why the devil do we still use a keyboard designed for ten fingers in devices that we operate with two thumbs? Will proposals for mobile-specific keyboard designs, such as KALQ – which apparently solves the problem and increases ease of use and typing speed by 34% – end up being successful? [7] In short, doesn’t standardisation also have to do with a process by which people simply don’t want certain changes, other than the usual aesthetic ones? And doesn’t this terribly contradict the whole economic and technological construct around technological design and creativity? What should we be saying to electrical and industrial engineers, programmers, and graphic designers who are constantly under pressure to make use of the potential, chimerical opportunities to be creative in a “highly volatile” environment?


The physicist James C. Maxwell, one of the fathers of the modern notion of the interface as a field of knowledge, was amazed by the interfacial simplicity of the first telephone: “The speaker talks to the transmitter at one end of the line, and at the other end of the line the listener puts his ear to the receiver and hears what the speaker said. The process in its two extreme states is so exactly similar to the old-fashioned method of speaking and hearing that no preparatory practice is required on the part of either operator.”[8] . The interface has an amazing naturalising effect! You don’t realise you are using it. The more the procedure resembles life itself, the more life resembles the instrument.

This relationship is at the heart of the narrative that developed from the eighties onwards in relation to the interface and transparency, particularly in the framework of the early years of the commercial expansion of the Internet. There was The Knowledge Navigator, for instance, a fictional video commissioned in 1987 by Apple’s Executive Director at the time, John Sculley, who had written a book entitled Odyssey (based on Stalney Kubrik’s 1968 film 2001: A Space Odyssey), which ended with the idea of a universal “knowledge navigator” for individual use. Sculley wanted to illustrate the interface of the future, beyond mouse and menus, in the context of an evolutionary process towards a symbiotic relationship between user and machine, ultimately tending towards the disappearance of the interface as a result of its transparency and subsequent naturalisation. The interface would become so natural that it would no longer be there. According to the digital life and biotech guru Nicholas Negroponte, who wrote extensively on the project, the secret of a truly perfect interface – like the one Sculley promised – was to “make it go away”.[9]. That was its true standardized nature. What better standardization – these high priests imagined – than the one you do not see?

So standardization invariably involves a chain of functions conditioned by the final stage: to make the user lose his fear of exploring. But of exploring what? In the nineties, when computers were being marketed as a kind of do-it-yourself household appliance and a tool for productive creativity – beyond its commercial uses – there was a proliferation of a whole range of references to exploration, as a metaphor of usability and in relation to the imaginary of biocommunication. In terms of usability, exploration was linked to the “consistency” of interfaces that would allow people to “transfer their knowledge and skills from one application to any other.” (Apple, 1992) The argument in favour of the consistency of Graphic User Interfaces (GUIs) was based on the standardization of usability processes, which promised greater system explorability based on the standardization of a function for deleting undesired actions (undo-cancel), which was crucial for disseminating GUIs among non-expert users.

The “usability + consistency” chain set an operating standard in which users who had understood the procedure for using a particular programme or system would then have no qualms about voluntarily exploring similar programmes or systems, as long as they were sure that they really worked in the same way. In other words, that they would be able to undo the orders they gave. While the question of hardware interfaces – such as the interface for driving a car: steering wheel, pedals, hand gears – simply had to do with a learning model in which once you learnt to drive one car you could drive all cars, the standardization of computer graphic interfaces also meant generating a unified definition of “users”: potential idiots who wished to undo their constant errors above all else. Being able to drive requires you to pass an official exam regulated by the authorities. But you can “drive yourself” on a digital screen without passing any exams, aside from your self confidence, your curiosity, and your productive inclinations, which are always protected by Ctrl+Z.

Explorability is thus the ultimate value by which digital devices have been standardized. Nonetheless, there is still the question, as I’ve mentioned elsewhere, of the extent to which GUIs – as opposed to screens – have been the channel for standardization. [10] Of course, it is all a matter of control, of “prognosis”. The fact that the monitor – from the Greek for “he who warns” – has ended up being the graphic hardware expression of computers is significant. And the fact that television took the lead is even more so. When domestic television had to compete with VHS, video game consoles and the first CPUs in the early eighties, the TV set broke free of television as a medium, attaching itself infinitely to all devices that came near. The TV – which became known as the “screen” from then on to differentiate it from the TV set in the living room – colonised the world and became the platform that made all GUIs exactly the same: simple cascading icons or variations, always presented as browsable files. Standardization was simply the awareness that communication with machines (that is, with other users), was based on files. You are a file, admit it, you are part of the archive. So the idea is to make an agreeable, easy to understand archive, which is essentially invisible. Screens and the ability to browse through the archive – anytime, anywhere – are the governing principles of standardization. GUIs are actually a secondary matter, an effect. The really important thing is that the cause has to appear to be “unimportant”. Explorability is thus defined in an absence of exploring causes, in favour of exploring effects.

If you go to a store to buy a new washing machine, the salesman will come and answer your doubts about which model to choose: “This one has a whole lot of buttons and displays, but if it breaks down it will cost twice as much to fix, because it has to be reprogrammed, a specialist technician has to come, etc. Worse still, perhaps the washing machine is fine and only the software for the information display is broken, but even so you can’t use it. The other one without all those screens works just as well.” Standardization is often simply the aesthetic standardization stemming from technophilia, in which rigour, automation, and security are measured in numbers, icons, and the display of functions; in the illusion of biotechnical control. A washing machine salesman once told me that when he tells customers that there is no direct correlation between the digital information displays and the real performance of the washing machine, they usually say: “The truth is that all those graphics and functions drive me crazy. I just want the one that works best.” The “interfacialization” of the world that is going on around us is not just about creating a better world, but perhaps a more aestheticized world. But who determines the aesthetic or functional cannon? If “magic wands” do what I ask them to, that’s good enough for me. If a washing machine with two buttons works as one with a screen full of changing symbols, that’s fine. That suits me, the companies can go to the trouble of adapting: “let’s see how we can fix this standardization between my thing everybody else’s.”

What is really behind the relationship between the transparency, illusion, simplicity, consistency, domesticity, explorability, standardization, and interconnection of interfaces? Who responds to whom, about what?


  1. Stanislaw, L. 1979. “On Patrol”, in Tales of Pirx the Pilot, English translation by Louis Iribarne, NewYork:Houghton Mifflin Harcourt.
  2. Crone, G. R., 1953. Maps and their Makers. Herts: Premier Press, p. 116.
  3. Parry, J. H., 1974. El descubrimiento del mar. Traducción al castellano de J. Beltrán. 1989. Barcelona: Crítica, pp. 213-214.
  4. Parry, J. H., 1974. The Discovery of the Sea. Berkeley and Los Angeles: University of California Press, p.153
  5. Bijker, W. E., Hughes, T. P., Pinch T. J. (eds), 1987. The Social Construction of Technological Systems. Cambridge: MIT Press. See the chapter by John Law, “Technology and Heterogeneous Engineering: The Case of Portuguese Expansion”, pp. 105-128.
  6. See David, P. A., 1985. “Clio and the Economics of QWERTY”, In The American Economic Review, vol. 75, no 2, May 1985, pp. 332-337. See also Lewin, P., 2002. The Economics of Qwerty: History, Theory, Policy: Essays by Stan J. Liebowitz and Steven E. Margolis, New York: New York University Press.
  7. The KALQ Keyboard is a project for the Android operating system that has been under development since 2013 at St Andrews University, The Max Planck Institute for Informatics, and Montana Tech.
  8. Gleick, J., 2011. The Information. A History, a Theory, a Flood, (London: Harper Collins)
  9. Negroponte, N., 1995. Being Digital. New York: Vintage, p. 93.
  10. Marzo, J. L., 2012. “La colonización de las pantallas. Sobre la exposición Pantalla Global”. En www.soymenos.wordpress.com