The Myth of Japan's Technological Superiority - PART 1: MOT to Bust Fetishism
Contributed by: Y.Yamamoto
Left: Hollerith Tabulating Machine
Right: Wiring Panel embedded in it
I fell in love with the computer in 1963. It's been 46 years since then, but my love affair with her is still going on. Sometimes I fantasize that on the last day of my life, I will collapse onto the keyboard while my finger keeps hitting "M."
In the early days of the computer age, people were fetishizing hardware in part because the computer was actually a "precious metal mine" of gold, platinum, silver, palladium, rhodium and tantalum.
Another reason for the hardware fetishism was that although people were gradually realizing that "the computer without software is nothing but a box," they still found it difficult to distinguish software from hardware. In the tabulating machine, the precursor of the "electronic data processing" system invented by Herman Hollerith (photo on the left) they were one and the same thing. Before the arrival of the IC-powered machines, most business computing was performed by the tabulators. They processed particular jobs, such as summarizing census results using firmware which was then called the "wiring panel." (Photo on the right.)
In those days, huge machines run on thousands of vacuum tubes were sweating from internally generated heat. But they were soon to be supplanted by far smaller and much more powerful ones. Hardware prices per byte were dramatically coming down. Yet the same old hardware fetishism lingered on even after the emergence of Microsoft as the leader of the IT industry. The software giant artfully hooked us all on software without really getting emancipated from hardware fetshism.
Today there still is something that makes the vast majority of people look away from a third element which by far outweighs hardware and software. I call it human-ware.
By now both hardware and software have been commoditized. You can go out and buy state-of-the-art products, anytime and anywhere. On the other hand, human-ware is something you can't buy at any price. It is true that computer users worldwide are gradually getting wise to this paradigm shift. But they are too slow in embracing the fetishism-free, human-centric attitude toward technologies. Needless to say, the Japanese are lagging far behind other peoples in this respect, as well. I suspect that it takes an eternity for them to catch up with the new way of dealing with modern technologies.
Generally speaking, technological developments, unlike with humanities, tend to take linear paths. Most of the time, therefore, the human elements represented by socio-political systems lag far behind the other elements. One example is the United Nations. Simply put, the international body, which was founded when Chiang Kai-shek was still ruling over the Republic of China, has long been dead. However, anyone but former U.S. ambassador to the U.N. John Bolton hasn't admitted the obvious that "if [the 38-story building of the U.N. headquarters] lost 10 stories, it wouldn't make a bit of difference." People still keep talking sheer hogwash about reforming the 63-year-old edifice now joined by 192 countries.
Actually we are seeing a yawning gap between technologies and socio-political systems. It still keeps widening. I'm sure that sooner or later the gap will be closed in a dramatic way. But I'm also sure this "dramatic way" will mean the "wrong way." In other words, it's not that socio-political system catches up with technologies, but technologies die out so they can keep pace with the carcass of human-ware.
Against this backdrop, a new interdisciplinary field of study called Management of Technology has been emerging in recent years in the U.S. and some other industrialized countries. Yet, even in the U.S., only about 30 universities have launched MOT classes thus far.
By now we have noticed that both professors and students specializing or majoring in this discipline tend to marginalize its significance by confining their studies to how to utilize web-based information technologies for business purposes. But they should address a broader range of technological issues with much more focus on the human-ware perspective. Otherwise, they couldn't fulfill what the new discipline has been meant for.
More specifically, MOT is intended to look into these crucial questions:
● How to optimize the development of a particular technology at hand in the context of socio-political systems and the cultural climate underlying them. Needless to say, a project given a misplaced funding priority often results in a squander of financial resources. But more importantly, a project pursued out of socio-political context is doomed to end up in a vast waste of human resources. This is especially important here, because in Japan gifted human resources are a scarcity.
● How to synergize the interaction between the developers of the particular technology and its users. User feedback about its usefulness, rather than its usability, is essential because without valid input from the users, the innovativeness on the part of the developers will soon dry out. This is especially true of disruptive technologies.
The second bullet will need some explanation. Grant Norris, IBM consultant, defined the words disruptive technologies as against adaptive ones, in a book he co-authored with other consultants. According to Norris, while adaptive technologies such as those for the cellphone just move "earlier technologies forward incrementally," disruptive technologies, such as the Internet, "change the way people live their lives or the way businesses operate" in a disruptive fashion.
Of course you can't draw a clear line between the two types of technologies. But that doesn't really matter. Any technology is meant to be an enabler of change. Karl Marx wrote essentially the same thing when he was talking about production and consumption in industrialized nations. Now we can say:
"Any invention doesn't deserve to be called a useful technology until its users can leverage it to change their way of life."
When in business I used to tell my people, both from the user side and the IT side that the computer can be likened to a mirror. They didn't like my mirror analogy because this particular mirror is biased - biased against stupidity. Why is that? That is simply because the inventor of the computer, John von Neuman or whoever it is, was driven by his hope for a less stupid world.
This piece will soon be followed by a couple more installments which will discuss prewar and wartime history of aeronautics in Japan and some other issues with the ways Japanese people are "using" web-based technologies and "green" ones.
TO BE CONTINUED TO PART 2.1 ·