The death of the what?

Sunday, August 12 2012 @ 10:48 PM JST

Contributed by: Y.Yamamoto

Production is thus at the same time consumption, and consumption is at the same time production. Each is simultaneously its opposite. But an intermediary movement takes place between the two at the same time. Production leads to consumption, for which it provides the material; consumption without production would have no object. But consumption also leads to production by providing for its products the subject for whom they are products. The product only attains its final consummation in consumption. A railway on which no one travels, which is therefore not used up, not consumed, is potentially but not actually a railway. Without production there is no consumption, but without consumption there is no production either, since in that case production would be useless. Consumption produces production in two ways.
- From Critique of Political Economy by Karl Marx (1859)

IBM System/360 was announced in 1964

The legendary IBM PC hit the market in 1981
It seems the death of the PC is the talk on the web these days. The alleged cause varies from an obituary to another. Some say the death is attributable to the world-wide proliferation of smartphones while a little more computer-savvy people think the PC went virtually extinct in the wake of the widespread application hosting services comprehensively called "cloud computing."

I don't want to attend the deathwatch because I am sure that the corpse was misidentified as my longtime friend's.

The false obituaries, however, bring me back to the early 1950s when I was preparing myself for the rocky adulthood ahead of me. One day I stumbled on the following sentences in an 1843 entry of Soren Kierkegaard's diary.

It is quite true what philosophy says: that life must be understood backwards. But that makes one forget the other saying: that it must be lived forwards. (English translation by Peter Rohde.)

Later in the same year the Danish philosopher wrote a book titled Repetition. He titled the book that way because he thought repetition should be the same thing as "forward recollection." He hypothesized subliminal recollection of the past was the only thing that would guide him in the right direction. That is why Kierkegaard concluded that his dilemma would be solved with his faith in Christianity, the only source of his intuition. Fortunately or unfortunately, though, I was already under the influence of Buddha who knows no Gods and no isms, including atheism. To me denying God was another way of admitting him.

A few years later I came across the Japanese translation of Norbert Wiener's Cybernetics. Etymologically, the coined word has its origin in Ancient Greek that meant "the art of governing." Wiener's interdisciplinary study specifically deals with the question about how the sender of information can use the feedback from its receiver to correct himself, and then update the receiver with new output. I thought I would be able to apply his theory about the feedback mechanism to optimize the way to govern myself. As I wrote some three years ago under the title of The Smart Way of Making Mistakes, it's more important, either in business or personal life, to learn a lesson from your mistake than to make no mistakes at all. In other words you must be error-prone because the more you err, the more you learn.

This is not to say, however, I've never failed to learn from my mistake. I must also admit that even when I failed, I sometimes got back on the right track just by accident. And yet, there were times I would never have overcome a crisis facing me without leveraging lessons I'd learned before.

That's basically how I decided, in 1963, to become a small part of the computer industry. Electronic Data Processing system, or EDP for short, based on the "stored-program" concept developed by John von Neumann, et al. was still in its fledgling stage. But perhaps I already knew that was the surest way to grow into a mature man - one who always embraces change, or even initiates one. I may look to be second-guessing on my career, but actually I am not.

I know that if you are an American, you think it's too far-fetched a thinking to see a link between the Kierkegaardian dilemma and the computer. That's simply because you never think the way I do, or don't think at all for that matter. I don't want to waste your time, and mine either, by telling you how two other thinkers, Max Weber and Karl Marx helped me as catalysts to become involved with information technology the way I did, though mainly as its user.

But before I go on, let me quickly talk about my interpretation of Marx's thoughts on the value-creating chain.

Your parents and grandparents were taught nothing about Marx except that he was a bad guy. Yet some of them must have been smart enough to intuitively understand the dialectical mechanism that governs an industrialized economy. Unfortunately, though, most of them are gone without handing down to posterity their wisdom, work ethics and no-nonsense attitudes toward life. As a result, your generation doesn't have the foggiest idea of what man's economic activity is all about even after completing the MBA course at Harvard Business School. You just take it for granted that economy is something in which people take care of clothing, food and housing among one another, while providing cheap entertainment in between. Small wonder you have recently swallowed yet another stupid notion that economy is something revolving around the conflict between Wall Street and Main Street or 1% versus 99%.

It's true that not once did Marx present the oversimplified formula "Geld-Ware-Geld" or Money-Commodity-Money. But as is evident from the above quote, Marx was keenly aware of the third factor, i.e. technology. Maybe he deliberately put it aside for the purpose of clarity, or he just assumed a flat or linear development of technologies after the first Industrial Revolution. Aside from the class struggle he always stressed, there has been a perpetual battle between technologists and users of their products. And it's important to note that it can't be won by either side where there is a yawning gap between the two. The Luddites are a different issue here.

One year after I joined IBM as a sales trainee, Tokyo hosted the 18th Olympic Games. At the closing ceremony, the Japanese were impressed to see someone from IBM proudly hand over to Avery Brundage, then President of the International Olympic Committee, a thick record book compiled overnight by IBM System/360. But some of us already knew this was not what the modern computer was invented for. Actually we had a great sense of uncertainty about what the coming computer age would look like. All we knew was that Japan wouldn't get on the high-growth track without computerization.

I still remember the touching moment in the midnight hands-on training session when the COBOL program we wrote and rewrote over and over completed the task at hand as intended. My teammates cheered especially when the process started in the right way. On the contrary I was moved when the computer responded to the "STOP" command at the right time and in the right way.

In the subsequent years, we were feeling increasingly frustrated with never-ending conflicts between hardware and software engineers and endusers of their products and services. It was as though someone had put buttons in the wrong holes. We were supposed to expect a synergy effect from the cooperation between computer-illiterate business people and business-illiterate engineers, but actually we always ended up seeing an anti-synergy effect.

With what I named the multiplication rule at work everywhere, 0.5 merged with another 0.5 never makes 1.0 or larger. The arithmetic notation which seems to apply in the real world, instead, is: 0.5 multiplied by 0.5 makes 0.25. In later years I found out that my empirical theory applies not only to business and technology but also any other combination of different things such as cross-racial marriages.

Toward the end of the Mainframe Era, one of the fathers of the modern computer contributed an interesting article to a computer journal. (I forgot whether it was Neuman or John Adam Presper Eckert, Jr.) He argued to the effect that the traditional system architecture in which a number of "dumb" terminals were subordinated to the mainframe machine was as obsolete as the centrally-planned Soviet economy.

You don't quite understand the real implication of his statement if you are one of those people who have never committed themselves to revolutionizing the value-creating chain in the real world, where most everything comes down to the question of how to bring heterogeneous elements together. Since you always mix up ends and means, you think the computer, in itself, represents a value. It's, therefore, none of your concern how different devices with different functions interact with one another, let alone how the computer interacts with its user.

Here and there in the industry, however, a subtle change in attitudes toward the computer had already been underway. Under the circumstances, the Soviet analogy deeply resonated with some of us. It is true that the new trend still remained amphibious, but we were already preparing ourselves for what we would later call "enduser computing."

We had yet to see the arrival of the "smart terminal" but we already had some tools with which to rehearse personal computing under the conventional environment for central data processing. For one thing, we could avail ourselves of "A Programming Language," APL for short, which was an "interactive array-oriented language" developed by Kenneth E. Iverson decades earlier.

In 1983, one year after the first customer shipment of the legendary IBM PC in the U.S., the Japanese subsidiary of IBM announced its Japanese version under the brand name of "IBM Multistation 5550." The top page of its promotional brochure read: "IBM Multistation 5550 is a calculator and a wordprocessor combined into one." The stupid copy unmistakably indicated that the developers of the new product and their target customers were not on the same page yet.

17 years later, I had an opportunity to teach an MBA class at International University of Japan. At that time Grant Norris, now an IBM consultant, gave me a special permission to use his material for my lecture on E-Business and discussions with my foreign students. In a book he co-authored with his fellow consultants, Norris wrote: "Adaptive technologies move earlier technologies forward incrementally [while] disruptive technologies change the way people live their lives or the way businesses operate."

From my MOT (Management of Technology) point of view, where people tend to deal with a disruptive technology as if it were adaptive, Marx's value-creating chain doesn't work because then there is no compelling reason for scientists to seek a major technological breakthrough anymore. As a result, consumers become even more change-resistant because they know life is much easier with existing technologies. Hopefully I will come back to this point in a separate piece.

When the Multistation was unveiled, I was the local CFO at an international trading house headquartered in Zurich, Switzerland. Founded in 1865 in Yokohama by two Swiss merchants, this company was yet another example of the curse of the multiplication rule I mentioned earlier. For one thing, people from the owners, to expat executives, to local employees took it for granted that their strength lay with the Wakon Yosai (Japanese spirit and Western learning) mindset which dates back to the 1860s. But actually, this formula, which was applied to Japan's "modernization" (actually it's just industrialization,) had long proved unworkable because of what I call "technology fetishism" as its inevitable consequence.

The Swiss company always claimed to be a "value-adding trader," but in fact, it was just adding costs which had to be passed on to the customer every time goods changed hands. It went virtually bankrupt several years after I left it, primarily because of its technophobia, the reverse side of technology fetishism. I still remember a 40-something-year-old accountant in my shop double-checking the computer output with her abacus. Believe it or not, she wasn't an exception.

As a senior manager overseeing the entire administration, I submitted to my Swiss boss, named Kurt E. Sieber, a purchase proposal in which I said I wanted to have a 5550 just because I had long had in mind a lot of essential tasks which wouldn't be done effectively, or even performed at all, without a PC on my desktop. Although there were very few reference books readily available at bookstores, I didn't care a bit about how to use the new technology because what for to use it was my only concern. I was more of a businessperson than an IT engineer, but I could learn, in due course, how to use these applications such as Multiplan (the precursor of Excel), BASIC, and VBA (Visual Basic for Applications.)

At the initial stage, the Multistation had no hard disk drive in it. Instead, it used 3.5" floppy disks. And its RAM capacity was a mere 256KB (not a typo,) expandable only to 512KB. But to the money-worshiping Swiss executive, specifications were no concern. The hardest part was to convince him that I wasn't out of my mind when I asked him for 1.5 million yen (US$ 19,000 at the current exchange rate) to buy a "small toy." It took me months until I finally got his reluctant approval.

Another ten years were needed for Sieber to come to terms with the idea that even in his fiefdom, he couldn't get away from the peril of personal computing any more. I suspect, however, it would have taken an eternity had it not been for the invention of a convenient technical term - "Client/Server Model." The fancy phrase allowed any interpretation you liked because nobody couldn't tell exactly how the "new" model differed from the conventional architecture for centralized computing, except that peripheral devices had now grown a little smarter and that clients and servers were often networked using the communications protocol called TCP/IP. It was quite OK if endusers sitting at their smart terminals still wanted to remain dumb. In short, the notion about the client/server meant nothing more than the old Soviet system disguised as a little more user-friendly environment.

Sieber was a former captain at a tank unit of the notorious Swiss Army. That meant he would never emancipate himself from hierarchical way of thinking. No wonder he chose to settle down in this country despite his contempt toward the Japanese. He found the easiest people to exploit in this classless society where peer pressure always prevails among locals. But at the same time he was an unblushing robber. By the time I reached the mandatory retirement age, he and his men had started to confiscate, and then alter the intellectual property I'd accumulated in my computer, as if to defuse the time bomb I'd set to blast the "legacy" system. My repository included hand-made systems for an online exercise of the corporate budgeting and up-to-the-minute control of currency positions, just to mention a few. I called them "systems of the user, by the user, for the user." Despite the fact that the amount of the corporate resources I'd used to develop these mini-systems was negligible small, they didn't pay me a single Swiss franc in royalty. I didn't sue them because I knew these systems and user manuals were nothing but pearls being cast before change-disabled swine.

In 1993, three years after Japan's economic bubble belatedly burst, an epochal book was published in the U.S. The book titled Reengineering the Corporation - A Manifesto for Business Revolution was authored by the late Dr. Michael Hammer with the help of James Champy. Unusually for a business book, it spent more than six months on the New York Times nonfiction bestseller list. But I know very few among my predominantly American audience have read the reengineering classic in part because most of you thought "reengineering" was yet another way to refer to "restructuring" - jettisoning unprofitable business lines, cutting redundant manpower, etc. You don't give a damn about quality of life and real values it calls for. That's why you never understand the positive side of Dr. Hammer's argument. He just wanted to present a methodology to use networked computers as the enabler of "fundamental, radical and dramatic" change in a clear departure from the principles laid down by Adam Smith more than two centuries ago.

Hammer writes: "Reengineering isn't another idea imported from Japan. It isn't another quick fix that American managers can apply to their organizations. ...... Rengineering isn't about fixing anything."

When I was a contractor overseeing the "University Alliance Program" at the rotten Japanese subsidiary of SAP AG, the German software giant, I had a chance to translate Dr. Hammer's PowerPoint slides into Japanese. To me he looked more like a down-to-earth business consultant than yet another management guru. But unfortunately his avid advocacy of a business revolution hasn't borne fruit by now for the reasons I have mentioned.

In 1995, James Champy, the other coauthor of Reengineering the Corporation, published, solo this time, a followup book titled Reengineering Management - The Mandate for New Leadership. Champy had a good reason to author it singularly. CSC Index, a consulting firm he was heading then, sent out an extensive questionnaire to more than 600 CEOs in North America and Europe to find out if the intended revolution had been paying off in their organizations.

In his book, Champy wrote: "[Overall], the study shows, participants failed to attain these benchmarks [for shortening the cycle time, reducing costs, increasing revenue, etc.] by as much as 30%. ..... This partial revolution is not the one I intended. If I've learned anything in the last 18 months, it is that the revolution we started has gone, at best, only halfway. I have also learned that half a revolution is not better than none. It may, in fact, be worse."

From his findings, Chapmy concluded the fundamental problem lay with the corporate culture, and that it was CEOs' responsibility to revolutionize it.

At the 1997 World Economic Forum in Davos, Andy Grove, co-founder and then Chairman of Intel Corp., said to the effect that change in the corporate culture is the key to success in BPR (business process reengineering.) Grove was absolutely right. But he shouldn't have added that the cultural revolution should be driven from the top, just as Champy shouldn't have written Reengineering Management. Success in corporate revolution, or any other revolution for that matter, solely hinges on unfettered spontaneity and creativity on the part of ordinary people. And a corporate culture is just a reflection of nation's culture. It's ridiculous to expect one of those egomaniacs in the executive office to act as a change agent.

At the height of the economic boom, a variety of "participatory" programs such as kaizen (company-wide efforts for reform), kanban (just-in-time inventory management system), and TQC (programs for total quality control) were widely practiced across Japan Inc. Japan experts in other industrialized countries, especially in the U.S., have always touted these "bottom-up" approaches as the recipe for Japan's phenomenal success. But as always, they are wrong. If these programs had really been bottom-up, then we wouldn't have seen the economic bubble form and burst that easily, or the Japanese must have shown, a long time ago, the vigor and resilience needed for recovery from the economic doldrums and political impasse.

Here's one little question for you: Did you know your personal computer mirrors what you really are? I don't know if you did, but in fact she mirrors you even more than she does her developer or manufacturer. Number-crunching or word-processing, let alone apple-polishing, is not her job; always getting you an undistorted feedback is.

Those obituaries are all wrong, after all. If your PC looks to be dead, it's you, not she, that's been actually dead. As quoted at the top of this post, Marx observed that "a railway on which no one travels is potentially but not actually a railway." In another paragraph of the essay, he paraphrased the same idea more succinctly: "Consumption gives the product the finishing touch." Now at the sight of your underused PC, Marx would say:

"A PC you don't want to use real creatively is potentially but not actually a PC."

He would also say the same thing with respect to Web-based technologies.

Now seven years into my retirement, I'm being overwhelmed in the face of the explosion of Sumaho, as the Japanese call smartphones, and other types of hand-held devices.

According to the World Bank, the population of cellphone users has been growing exponentially in the last couple of years and will soon top the 6 billion mark. That should mean everyone except children starving to death in Africa is fondling his handset all the time while he has nothing in particular to communicate with others, electronically or otherwise. Maybe his stomach is not empty, but it's for sure his brain is. I don't know any other words than "mass addiction" to describe this trend.

Apple's iPhone, for one, is a typical example of adaptive technology. Once again, an adaptive technology is something you can live without. In other words, it's, at best, a nice-to-have. Believe me, I have absolutely nothing against your desire to own such a fancy product. All I want to say is I have a great difficulty living shoulder-to-shoulder with people who think these gadgets are must-haves just like junkies think they can't live without the drug which, in fact, is not so much in the substance. And especially in a conformist society like Japan, this addiction is highly infectious.

To make the plague of addiction even worse, the IT industry is single-mindedly building the infrastructures for "ubiquitous computing" and "cloud-computing." Now this is a global trend.

Needless to say, however, the situation in Japan is even more disastrous because of the legacy of Wakon Yosai and technology fetishism resulting from it. With the entire population drowned in the Great Flood of mobile devices, nation's value-creating chain now seems to have gone into pieces, totally and perhaps irreversibly.

Day in, day out, and around the clock, people from young to old pass me by with their fingertips glued to their Keitai (mobile phones,) or vice versa. These days not a few Japanese go to the bathroom, or even to bed, without parting ways with their beloved cellphones. If you take into account the fact that Japan's population density is 10.2 times higher than that of the United States, you may understand what it is like to be among 100 million Keitai users.

I can't tell for sure, but if time allows, I will post a separate piece to elaborate some more on my observation of the intellectual crisis now in its final stage.

Comments (6)