Before we get started, I wanted to mention a few changes regarding OS/2 CONNECT. The first of which, obviously, is the new format into a frames based publication. We had originally maintained a single web page format to accommodate the OS/2 IBM WebExplorer. As much as I liked the old WebExplorer, I found very few people were using it anymore and preferred a frames based version to more quickly navigate the newsletter. I hope you like it.
As a part of this new version, we have implemented a "Discussion Group" area to allow users to exchange ideas, develop polls, etc. The forum is sponsored by eGroups and you can sign up for it either on our cover or right here.
The final change has to do with my e-mail address. Due to IBM's selling off of its Global Network, effective October 1st my e-mail address was changed to: timb001@attglobal.net
Please update your records accordingly. My old address (timb001@ibm.net) will remain in effect for one year before being phased out. As to those of you wondering how I'm communicating with the AT&T Global Network; quite simply, I'm using my OS/2 based IBM Global Network Dialer which works just fine.
One last note before we proceed, I understand that I was the answer to a Warped Jeopardy question at this past "Warpstock" held in Atlanta; "In 1996, he connected the world with Merlin." "Who was Tim Bryce?" Thanks for thinking of me in Atlanta.
APPROACHING THE MILLENNIUM
At the end of each decade, MBA has reviewed the past and made some predictions about the next decade. This began with my father, Milt Bryce, who in 1979 foresaw what was to become the "file server." Although he didn't call it as such back then, I believe he used the term "back-end data base machine," nonetheless he accurately predicted the computer room would disappear and computers would be spread throughout a company like the nervous system of the human body, with file servers handling data used on an enterprise-wide basis.
In 1989, I foresaw the merger of the CPA firms in order to effectively compete in the burgeoning service industry; gave advanced warning of the Year 2000 problem; and predicted that "COBOL will be with us well into the 21st century."
Where my predictions flopped was with IBM who failed to put together a homogeneous computing environment based on SAA standards. IBM essentially surrendered control of the industry to Microsoft who dictated policy and standards, not IBM. Two key elements were responsible for this: IBM's decision to make the Micro Channel Architecture proprietary and backing off from marketing OS/2. These two botched decisions essentially conceded defeat and marked the point where IBM turned the industry's reins over to outsiders, as well as their own destiny. IBM still does well in the mainframe and mid-range arenas, but forfeited the critical desktop portion to others. In 1989, I predicted IBM was on the verge of offering companies a "soup to nuts" computing strategy. Boy, did I blow this one.
System Development
The 1990's marked the demise of true Systems Analysis. Although Systems Analysts have been in decline since the 1980's, it was the 90's that saw them disappear completely. The proliferation of PC's and their glamourous visual programming tools represented the final death knell of Systems Analysis. By "Systems Analyst," I don't mean someone who is charged with implementing operating systems or software. I'm talking about a person with strong communications/business analysis/financial skills; someone who understands work simplification, work measurement, business processes, return on investment, ergonomics, etc.; someone who can design enterprise-wide information systems with an integrated data base. Unfortunately, these people were supplanted by armies of programmers who hack away at code, one program at a time. As a result, professional associations like the Association of Systems Management (ASM) and periodicals like "Infosystems" were forced to close their doors while PC and programming publications and associations flourished.
Another indicator is the decline of the Chief Information Officer (CIO) who was envisioned as being an integral part of a company's management team. The concept of CIO never really got off the ground in the 1990's. Today, the CIO is nothing more than a high-paid baby-sitter for the programming staff, very similar in stature to the DP Manager of the 1960's. During the 1990's the CIO was considered expendable and faced a revolving door policy by most companies; yes, they were paid well but they were expected to be "turn around" artists for the IT departments. If they didn't radically change things for the better, they were out the door in less than a year. I think the "CIO" title has been overused and a new more glamourous one will emerge in the next decade; perhaps "GMC" (Grand Master of Computing), "PCP" (PC Potentate), or "BMOC" (Big Man On Computing) will emerge. Regardless, the title will remain meaningless as long as the top dog in systems is prohibited from joining the inner circle of a company's management team. This will not change in the coming decade.
CASE tools (Computer Aided Software Engineering) were the rage at the start of the 1990's (anyone remember Knowlegeware or Bachman Information Systems?). These all passed on with the advent of visual programming tools which took the market by storm. Also "Methodologies" and "Repositories" (aka Dictionaries, Encyclopedias, etc.) perished from the face of the earth for the same reason, even though they represented important tools and considerable work went into developing standards for them. The go-go mentality of the 1990's prohibited any long term planning and investment in the future and, instead, promoted quick and dirty solutions like the visual programming tools.
Today, companies are calling for "Enterprise Resource Planning" (ERP) systems which are essentially no different than the "Management Information Systems" (MIS) of the 1960's and 70's ("a rose by any other name is still a rose"). Unfortunately, they will meet the same fate as their MIS predecessors; e.g., lack of adequate documentation, lack of integration, lack of user participation and satisfaction, project overruns (costs and schedules), etc. As long as management continues to abdicate control over the development environment (enforcing discipline, organization, and accountability), companies will continue to botch major systems projects.
When the 1990's began, developers were still trying to "engineer" systems and software, a carryover from the structured design movement of the 1970's and 80's. Whereas people believed systems and software could be engineered, today they simply do not care, which is another reason why Systems Analysis perished. I do not foresee this trend changing in the next decade. The developers being produced today do not grasp basic engineering concepts (nor are they taught by the universities) and still see things from a myopic programming perspective. Further, today's generation resists discipline, organization and accountability more so than their predecessors. Implementing a structured development environment is simply not feasible anymore; the mindset is not there.
In terms of development innovations, nothing significant was introduced during this past decade. Object Oriented Programming (OOP), which was introduced in the 1980's, was refined and more widely used. As mentioned, CASE tools were replaced by visual programming tools, which were also introduced in the 1980's. As a matter of fact, development practices digressed during the 1990's as evidenced by the decline of such tools as program generators, 4GL's, report writers, repositories, and methodologies. Project Management (PM) tools are still selling well, but they only address the symptoms of much more inherent development problems; as a measuring device, PM tools do not address the root problems of development, they only indicate that problems exist.
Perhaps the biggest innovation during the past decade in this area was Sun Microsystem's Java programming language with its noble concept of "write once, run anywhere." Although a slew of vendors embraced this concept, Microsoft fought it because Java essentially represents a threat to the company's domination of operating systems and associated office applications. If you want to own all the marbles in a game, you have to control the rules of the game, which Microsoft has successfully done by resisting industry standards. Will Java succeed? Possibly, depending on how the Justice Department breaks up Microsoft (more on this later).
Don't get me wrong; I think Java is the right thing to do, I just happen to know we live in an industry that strongly resists any form of standardization, particularly when the government gets involved. Consider how successful the industry was in standardizing on COBOL, Unix, or even the Repository model. I believe standards should be controlled by anyone but the government. Although government supporters would argue they represent a fair and impartial body, I contend they setup a nightmarish bureaucracy that seriously impedes progress.
One of the main reasons development practices experienced degradation during the 1990's was due to the preoccupation with the Year 2000 (Y2K). Companies have been waiting to clear this hurdle, which will take a couple more years to clean up, before going forward with new applications. Whereas corporate developers have been essentially fighting fires in preparation of Y2K, they will then be free to take on some meaningful enhancements to their systems. The problem though will be one of people. After companies survive Y2K, they will begin to unwisely force older workers into early retirement or pink slip them altogether with the thinking they had served their purpose but now must be pushed aside. When the old guard leaves, a lot of valuable systems knowledge will go out the door with them, and their younger replacements will be at a total loss as to how to develop major systems. You simply can't turn a carpenter into an architect over night. Yet, this is what many companies will be forced to do. Consequently, companies will be given a tough choice: either re-train and re-engineer the staff or outsource the entire operation to a third party vendor like IBM, EDS, or one of the CPA firms. Either way, outside vendors will thrive in the next decade. For example, training companies will be put to work, consultants will flourish again (primarily staffed by the departed workers), and the big outsourcing firms will make more money than they ever dreamed possible.
Data Base
As I predicted ten years ago, IBM and Oracle were the big winners in the data base derby, and will remain there throughout the coming decade. Both have stood the test of time and I do not believe anyone can unseat them unless they do something incredibly foolish.
During the 1990's "Data Mining" became the rage due to the absence of any true Data Resource Management, another area sorely declining. Instead of managing data as a reusable resource, companies permitted data bases to get out of control and allowed data redundancy to flourish, thereby users still questioned the cleanliness of their data. To compensate, they tried many new Data Mining tools and techniques that can best be described as "hokey." Like systems development, data base is another area that can be solved using common-sense management techniques.
Computing
Mainframes and mid-range computers will not undergo any major changes during the next decade. The PC's, however, are another story. 64-bit based chips are just around the corner and 128-bit processing not too far behind, probably by the end of the next decade. Further, disk storage is reaching new heights. Between powerful chips, advanced disk storage, and enhanced memory, PC's will be a lot more powerful than a lot of people had ever imagined and the cost of computing will fall radically. When this happens, mainframes will indeed begin to disappear from the workplace and replaced, as Milt predicted, by a true nervous system of computers.
The only problem with expanding PC capabilities is having operating systems keep up with the pace. As you may recall, the 32-bit power of the 386 wasn't fully realized until OS/2 began to catch on in the early 1990's. Both Microsoft and Apple were slow to react. Even if the 64-bit chip was introduced today, it would take the industry about five years to figure out how to properly implement it. IBM would have had the best shot at implementing it based on their research with OS/2, but since they dismembered the Personal Software Products (PSP) division and turned the reigns over to Microsoft, that opportunity has disappeared.
Like it or not, Microsoft will still be the key to 64-bit computing, but a lot depends on how the government carves up the company. As this publication goes to press, U.S. District Judge Thomas Penfield Jackson's findings of fact had just been introduced, basically stating that Microsoft was, in fact, a monopoly (anyone surprised?). Microsoft is left with two options: either make some serious concessions and settle, or appeal the judge's decision. Knowing the arrogance of Gates & Company, I'm betting that they'll appeal the decision all the way to the Supreme Court. This will probably take until 2004 before the case is concluded and Gates has had time to double his net worth. The government will break up Microsoft into smaller pieces, thereby infuriating Gates who will simply walk away from the company to pursue other interests; e.g., Czar of Washington and British Columbia; or become a real-life James Bond villain. Seriously, Gates has been expanding his interests and portfolio for some time now; e.g., satellite communications, artwork, golf, etc. Whether Microsoft sinks or swims by the middle of the next decade will be inconsequential to him.
While Microsoft goes through this turmoil, development on 64-bit computing will stagnate. Eventually, Microsoft (or whatever it will then be called) will introduce a 64-bit operating system which will take until the end of the decade to gain widespread acceptance. If ever there was a time for a company to enter the field and wrestle control of desktop operating systems away from Microsoft, it is now as Microsoft begins to turn into a downward spiral. Unfortunately, I don't believe IBM is smart enough to seize the moment. Sun Microsystems, on the other hand, may be up to the challenge and provide a 64-bit Java based solution.
Linux will continue to make in-roads in the file-server area but will not take off until someone contributes a viable graphical user interface like the OS/2 Workplace Shell. Perhaps someone in IBM could find it in their heart to donate it to the Linux effort since they are rapidly embracing Linux with their products (but I wouldn't hold my breath waiting for them). Linux will also struggle with the migration to 64-bit computing. Look for Linux and OS/2 to remain only in niche markets for the coming decade.
Personal Devices
A few years ago, IBM's Lou Gerstner talked about putting a computer in a shoe, thereby providing a convenient means to capture personal data and communicate. I don't know about the shoe, but Lou was basically on the mark. Personal hand-held devices will become the norm during the next decade. We've had plenty of experiments with palm computing over the last decade. But now we're going to see a rash of hand held devices merging communications (cell phone, fax, and Internet) and PIM tools (calendars, notepads, etc.). As features are enhanced (including ease of use) and costs lowered, everyone from executives and professionals, to the average worker and soccer-Moms will be using them to communicate and record their notes. Although Windows-CE will initially be used in these devices, look for Java to make some significant in-roads in this area.
This brings up a point, communications will remain the hot item for the next decade. In a way, it kind of reminds me of the movie, "The President's Analyst," starring James Coburn and Godrey Cambridge (late 1960's). In the movie, James Coburn inadvertently uncovers a conspiracy by the phone company to implant communication devices in the heads of all the citizens, thereby simplifying communications between everyone. Although the movie was a satire, the concept may very well come to fruition with the sleek personal devices being planned. There is only one problem with the proliferation of such communication devices: bandwidth. If the personal devices I described become popular, we will definitely run into road blocks on the Information Superhighway. Already, area codes are growing to accommodate increased cell phone traffic. Further, the Internet is coughing up blood as it struggles to meet the current demand. Now imagine quadrupling communication demands over the next five years and you start to see that trouble is surely brewing. We will hear the usual baloney on the campaign-trails that government must get involved to straighten out the mess, but this will never happen. Enter Bill Gates who will be ready and waiting to start his second career. After he walks away from Microsoft, he will start a totally new business to offer an alternative to today's communications infrastructure, which will become even bigger than Microsoft as we know it today.
Epilogue
So here we are at the end of the 20th century, just one year away from 2001: A Space Odyssey. According to the movie, we were to have commercial flights to the moon, space stations, picture-phones, and a lot of other technology that simply hasn't materialized yet. Like most science-fiction movies, the producers tend to over estimate the timetable of new technology by overlooking the impact of social, political and economical changes, which is what happened to Stanley Kubrick & Arthur C. Clark. We have only been actively involved with computing for about 45 years and, as far as I'm concerned, we still have a long way to go before we are truly productive as a result of its use. Sure, we have introduced several useful innovations that have made our lives better, but until such time as computing is made intuitive to the human being we will never realize its full potential. The next ten years will bring some exciting new innovations, primarily in the area of communications, but we are still a long way from commercial flights to the moon, space stations, and picture-phones.
Keep the Faith!
PAST EDITORIALS
Copyright © M&JB 1999