From PC’s to Artificial Intelligence and the future
A review on IT history in simple language to be understood by both IT and non IT specialists interested in the course of events and the impact that the third and forth industrial revolution had on society with a glimpse on what is coming in the future.
A personal computer (PC) is a multi-purpose minicomputer whose size, capabilities, and price make it feasible for individual use. Personal computers are intended to be operated directly by an end user , rather than by a computer expert or technician. Unlike large, minicomputers and mainframes, time-sharing by many people, at the same time, is not used with personal computers.
Since the introduction of Personal Computers, the shape of the world has changed in many ways. The main aspect for this change was the fact that the computer technology came closer to every individual in many aspects, so every person in any profession, with no special expertise or skills became acquainted with computers and information technology, so that, such a person could appreciate the contribution that this technology could offer to his own profession. This widespread skill became very useful as technology was expanding in areas that communication enabled computers of various sizes to interwork in local area and wide area networks, in such way as to utilize computer power, data and programs in many flexible ways. Personal computers could work as standalone systems or intelligent workstations or terminals linked to mainframes or complex networks.
Hence an increasingly important set of uses for personal computers relied on the ability of the computer to communicate with other computer systems, allowing interchange of information.
Experimental public access to a shared mainframe computer system was demonstrated as early as 1973, in USA, in what was called the “Community Memory project” that was the first public computerized board system operating on line. Individuals could now place messages in the computer and then look through the memory for a specific notice.
Commercial network (Internet) service providers emerged in the late 1980s, giving public access to the rapidly growing network market.
In 1991, the World Wide Web was made available for public use. The combination of powerful personal computers with high-resolution graphics and sound, with the infrastructure provided by the Internet, and the standardization of access methods of the Web browsers established the foundation for a significant fraction of modern life, from airline time tables through unlimited distribution of free videos through to online user-edited encyclopedias. This started sowing the direction PC market was taking.
As for the history of Personal Computers manufacture, this was made possible due to major advances in semiconductor technology since 1959,( the silicon integrated circuit (IC) chip the metal –oxide –semiconductor (MOS) transistor).
Major players and milestones
The MOS integrated circuit was commercialized by Radio Corporation of America RCA in 1964 which later developed the first single-chip microprocessor, Intel 4004 in 1971.
The first microcomputers based on microprocessors, were developed by early 1970s. Widespread commercial availability of microprocessors, from the mid-1970s onwards, made computers cheap enough for small businesses and individuals to own. This made the production of Personal Computers possible.
1974 saw the introduction of what is considered, by many, to be the first true «personal computer», the Altair Based on Intel Microprocessor the Altair was widely recognized as the spark that ignited the microcomputer revolution as the first commercially successful personal computer with the first programming language for the machine was Microsoft’s founding product, Altair BASIC
In 1976, Steve Jobs sold the Apple I computer. Jobs was given his first purchase order, for 50 Apple I computers,
The first successfully mass-marketed personal computer announced in January 1977, was the Commodore PC.
Yet the real impact on the PC market was made by IBM. IBM responded to the success of the Apple II with the IBM PC, released in August 1981
How IBM clones dominated the PC market
Because the IBM PC was based on relatively standard integrated circuits, and the basic design was not patented, the key portion of that hardware was actually the BIOS software got reverse engineered, and that opened the floodgates to the market for IBM PC imitators, IBM had decided to enter the personal computer market in response to Apple’s early success, IBM was the giant of the computer industry and was expected to crush Apple’s market share. But because of these shortcuts that IBM took to enter the market quickly, they ended up releasing a product that was easily copied by other manufacturers using off the shelf, non-proprietary parts. So in the long run, IBM’s biggest role in the evolution of the personal computer was to establish the de facto standard for hardware architecture amongst a wide range of manufacturers. IBM’s pricing was undercut to the point where IBM was no longer the significant force in development, leaving only the PC standard they had established. Emerging as the dominant force from this battle amongst hardware manufacturers who were fighting for market share was the software company Microsoft that provided the operating system and utilities to all PCs across the board, whether authentic IBM machines or the PC clones.
This became the golden opportunity for Microsoft a company created by Gates and Allen shortly after 1975.
In 2004, IBM announced the proposed sale of its PC business to Chinese computer maker Lenovo Group which is partially owned by the Chinese government As a result of the purchase, Lenovo inherited a product line that featured the ThinkPad a line of laptops that had been one of IBM’s most successful products.
Many more IBM PC Clones were manufactured in China and Taiwan, The reason was lower cost , I had a personal experience of traveling to Taipei to see the factory of an IBM Clone manufactured by ‘Copam SA’ which was imported and marketed in Greece by the ICL distributor. I was really shocked by how difficult conditions, under which, Copam engineers and technicians were working there. Four qualified senior computer designers were sharing single desks with no air-conditioning, while manufacturing equipment was fully protected working under full air –conditioning and humidity controlled environment.
No questions why production of integrated circuits and PC’s moved from US and Europe to China and Taiwan.
By 2011, China surpassed US in PC shipments by 18.5 million. This trend reflects the rising of emerging markets as well as the relative stagnation of mature regions.
In 1994, Apple introduced the Power Mackintosh series of high-end professional desktop computers for desk top publishing and graphics designers. During the 1990s, the Macintosh remained with a low market share but as the primary choice for creative professionals, particularly those in the graphics and publishing industries.
In 2002, HP purchased Compaq. Following this strategy HP became a major player in desktops, laptops, and servers for many different markets. The buyout made HP the world’s largest manufacturer of personal computers, until Dell later surpassed HP.
As of June 2008, the number of personal computers in use worldwide hit one billion, while another billion reached by 2014. Mature markets like the United States, Western Europe and Japan accounted for 58% of the worldwide installed PCs
Independently of companies involved in manufacturing and distribution Personal Computers generated a technological revolution which together with significant improvement in communications and micro circuits affected global economies, social behavior, and opened new avenues in science.
Has this PC and microcomputer and PC Networks computer revolution managed to replace or diminish the previous Mainframe dominance?
In spite the technological evolution which has been achieved with the wide use of PC’s and computer networks that brought the general public and small to medium size businesses closer to IT technology, Main frames retained their importance due to OPEN system standards, telecommunication evolution and modern technological tools which we will analyze later, mainly around the concepts of Big Data, Artificial intelligence, Cloud Computing etc
Over the past seven decades, compute power, storage, and networking, have seen various waves of centralization and decentralization interchanges, where each wave enforced the adoption of disruptive technologies,
As with each wave, analysts, and industry observers have forecasted the death of the mainframe. Yet, the venerable mainframe has prevailed. In a Q4 2020 update on mainframe usage, IBM shared the following statistics on mainframe adoption world wide:
- 67 of the Fortune 100;
- 45 of the top 50 Banks;
- 8 of the top 10 Insurers;
- 8 of the top 10 Telcos;
- 7 of the top 10 Retailers;
- 4 of the top 5 Airlines;
use the mainframe.
PC Operating systems
In 1980 IBM asked Microsoft to produce the essential operating system for its first personal computer the IBM PC. Microsoft purchased an operating system from another company, modified it, and renamed it MS-DOS (Microsoft Disk Operating System). MS-DOS was released with the IBM PC in 1981.
Thereafter, most manufacturers of personal computers licensed MS-DOS as their operating system, generating vast revenues for Microsoft; by the early 1990s it had sold more than 100 million copies of the program and defeated rival operating systems
Common contemporary desktop operating systems include Microsoft Windows, macOS, Linux, Solaris and FreeBSD, with the exception of Microsoft Windows, the design of each PC operating system were inspired by, or directly inherited from UNIX operating system, see Linux operating system above. Microsoft managed to establish a “defacto” system on its own.
The introduction of “Cloud Computing” environment obliged all operating systems to produce “Cloud computing: versions adapted for both PC’s and PC networks as well as for mainframes.
For many of these organizations, they are working with public cloud vendors such as Amazon Web Services, Microsoft Azure, Google Cloud , and Alibaba Cloud while keeping their intense workloads on the mainframe for both cost and security reasons.
In fact, many industry leaders value such hybrid approach with cloud and mainframe as a more trusted, efficient architecture.
Communication evolution and its affect on IT applications now and to the future
As mentioned above, IT “applied technology” has reached revolutionary levels, due to the parallel evolution of multiple vertical areas, among which a critical and fundamental role have played the wireless Telecommunications.
Speed of communications have allowed computer networks to evolve from IG first generation analogue data transmition technology at 56.64kbps, to 5G Fifth generation digital data transmition technology at 5Gbps transmition rate
The start of the digital revolution really started with the Second generation wireless telephony technology (2G) which refers to telecom network technologies that were launched on the Global System for Mobile Communications (GSM) standard in 1991
Improved data rates of 3G generation systems have opened the doors for applications like mobile TV, video-on-demand, video conferencing, tele-medicine, and location-based services. High data rates have also allowed users to browse the Web using their cell phones and consequently gave birth to the term mobile broadband.
Global adoption of 3G only started to really gain traction sometime in 2007. It is no coincidence that the introduction of the iPhone in 2007 came at a time when 3G was gaining wide acceptance.
It is also no surprise why Video on demand Mainframe Computer systems failed to be successfully marketed, although they successfully demonstrated cable TV implementations. I can refer to a very ambitious launch of the “Gold rush” ICL system demonstrated to the Greek Antena TV company but never delivered due to lack of telecommunication lines with adequate bandwidth. It would have been a great success, at the time, but total financial failure later, as video on demand was implemented, very soon after, using different equipment at lower cost, running over Internet. A speed over 50 mbps is considered excellent for video down loading even for multiple users
4G Generation was used for high data rates, 1Gbps for stationary users and 100mbps for high mobility users. 4G networks were designed to accommodate a far larger volume of cellular devices and more data-heavy Internet activities like streaming High Definition video. Technically, the first commercially deployed 4G networks came out in 1998, but they didn’t become widely available to the public until 2009.
5G generation at 5Gbps is being considered the real revolution in Mobile Communications – Wireless Telecommunications which provides benefits to five key industries: manufacturing, healthcare, retail, transportation, and agriculture. with this generation we have reached the stage of real time communication between machines and computers bringing automation to the next level that we can effectively control and evolve further robotics which can perform medical operations, instant responses for automatic car driving, fuller exploitation of artificial intelligence, holographic presentation of images, instant uploading and down loading of Big Data for instant statistical analysis, real time simulation modeling and prediction modeling for commercial and other purposes, machine self learning, etc.
All these examples can only act as an indication of the level of human imagination and potential, which combinations of such new products and technologies can lead, to in the future.
We are already experiencing dramatic changes happening by implementing Cloud computing in the provision of commercial applications and services, changes in traffic systems by implementing driverless vehicles, boats and airplanes, spreading of knowledge all around the world, eliminating of distances, even optimizing energy consumption, introducing virtual reality entertainment etc.
Computer networks appeared in the IT market very early, as the need to interconnect groups of computers, servers, mainframes, network devices, peripherals, or other devices needed to allow data sharing, either locally LAN or over telecommunication lines WAN. The larger wan is the Internet which connects millions of people all over the world. Internet contributed to change the image of the world more than any technology as it brought people closer to information sharing as well closer to each other.
The Internet was initially developed to aid in the progress of computing technology by linking academic computer centers. The Internet we use today started being developed in the late 1960s with the start of ARPANET Advanced Research Projects Agency Network and transmitted its first message during 1969. ARPANET was a wide Area Network linking many Universities and research centers, it was first to use packet switching to speed up data transmition, and was the beginning of what we consider the Internet today. In 1993 the Internet experienced one of its largest growths and today is accessible by people all over the world. Internet provides an endless supply of knowledge and entertainment.
For local area networking we can refer to Novell NetWare
Novell Inc. was a global software leader, began managing and securing work environments and making people more productive since 1979. Novell NetWare was the first local Area Network software operating system first introduced in 1983. NetWare was based on file server technology, running on both Ethernet and IBM Token Ring networks. NetWare can be used on various desktop operating systems, such as Microsoft Windows, DOS, IBM OS/2 and UNIX
Cloud computing and associated solutions provide access, through the web, to computing resources and products, including development tools, business applications, computer services, unlimited data storage, and networking solutions. These cloud services are hosted, at a software vendor’s data center and managed by the cloud services provider, or onsite at a customer’s data center.
Cloud computing is releasing most IT users, either as end user customers or IT service providers from cabperson administration and unnecessary resource expenditure to concentrate on the real issues of his activity.
Software as a service (SaaS) is a cloud-based software delivery model in which the cloud provider hosts the customer’s applications at the cloud provider’s location. The customer accesses those applications over the internet. Rather than paying for and maintaining their own computing infrastructure, SaaS customers take advantage of subscription to the service on a pay-as-you-go basis.
There are three types of clouds: public, private, and hybrid. Each type requires a different level of management from the customer and provides a different level of security.
New areas in information processing, currently evolving, that indicate the way to future.
We have touched above, in some of the products and technologies, which have already been applied, to certain extend, and are going through an intermediate stage that can give us confidence, that they will expand dramatically in the near future.
- Big Data
The development of computers and the internet has made it possible to collect and store information. Data has always existed; it was not discovered by computer science. Even before computers and databases were created, data was stored on paper. But what has increased terribly is the volume of information.
Every 2 days, today, we store as much Data as from our appearance on earth until 2000 AD and the volume of information continues to grow. By 2020 it is estimated that the volume of our digital data has increased to 50 zettabytes.
The «Big Data» is based on the doctrine that the more information we have about something, the more accurate predictions we can make about the future.
The term Big Data describes the large amount of data that can be either structured or unstructured and floods a business on a daily basis. This data, which is so large, or complex, that cannot be processed or interrelated using traditional methods. Thus, the size of the amount of data is not more important, but the actions taken by organizations to handle this data. «Big data» can be analyzed to draw conclusions that lead to better decisions and strategic business moves.
Cutting-edge technologies, such as artificial intelligence and machine learning are often used In order for the vast amount of «Big Data» to be «arranged» so that it can be utilized. In other words, we teach computers how to be able to categorize all kinds of information based on its nature, image, sound, etc. Then machines can detect «key» relations among various data of information, faster, more efficiently and more accurately in relation to humans. This is the additional value that technologies such as Enterprise Content Management (ECM) have brought to Information processing, since the form of data it preserves and manages supersedes the form of data ERP’s maintain and manage.
Big data can be of great use if it is processed by tools that can identify patterns by correlating different pieces of information to generate models of system functioning in various areas.
In 2013, IBM recognized data as the next greatest natural resource. Shortly after that, IBM management noted that companies that make billion-dollar-decisions based on gut instincts instead of deriving insights from predictive modeling of data are essentially setting themselves up for failure.
Examples of some areas where big data is being used today or can be used in the near future
• Improvement in health.
The collection and analysis of information regarding the treatment of patients with the methods of «Big Data», helps us to rapidly improve medical preparations, to discover new ones and to find treatments in a much shorter period of time compared to traditional methods.
• Prediction of natural or non-natural disasters.
Predictions can be made about where and when a strong earthquake will occur, for example, with increasing accuracy. Also, about exactly where survivors should be sought after a natural disaster or in war zones.
• Fight against crime.
Police forces are increasingly turning to «Big Data» methods to fight crime, thus minimizing physical and human losses. Comparisons from the history of crimes around the world will help to compare similarities.
• Investigation of history of legal cases
Lawyers are spending thousands hours investigating court cases so that they can either predict results or argue their cases on the basis of previews court decisions.
• Statistical analysis to predict retail behavior
Statistical analysis to predict market penetration on products and categories
• Business decision making, using “predictive analysis” and “modeling”
Business decision making can be improved if it is based on statistical analysis of similar previous events, this will require new techniques and methods such as ‘Predictive modeling’.
Predictive modeling is a mathematical process used to predict future events or outcomes by analyzing patterns in a given set of input data extracted from previews events held on databases.
The form and volume of data analytics, which uses current and historical data to forecast activity, behavior and trends is a crucial component of “predictive analytics”,
- Artificial intelligence (AI)
Unless the reader of this article is a person aware of, or even involved with this technology it is quite difficult to understand the basic concept of AI, so I will try to simplify the concept as much as possible.
It is easy to understand that real expertise in a human being comes from knowledge and experience. For example, if you have to travel on a Sunday to a beach you have already visited, you know that the traffic will be very heavy, so you can take a decision whether you want to travel or not, you can also predict the time you will arrive at your destination.
This is a good example to start understanding one of the fundamentals of computer intelligence.
Knowledge and pattern is the basis on which a computer requires to absorb experience so that it can to help take a decision. So the more data and the more modeling capability the computer system will aquire the more accurate the predictions will be.
Artificial Intelligence for IT Operations (AIOps) combines big data and machine learning to automate IT operations processes, including event correlation, anomaly detection and causality determination. Hence a new terminology is developing to describe the tools used and processes adopted to implement Artificial Intelligence projects.
Such projects would require data collection, usually requiring access to “Big Data”. Data analytics is the first step towards tuning data into knowledge.
Analytics is utilized to identify trends and patterns. Secondly, ”predictive analytics” is applied to generate computational models that represent the knowledge behind the relevant data, to address a particular problem under investigation within the industry to be analyzed. Once we have achieved that, we can create decision support systems that can help improve the process by providing the final user with realistic recommendations.
Each of the tools used, such as analytics , Modeling, predictive analytics, and other would need further description to fully understand the way AI is used to predict future events and trends for correct business decisions to be taken. In addition we need to understand how self learning machines operate to arrive at the stage we can appreciate AI evolution and its limits for the future.
It is important, at this stage, to stress the importance of the Power of computing, the Data availability, the bandwidth of the networks, the contribution of Cloud computing, the speeds of data transfer over Internet, the storage capacity available for Big Data, the logic behind modeling and predictive analytics etc, as they all contribute to implement a successful AI project.
Gartner said, some time ago, that “AI Technologies will be in almost every new software product by 2020”, and it has already been proved true, it has also predicted that By 2023, at least 85% of governments without a total experience in applying AI strategy will fail to successfully transform government services.
There is no question that the socio-economic impact to our society has been enormous, may be bigger than, originally, anticipated. Some of these changes have not only been, gradually, affecting social human behavior but have been changing even human characteristics, For example new generations have changed the way they have been using their brains and their imagination.
When people used to read a book or listening to music, they process information in a way that generates pictures and colors, when people are accustomed to see pictures and colors they absorb information, but they use their imagination to a lesser degree, this is a fundamental difference in the way the human brain works.
The human brain acquainted in reading books and listening to music with no visual help, has acquired powers of creativity to generate, by imagination, images and videos. On the other hand people, acquainted with the use of computers, have lost some power to create images in their imagination but have adapted to respond to images, pictures or videos with more flexibility and speed, hence absorb higher volumes of information. It seems like a race with computer speeds. It is surprising how fast the human brain is trained to absorb information at a higher rate, when it interacts with the computer, loosing contact with the real environment. The computer process becomes something like a virtual reality. This has an effect on social behavior reducing direct human communication replacing direct interaction with social contacts via internet etc.
Apart from that the rate of absorbing information and the increase of the availability of information has increased human knowledge and competence.
The economic impact of the digital revolution has been wide-ranging. Without the world wide web for example, globalization and outsourcing would not be nearly as feasible as they are today. The digital revolution radically changed the way individuals and companies interact. Small regional companies were suddenly given access to much larger markets. Concepts such as on-demand software services and manufacturing processes have rapidly allowed costs to be dropped and made possible innovations to be introduced in all aspects of industry and everyday life.
Negative effects were also created, such as information overload Internet forms of social isolation, and media saturation etc. Journalists are saying that Internet is hurting journalism more than it is helping by allowing anyone to be involved, no matter how amateur and unskilled he may be, causing information to be confusing and non reliable, hence the rise of conspiracy theories in a way it didn’t exist in the past.
Social networking became a double edge knife; it allowed social groups to be formed sharing common interests but at the same time gave birth to a new form of terrorism or authoritarian political power or fake news to mislead the masses. Systems such as “Facebook” and “Twetter” influenced social trends in politics, even in business. We have only to note the recent purchase of Twetter by Elon Mask at the unthinkable price of 44billion dollars an event that rises new political questions as to the control that an individual may have on a public social network.
Productivity was certainly increased and unemployment has been reduced, in spite the well known effect, which is known as the “productivity paradox” that identified reduced productivity numbers, for certain periods, when computer penetration and investments were increased at very high rates in USA and the rest of the developed world.
This effect could not be totally explained except with simplified statements, based on causes due to latency experienced in learning and adaptation to new processes introduced in human society.
In any case, this paradox was overcome while productivity is improving at a very high rate.
The first impression of improved productivity comes from automation generated by robotics and construction with 3d printers, yet the real surprise comes from areas that Artificial Intelligence is being applied.
“Prediction” of future events and market behavior based on “Big Data” and “pattern identification” is already helping top management and governments to take strategic business and political decisions to such a degree that any decision, which will not be based on such practices, is bound to be failing from now on.
This is the real improvement that will affect the future productivity for the new world which is rising. This will impress more than any glamorous new developments to be achieved by automation such as, self driven cars, self driven drowns, trains and airplanes, a trend which is already happening.
We are only hoping that Artificial Intelligence will no supersede human intelligence and will only stay as a complementary competence, supportive to humanity.