A journey into the history of data processing, and the impact it had on Greece Part B

From PC’s to Artificial Intelligence and the future

A review on IT history in simple language to be understood by both IT and  non IT specialists interested in the course of events and the impact that the third and forth industrial revolution had on society with a glimpse on what is coming in the future. 

PERSONAL COMPUTERS

A personal computer (PC) is a multi-purpose minicomputer whose size, capabilities, and price make it feasible for individual use. Personal computers are intended to be operated directly by an end user , rather than by a computer expert or technician. Unlike large, minicomputers and mainframes, time-sharing by many people, at the same time, is not used with personal computers.

Major impact

Since the introduction of Personal Computers, the shape of the world has changed in many ways. The main aspect for this change was the fact that the computer technology came closer to every individual in many aspects, so every person in any profession, with no special expertise or skills became acquainted with computers and information technology, so that, such a person could appreciate the contribution that this technology could offer to his own profession.  This widespread skill became very useful as technology was expanding in areas that communication enabled computers of various sizes to interwork in local area and wide area networks, in such way as to utilize computer power, data and programs in many flexible ways. Personal computers could work as standalone systems or intelligent workstations or terminals linked to mainframes or complex networks.

Hence an increasingly important set of uses for personal computers relied on the ability of the computer to communicate with other computer systems, allowing interchange of information.

Experimental public access to a shared mainframe computer system was demonstrated as early as 1973, in USA, in what was called the “Community Memory project” that was the first public computerized board system operating on line. Individuals could now place messages in the computer and then look through the memory for a specific notice.   

Commercial network (Internet) service providers emerged in the late 1980s, giving public access to the rapidly growing network market.

In 1991, the World Wide Web   was made available for public use. The combination of powerful personal computers with high-resolution graphics and sound, with the infrastructure provided by the Internet, and the standardization of access methods of the Web browsers established the foundation for a significant fraction of modern life, from airline time tables through unlimited distribution of free videos through to online user-edited encyclopedias. This started sowing the direction PC market was taking.

As for the history of Personal Computers manufacture, this was made possible due to major advances in semiconductor technology since 1959,( the silicon integrated circuit  (IC) chip the metal –oxide –semiconductor (MOS) transistor).

Major players and milestones

The MOS integrated circuit was commercialized by Radio Corporation of America RCA in 1964 which later developed the first single-chip microprocessor, Intel 4004 in 1971. 

The first microcomputers based on microprocessors, were developed by early 1970s. Widespread commercial availability of microprocessors, from the mid-1970s onwards, made computers cheap enough for small businesses and individuals to own. This made the production of Personal Computers possible.

1974 saw the introduction of what is considered, by many, to be the first true «personal computer», the Altair   Based on Intel  Microprocessor the Altair was widely recognized as the spark that ignited the microcomputer revolution   as the first commercially successful personal computer with the first programming language for the machine was Microsoft’s founding product, Altair BASIC  

In 1976, Steve Jobs sold the Apple I computer. Jobs was given his first purchase order, for 50 Apple I computers,

The first successfully mass-marketed personal computer announced in January 1977, was the Commodore PC.

Yet the real impact on the PC market was made by IBM. IBM responded to the success of the Apple II with the IBM PC, released in August 1981

How IBM clones dominated the PC market

Because the IBM PC was based on relatively standard integrated circuits, and the basic design was not patented, the key portion of that hardware was actually the BIOS  software got reverse engineered, and that opened the floodgates to the market for IBM PC imitators, IBM had decided to enter the personal computer market in response to Apple’s early success, IBM was the giant of the computer industry and was expected to crush Apple’s market share. But because of these shortcuts that IBM took to enter the market quickly, they ended up releasing a product that was easily copied by other manufacturers using off the shelf, non-proprietary parts. So in the long run, IBM’s biggest role in the evolution of the personal computer was to establish the de facto standard for hardware architecture amongst a wide range of manufacturers. IBM’s pricing was undercut to the point where IBM was no longer the significant force in development, leaving only the PC standard they had established. Emerging as the dominant force from this battle amongst hardware manufacturers who were fighting for market share was the software company Microsoft that provided the operating system and utilities to all PCs across the board, whether authentic IBM machines or the PC clones.

This became the golden opportunity for Microsoft a company created by Gates and Allen shortly after 1975.

In 2004, IBM announced the proposed sale of its PC business to Chinese computer maker Lenovo Group   which is partially owned by the Chinese government As a result of the purchase, Lenovo inherited a product line that featured the ThinkPad  a line of laptops that had been one of IBM’s most successful products.

Many more IBM PC Clones were manufactured in China and Taiwan, The reason was lower cost , I had a personal experience of traveling to Taipei to see the factory of an IBM Clone manufactured by ‘Copam SA’ which was imported and marketed in Greece by the ICL distributor. I was really shocked by how difficult conditions, under which, Copam engineers and technicians were working there. Four qualified senior computer designers were sharing single desks with no air-conditioning, while manufacturing equipment was fully protected working under full air –conditioning and humidity controlled environment.

No questions why production of integrated circuits and PC’s moved from US and Europe to China and Taiwan.

By 2011, China surpassed US in PC shipments by 18.5 million. This trend reflects the rising of emerging markets as well as the relative stagnation of mature regions.

In 1994, Apple introduced the Power Mackintosh   series of high-end professional desktop computers for desk top publishing and graphics designers. During the 1990s, the Macintosh remained with a low market share but as the primary choice for creative professionals, particularly those in the graphics and publishing industries.

In 2002, HP purchased Compaq. Following this strategy HP became a major player in desktops, laptops, and servers for many different markets. The buyout made HP the world’s largest manufacturer of personal computers, until Dell later surpassed HP.

As of June 2008, the number of personal computers in use worldwide hit one billion, while another billion reached by 2014. Mature markets like the United States, Western Europe  and Japan accounted for 58% of the worldwide installed PCs

Independently of companies involved in manufacturing and distribution Personal Computers generated a technological revolution which together with significant improvement in communications and micro circuits affected global economies, social behavior, and opened new avenues in science.

Has this PC and microcomputer and PC Networks computer revolution managed to replace or diminish the previous Mainframe dominance? 

In spite the technological evolution which has been achieved with the wide use of PC’s and computer networks that brought the general public and small to medium size businesses closer to IT technology, Main frames retained their importance due to OPEN system standards, telecommunication evolution and modern technological tools which we will analyze later, mainly around the concepts of Big Data, Artificial intelligence, Cloud Computing etc

Over the past seven decades, compute power, storage, and networking, have seen various waves of centralization and decentralization interchanges, where each wave enforced the adoption of disruptive technologies,

As with each wave, analysts, and industry observers have forecasted the death of the mainframe. Yet, the venerable mainframe has prevailed. In a Q4 2020 update on mainframe usage, IBM shared the following statistics on mainframe adoption world wide:

  • 67 of the Fortune 100;
  • 45 of the top 50 Banks;
  • 8 of the top 10 Insurers;
  • 8 of the top 10 Telcos;
  • 7 of the top 10 Retailers;
  • 4 of the top 5 Airlines;

use the mainframe.

PC Operating systems

In 1980  IBM asked Microsoft to produce the essential operating system  for its first personal computer  the IBM PC. Microsoft purchased an operating system from another company, modified it, and renamed it MS-DOS  (Microsoft Disk Operating System). MS-DOS was released with the IBM PC in 1981.

Thereafter, most manufacturers of personal computers licensed MS-DOS as their operating system, generating vast revenues for Microsoft; by the early 1990s it had sold more than 100 million copies of the program and defeated rival operating systems 

Common contemporary desktop operating systems include Microsoft Windows, macOS, Linux, Solaris and FreeBSD, with the exception of Microsoft Windows, the design of each PC operating system were inspired by, or directly inherited from UNIX operating system, see Linux operating system above. Microsoft managed to establish a “defacto” system on its own.

The introduction of “Cloud Computing” environment obliged all operating systems to produce “Cloud computing: versions adapted for both PC’s and PC networks as well as for mainframes.

For many of these organizations, they are working with public cloud vendors such as Amazon Web Services, Microsoft Azure, Google Cloud , and Alibaba Cloud while keeping their intense workloads on the mainframe for both cost and security reasons.  

In fact, many industry leaders value such hybrid approach with cloud and mainframe as a more trusted, efficient architecture.  

Communication evolution and its affect on IT applications now and to the future

 As mentioned above, IT “applied technology” has reached revolutionary levels, due to the parallel evolution of multiple vertical areas, among which a critical and fundamental role have played the wireless Telecommunications.

Speed of communications have allowed computer networks to evolve from IG first generation analogue data transmition technology at 56.64kbps, to 5G Fifth generation digital data transmition technology at  5Gbps transmition rate

The start of the digital revolution really started with the Second generation wireless telephony technology (2G) which refers to telecom network technologies that were launched on the Global System for Mobile Communications (GSM) standard in 1991 

Improved data rates of 3G generation systems have opened the doors for applications like mobile TV, video-on-demand, video conferencing, tele-medicine, and location-based services. High data rates have also allowed users to browse the Web using their cell phones and consequently gave birth to the term mobile broadband.

Global adoption of 3G only started to really gain traction sometime in 2007. It is no coincidence that the introduction of the iPhone in 2007 came at a time when 3G was gaining wide acceptance.

It is also no surprise why Video on demand Mainframe Computer systems failed to be successfully marketed, although they successfully demonstrated cable TV implementations. I can refer to a very ambitious launch of the “Gold rush” ICL system demonstrated to the Greek Antena TV company but never delivered due to lack of telecommunication lines with adequate bandwidth. It would have been a great success, at the time, but total financial failure later, as video on demand was implemented, very soon after, using different equipment at lower cost, running over Internet. A speed over 50 mbps is considered excellent for video down loading even for multiple users

4G Generation was used for high data rates, 1Gbps for stationary users and 100mbps for high mobility users. 4G networks were designed to accommodate a far larger volume of cellular devices and more data-heavy Internet activities like streaming High Definition video. Technically, the first commercially deployed 4G networks came out in 1998, but they didn’t become widely available to the public until 2009. 

5G generation at 5Gbps is being considered the real revolution in Mobile Communications – Wireless Telecommunications which provides benefits to five key industries: manufacturing, healthcare, retail, transportation, and agriculture. with this generation we have reached the stage of real time communication  between machines and computers bringing automation to the next level that we can effectively control and evolve further robotics which can perform medical operations, instant responses for automatic car driving, fuller exploitation of artificial intelligence, holographic presentation of images, instant uploading and down loading of Big Data for instant statistical  analysis, real time simulation modeling and prediction modeling for commercial and other purposes, machine self learning, etc.

All these examples can only act as an indication of the level of human imagination and potential, which combinations of such new products and technologies can lead, to in the future.

We are already experiencing dramatic changes happening by implementing Cloud computing in the provision of commercial applications and services, changes in traffic systems by implementing driverless vehicles, boats and airplanes, spreading of knowledge all around the world, eliminating of distances, even optimizing energy consumption, introducing virtual reality entertainment etc.

COMPUTER NETWORKS

Computer networks appeared in the  IT market very early, as the need to interconnect groups of computers, servers, mainframes, network devices, peripherals, or other devices needed to allow data sharing, either locally LAN or over telecommunication lines WAN. The larger wan is the Internet which connects millions of people all over the world. Internet contributed to change the image of the world more than any technology as it brought people closer to information sharing as well closer to each other.

The Internet was initially developed to aid in the progress of computing technology by linking academic computer centers. The Internet we use today started being developed in the late 1960s with the start of ARPANET Advanced Research Projects Agency Network and transmitted its first message during 1969. ARPANET was a wide Area Network linking many Universities and research centers, it was first to use packet switching to speed up data transmition, and was the beginning of what we consider the Internet today. In 1993 the Internet experienced one of its largest growths and today is accessible by people all over the world. Internet provides an endless supply of knowledge and entertainment.

For local area networking we can refer to Novell NetWare

Novell Inc. was a global software leader, began managing and securing work environments and making people more productive since 1979. Novell NetWare was the first local Area Network software operating system first introduced in 1983. NetWare was based on file server technology, running on both Ethernet and IBM Token Ring networks. NetWare can be used on various desktop operating systems, such as Microsoft Windows, DOS, IBM OS/2 and UNIX

CLOUD COMPUTING

Cloud computing and associated solutions provide access, through the web, to computing resources and products, including development tools, business applications, computer services, unlimited data storage, and networking solutions. These cloud services are hosted, at a software vendor’s data center and managed by the cloud services provider, or onsite at a customer’s data center.

Cloud computing is releasing most IT users, either as end user customers or IT service providers from cabperson administration and unnecessary resource expenditure to concentrate on the real issues of his activity.

Software as a service (SaaS) is a cloud-based software delivery model in which the cloud provider hosts the customer’s applications at the cloud provider’s location. The customer accesses those applications over the internet. Rather than paying for and maintaining their own computing infrastructure, SaaS customers take advantage of subscription to the service on a pay-as-you-go basis.

There are three types of clouds: public, private, and hybrid. Each type requires a different level of management from the customer and provides a different level of security.

New areas in information processing, currently evolving, that indicate the way to future.

We have touched above, in some of the products and technologies, which have already been applied, to certain extend, and are going through an intermediate stage that can give us confidence, that they will expand dramatically in the near future.

  • Big Data

The development of computers and the internet has made it possible to collect and store information. Data has always existed; it was not discovered by computer science. Even before computers and databases were created, data was stored on paper. But what has increased terribly is the volume of information.

Every 2 days, today, we store as much Data as from our appearance on earth until 2000 AD and the volume of information continues to grow. By 2020 it is estimated that the volume of our digital data has increased to 50 zettabytes.

The «Big Data» is based on the doctrine that the more information we have about something, the more accurate predictions we can make about the future.

The term Big Data describes the large amount of data that can be either structured or unstructured and floods a business on a daily basis. This data, which is so large, or complex, that cannot be processed or interrelated using traditional methods. Thus, the size of the amount of data is not more important, but the actions taken by organizations to handle this data. «Big data» can be analyzed to draw conclusions that lead to better decisions and strategic business moves.

Cutting-edge technologies, such as artificial intelligence and machine learning are often used In order for the vast amount of «Big Data» to be «arranged» so that it can be utilized. In other words, we teach computers how to be able to categorize all kinds of information based on its nature, image, sound, etc. Then machines can detect «key» relations among various data of information, faster, more efficiently and more accurately in relation to humans. This is the additional value that technologies such as Enterprise Content Management (ECM) have brought to Information processing, since the form of data it preserves and manages supersedes the form of data ERP’s maintain and manage.  

Big data can be of great use if it is processed by tools that can identify patterns by correlating different pieces of information to generate models of system functioning in various areas.

In 2013, IBM recognized data as the next greatest natural resource.    Shortly after that, IBM management noted that companies that make billion-dollar-decisions based on gut instincts instead of deriving insights from predictive modeling of data are essentially setting themselves up for failure.

Examples of some areas where big data is being used today or can be used in the near future

• Improvement in health.

The collection and analysis of information regarding the treatment of patients with the methods of «Big Data», helps us to rapidly improve medical preparations, to discover new ones and to find treatments in a much shorter period of time compared to traditional methods.

• Prediction of natural or non-natural disasters.

Predictions can be made about where and when a strong earthquake will occur, for example, with increasing accuracy. Also, about exactly where survivors should be sought after a natural disaster or in war zones.

• Fight against crime.

Police forces are increasingly turning to «Big Data» methods to fight crime, thus minimizing physical and human losses. Comparisons from the history of crimes around the world will help to compare similarities.

• Investigation of history of legal cases

Lawyers are spending thousands hours investigating court cases so that they can either predict results or argue their cases on the basis of previews court decisions.

• Statistical analysis to predict retail behavior

Statistical analysis to predict market penetration on products and categories

• Business decision making, using “predictive analysis” and “modeling”

Business decision making can be improved if it is based on statistical analysis of similar previous events, this will require new techniques and methods such as ‘Predictive modeling’.

Predictive modeling is a mathematical process used to predict future events or outcomes by analyzing patterns in a given set of input data extracted from previews events held on databases.

The form and volume of data analytics, which uses current and historical data to forecast activity, behavior and trends is a crucial component of “predictive analytics”,

  • Artificial intelligence (AI)

Unless the reader of this article is a person aware of, or even involved with this technology it is quite difficult to understand the basic concept of AI, so I will try to simplify the concept as much as possible.

It is easy to understand that real expertise in a human being comes from knowledge and experience. For example, if you have to travel on a Sunday to a beach you have already visited, you know that the traffic will be very heavy, so you can take a decision whether you want to travel or not, you can also predict the time you will arrive at your destination.

This is a good example to start understanding one of the fundamentals of computer intelligence.

Knowledge and pattern is the basis on which a computer requires to absorb experience so that it can to help take a decision. So the more data and the more modeling capability the computer system will aquire the more accurate the predictions will be.

Artificial Intelligence for IT Operations (AIOps) combines big data and machine learning to automate IT operations processes, including event correlation, anomaly detection and causality determination. Hence a new terminology is developing to describe the tools used and processes adopted to implement Artificial Intelligence projects.

Such projects would require data collection, usually requiring access to “Big Data”. Data analytics is the first step towards tuning data into knowledge.

 Analytics is utilized to identify trends and patterns. Secondly, ”predictive analytics” is applied to generate computational models that represent the knowledge behind the relevant data, to address a particular problem under investigation within the industry to be analyzed. Once we have achieved that, we can create decision support systems that can help improve the process by providing the final user with realistic recommendations.

Each of the tools used, such as analytics , Modeling, predictive analytics, and other would need further description to fully understand the way AI is used to predict future events and trends for correct business decisions to be taken. In addition we need to understand how self learning machines operate to arrive at the stage we can appreciate AI evolution and its limits for the future.

It is important, at this stage, to stress the importance of the Power of computing, the Data availability, the bandwidth of the networks, the contribution of Cloud computing, the speeds of data transfer over Internet, the storage capacity available for Big Data, the logic behind modeling and predictive analytics etc, as they all contribute to implement a successful AI project.

Gartner said, some time ago, that “AI Technologies will be in almost every new software product by 2020”, and it has already been proved true, it has also predicted that By 2023, at least 85% of governments without a total experience  in applying AI strategy will fail to successfully transform government services.

Socio-economic impact

There is no question that the socio-economic impact to our society has been enormous, may be bigger than, originally, anticipated. Some of these changes have not only been, gradually, affecting social human behavior but have been changing even human characteristics, For example new generations have changed the way they have been using their brains and their imagination.

When people used to read a book or listening to music, they process information in a way that generates pictures and colors, when people are accustomed to see pictures and colors they absorb information, but they use their imagination to a lesser degree, this is a fundamental difference in the way the human brain works.     

The human brain acquainted in reading books and listening to music with no visual help, has acquired powers of creativity to generate, by imagination, images and videos. On the other hand people, acquainted with the use of computers, have lost some power to create images in their imagination but have adapted to respond to images, pictures or videos with more flexibility and speed, hence absorb higher volumes of information. It seems like a race with computer speeds. It is surprising how fast the human brain is trained to absorb information at a higher rate, when it interacts with the computer, loosing contact with the real environment. The computer process becomes something like a virtual reality. This has an effect on social behavior reducing direct human communication replacing direct interaction with social contacts via internet etc.

Apart from that the rate of absorbing information and the increase of the availability of information has increased human knowledge and competence.

The economic impact of the digital revolution has been wide-ranging. Without the world wide web for example, globalization and outsourcing would not be nearly as feasible as they are today. The digital revolution radically changed the way individuals and companies interact. Small regional companies were suddenly given access to much larger markets. Concepts such as on-demand software services and manufacturing processes have rapidly allowed costs to be dropped and made possible innovations to be introduced in all aspects of industry and everyday life.

Negative effects were also created, such as information overload   Internet forms of social isolation, and media saturation etc.  Journalists are saying that Internet is hurting journalism more than it is helping by allowing anyone to be involved, no matter how amateur and unskilled he may be, causing information to be confusing and non reliable, hence the rise of conspiracy theories in a way it didn’t exist in the past.

Social networking became a double edge knife; it allowed social groups to be formed sharing common interests but at the same time gave birth to a new form of terrorism or authoritarian political power or fake news to mislead the masses. Systems such as “Facebook” and “Twetter” influenced social trends in politics, even in business. We have only to note the recent purchase of Twetter by Elon Mask at the unthinkable price of 44billion dollars an event that rises new political questions as to the control that an individual may have on a public social network.    

Productivity was certainly increased and unemployment has been reduced, in spite the well known effect, which is known as the “productivity paradox” that identified reduced productivity numbers, for certain periods, when computer penetration and investments were increased at very high rates in USA and the rest of the developed world.

This effect could not be totally explained except with simplified statements, based on causes due to latency experienced in learning and adaptation to new processes introduced in human society.

In any case, this paradox was overcome while productivity is improving at a very high rate.

The first impression of improved productivity comes from automation generated by robotics and construction with 3d printers, yet the real surprise comes from areas that Artificial Intelligence is being applied.

“Prediction” of future events and market behavior based on “Big Data” and “pattern identification” is already helping top management and governments to take strategic business and political decisions to such a degree that any decision, which will not be based on such practices, is bound to be failing from now on.

This is the real improvement that will affect the future productivity for the new world which is rising. This will impress more than any glamorous new developments to be achieved by automation such as, self driven cars, self driven drowns, trains and airplanes, a trend which is already happening.

We are only hoping that Artificial Intelligence will no supersede human intelligence and will only stay as a complementary competence, supportive to humanity.           

The future

A journey into the history of data processing, and the impact it had on Greece Part A

From Card punch processing to Artificial Intelligence

A review on IT history in simple language to be understood by both IT and  non IT specialists interested in the cource of events and the impact that the third and forth industrial revolution had on society with a glimps on what is coming in the future.  

Introduction

Information technology has been transforming the economies around the world since the end of the 2nd World War. The effect, which Information technology would have on society, was not so obvious at the beginning. At the beginning people thought that computers would have an impact only on science, or defense, or automation, and in general, would enhance or replace human competences in speed of executing accounting activities and improve performance in numeric calculations. In fact, humanity was relating computer evolution, which took place over the last 80 years, to what had happened during industrial revolution of the 19th century.

Nobody could imagine how far, the dramatic improvements. the third and fourth industrial revolution, would lead to.

It is already difficult to contemplate where we have already reached. So much more difficult it is to predict where we will have reached in the next 80 years, following this rate of exponential evolution in technological advances..

We have already reached and we are experiencing technologies, such as Artificial intelligence (AI), machine learning (ML), Big Data analytics, predictive systems, predictive models, robotics, cloud computing etc.

Integrating these technologies with 5G communications speed, new technological horizons will be opened that one could only encounter in science fiction.

Looking back and navigating through various stages of this evolution will help us visualize our future more accurately.

The trip has been so wide that I can only cover part of it, focusing on my personal experience, as I was lucky and privileged enough to experience, first hand, some of these dazzling speeds in technological evolution.   

From card punch and Tabulator equipment to commercial Mainframe computers. (50ies to70ies)

IT industry evolved in USA and Europe since the end of 2nd World War but expanded, exponentially, in the following decades starting from of 50’s.

Greece was one of the very last countries, in Europe, which managed to implement and benefit from computer technology for various reasons. These reasons were not different from those which kept Greece back from an early economic development.

To start with, it is important to remind ourselves, that personal computers did not exist at the beginning, neither internet or data entry screens. Operational instructions were keyed in on main frame consoles which were used to interface with complex operating systems, all data and programs were keyed in on card punch or paper tape equipment, while cards or tapes were imported on card and paper tape peripheral readers creating files on computer memory and magnetic peripherals

At the beginning programming languages were either pure machine code, very close to binary instructions, while higher level languages were created later, initially used for scientific purposes in Universities, such as ALGOL and FORTRAN.  Other languages were produced later to be used for commercial purposes, such as COBOL, and RPG, etc.

Most computer companies, in Greece, were subsidiaries of foreign mainframe computer manufacturers, except from very few, such as International Computers Co (ICL) which was initially represented by a Greek company “Pan Solomos”, a pioneer in office automation and business organization equipment and UNIVAC represented by Doxiadis Associates company.

Before mid-1970s, commercial main frame computers were very expensive and used only for simple batch processing applications that could ran only a single «job» at a time, one after another.

Main frame technology was expensive not only for the cost of hardware and its operating system but also for the need  of special computer room environment under which computers could operate with special air conditioning, temperature and humidity control, dust or smoke control conditions. The first computer prototypes, of the 50ies, were even using special water cooling systems due to high temperature generated from vacuum valves, used before transistors were invented.   

Another very important aspect of cost, for main frame installations, was the human operational cost for developing and maintaining applications on proprietary operating software, it took years for the market to standardize on open operating systems and computer languages, that would, eventually, help to reduce operating costs.

These factors restricted the use of computers, to large government and Banking organizations and some very large private corporations.

Marketing strategy applied, by Main frame manufacturers, was mainly focused on revenues produced out of sales of hardware and their operating systems, as well as from maintenance services; no attention was given to revenues that could be generated from software application development or other services. As a consequence, customers were investing on hiring and training high numbers of IT personnel, to develop software applications in house, on proprietary operating system software and languages. Outsourcing was nonexistent, especially in Greece.

Lack of collaboration between computer industry and Universities in Greece was not helping to generate an adequate number of IT specialists needed. This was exactly the opposite to what was taking place in Europe and USA where collaboration among computer industry and Universities or research laboratories was exactly the reason why computer technology emerged.

Therefore, these personnel were becoming scarce and indispensable, for each computer user company, This, unfortunately, created an unexpected, side effect, a peculiar resistance to change from some individuals who were delaying the introduction of new technologies for purely personal reasons, in order to retain their jobs as highly trained specialized personnel.

The main computer companies operating, at the time, were IBM, monopolizing the market,(90%) with DIGITAL, UNIVAC, HP following closely, with products produced in United States, the only companies from Europe were Bull, a major French company and International Computers (ICL), a British company which came later in the Greek market, and Siemens the German company which came even later,  

One should not forget Control Data which established the first computer training institute complementing Universities in Greece, a very important contribution for the Greek market which was in such need of computer programmers and consultants.

Let us have a look on some of these pioneering computer companies which played a significant role in the international as well as the Greek IT market.

IBM

There is no better way to understand the early IT history than making a list of milestones of IBM‘s evolutionary history, which proves why IBM played such a leading role in this industry,

1911: IBM foundation

H IBM was founded in June 1911 as Computing – Tabulating – Recording Co. (C-T-R) which adopted the name International Business Machines (IBM) in 1924. IBM punch card was at the origin of IBM Business Machines and became the industry standard, all over the world, since 1928, holding, nearly all of the world’s known information, for the next 50 years, and enabling large-scale projects like the US Census.  This was a fact even if punched card processing had been developed by Hermann Hollerith as far back as 1884. IBM Business Machines grew, indisputably, to a real pioneer in Data processing and Information technology; even today IBM retains the 10th company position worldwide.

1936: Social Security, made possible by IBM

IBM works with the government on the US Social Security Act of 1935, tabulating employment records for 26 million Americans — the largest accounting project of its time.

1952: The inception of digital storage

IBM introduces the world to digital storage via magnetic tape data, marking the transition from punched-card calculators to electronic computers.

IBM 701-The first large scale computer based on vacuum valves 

1956: AI before AI

IBM 704 to play checkers and learn from its experience. It is considered the first demonstration of artificial intelligence. 

1957: 60+ years of FORTRAN (Formula Translation)

Possibly the most influential software product in history, FORTRAN liberates computers from the exclusivity of programmers and opens them to users worldwide. A computer language based on rules of algebra , grammar and grammar syntax. 

1957: Storing data on a disc

 IBM 305 Random Access Method of Accounting and Control (RAMAC) – the first system storing data on a disc.

1959:, Transistor technology.

 IBM system 7090 was one of the first mainframe which was based on transistor technology. Το IBM 7090 could perform  229.000 calculations per sec

60ies:, Flight reservations

IBM continued to focus in the development of solutions which would help to bring companies to a new age IBM technology was applied in a series of commercial applications from managing flight reservations in real time up to support the first space exploration in landing the first man on the moon. ..

1964 System/360, the first family of large scale main frame computers

System/360, was the first family of large scale main frame computers which could use alternative software products and peripherals. This approach was the starting point for a new era that departed from the philosophy of «one system for all».

1981: Introducing the IBM PC

The PC revolution begins with the IBM Personal Computer, computing goes ‘mainstream’, beyond hobbyists and into the market of common household commodity.

1997: Artificial Intelligence (AI) defeats a reigning chess champion

IBM Supercomputer defeats the best chess player in the world. Thinking computers take a giant leap forward toward the kind of AI that we know and use today.

2011: First AI to understand fluid language

In an unprecedented demonstration of natural speech recognition and cognitive computing, IBM Watson defeats the champions of the TV quiz show Jeopardy!.

The IBM history in Greece started with the first installation in the banking sector with a major mainframe installation at the National Bank of Greece which served as the first training center that produced the first programmers and computer specialists and consultants. Even before this initial stage, IBM implemented a project at the Greek statistical department in the ministry of finance using card punch equipment.

BULL

Bull SAS is a French computer company based in Paris 

The company has also been known at various times as Bull General ElectricHoneywell BullCII Honeywell Bull, and Bull HN, Compagnie des Machines Bull.

Bull was founded in 1931, to capitalize on the punched card technology

Compagnie des Machines Bull was an OEM supplier to BTM, a predecessor of ICT, future core of ICL. Bull first patents were related to the pre-selection of data inside punched cards reader in 1919 (35 years after Hollerith).

Bull has a worldwide presence in more than 100 countries and was particularly active in defense, finance, health care, manufacturing, public, and telecommunication sectors. Bull was nationalized in 1982 and was merged with most of the rest of the French computer industry. The company was re privatized In 1994,
The origins of Bull lie in the need of insurance companies to process their voluminous data quickly. By 1935 Bull had launched its first significant line, the 150 series of tabulators, and thus found itself locked in a struggle with the French subsidiary of IBM.

For the remainder of the 1930s Bull grew at a modest rate. Most of its customers were banks, nearly all of them French. The company’s real expansion came only after World War II had spurred the French scientific community to develop a computing machine.

In Greece, Bull started with a major main frame installation at the Greek power corporation DEH, which was later replaced by IBM. The major success happened when Bull succeeded to win the international bid for Taxis of the Greek Ministry of Finance during the 90ies 
ICL

ICT established International Computers Ltd in UK during 1968. ICL absorbed most of the existing computer industry in UK including English Electric Computers, Elliot Automation Leo Marconi etc

An ICL installation of historical interest is LEO I (Lyons Electronic Office I) was the first computer designed with vacuum valves, used for commercial business applications during1951. This claim is in conflict with similar claims that both Univac and IBM have been making. ( LEO I was, surprisingly, the creation of J. Lyons & CO. a British restaurant chain, food manufacturing and hotel management conglomerate founded in 1884. LEO I ran its first business application in 1951. In 1954 Lyons formed LEO Computers Ltd to market LEO I and its successors.  LEO Computers eventually became part of    English Electric that then became part of ICl. The LEO computer room, which took up around 2,500 square feet of floor space was originally relocated and kept as a museum at Kidsgrove ICL factory where the latest ICL Main frame 2900 series was built.

Lyons used LEO I initially for valuation jobs, but its role was extended to include payroll and inventory. One of its early tasks was the elaboration of daily orders which were phoned in every afternoon by the shops and used to calculate the overnight production requirements, assembly instructions, delivery schedules, invoices, castings, and management reports. This was the first instance of an integrated management information system an impressive achievement for the time.

I was privileged having the opportunity to work as computer design engineer at Kidsgrove ICL factory where had the experience of many technological advances that took place during the design of ICL 2900 mainframe series and contribute in the design of the 2900 series File Peripheral Controller (FPC2) and the error correction system that was applied for magnetic discs to correct bursts of errors due to smoke and dust in the atmosphere for which magnetic disks were extremely vulnerable at the time.

I also experienced the problems that ICL and most of computer manufacturers had to overcome with the previous computer technology. At the time, the complexity of the circuitry was increasing and the speed of data transmition over transmition lines, due to long cabling, were causing serious “ringing problems”. Many companies failed to enter the higher technological level needed for digital computer design and stopped their efforts. This problem was eventually solved by applying four layer board technologies.

Another area worth mentioning was an advanced technological ICL innovation, CAFS (Contend Addressable File Store) which was a high speed hardware search engine, to access information using part of the information at extremely high speed.

ICL failed to enter the Greek Mainframe market with the 2900 series but made its presence strong with its Medium Range Computer Series 2903 and ME29 and DRS Distributed Computing Range, a few years before its purchase from the Japanese Computer company Fujitsu.

I cannot forget a speech made by the ICL European Director addressing a European gathering of country managers during the end of 70ies, warning that the real threat for the European computer industry was coming from their chip supplier from Japan.  

UNIVAC

The UNIVAC I (UNIVersal Automatic Computer I), was the first general-purpose electronic digital computer design for business application. The first contracts were with government agencies such as the  Census Bureau  on March 31, 1951, the U.S. Air Force, and the U.S. Army Map Service

In order to understand the progress made since these first generation computers, in comparison with today’s technology it is worth noting that UNIVAC I used about 5,000 vacuum tubes weighed 8.3 short tons; 7.6 t), consumed 125 kw and could perform about 1,905 operations per second running on a 2.25 mhz clock. The Central Complex alone (i.e. the processor and memory unit) was 4.3 m by 2.4 m by 2.6 m high. The complete system occupied more than 35.5 m2 (382 ft²) of floor space.

Univac was represented in Greece by a Technical company Doxiadis Associates which implemented one of the first computer statistical calculations for the Geek Customs Organization

The first data processing applications in Greece

It is not surprising that the first data processing applications, in Greece and elsewhere, were with tabulating and sorting of data collected on cards

It is of historical interest that the first attempts to apply data processing technology, even before IBM with the National Bank of Greece, were made by Univac in applications used for scientific calculations at Greek customs and by Pan Solomos Ltd for the ministry of interior, during Greek elections. This was done by using ‘International Computers and Tabulators’ (ICT) card punch equipment and tabulators that produced the election results, automatically, for the first time.

This is how major mainframe manufacturers including IBM, UNIVAC, BULL, ICT started their operations in USA and Europe.

Main frame computers were mostly used for commercial applications in Government, public utility companies, Banks for loan applications and some very large private companies mainly in insurance. The applications were mostly invoice preparations for printing thousands of bills to be posted to customers. One of the most difficult applications was considered to be payroll as well as banking applications. Some reservation systems were developed later as main frame computers acquired some real time online and real time interactive capabilities.

The newest mainframes include the latest Fujitsu GS21 series announced April 2018 along with Fujitsu Software GSS21sx V20 which supports connections with open systems. In September 2019, IBM launched the IBM z15  showcasing key capabilities such as encryption anywhere, cloud native development, and instant recovery. As a follow up IBM z16 delivers breakthrough technologies for AI and cyber resiliency to accelerate decision velocity, protect against threats across business, and modernize for hybrid cloud meaning common use of Mainframe and Cloud technologies.

Magnetic ledger accounting computers in the 60ies and 70ies

Most medium to large companies in the private sector were just using accounting machines with multiple registers printing on hard copy ledger cards, mainly covering accounting applications.

These accounting machines evolved to the first magnetic ledger computers which used ledgers with magnetic stripes and peripherals with paper tapes and magnetic discs and tapes.  

These magnetic ledger or hard copy ledger accounting machines were small computers with processors and memory that used proprietary computer languages limiting the development of wide spread software applications restricted to commercial applications focused mainly on general ledger, invoicing and payroll.

This is the time that a new group of companies appeared in the Greek market such as NIXDORF, KIENZLE, Burrroughs, OLIVETI, REMINGTON, Logabax,  ect, which dominated the private sector market during the 60ies and 70ies but gradually some  vanished as far as some of their magnetic ledger products during the 80ies, some became intelligent computer terminals, as communications were becoming more accessible. 

These first accounting computer machines, equipped with simple paper tape peripheral, were used to the extreme of their technical capabilities, to satisfy advanced accounting requirements.

As an example, I can refer to an inventory control application implemented for the Greek shipping company AEGIS, owned by a well-known ship owner Papalios. The solution was implemented on a Remington paper tape computer, programmed in binary machine code language, as early as 70ies. Nobody that is aware of today’s facilities could even contemplate that similar solutions could be implemented with paper tapes used as files to maintain a huge inventory control.

Similar applications were developed on similar Remington computers in other areas such as a hotel software package for hotel reservations, even main currant.software. The system was installed in major hotels in Greece, including Porto Carras group of hotels, major hotels in Rhodes, Skiathos Palace and a number of hotels in Crete, the system was even exported to six hotels in Porto Roz close to Ljubljana. This hotel application was later converted to operate on ICL DRS distributed processing small computers under the name WELCOME which was installed in international cruiser lines enhanced with Point of sales cash registers operating around the world.   

This type of computers helped the wider market to be educated in the concept of computing and the benefits it could produce. It helped accountants to get reed of those long black sleeves and communicate at a higher level

This type of applications was an indication that the Greek IT market was maturing to step up the next stage of technological evolution which came with the introduction of minicomputers and improvements in communication networks

Minicomputers , microcomputers and medium size mainframe systems in the 70ies and 80ies .

Before the mid-1960s, main frame computers were still very expensive and used only for special-purpose tasks, such as military or scientific research. Computers were also very slow for interactive or multitasking operations. A simple batch processing arrangement ran only a single «job» at a time, one after another. But at the end of 60ies, faster and more affordable minicomputers became available, and as prices decreased, such computers supported time-sharing, which were allowing multiple users to share processes on same CPU and memory.

Main frame manufacturers did not jump to the opportunity to proceed with this new minicomputer technology or move to the concept of open operating systems as it would deprive them of their ‘protected’ proprietary environment that was locking the investment of customers to one manufacturer.

Digital Equipment Corporation (DEC, the first implementation of UNIX operating system

A breakthrough was initially made from Digital Equipment Corporation (DEC), a major American company founded in 1957. (DEC) became «the nation’s second-largest computer company after IBM. Its initial major impact was in minicomputers. 

The technology introduced by DEC computers was based on much earlier work that started with projects for US Air Force that demanded much more interactivity than even faster main frame computers could provide much later.

Real time environment was needed for projects like flight simulators used by the US Navy and Air force since 1944. Such technology was needed   to allow operators to interact with radar data stored in the computer

The real breakthrough came during 1956 when old vacuum valve technology was replaced by the first transistors   

So, this gave a big advantage to DEC, for the development of computer systems which were promoted in universities and later in commercial applications in competition to IBM and other main frame manufacturers.

One of the most famous mini computers produced by DEC was the PDP-7 introduced in December 1964, The PDP-7 is most famous as the original machine for the UNIX operating system, Unix only ran on DEC systems for a certain period, but later became the fundamental operating system on which the total OPEN SYSTEMS philosophy was based upon.

Most computer companies followed the technological example of DEC and started the promotion of minicomputers or middle range computers capturing significant sections of the market due to lower cost to performance ratio.

DEC was acquired by Compaq in June 1998, in what was at that time the largest merger in the history of the computer industry. Not long thereafter, Hewlett- Packard bought Compaq, «creating a technology company second in revenue only to IBM.

Some of the new names included:

Sun Microsystems, Inc. (Sun for short)  was founded on February 24, 1982. Sun for was an American technology company that pioneered in IT technology after the Mainframe era introducing  computers, components, and information technology services and created the Java programming language, the Network File System (NFS) a distributed file system protocol  and SPARK microprocessors (Scalable Processor Architecture) a reduced instruction set computing 

Sun contributed significantly to the evolution of several key computing technologies, among which UNIX, RISK processors, thin computing,

In general, Sun was a proponent of open systems, particularly UNIX. It was also a major contributor to open source software, by purchasing later MySQL an open-source relational database management system. Eventually, the company was acquired by Oracle on January 2010

Data General was one of the first minicomputer companies founded by former employees of DEC the late 60ies.Their first product, intended to both outperform and cost less than the equivalent from DEC, the PDP-8, eventually proved a better system that sold thousands during the 70ies.

Data General, although successful, and very innovative in hardware and software could not face the introduction of the IBM PC  in 1981 an event that marked the beginning of the end for minicomputers, and by the end of the decade, the entire market had largely disappeared

ICL’s entry in the Greek market. (The story of ICL and Pan Solomos co)

With the successful entry, of mini and medium range computers that started in the Greek market, the Greek company, Pan Solomos Ltd, invested to bring the first ICL Medium Range computer of the 2903 series.

The 2903 range was a rapid development to produce a small business computer to replace the 1901A. As far as possible, it was developed from existing hardware and software, but configured for an office environment without under floor cabling. It was urgently needed to generate a cash flow that would support continuing 2900 development. The hardware was based on the 2900 FPC2 (Disk File Controller), 2903/4 system cabinet housed the BASIC engine, disk storage and a punched card reader, the operator’s console was a visual display unit for 1900 users who were accustomed to teletypes as console, 2903 used microcode emulating 1900 main frame hardware. The operating system was George 1* In consequence, the 1900 compilers and utilities ran on the 290x range without any changes or recompilation. For some sites, a microcode floppy was available that would make the system work as an IBM 360 running the IBM operating system.

A new feature provided, on this range, was “Direct Data Entry”, a system with up to eight dedicated VDU data entry stations, with which card image files could be created; these could be assigned to a program’s card reader and processed accordingly.

290x computers would run in an office environment, still quite an innovation for this class of machine, which very soon became a runaway success. Roughly 3,000 systems were sold, ten times as many as ICL had anticipated.

The smaller machines in the 290x family were replaced in 1980 by the ME29 system.

Pan Solomos had already established one of the first data processing service bureau on ICT Card Punch equipment, which had been used in Greek elections, so the company used the first 2903 computer to re-established its service bureau replacing card punching equipment with the first “Direct Data Entry”, system on the 2903 computer, later renamed ME29.

Among the first service bureau customers which used this new data entry technology were, “Chalivourgiki”, the Greek Steel company owned by Angelopoulos, the Lebanese contractor company (CCC), the Greek Social Security Company “IKA”, “Springfield” Onasis shipping company and the Greek Radio and Television company “ERT”. Two more service bureau companies were impressed by the replacement of card punch machines and invested on this system as well.

More companies from the private sector were impressed by the success of this service bureau and invested in ICL 2903 computers, Including Lambropoulos Bros the biggest department store, KAROLOS FIX Brewery, Adriatiki insurance co, “ION” the biggest Chocolate producer, “Minion Department Stores”, Meimaredis group of department stores (Akron Ilion Krystal), “EVZ” the Greek Sugar Industry with an installation of six 2903 systems, the biggest 2903 installation in Europe, ”Mitera” the biggest private Maternity Hospital etc.

It is of historical interest, worth mentioning, is terrorist event that burned down the total “Minion department stores” including the 2903 computer installation, which maintained the accounts of all suppliers and whole sale customers including debtors balances. By total coincidence, the EDP manager of Minion had taken with him a back up disc with all programs and data. The system was reinstated in Solomos service bureau overnight, that helped Minion to reopen the department store in a new, close by, location within a few weeks. This was a unique event in EDP history.

Some of these installations replaced IBM medium size computer installations, hence the impact made in the market for the entry of ICL in the market and the competence of the Greek company, Solomos, were very impressive.

Meanwhile ICL internationally proceeded to absorb Singer business computers the US computer company.

Singer Business Machines

Early in 1976, ICL acquired the Business Machine division of Singer, which was acquired by purchasing “Friden”, a computer company, whose flagship product was the “system ten”  a small business minicomputer.   The acquisition shifted the geographical balance of ICL’s sales away from the UK, and also gave a presence in industry markets such as retail and manufacturing. ICL subsequently developed the System Ten into the System 25, and used the product to spearhead the growth of its Retail Systems business during the 1980s.

Datachecker

In 1988 STC major share holder acquired US retail systems specialist Datachecker Systems from National Semiconductor Corporation. At the time this was the second largest supplier in the US retail market, and greatly expanded ICL’s US presence.

Any merging activity in Europe and USA was considered a natural part of an evolutionary process. Integrating skills and management was always a difficult process but management practices and culture was advanced, so solutions were always found to the benefit of the new company created out of a merger or takeover.

Unfortunately, the situation in Greece developed to a tragedy which led to the bankruptcy of several companies including “Pan Solomos” company, a real pioneer of the Greek office organization market and the newly formed company “Prooptiki SA” generated out of merging “Eurodata” the Singer Business machine agent in Greece and “Pan Solomos” representing ICL. This dramatic development happened due to lack of management culture of the shareholders of both merging companies to the benefit of ICL.

ICL eventually founded ICL Hellas SA with ICL controlling majority during the end of 80ies.

ICL’s presence was still expanding in Greece, especially in retail industry, having retained all personnel and customer base utilizing ICL point of sale products which enriched existing customer base such as Minion Department stores, Duty Free Shops, Royal Cruise line operating in Aegean Sea, the Bahamas, North America and North Europe. Similar ICL Point of Sales installations were implemented on major coastal shipping companies among which ANEK LINES of Crete, ANEL LINES of Lesbos and DANE of Rodes (Dodecanese lines) which  was expanding on their ticket reservations applications software and OTE the Greek ptt.

iCL managed to promote one of the first office automation systems that run on Middle range computers with office automation packages  a such as “Office Power” package installed on a 2903 ICL computer for the ministry of exterior. Other systems included the installation of 2903 to “Arios Pagos” the supreme court of Greece etc

Fujitsu relationship with ICL and eventual acquisition

ICL’s relationship with Fujitsu started in 1981, when ICL needed a cheaper source of technology to develop lower-end machines in the 2900 range to compete with the IBM 4300 Series   At this stage ICL was developing its own Large-scale integration (LSI) technology with tens of thousands of transistors per chip, for use in the higher-end machines, designed as a successor to the highly successful 2900 processor). ICL had visited a number of companies during 1980, including Fujitsu and Hitachi  to identify potential suppliers.

As part of the 1981 restructuring, ICL cancelled in-house LSI technology development, and negotiated an agreement that gave access to Fujitsu’s LSI and packaging technologies, which, when combined with ICL’s in-house CAD capability, enabled ICL to design and manufacture Series 39 level 30 and 80.

Initially the collaboration with Fujitsu was presented as being an arm’s length one, to avoid diluting ICL’s credentials as a European and British company. However, Fujitsu’s involvement with ICL at both the financial and the technical level steadily increased over the subsequent two decades, and in 1990 Fujitsu acquired 80% of ICL plc from its parent STC plc, In 1998 Fujitsu became ICL’s sole shareholder[ and the ICL brand was dropped in 2002.

This fulfilled a prediction that the European director has made earlier, warning ICL and the rest of the British industry of the future threat coming from Japan.

Fujitsu managed to absorb eve more of the European computer industry by the creation of Fujitsu Siemens Computers during 1999 which also acquired Nokia Data personal computers and severs marketed under ICL brand and all the business of Siemens Nixdorf with the exception of its banking and retail systems. Fujitsu Siemens was merged back into Fujitsu in 2009.

European Union funding provided to improve IT infrastructure in Greece. Mediterranean programs during the 80ies and 90ies.

During the same period, Public organizations, which were in need of immediate introduction of computerization, still straggled with manual processes and bureaucracy, partly due to incompetence and partly due to lack of properly established public procurement procedures?

Organizations such as Inland Revenue and tax authorities, social security (IKA), pensions, public hospital administration, various ministries, public television companies, public utility companies, legal administration, central government and much more, which could benefit from large mainframe installations, they either did not use any or they failed to make good use of them. It was only after Greece’s entry into the European Common Market the problem was seriously addressed in terms of funds being made available and methodology followed.

Two areas which were considered to be of great priority were Inland Revenue (TAXIS) in the ministry of Finance and Public Hospital administration within the Ministry of Health, especially maintaining medical record historical files. Efforts had already been made. For the various tax offices Novel network distributed systems were installed from Greek companies “ABC” and “Informer SA”, this solution had not proved to be adequate. IBM installed a central main frame to complement the initial implementation to the TAX offices, but still the problem was not solved. Eventually the project was to be funded by a gigantic Mediterranean program that was won by Bull.

The medical record problem has always been a serious unresolved issue, on previous attempt took place in one of the general hospitals “Genimatas General hospital’ during 1976. This attempt was done by Pan Solomos Ltd based on Microfiche Remington equipment, nothing to do with computers.      

Yet, in spite significant funding that was provided for both areas the public administration did not manage to take advantage of this huge opportunity.

In both cases the projects proved to be failures, in the first case, TAXIS, the two front runners were two European companies Bull and ICL. Both companies submitted proposals for the same system purchased from Siemens, the bid requested the execution of a bench mark which both companies completed, at great cost, the result was that ICL produced double performance at significantly lower price but the company was  disqualified because , at the original proposal, the company did not include a software simulator product of minor cost, but submitted this, subsequently, with the results of the benchmark, as a result the government lost a significant amount of money. One might say that this was not so bad; the bigger failure was with the second project with the 15 hospitals and the patient medical record history file.

The companies participating for the hardware system was DEC, HP and ICL, the Greek company INTRASOFT got the contract for developing the Patient Medical record history file.

The bid committee had an advanced idea, instead of declaring one company as the winner it decided   to allocate five hospitals to each of the participating companies, as long as they could all agree to run the software under the same OPEN SYSTEM environment with UNIX operating system. They all agreed including INTRASOFT the software producer.   

The project was successfully completed for both for the hardware supplied and the Medical record software, yet the project was never fully paid, one of the reasons was that the Greek Ministry of Health had failed to make provisions for the budget needed to cover maintenance cost, not covered by European funding…many Billions Drachmas were lost….The ministry is still straggling for a standardize Medical history file. This was an enormous failure that discredited Greek public administration for years.

The only activity that survived out of the hospital effort was the implementation of basic accounting applications, such as general ledger, payroll stock control completed by Pan Solomos on an ICL 2903 computer at the largest Greek Hospital ‘Evagelismos’ .

The train of technology in Greece       

Eventually, everybody was talking about the risk of ‘losing the train of technology’. So, the Greek public administration had to run faster, thus, it was after the 80ies that the train started moving to gain some of the wasted time.

The multiplicity of installations in various commercial companies gave birth to the need to standardize on certain operating system and programming languages, hence the concept of ‘Open systems’ during the 80ies was introduced.

All this technological evolution had a direct impact in the international IT market as well as to the Greek market.

European IT industry was struggling to survive in competition with US and later with Japan for both Main frame and minicomputer manufacturing industries, till the first personal computers appeared during late 80ies and 90ies. That was a game changer for both OPEN SYSTEM and the rest of the software industry due the rise of Microsoft which developed to a monopoly.

This gave a big boost to hardware manufacturing in China and Taiwan, but even stronger impact to the spreading of IT, had the rise of OPEN SYSTEMS applications and faster communication networks with internet diving deeper to human societies.

OPEN SYSTEMS, UNIX AND LINUX

As computer hardware was becoming smaller, faster, easier to install and operate as well as at considerably lower cost, more companies were purchasing systems which started outsourcing software development services. As a consequence a number of third party software houses started developing commercial applications, marketing standard software packages that run on different hardware system platforms.

Open systems refer to computer systems that offer interoperability, portability, as well as open software standards. The open systems indicated the Unix world because Unix runs in more types of computer hardware compared with other operating systems.

It was a natural trend for software producers to seek a common environment to transfer their products and implement packaged solutions on different hardware platforms,

That is how “Open Systems” became the surviving trend against legacy systems which were prevailing at the time. However, the trend was resisted by IBM and other firms for decades which were protecting their legacy platforms as much as possible..

The situation starts changing in the first part of 21st century. Many of same legacy system vendors like IBM and Hewlett-Packard begin to adopt Linux as part of their overall sale strategy     

The history of UNIX dates back to the mid-1960s, when MIT, AT&Bell Labs, and General Electric were jointly developing an experimental time-sharing operating system called ‘Multics’  for the GE mainframe. Multics introduced many innovations   but also had many problems. Bell Labs, frustrated by the size and complexity of Multics but not its aims, slowly pulled out of the project. Their last researchers to leave Multics decided to redo the work, but on a much smaller scale, The resulting system, much smaller and simpler than Multics, was to become Unix which was first ported on a PDP-7 DEC Computer during 69. For the first time in 1970, the UNIX operating system was officially named and ran on the PDP-11/20.

During the 80ies, multiple minicomputer installations gave birth to plethora of software houses that started developing commercial application on UNIX, and service bureau companies, both of which acted as an introduction to what was called later, outsourcing of IT services.

One of the first companies that developed most of the original UNIX software was UniSoft Corporation an American software developer established in 1981, originally focused on the development of UNIX ports for various computer architectures.

UniSoft Corporation

By 1989, UniSoft Corporation had completed over 225 UNIX implementations on various hardware platforms, which were estimated to have been about 65% of all such ports. This included porting of

UNIX Version 7, as the first operating system, on Sun Microsystems.  UniSoft also developed Apple’s  UNIX variant for the Apple Macintosh II.

The first UNIX release was written entirely in assembly language in 1971, as this was common practice at the time.

UNIX was rewritten in C programming language during 1973,  

The availability of a high –level language implementation of UNIX made its porting to different computer platforms easier.

In spite its contribution to OPEN SYSTEM, UNIX was marketed as a proprietary operating system, portable to different platforms, yet its source code was closed to any third party due to an earlier antitrust regulation forbidding the entry to commercial computer business, AT&T was required to license the operating system’s source code free to anyone who asked.

As a result, UNIX grew quickly and became widely adopted by academic institutions and businesses. In 1984, AT&T disengaged itself of Bell Labs, freed of the legal obligation requiring free licensing; Bell Labs began selling UNIX as a proprietary product, where users were not legally allowed to modify UNIX.

Linux operating system

The GNU General Public License (GNU GPL or simply GPL)  Project was free software, mass collaboration project announced on September 27, 1983. Its goal was to give computer users freedom and control in their use of their computers and computing devices by collaboratively developing and publishing software that gives everyone the rights to freely run the software, copy and distribute it, study it, and modify it.

Development was initiated in January 1984. In 1991, the Linux Kernel   appeared, developed outside the GNU project and in December 1992 it was made available under version 2 of the GNU General public License, it allowed for the first operating system that was free software, commonly known as Linux.

Linux was originally developed for personal computers based on Intel architecture, but has since been ported to more platforms than any other operating systemBecause of the dominance of the Linux-based Android on smartphones, Linux also has the largest installed base of all general purpose operating systems. Linux is the leading operating system on servers (over 96.4% of the top 1 million web servers’ operating systems are Linux) leads other large  systems such as mainframe computers and is the only OS used on most supercomputers (since November 2017, having gradually eliminated all competitors).

Linux is one of the most prominent examples of free and open-source software collaboration. The source code may be used, modified and distributed commercially or non-commercially by anyone under the terms of its respective licenses, such as the GNU General Public License. 

PART B FOLLOWS SHOTLY

Η χρησιμότητα του λογισμικού ECM (Enterprise Content Management) σε κάθε επιχείρηση.

ECM.jpg

Ανεξάρτητα από τον τομέα της επιχείρησης , η μεγάλη εικόνα όσον αφορά τα οφέλη, όταν πρόκειται για τη χρήση  του λογισμικού διαχείρισης πληροφοριών ECM είναι αρκετά ξεκάθαρη.

Δημιουργείται ένα ενιαίο σύστημα αποθήκευσης αρχείων σε μορφή ψηφιοποιημένων εγγράφων η άλλης  μομφής πληροφοριών, μηχανογραφικών δεδομένων, αλληλογραφίας, ήχου, εικόνας, σχεδίων κλπ  , σε ένα συνεργαζόμενο αλλά ελεγχόμενο περιβάλλον για την  διαχείριση και την επεξεργασία εγγράφων και πληροφοριών  πολλαπλών κατηγοριών .

Μόλις η βάση αυτή ολοκληρωθεί και διασυνδεθεί με όλο το λογισμικό  που λειτουργεί στην  υφιστάμενη υποδομή  της επιχείρησής , μπορεί αυτόματα να διατηρήσει αυτά τα αρχεία, έγγραφα και πληροφορίες, παρέχοντας ένα κεντρικό αποθετήριο (full-stack) για όλα αυτά τα δεδομένα, ανεξάρτητα από την πηγή από την οποία  προέρχονται και την μορφή τους.

Τέλος, η ασφάλεια, η αξιοπιστία και η αυξημένη αποδοτικότητα που επιτυγχάνεται με το λογισμικό αυτό σημαίνει ότι το κόστος του αποσβένεται  σε ελάχιστο χρόνο.

.Εκείνοι που αγνοούν τη σημασία του λογισμικού ECM είναι βέβαιο ότι θα αντιμετωπίσουν κάποια  καταστροφή η μεγάλο κόστος λειτουργίας  στο διάστημα της ζωής μίας μηχανογραφικής υποδομής  .

Η δύσκολη επικοινωνία ή η ασυμβατότητα θα έχει ως αποτέλεσμα τα τμήματα να μην διαμοιράζονται σωστά τις πληροφορίες.

Τα έγγραφα και οι πληροφορίες ενδεχομένως να χαθούν, να διαγραφούν ή να αλλαχθούν εσφαλμένα από ανθρώπινα λάθη, ή ακόμα από βλάβες του εξοπλισμού ή το κακόβουλο λογισμικό / ransomware που είναι βέβαιο ότι σε κάποια στιγμή θα χτυπήσουν.

Χωρίς να υπάρχει λογισμικό ECM για να μετριάσει αυτά τα προβλήματα, κάποια ή όλα αυτά μπορεί να οδηγήσουν σε σοβαρή ζημιά  και πολύ χαμένο χρόνο και χρήμα για την αποκατάσταση αυτών των ανωμαλιών.

Αυτά είναι κοινά προβλήματα που έχουν  αντιμετωπιστεί και συνεχίζουν να αντιμετωπίζονται από τις επιχειρήσεις  με την χρήση της τεχνολογίας αυτής που είναι πλέον ευρέως διαθέσιμη.

Αλλά ακόμα πιο σημαντικό είναι ο συνδυασμός της άμεσης και ελεγχόμενης  πρόσβασης στα δεδομένα της ενιαίας βάσης από αυτούς που συμμετέχουν στις διάφορες διαδικασίες της επιχείρησης.

Ο εμπλουτισμός των συστημάτων ECM με εργαλεία process  control ολοκληρώνει την αποτελεσματικότητα του συστήματος.

Η διαχείριση εγγράφων είναι μία από εκείνες τις τεχνολογίες αιχμής λογισμικού που συχνά παρερμηνεύονται από εκείνους που δεν βρίσκονται βαθειά μέσα στη βιομηχανία λογισμικού.

Ως αποτέλεσμα, μια απλή περίληψη αυτού που κάνει μπορεί να οδηγήσει πολλές επιχειρήσεις , εσφαλμένα να υποθέσουν ότι μπορούν να λειτουργήσουν ,ικανοποιητικά, χωρίς αυτό, βασιζόμενοι σε απλές βάσεις δεδομένων για να παρακολουθούν τις ενημερώσεις, να οργανώνουν αρχεία και να διατηρούν τακτικά αντίγραφα ασφαλείας.

Σίγουρα, είναι εφικτό να επιβιώσει μία επιχείρηση ακόμα και με αυτές τις βασικές πρακτικές, αλλά δεν θα  εξυπηρετηθούν  πολύ καλά μακροπρόθεσμα.

Τα συστήματα διαχείρισης εγγράφων και πληροφοριών  (ECM)  είναι πολύ κρίσιμα για την επιτυχία μιας μεγαλύτερης επιχείρησης με πιο σύνθετες διαδικασίες.

Η διαχείριση πληροφοριών και οι επικοινωνίες είναι το κλειδί, κάτι που ισχύει σε όλες τις επιχειρήσεις.

Το ECM έχει σχεδιαστεί για την ψηφιοποίηση και την οργάνωση εγγράφων και πληροφοριών σε μια βάση δεδομένων που επιτρέπει την άμεση πρόσβαση.

Για μια αναπτυσσόμενη επιχείρηση, κάθε τμήμα αυτών των πληροφοριών είναι αναγκαίο να διατηρηθεί άθικτο. Ποτέ δεν ξέρει κανείς αν  θα χρειαστεί τυχαία, μία απόδειξη, κάποιο mail, αίτημα, πρόταση ή  αναφορά.

Η ψηφιοποίηση εγγράφων βοηθά να διατηρούνται τα πράγματα οργανωμένα.

Η υποδομή του Cloud έχει προχωρήσει πολύ προς τη διαρκή διαθεσιμότητα και ασφάλεια αυτών των δεδομένων. Ωστόσο, αυτό δεν αρκεί, καθώς η πλειονότητα των υπηρεσιών αποθήκευσης σύννεφων προσφέρει το πολύ μια βάση δεδομένων και διαχειριστή αρχείων.

Ας ρίξουμε μια ματιά στα επτά μεγαλύτερα πλεονεκτήματα της χρήσης DMS και ECM  για την κάθε εταιρεία.

Θα συνειδητοποιήσουμε  γρήγορα πόσο ζωτικής σημασίας είναι ένα εργαλείο αυτού του λογισμικού.

Συνοπτικά μπορούμε να τιτλοφορήσουμε τα πλεονεκτήματα στα εξής επτά οφέλη που προκύπτουν

Όφελος # 1 – Αξιόπιστα αντίγραφα ασφαλείας

Όφελος # 2 – Πρόσβαση σε αρχεία δεδομένων και έγγραφα, δομημένα και αδόμητα βάσει ρόλου των εργαζομένων στην επιχείρηση  η ακόμα και εξωτερικών συνεργατών.

Όφελος # 3 – Πλήρης παρακολούθηση διαδρομής και του ιστορικού ενεργειών και δεδομένων.

Όφελος # 4 – Συνεργασία και  δια λειτουργικότητα με πολλαπλές πλατφόρμες  μηχανογραφικών εφαρμογών

Όφελος # 5 – Συμβατότητα πληρωμών, χρήσιμο για την αποφυγή σφαλμάτων που εγκυμονούν λόγω  πολλαπλών συστημάτων  που συμμετέχουν στα συστήματα και τις διαδικασίες πληρωμών.

Όφελος # 6 Ανεξαρτησία των αρχείων και βάσεων δεδομένων  από την εκάστοτε υφιστάμενη μηχανογραφική υποδομή.

Όφελος # 7- Σημαντική διευκόλυνση στην  αποτύπωση , παρακολούθηση και βελτίωση των διαδικασιών στην λειτουργία της επιχείρησης.

skymarkrelate.com

 

 

ΤΑ ΑΥΤΟΝΟΗΤΑ ΤΗΣ ΤΕΧΝΟΛΟΓΙΑΣ ΤΗΣ ΠΛΗΡΟΦΟΡΙΑΣ

images.jpg

Μερικές φορές τα αυτονόητα είναι εκείνα που δεν γίνονται κατανοητά και δεν εφαρμόζονται τόσο στην καθημερινότητα της επιχειρηματικής δραστηριότητας αλλά και της ευρύτερης διαχείρισης  της εθνικής οικονομίας  του δημοσίου και του ιδιωτικού τομέα.

Αυτή η αδυναμία έχει σαν αποτέλεσμα να συνειδητοποιούμε το χάος που αποκαλύπτεται σήμερα πρώτα απ’ όλα με τους φακέλους  συνταξιούχων που βρίσκονται διάσπαρτοι στους διαδρόμους των ΕΦΚΑ

Αλλά τα προβλήματα διαχείρισης φακέλων και γενικότερα της διαχείρισης πληροφοριών δεν σταματούν  στον ΕΦΚΑ η άλλων δημοσίων οργανισμών αλλά επεκτείνονται στον ιδιωτικό τομέα σε ότι αφορά την λειτουργία του ηλεκτρονικού εμπορίου, την διακίνηση του ηλεκτρονικού χρήματος αλλά και την ανταλλαγή πληροφοριών μεταξύ των τράπεζων του δημόσιου και των επιχειρήσεων.

Σε όλα αυτά  έρχεται να προστεθεί   η ανάγκη προσαρμογής στην ευρωπαϊκή νομοθεσία σε ότι αφορά την διαφύλαξη προσωπικών δεδομένων και στη διεθνή επιβολή των κανόνων ποιοτικού ελέγχου του γνωστού μας  ISO κλπ.

Η σωστή αξιοποίηση της τεχνολογίας έχει γίνει απαραίτητη  για την  αποκατάσταση των προβλημάτων ενώ διευκολύνει ταυτόχρονα στην προσαρμογή στις νέες απαιτήσεις και συνθήκες της διεθνούς αγοράς.

Στο επίπεδο των Ελληνικών επιχειρήσεων του δημοσίου και ιδιωτικού  τομέα γίνονται ορισμένες απόπειρες,  γι’ αυτό ακούμε όλο και περισσότερο τις λέξεις όπως ψηφιοποίηση, ηλεκτρονική τιμολόγηση, ηλεκτρονική αρχειοθέτηση.

Ταυτόχρονα διαπιστώνεται η ανάγκη περισσότερης συνεργασίας διαφορετικών οργανισμών  και φορέων δημοσίου, τραπεζικού και ιδιωτικού τομέα που συμμετέχουν στο επιχειρηματικό περιβάλλον και απαιτούν την ανταλλαγή πληροφοριών.

Οι προσπάθειες αυτές ήταν και εξακολουθούν να είναι αποσπασματικές και ανεπαρκείς. Το αυτονόητο είναι ότι δεν είναι επαρκές να ψηφιοποιήσει κανείς τις πληροφορίες, χωρίς  να αυτοματοποιηθούν και οι διαδικασίες.

Είναι αυτονόητο ότι η ροή των πληροφοριών μεταξύ των συντελεστών  που συμμετέχουν γίνεται πολύ πιο εύκολα όταν οι πληροφορίες είναι ψηφιοποιημένες, δεν είναι αυτονόητο όμως ότι οι διαδικασίες και αρμοδιότητες των ενδιαφερομένων αποτελούν μέρος της οργανωτικής δομής  των οργανισμών που αποτελούν μέρος του συστήματος ποιοτικού ελέγχου.

Τελικά δεν φαίνεται να είναι απόλυτα αυτονόητο ότι η συγκέντρωση  του συνόλου των πληροφοριών,  και ο καθορισμός  των διαδικασιών συμβάλει στην σημαντική βελτίωση της ανταγωνιστικότητα των επιχειρήσεων και της οικονομίας γενικότερα.

Ας μην δικαιολογούμε λοιπόν την έλλειψη ανταγωνιστικότητας  στο κόστος ενέργειας ,των πρώτων υλών και του ανθρώπινου δυναμικού, ένα τεράστιο ποσοστό ωφέλειας κρύβεται στην εφαρμογή σωστής διαχείρισης της καθημερινής πληροφορίας που βρίσκεται στα έγγραφα, ηλεκτρονικά και μη, σε συνδυασμό με την αυτοματοποίηση των διαδικασιών που διασφαλίζουν τον έλεγχο και την σωστή διεκπεραίωση των εργασιών.

Αυτά δεν είναι αυτονόητα, απαιτούν όμως λίγο μεγαλύτερη προσοχή και μελέτη στην εφαρμογή της σωστής τεχνολογίας.

Νίκος Κούζος (Nick Kouzos)

President of Skymark Technologies

Tel: +30 22910 78964 | Mob: +30 697 66 96 568

Url: skymarkrelate.com

Εφαρμογή Στρατηγικής Διαχείρισης Εταιρικης Πληροφορίας

 

Σύλληψη, Επεξεργασία Πληροφοριών, Διασύνδεση.

Η εφαρμογή Στρατηγικής Διαχείρισης Εταιρικης Πληροφοριας  (Enterprise Content Management)  είναι η πλέον πρόσφατη εξέλιξη στην εφαρμοσμένη  πληροφορική των επιχειρήσεων που επιβλήθηκε σαν   χιονοστιβάδα ακολουθώντας και συμπληρώνοντας στην έκρηξη των εγκαταστάσεων  ERP που έγινε πριν από μερικά χρόνια στη διεθνή αγορά.

Η εφαρμογή μιας λύσης ECM σε οποιοδήποτε τομέα, είναι βέβαιο ότι θα αποφέρει σημαντικά και άμεσα οφέλη στους περισσότερους οργανισμούς και επιχειρήσεις.

Η διοικήσεις των επιχειρήσεων  έχουν αρχίσει να  συνειδητοποιηθούν αρκετά από τα οφέλη και τα πλεονεκτήματά της στρατηγικής αυτής.

Παρά την θετική εξέλιξη αυτή προκύπτουν ορισμένα  προβλήματα, κυρίως  όταν επιβάλλονται διαφορετικές προτεραιότητες για διάφορους λόγους, είτε από συνθήκες αγοράς, είτε από αλλαγές στην τεχνολογία καθώς και από εσωτερικές ανάγκες της κάθε εταιρείας.

Για τον λόγο αυτό η διοίκηση πρέπει να κατανοήσει πλήρως και στην λεπτομέρεια την χρήση της τεχνολογίας ECM, προκειμένου να αποφασίζει πώς και από ποιους τομείς μπορεί να ωφεληθεί περισσότερο. Θα ήθελα λοιπόν Θα επωφεληθώ αυτής της ευκαιρίας για να οριοθετήσω  ορισμένα στοιχεία των προβληματισμών που προκύπτουν σχετικά με αυτή τη στρατηγική.

Η τεχνολογία ECM κατα βάση αποτελείται από τρία κύρια μέρη:

  1. Μέθοδοι σύλληψης-συλλογής πληροφοριών
  2. Επεξεργασία και διακίνηση πληροφοριών
  3. Διασύνδεση με την εκάστοτε υπάρχουσα υποδομή συστημάτων πληροφορικής.

 

 Data entry.jpg

Μέθοδοι σύλληψης -συλλογης πληροφοριών

Αυτό είναι το μέρος που αντικαθιστά κυρίως την προσπάθεια καταχώρησης δεδομένων, η οποία συνήθως σχετίζεται με την εισαγωγή στοιχείων στο  ERP.

Πρόκειται για έναν τομέα ο οποίος είναι εύκολα κατανοητός από τη διοίκηση, αλλά είναι επίσης μία διαδικασία που συμμετέχει κατά το  μικρότερο ποσοστό στο λειτουργικό κόστος  της επιχείρησης που μπορεί εύκολα να εκτιμηθεί.

Είναι γεγονός ότι οι υπολογιστές έχουν εξελιχθεί σημαντικά και συνεχίζουν να εξελίσσονται, αναπτύσσουν   χαρακτηριστικά «αυτό -εκπαιδευσης» που έχουν δημιουργήσει προσδοκίες για μελλοντικές σημαντικές βελτιώσεις, στην αυτόματη ανάγνωση εγγράφων, αυτόματης Ψηφιοποίησης ή ακόμη και αυτόατης ανάγνωσης χειρόγραφων σημειώσεων. Ήδη διαπιστώνουμε καλό επίπεδο ακόμα και  σε αυτόματες μεταφράσεις κλπ

Αλλά δεν έχουμε φτάσει ακόμα εκεί.

Πολλές υλοποιήσεις πραγματοποιούνται ακόμα και από σήμερα, με την αξιοποίησή των παραπάνω τεχνολογικών επιτεύξεων , αλλά οι περιπτώσεις αυτές βρίσκουν εφαρμογή σε πολύ συγκεκριμένες περιπτώσεις και οι περισσότερες από αυτές αποδεικνύονται ιδιαίτερα δαπανηρές.

Από την άλλη πλευρά, ταυτόχρονα,  η παραγωγή ψηφιακών ηλεκτρονικών πληροφοριών αντικαθιστά πολύ γρήγορα τα έγγραφα σε έντυπη μορφή, με ηλεκτρονικά έγγραφα ώστε μειώνεται η απαίτηση για αυτόματη ανάγνωση των εγγράφων αυτών.

Για παράδειγμα, βιώνουμε σημαντική αύξηση της ηλεκτρονικής τιμολόγησης και γενικά  της ηλεκτρονικής ανταλλαγής δεδομένων (EDI), η οποία εξαλείφει πλήρως την ανάγκη για αυτόματη, έξυπνη,  ανάγνωση των εγγράφων αυτών.

Επομένως, είναι αμφίβολο αν πρέπει να επενδυθούν χρήματα  σε δαπανηρές τεχνολογίες σύλληψης πληροφοριών που δεν θα αξιοποιηθούν τα επόμενα χρόνια.

 

processes.jpg

Επεξεργασία πληροφοριών.

 Αυτός είναι ο τομέας στον οποίο η επένδυση στην στρατηγική ECM αποδεικνύεται μακροπρόθεσμα περισσότερα επωφελής.

Αυτή είναι η περιοχή όπου δημιουργείται ένα περιβάλλον χωρίς την χρήση και διακίνηση  χαρτιού. (Paperless office)

Στην φάση αυτή αξιοποιούνται οι διαδικασίες αυτόματης ροής εργασιών, που επιτρέπουν στην διοίκηση και τους υπαλλήλους να επικοινωνούν μεταξύ τους, να έχουν πρόσβαση σε πληροφορίες οποιασδήποτε μορφής ταχύτερα, προκειμένου να  επιτυγχάνεται καλύτερος έλεγχος , ταχύτερη λήψη αποφάσεων, συνεχή παρακολούθηση των διαδικασιών, αυτόματη αρχειοθέτηση, αποτελεσματική προστασία δεδομένων, εξάλειψη σφαλμάτων και, τέλος, ανεξαρτησία από την υπάρχουσα υποδομή συστημάτων πληροφορικής κ.λπ.

Ας σημειωθεί ότι ο υπολογισμός του κόστους εισαγωγής δεδομένων που γίνεται με ανθρώπινη συμμετοχή Data entry,  είναι απόλυτα εμφανής και μπορεί να συγκριθεί άμεσα με την απαιτούμενη επένδυση για την αυτοματοποίηση.

Αλλά η αυτοματοποίηση είναι δυσανάλογα μεγάλου κόστους ενώ το κόστος της εισαγωγής στοιχείων με ανθρώπινα μέσα αποτελεί το 6% του συνολικού λειτουργικού κόστους της επιχείρησης.

Που πρέπει λοιπόν να εστιαστεί η προσπάθεια; Που βρίσκεται το πραγματικό κόστος λειτουργίας;

Πόσο σημαντικό είναι να εκτιμηθεί η εξοικονόμηση και το όφελος που επιτυγχάνεται από την γρήγορη και καλύτερη λήψη αποφάσεων, την συνεχή παρακολούθηση των επιχειρησιακών δραστηριοτήτων σε πραγματικό χρόνο, την ακεραιότητα και ασφάλεια δεδομένων και την άμεση πρόσβαση στις πληροφορίες;

Τέλος, πώς μπορεί κάποιος να αξιολογήσει την απλούστευση της διαδικασίας και την σημασία της άψογης διαχείρισης των επιχειρησιακών ροών εργασίας, καθώς και τον εντοπισμό επιχειρησιακών εργασιών που αναλίσκουν πλεονάσματα ανθρώπινων πόρων και άλλων μέσων;   Και τέλος πόσο σημαντικός είναι ο εντοπισμός των σημείων συμφόρησης που πρέπει να αντιμετωπισθούν;

Στα σημείο αυτά μπορούν να επιτευχθούν σημαντικές βελτιώσεις, δυσανάλογα μεγαλύτερες από την οικονομία που ενδεχομένως προκύψει από την εφαρμογή αυτοματισμών στην φάση της εισαγωγής δεδομένων  όσο και αν αυτή φαίνεται φαντασμαγορική η εντυπωσιακή.

Έτσι, εν κατακλείδι, οποιαδήποτε απόφαση για την εφαρμογή της στρατηγικής ECM πρέπει να είναι το αποτέλεσμα μιας προσεκτικής μελέτης που θα προσδιορίσει τους τομείς εκείνους στους οποίους θα πρέπει να πραγματοποιηθούν οι επενδύσεις.

Διασύνδεση συστημάτων

.Το τελευταίο μέρος ενός έργου εφαρμογής ECM που πρέπει να εξεταστεί είναι η δυνατότητα μιας πλατφόρμας ECM να επικοινωνεί με οποιαδήποτε υφιστάμενη υποδομή συστημάτων πληροφορικής για να αποφευχθεί η αλληλοεπικάλυψη των διαδικασιών και δεδομένων, να επιτρέπει την ανταλλαγή δεδομένων και τη συνεργασία, αλλά κυρίως να διασφαλίσει την ανεξαρτησία της κεντρικής βάσης πληροφοριών της ECM (δεδομένα και ροές εργασίας) που θα  διατηρούνται όταν πρέπει να αντικατασταθεί οποιαδήποτε υποδομή των υπολοίπων συστημάτων πληροφορικής.

Ένα μεγάλο πλεονέκτημα ενός συστήματος ECM είναι η διατήρηση της κεντρικής βάσης δεδομένων  για πολύ μεγαλύτερο χρονικό διάστημα ακόμη και μετά από αλλαγές στην υπόλοιπη υποδομή πληροφορικής.

Enterprise Content Management as part of the 4th technological revolution

ROI FOR ECM IMPLEMENTATIONS.jpg

Ας προβληματιστούμε και λίγο παραγωγικά. Υπάρχει χώρος να κάνουμε άλματα στην παραγωγικότητα του ιδωτικού και δημοσίου τομέα. Αλλά όπως με τα μνημόνια έτσι και στην παραγωγικότητα θα προχωρίσουμε με τις υποχρεωτικές διαδικασίες που επιβάλουν οι διεθνείς οργανισμοί και πρώτα από όλους οι Ευρωπαικοί θεσμοί. Διαβάστε!

http://skymarkrelate.com , http://skymarkrelate.com/blog/

 

IMPROVING COMPETITIVENESS (Awakening business community)

improving copetitivness.jpg

 (An article about the awakening of the business and political community)

An issue that should concern most of us

Greece is experiencing, one of its most difficult moments, in recent history and certainly the most difficult since the fall the junta regime.

Greek Crisis.jpg

We are at the bottom of the European list in terms of competitiveness and in the last four positions in important parameters that define the standard of living and rates of development.

Most recently Greece figures forth from the top of the lowest economies just after Venezuela, Cuba, Bolivia.

«Greece subsides in 86th place among 138 countries, compared to 81st in the previous evaluation. The political instability, taxation, and bureaucracy affect the competitiveness, according to the World Economic Forum «.

We continue to focus our efforts on restructuring a huge debt created in the last few years and are interested in improving competitiveness in a globalized world environment, which inevitably defines the framework of our possibilities.

This should be the first and most important goal and motivation for both the public and private sector. Unfortunately, the concept of increased competitiveness is misunderstood.

Improving competitiveness is not just the reduction of human resources in the public or private companies, but it implies increased efficiency, cost savings in media and materials, improved processes within companies that reduce and even eliminate, errors, reduce time gaps and finally achieve the best operational control of the business units.

Let us emphasize that improving competitiveness cannot replace the importance of the strategy of a company, but can help to strengthening a comparative advantage in a fierce global competitive environment.

Many investments are made in the field of business organization both for the introduction of methods and means of automation, such as implementation of quality control applications and compliance with international standards and the renewal of computer equipment. All such investments aim to improve efficiency covering much of the operating procedures of the business to such an extent that we should not even imagine a company operating without the above mentioned instruments and methods.

The improvements that can be achieved are directly in proportion to the complexity of the processes and operational workflows which usually involve the participation, of many persons serving different levels and functions, such as executive, corporate, strategic, control, auditing, management etc.

The proper management, the flow and the easy access to information as well as proper control of the operational processes in a business may have unlimited room for great improvements.

The technology which complements the existing technological environment and makes better use of any existing infrastructure, while achieving the maximum benefit, has the name, ECM (Enterprise Content Management) technology.

By adopting horizontally ECM technology, in almost any industry sector, and for numerous departments within a company, one has the capability to raise the competitiveness in minimal time and minor cost, compared to other investments incurred for IT infrastructure and organizational tools.

The promotion of this technology can bring immediate results at exceptionally ‘High Return on Investment’ (ROI)   both in the public and private sector.

The awakening of the business community about the possibilities of Enterprise Content Management systems (ECM) can enhance the country’s competitiveness.

For more information please visit http://www.modus.gr/en

Or contact me at nkou@skymarkrelate.com or visit http://skymarkrelate.com/

 

 

 

“ΠΑΡΑΓΩΓΙΚΟΤΗΤΑ” Ο ΚΡΙΣΙΜΟΣ ΠΑΡΑΓΟΝΤΑΣ ΓΙΑ ΤΗΝ ΑΝΤΙΜΕΤΩΠΙΣΗ ΤΗΣ ΚΡΙΣΗΣ

“ΠΑΡΑΓΩΓΙΚΟΤΗΤΑ” Ο ΚΡΙΣΙΜΟΣ ΠΑΡΑΓΟΝΤΑΣ ΓΙΑ ΤΗΝ ΑΝΤΙΜΕΤΩΠΙΣΗ ΤΗΣ ΚΡΙΣΗΣ

(Ενα άρθρο με επαγγελματική προσέγγιση)

Μεγάλη αντιπαράθεση φαίνεται να συνεχίζει να υπάρχει μεταξύ πολιτικών και άλλων φορέων, οικονομικών επιχειρησιακών,  κλπ.

Η ανάγκη βελτίωσης της ανταγωνιστικότητας της Ελληνικής οικονομίας εξακολουθεί να αποτελεί στόχο αλλά και αντικείμενο συζήτησης μεταξύ όλων των Ελληνικών κυβερνήσεων και των δανειστών.

Όλοι συμφωνούν, ανεξάρτητα πολιτικής ιδεολογίας, ότι η  “παραγωγικότητα” αποτελεί  κοινό παράγοντα  για την αντιμετώπιση της  κρίσης.

Παρότι όλοι συμφωνούν, στην πράξη εξακολουθούμε να αντιμετωπίζουμε, αύξηση κόστους λειτουργίας,  μείωση της παραγωγής, αύξηση της γραφειοκρατίας τόσο στον δημόσιο όσο και στον τραπεζικό αλλά και στον ιδιωτικό τομέα.

Η Κυβέρνηση εξαγγέλει μεγάλη οικονομία στο δημόσιο από την κατάργηση φωτοτυπιών.

Πρόσφατα διαβάσαμε αυτό που με περηφάνια εξαγγέλλεται από τον δημόσιο τομέα ότι η κατάργηση των φωτοτυπιών  από τις διαδικασίες διαπιστώθηκε οικονομία €500 εκ. Μα αυτό αποτελεί ένα αμελητέο ποσό σε σχέση με το σύνολο των απωλειών από τις γραφειοκρατικές διαδικασίες του δημοσίου που βιώνουμε επί δεκαετίες.

Αλλά είναι λάθος να επικεντρώνουμε την κριτική μας μόνο στο δημόσιο, οι επιπτώσεις της γραφειοκρατίας στην παραγωγικότητα αποτελεί σοβαρότατο πρόβλημα και στον ιδιωτικό τομέα ανεξαρτήτως συγκεκριμένης ειδικότητας.

Η καθυστέρηση στην συνειδητοποίηση της συμβολής των πρώτων εφαρμοφών πληροφορικής.

Μας πήρε πάνω από σαράντα χρόνια (δεκαετίες ‘60,’70,’80,’90) να συνειδητοποιήσουμε την συμβολή της πληροφορικής στην παραγωγική διαδικασία που οδήγησαν στην ευρύτατη διάδοση και εφαρμογή  των ERP’s, ελπίζω να μην μας πάρει περισσότερο από δύο ακόμη δεκαετίες για να συνειδητοποιήσουμε την συμβολή που μπορεί να έχει στην μείωση του κόστους και στην αντίστοιχη αύξηση της παραγωγικότητας η εφαρμογή των τεχνολογιών Enterprise Content Management (διαχείριση της εταιρικής πληροφορίας) που εξαπλώνεται παγκοσμίως, με ρυθμούς χιονοστιβάδας.

Η ανάγκη ταχύτερης συνειδητοποίησης της δεύτερης φάσης εφαρμογών με την τεχνολογία Enterprise Content Management.

Για τον μη ειδικό, οι τεχνολογίες αυτές αφορούν την όσο το δυνατόν μεγαλύτερη αυτοματοποίηση της διαχείρισης, διανομής, αποθήκευσης, ανάκλησης πληροφοριών καθώς και της αυτοματοποίησης των διαδικασιών κάθε ιδιωτικής η δημοσίας επιχείρησης ή τομέα, συνδυάζοντας στοιχεία και πληροφορίες που  προέρχονται από ηλεκτρονικά ή άλλα μέσα, όπως έντυπα, χειρόγραφα, φαχ, e-mails, σχεδίων, φωτογραφιών, video, κλπ.

Ο χρόνος απόσβεσης.

Η εφαρμογή της τεχνολογίας έχει συνήθως απόσβεση μικρότερη από 12 μήνες και ελάχιστο χρόνο υλοποίησης που δεν έχει σχέση με τους αντίστοιχους χρόνους εγκατάστασης και υλοποίησης ενός Συστήματος ERP.

Η δραματική αλλαγή που πραγματοποιείται σήμερα.

Σε λίγα χρόνια προβλέπουμε ότι δεν θα υπάρχει επιχείρηση δημόσιου ή ιδιωτικού τομέα διεθνώς, η οποία θα λειτουργεί χωρίς τις τεχνολογίες ECM, ενώ στην Ελλάδα θα ψάχνουμε ακόμη να βρούμε τα χαρτάκια των διαφόρων αποδείξεων εξόδων, τα χαμένα ή λάθος αρχειοθετημένα παραστατικά, τις συμβάσεις, τους μουχλιασμένους από την πολυκαιρία φακέλους, τα χαμένα ένσημα των διαφόρων επαγγελματικών ταμείων, τις πολύπλοκες διαδικασίες ποιοτικού ελέγχου, τα ληγμένα πληρεξούσια και εξουσιοδοτήσεις των τραπεζών και τόσα άλλα στοιχεία και έγγραφα.

Και μόνο η εξοικείωση με τις τεχνολογίες ECM  αποτελεί κέρδος.

Ταυτόχρονα οι δικηγόροι, οι λογιστές, οι ελεγκτές, οι δικαστές, οι φοροτεχνικοί, οι μηχανικοί θα χάνονται μέσα στις χιλιάδες τροπολογίες, διατάξεις, νομολογία, πρακτικά ΔΣ, δικαστηρίων, κλπ.

Τόσο  σε γενικό αλλά και σε ατομικό περιβάλλον στον δημόσιο και ιδιωτικό τομέα ακόμα και μόνο η εξοικείωση με τις τεχνολογίες ECM θα αποτελεί κέρδος.

Για τους παραπάνω λόγους απασχολούμαι και επαγγελματικά με την προώθηση και ηλοποίηση εφαρμογών Enterprise Content Management της Ελληνικής Εταιρίας DataTeam Solutions  www.datateamsolutions.com.

Η Εταιρία  DataTeam Solutions παρέχει δωρεάν συμβουλές  και μελέτη προϋπολογισμού για το  ROI που επιτυγχάνεται από την εφαρμογή των τεχνολογιών σε συγκεκριμένους τομείς όπως, Εταιρίες Διανομών/Logistics, Παραγωγής, Παροχής Υπηρεσιών, Ναυτιλιακών και πολλών άλλων επιχειρήσεων που δραστηριοποιούνται στη Ελλάδα και το εξωτερικό.

Νίκος Κούζος

ΝΕΕΣ ΤΑΣΕΙΣ ΓΙΑ ΤΗΝ ΒΕΛΤΙΩΣΗ ΤΗΣ ΑΝΤΑΓΩΝΙΣΤΙΚΟΤΗΤΑΣ

book-mess-chaos

ΝΕΕΣ ΤΑΣΕΙΣ ΓΙΑ ΤΗΝ ΒΕΛΤΙΩΣΗ ΤΗΣ ΑΝΤΑΓΩΝΙΣΤΙΚΟΤΗΤΑΣ

Ένας εύκολος και ανώδυνος τρόπος να μειώσετε το κόστος λειτουργίας μίας  επιχείρησης.

Η εξέλιξη της πληροφορικής και κυρίως η χρησιμοποίηση των συστημάτων οικονομικής διαχείρισης (ERP’s)  βοήθησε στην καλύτερη λειτουργία των επιχειρήσεων έτσι που δεν θα μπορούσε κανείς να φανταστεί μια επιχείρηση να λειτουργεί χωρίς αυτά.

Αλλά και τα εργαλεία της νέας τεχνολογίας που έχουν γίνει καθημερινότητα για τον άνθρωπο που έχει πλέον εξοικειωθεί αποτελούν και αυτά μία πραγματική επέκταση των δυνατοτήτων του, όπως  για παράδειγμα, προσωπικοί υπολογιστές, κινητά τηλέφωνα,  lap tops, σημεία πωλήσεων (POS), πιστωτικές κάρτες που λειτουργούν εξ αποστάσεως (Touch less credit cards), διαδίκτυο, e-pass, e-ticketing, e-banking, φωνητικές παραγγελίες, , sms,  ηλεκτρονική τιμολόγηση, χρήση φωτογραφικών δεδομένων, βίντεο, ιστοσελίδες και τόσα άλλα.

data chaos

Παρ όλα αυτά , το χαρτί εξακολουθεί να μας διαφεντεύει,  τόσο σαν αποδέκτες όσο και σαν αποστολείς  πληροφοριών  που διακινούνται  σε μορφή έντυπου υλικού , χαρτί, μελάνη, σεντόνια μηχανογραφικών εκτυπώσεων, φωτογραφίες, τιμοκατάλογοι, ντοσιέ, φάκελοι, επιστολές, Fax, φωτοτυπίες , εισιτήρια, σφραγίδες, υπογραφές, αποδείξεις, αιτήσεις κλπ.

Αναρωτήθηκε όμως κανείς πως θα ήταν η επαγγελματική ζωή χωρίς την ύπαρξη του χαρτιού; Πόσο κοστίζει άραγε αυτό σε κάθε επιχείρηση αλλά και στο δημόσιο , στην κοινωνία και στο οικολογικό περιβάλλον , σαν υλικό μαζί με την διακίνηση, και φύλαξή του;

Τα συστήματα οικονομικής διαχείρισης (ERP’s) παρέχουν ουσιαστικά επεξεργασμένες πληροφορίες και δεδομένα, αυτό που προσδιορίζεται σαν ‘δομημένη’ πληροφορία, σε αντίθεση με το έντυπο υλικό, την εικόνα το βίντεο, το μικροφίλμ, τον ήχο, πολλά ακόμα ηλεκτρονικά και  μη μέσα , δηλαδή την ‘αδόμητη’ πληροφορία, που και για αυτή απαιτείται ιδιαίτερη διαχείριση.

Αναρωτήθηκε κανείς , στην ηλεκτρονική εποχή που ζούμε αν η σχέση δομημένης προς την αδόμητη πληροφορία αλλάζει και προς ποια κατεύθυνση γίνεται αυτό;

Για να διευκολύνω την κατανόηση των αλλαγών που σαν χιονοστιβάδα προβλέπεται ότι θα γίνονται  θα αναφερθώ, πάρα κάτω, σε μερικές πτυχές της καθημερινότητας μιας επιχείρησης.

Πως θα σας φαινόταν αν όλα τα έγγραφα και στοιχεία, δομημένης και αδόμητης πληροφορίας, εικόνων, εγχειριδίων, συμβάσεων, εξερχομένων και εισερχομένων παραστατικών, που προέρχονται από διαφορετικές πηγές, ERP,  e-mai,  fax, ηλεκτρονικά έγγραφα, επιστολές αλληλογραφίας, σχέδια που προέρχονται ακόμα και από διαφορετικά συστήματα, παλαιά και νέα, ανεξαρτήτως τύπου η μορφής των στοιχείων αυτών, να βρίσκονται σε μία μοναδική και ενιαία βάση , άμεσα προσβάσημη από όλους , με τα δεδομένα κρυπτογραφημένα, διασφαλισμένα, σωστά αρχειοθετημένα και συσχετισμένα;

Πως θα σας φαινόταν αν ξαφνικά εξαφανιζόταν από την εικόνα τα ντοσιέ, τα ντουλάπια , τα ατέλειωτα ράφια, τα συρτάρια,  από τα γραφεία και τους διαδρόμους που καταλαμβάνουν ζωτικούς χώρους μιας επιχείρησης η οργανισμού;

Πως θα σας φαινόταν εάν ο Γενικός ή ο οικονομικός Διευθυντής  που δεν βρίσκονται σχεδόν ποτέ στο γραφείο τους γιατί ταξιδεύουν η βρίσκονται σε συσκέψεις, έδιναν εγκρίσεις για προμήθειες, ή άδειες, ή έξοδα, όλα από το κινητό τους, έχοντας πλήρη εικόνα των εγγράφων και στοιχείων χωρίς να επικοινωνούν με την γραμματέα τους, εξ ανάγκης, με τις αναπόφευκτες  ασάφειες και τα πιθανά λάθη;

Πως θα σα φαινόταν αν οι ατέλειωτες συζητήσεις, που προκαλούνται για διαφορές η αμφισβητήσεις  στα τιμολόγια προμηθευτών η πελατών με τις αντίστοιχες παραγγελίες, ή τους τιμοκαταλόγους, ή τις αιτήσεις προμηθειών, ή τις τυχόν συμβάσεις και τεχνικών προδιαγραφών, κατέληγαν, αντί των ατελείωτων τηλεφωνικών διενέξεων, σε ένα αυταπόδεικτο e-mail  με συνημμένα όλα τα σχετικά έγγραφα;

Πως θα σας φαινόταν εάν η αρχειοθέτηση των φυσικών εγγράφων εξαφανιζόταν από την ημερήσια διαδικασία, ενώ η ανεύρεση ενός φυσικού εγγράφου που ενδεχομένως θα ήταν αναγκαία μια δύο φόρες θα γινόταν εύκολα καταναλώνοντας  λίγα μόνο λεπτά τον χρόνο;

Αλλά ας σταματήσω να μιλάω ακατάπαυστα, πιθανώς εξάπτοντας την φαντασία ορισμένων, ή ενδεχομένως κουράζοντας άλλους.

Απλά καταλήγω τονίζοντας ότι η τεχνολογία αυτή, που εφαρμόζεται εδώ και πολλά χρόνια παρέχει τις παραπάνω δυνατότητες, τα δε οφέλη που  προκύπτουν, τόσο από μετρήσιμα στοιχεία οικονομιών, όσο και από ποιοτικά μη μετρήσιμα, είναι πολύ σημαντικά, ενώ η απαιτούμενη επένδυση είναι σχεδόν ίση με τις μετρήσιμες μόνο οικονομίες ενός έτους δηλαδή  “pay as you earn”.

Στις σημερινές συνθήκες κρίσης και ανταγωνισμού αυτός είναι ο ποιο ανώδυνος τρόπος να επιτευχθούν δραματικές μειώσεις στο κόστος λειτουργίας των επιχειρήσεων ενώ ταυτόχρονα επιτυγχάνονται άμεσα βελτιώσεις στην αποδοτικότητα και αναβάθμιση της ποιότητας των υπηρεσιών που προσφέρονται επιτυγχάνοντας κυριολεκτικά τριψήφιο ποσοστό  Return On Investment ( ROI).

Τέλος, πέραν των αυστηρά οικονομικών επιπτώσεων είναι καιρός να λαμβάνονται σοβαρά υπ όψη και οι περιβαλλοντολογικές επιπτώσεις που αναμφισβήτητα το δικό τους κοινωνικό κόστος η όφελος που στην συγκεκριμένη περίπτωση αναλογεί σε εκατοντάδες στρέμματα δάσους με εκατοντάδες ακόμα και  χιλιάδες δένδρα.

Ελπίζω με όλα αυτά που έθιξα να προβλημάτισα ή και να παρακίνησα ορισμένους από εσάς να ξεκινήσετε τουλάχιστον μία  έρευνα για τη αξιολόγηση εφαρμογής αυτής της τεχνολογίας στο δικό σας επιχειρηματικό περιβάλλον, ιδιωτικής η δημοσίας δραστηριότητας, σας διαβεβαιώνω ότι θα έχετε προσφέρει σημαντική συνεισφορά στον τομέα που εξυπηρετείτε.

 

Νίκος Κούζος.

Πρόεδρος Skymark Technologies

Authorized Modus Partner   http://www.modus.gr/

Tel: +30 210 89 57 741

Mob: +30 6976696568

Email: nkou@skymarkrelate.com

Addr: 36 Nikitara Str. 166 73, Voula, Greece

Lower your environmental footprint using DataTeam Solutions technologies…