Tuesday, July 12, 2011

AUTOMATION

                                             MEANING OF AUTOMATION

1.      The automatic operation or control of equipment, a process, or a system.
2.      The techniques and equipment used to achieve automatic operation or control.
3.      The condition of being automatically controlled or operated.
4.      noun] the act of implementing the control of equipment with advanced technology; usually involving electronic hardware; "automation replaces human workers by machines"
Synonyms: mechanization, mechanisation
5.      [noun] the condition of being automatically operated or controlled; "automation increases productivity"
6.      [noun] equipment used to achieve automatic control or operation; "this factory floor is a showcase for automation and robotic equipment"



DATA PROCESSING

(1) Refers to a class of programs that organize and manipulate data, usually large amounts of numeric data. Accounting programs are the prototypical examples of data processing applications. In contrast, word processors, which manipulate text rather than numbers, are not usually referred to as data processing applications.
(2) Same as Information Technology (IT), refers to all computing functions within an enterprise.

DATA
Data are pieces of information that represent the qualitative or quantitative attributes of a variable or set of variables. Data (plural of "datum", which is seldom used) are typically the results of measurements and can be the basis of graphs, images, or observations of a set of variables. Data are often viewed as the lowest level of abstraction from which information and knowledge are derived
data processing
n.
1. Conversion of data into a form that can be processed by computer.
2. The storing or processing of data by a computer.

data processing: The systematic performance of operations upon data such as handling, merging, sorting, and computing.  Note: The semantic content of the original data should not be changed. The semantic content of the processed data may be changed.
Electronic Data Processing (EDP) can refer to the use of automated methods to process commercial data. Typically, this uses relatively simple, repetitive activities to process large volumes of similar information. For example: stock updates applied to an inventory, banking transactions applied to account and customer master files, booking and ticketing transactions to an airline's reservation system, billing for utility services.

The first commercial business computer was developed in the United Kingdom in 1951, by the Joe Lyons catering organization. This was known as the 'Lyons Electronic Office' - or LEO for short. It was developed further and used widely during the 1960s and early 1970s. (Joe Lyons formed a separate company to develop the LEO computers and this subsequently merged to form English Electric Leo Marconi and then International Computers Ltd.)[1]

Early commercial systems were installed exclusively by large organizations. These could afford to invest the time and capital necessary to purchase hardware, hire specialist staff to develop bespoke software and work through the consequent (and often unexpected) organizational and cultural changes.
At first, individual organizations developed their own software, including data management utilities, themselves. Different products might also have 'one-off' bespoke software. This fragmented approach led to duplicated effort and the production of management information needed manual effort.
High hardware costs and relatively slow processing speeds forced developers to use resources 'efficiently'. Data storage formats were heavily compacted, for example. A common example is the removal of the century from dates, which eventually lead to the 'millennium bug'.
Data input required intermediate processing via punched paper tape or card and separate input to a repetitive, labor intensive task, removed from user control and error-prone. Invalid or incorrect data needed correction and resubmission with consequences for data and account reconciliation.
Data storage was strictly serial on paper tape, and then later to magnetic tape: the use of data storage within readily accessible memory was not cost-effective.

Today

As with other industrial processes commercial IT has moved in all respects from a bespoke, craft-based industry where the product was tailored to fit the customer; to multi-use components taken off the shelf to find the best-fit in any situation. Mass-production has greatly reduced costs and IT is available to the smallest company.
LEO was hardware tailored for a single client. Today, Intel Pentium and compatible chips are standard and become parts of other components which are combined as needed. One individual change of note was the freeing of computers and removable storage from protected, air-filtered environments. Microsoft and IBM at various times have been influential enough to impose order on IT and the resultant standardizations allowed specialist software to flourish.
Software is available off the shelf: apart from Microsoft products such as Office, or Lotus, there are also specialist packages for payroll and personnel management, account maintenance and customer management, to name a few. These are highly specialized and intricate components of larger environments, but they rely upon common conventions and interfaces.
Data storage has also standardized. Relational databases are developed by different suppliers to common formats and conventions. Common file formats can be shared by large main-frames and desk-top personal computers, allowing online, real time input and validation.
In parallel, software development has fragmented. There are still specialist technicians, but these increasingly use standardized methodologies where outcomes are predictable and accessible. At the other end of the scale, any office manager can dabble in spreadsheets or databases and obtain acceptable results (but there are risks).


                                              MEANING OF AUTOMATION

1.      The automatic operation or control of equipment, a process, or a system.
2.      The techniques and equipment used to achieve automatic operation or control.
3.      The condition of being automatically controlled or operated.
4.      noun] the act of implementing the control of equipment with advanced technology; usually involving electronic hardware; "automation replaces human workers by machines"
Synonyms: mechanization, mechanisation
5.      [noun] the condition of being automatically operated or controlled; "automation increases productivity"
6.      [noun] equipment used to achieve automatic control or operation; "this factory floor is a showcase for automation and robotic equipment"



DATA PROCESSING

(1) Refers to a class of programs that organize and manipulate data, usually large amounts of numeric data. Accounting programs are the prototypical examples of data processing applications. In contrast, word processors, which manipulate text rather than numbers, are not usually referred to as data processing applications.
(2) Same as Information Technology (IT), refers to all computing functions within an enterprise.

DATA
Data are pieces of information that represent the qualitative or quantitative attributes of a variable or set of variables. Data (plural of "datum", which is seldom used) are typically the results of measurements and can be the basis of graphs, images, or observations of a set of variables. Data are often viewed as the lowest level of abstraction from which information and knowledge are derived
data processing
n.
1. Conversion of data into a form that can be processed by computer.
2. The storing or processing of data by a computer.

data processing: The systematic performance of operations upon data such as handling, merging, sorting, and computing.  Note: The semantic content of the original data should not be changed. The semantic content of the processed data may be changed.
Electronic Data Processing (EDP) can refer to the use of automated methods to process commercial data. Typically, this uses relatively simple, repetitive activities to process large volumes of similar information. For example: stock updates applied to an inventory, banking transactions applied to account and customer master files, booking and ticketing transactions to an airline's reservation system, billing for utility services.

The first commercial business computer was developed in the United Kingdom in 1951, by the Joe Lyons catering organization. This was known as the 'Lyons Electronic Office' - or LEO for short. It was developed further and used widely during the 1960s and early 1970s. (Joe Lyons formed a separate company to develop the LEO computers and this subsequently merged to form English Electric Leo Marconi and then International Computers Ltd.)[1]

Early commercial systems were installed exclusively by large organizations. These could afford to invest the time and capital necessary to purchase hardware, hire specialist staff to develop bespoke software and work through the consequent (and often unexpected) organizational and cultural changes.
At first, individual organizations developed their own software, including data management utilities, themselves. Different products might also have 'one-off' bespoke software. This fragmented approach led to duplicated effort and the production of management information needed manual effort.
High hardware costs and relatively slow processing speeds forced developers to use resources 'efficiently'. Data storage formats were heavily compacted, for example. A common example is the removal of the century from dates, which eventually lead to the 'millennium bug'.
Data input required intermediate processing via punched paper tape or card and separate input to a repetitive, labor intensive task, removed from user control and error-prone. Invalid or incorrect data needed correction and resubmission with consequences for data and account reconciliation.
Data storage was strictly serial on paper tape, and then later to magnetic tape: the use of data storage within readily accessible memory was not cost-effective.

Today

As with other industrial processes commercial IT has moved in all respects from a bespoke, craft-based industry where the product was tailored to fit the customer; to multi-use components taken off the shelf to find the best-fit in any situation. Mass-production has greatly reduced costs and IT is available to the smallest company.
LEO was hardware tailored for a single client. Today, Intel Pentium and compatible chips are standard and become parts of other components which are combined as needed. One individual change of note was the freeing of computers and removable storage from protected, air-filtered environments. Microsoft and IBM at various times have been influential enough to impose order on IT and the resultant standardizations allowed specialist software to flourish.
Software is available off the shelf: apart from Microsoft products such as Office, or Lotus, there are also specialist packages for payroll and personnel management, account maintenance and customer management, to name a few. These are highly specialized and intricate components of larger environments, but they rely upon common conventions and interfaces.
Data storage has also standardized. Relational databases are developed by different suppliers to common formats and conventions. Common file formats can be shared by large main-frames and desk-top personal computers, allowing online, real time input and validation.
In parallel, software development has fragmented. There are still specialist technicians, but these increasingly use standardized methodologies where outcomes are predictable and accessible. At the other end of the scale, any office manager can dabble in spreadsheets or databases and obtain acceptable results (but there are risks).



No comments:

Post a Comment