Powerful Informatics Revolutionizing Digital Data Security 2024

Informatics or informatics is the science that studies how information is processed and the device that processes the information such as a computer, for example.

Informatics is the science that deals with information processing through automated procedures, and its goal is in particular to study the theoretical foundations of information and its calculation at the logical level and practical techniques for applying it in automated electronic systems. So-called computer systems.

As such, it is a discipline strongly associated with mathematical logic, automation, electronics, as well as electromechanics. It accompanies, integrates, or supports all scientific disciplines, and as a technology, it permeates almost any medium or tool of common and everyday use, to the point that we almost We are almost all users of IT services in some way.

The social and economic value of IT has skyrocketed in just a few years, moving from operational to replace or support simple, repetitive tasks, to tactics to support short-term planning or management, to strategy.

In this context, information technology has become so strategic in the economic and social development of the population that the inability to exploit it, a situation that has been renamed the digital divide, has become a problem of global concern.

Together with electronics and unified communications together under the name ICT, it represents that discipline and at the same time the economic sector that generated and developed the third industrial revolution through what is generally known as the digital revolution.

Information technology is developing, especially in the field of telephone communications.

History of informatics:

The history of informatics, or rather (computer science), is the history of the science of the same name. It has very ancient origins, as the mechanisms of data processing and calculations were already known to the Babylonians around the 10th century BC, in India and China, and perhaps even earlier.

However, in the modern sense, it arises above all from the work of precursors, creators of the first far-reaching computational projects.

Informatics in ancient times

The oldest known tool used for calculation was the abacus, which among the ancient peoples of the Babylonians, Chinese, Greeks, and Romans was a board with numbered grooves containing moving stones arranged appropriately to enable calculations to be made. Of the oldest and medieval types, the abacus is derived from balls suspended on wooden rods or metal wires, called abacus, which are used for primary counts.

In the first millennium BC, the first differential systems were invented in China. They are found in chariots dating from this period. In the fifth century BC, in ancient India, the grammarian Pashinini formulated the Sanskrit grammar in 3959 rules in work, a systematic and Very technical, use meta grammar, conversions, and returns.

The Antikythera is the oldest known mechanical calculator, dating from between 150 and 100 BC, according to more recent hypotheses, to 250 BC. It was found in a shipwreck off the coast of Greece. It was an advanced planetarium, driven by cogwheels. Which was used to calculate the sunrise, the phases of the moon, the movements of the five planets known at the time, the equinox, the months, and the days of the week.

Informatics in the Middle Ages

Analog mechanical devices for calculation appeared again a thousand years later, in the medieval Islamic world thanks to Arab astronomers, such as the mechanical astrolabe of Abu Rayhan al-Biruni and Azam Jabir ibn Aflah.

Arab mathematicians also made important contributions to cryptography, just think of Al-Kindi's development of cryptanalysis and frequency analysis. Arab engineers also invented some programmable machines, such as the automatic flute of the Banu Musa brothers, or the automata and the clock that used the loss of weight in a lit candle to track… Time, by Ibn al-Razzaz.

Technological artifacts of similar complexity appeared in 14th-century Europe, such as mechanical astronomical clocks.

The seventeenth, eighteenth, and nineteenth centuries

Since the introduction of logarithms at the beginning of the seventeenth century when Scot Napier published the first tables of logarithms, a period of great progress in automatic calculating devices has followed, thanks to inventors and scientists.

In 1623, German scientist Wilhelm Scheckhard designed a calculating machine but abandoned the design when the prototype he had begun building was destroyed in a fire in 1624.

Around 1640, the French mathematician and philosopher Blaise Pascal built the Pascaline, a mechanical device based on the design of the Greek mathematician Hero of Alexandria.

In 1672, German mathematician Gottfried Wilhelm Leibniz invented a calculating machine, also known by the English name the Stepped Reckoner, which he completed in 1694.

In 1702 Leibniz developed logic as a formal mathematical system, with his writings on the binary number system. In his system, one and zero represent true and false values. But it took more than a century before George Boole published his Boolean Algebra in 1854, creating a system in which any logical relationship could be manipulated. Through the use of algebraic formulas, operations such as addition, subtraction, and multiplication are replaced by logical operations The values are conjunction, disjunction, and negation, while the only numbers used, 1 and 0, take the meanings of true and false respectively.

From this time the first mechanical devices driven by a binary system were invented, and the Industrial Revolution prompted the mechanization of many activities, including weaving.

Charles Babbage is often recognized as one of the first pioneers of automatic computing. He created a very complex automatic computing machine, the differential machine, which he was able to create with great difficulty, also due to the constraints of time mechanics.

Thanks to a method known as differences, which is particularly suitable for expressing it in mechanical terms, he created a system for automatically executing the calculations needed to compile tables. Mathematical. Then he created, starting with the punched cards of the French Jacquard machine, a new machine, the analytical machine for which he defined the digital computing unit we could say the processor, the execution control unit, the memory for storing intermediate results, and an output device for displaying the result of the calculation.

His assistant, Ada Lovelace Byron, daughter of the English poet George Byron, devised a way to program a machine, at least on a theoretical level, and is thus considered the first programmer in history. In the 1980s, a programming language called ADA was created.

Babbage's analytical engine, extremely large and expensive to build, was never completed due to lack of funds, however, a way was opened, even if the automatic computing revolution, which began 2,300 years ago, would only become a planetary phenomenon with the advent of electronics.

Informatics in the Twentieth Century

Alan Turing is best known for his crucial contribution, during World War II, to the project of deciphering encrypted messages used by the Germans with their Enigma machine.

But this activity of his ended up overshadowing his essential role as the father of informatics at a time when the discipline did not yet have a name, and computers performed tasks barely superior to those of a desktop calculator.

By focusing his research on computation, that is, evaluating the possibility of performing certain operations by a machine, at the age of just over twenty years, he defined the theoretical frontiers of current and future informatics.

His subsequent research could not fail to invest in the field of what would later be called artificial intelligence. The famous test that bears his name is still at the heart of the very open debate about the ability of machines to compete with the human mind.

However, the mathematical foundations of modern informatics were laid by Kurt Gödel with his incomplete theorems of 1931.

The first emphasizes the impossibility of determining the coherence of any mathematical system that also contains natural numbers in its formation, i.e. infinity, and this is the impossibility of constructing systems within mathematics whose principles or axioms do not conflict with each other, along with the second, from 1930, which asserts the semantic completeness of predicate logic, showing that if a formula is true, it can be proven in a finite number of steps, the two theorems represent a cornerstone of historical importance in the field of mathematical logic. , with also important implications of a philosophical nature.

However, they led to the definition and description of such formal systems, including concepts such as recursive functions, lambda arithmetic, the general Turing machine, and postal systems.

In 1936, Alan Turing and Alonzo Church presented an algorithmic formulation, with limits on what can be computed, as well as a purely mechanical model of computation.

This became the Church-Turing thesis, a hypothesis about the nature of mechanical computing devices, such as electronic calculators. This thesis states that any arithmetic operation can be performed by an algorithm installed on a computer, assuming sufficient time and storage space are available. In the same year, Turing also published his Symposium on the Turing Machine, an abstract digital calculating machine now simply called the Universal Turing Machine.

This machine enshrined the principle of the modern computer and represented the birthplace of the concept of the stored program computer, which is now used by practically every modern computer.

These virtual machines are designed to determine what can be computed formally and mathematically, taking into account constraints on computing power. If a Turing machine can complete an activity, it is considered Turing computable or, more commonly, Turing complete.

Starting in the 1930s, electrical engineers were able to build electronic circuits to solve logical and mathematical problems, but many did so on an ad hoc basis, ignoring any theoretical rigor.

This changed with engineer Akira Nakajima's switching circuit theory, published in those years. From 1934 to 1936, Nakajima published a series of documents showing that two-valued Boolean algebra, which he had discovered independently, could describe the operation of switching circuits.

This concept of using the properties of electrical switches to obtain logical results is the basic principle upon which all electronic digital computers are based. Switching circuit theory has provided the foundations and mathematical tools for digital system design in almost every field of modern technology.

Nakajima's work was later cited and rephrased in Claude Elwood Shannon's 1937 master's thesis, entitled A Symbolic Analysis of Relay and Switching Circuits.

While attending a philosophy class, he realized that algebra could be used to order electromechanical relays in order to solve logic problems. His thesis became the principle behind digital circuit design when it became widely known to the electrical engineering community during and after World War II.

In 1941, Konrad Zuse developed the first functional software-controlled computer, the Z3. In 1998 it was classified as a complete Turing machine, and Zuse also developed the S2, considered the first industrial control machine.

He founded one of the first computer companies in 1941, producing the Z4, which became the world's first commercial computer. In 1946 he designed the first high-level programming language, Plankalkül.

In 1948 the creation of the Manchester Baby was completed, the first general-purpose digital electronic computer that ran stored programs like most modern computers. The influence of Max Neumann's 1936 document on Turing machines, and his logical and mathematical contributions to the project, were decisive in its subsequent development. For Manchester.

In 1950, the British National Physical Laboratory completed the Pilot ACE, a programmable microcomputer based on Turing's philosophy with an operating speed of 1 MHz. It was for a time the fastest computer in the world.

Turing's design for the ACE had much in common with RISC architectures. If Turing's ACE had been built exactly as intended, it would have boasted a different ratio than other early computers.

In 1948, Claude Shannon published an article entitled A Mathematical Theory of Communication, which is one of the pillars of modern information and computer theory, where the term (bit) appeared for the first time, which he coined to designate the primary unit of information, and where the concept of information entropy and correspondence between The truth values (true and false) of symbolic logic and the binary values 1 and 0 of electronic circuits.

Through her works devoted to information theory, circuit reliability, and the problem of communications security and cryptography, Shannon profoundly changed the theory and practice of communication.

After receiving his doctorate at the age of 18 at Harvard University with a thesis on mathematical logic, Norbert Wiener studied in Europe with Bertrand Russell and David Hilbert beginning in 1919, at the Massachusetts Institute of Technology in Cambridge.

He made fundamental contributions to the field of the mathematical theory of random processes, prediction, and probability calculation, and, starting with his works in statistics, developed, with his student Claude Shannon, the modern theory of information.

During the years of World War II, he dealt with the problems of automatic control of military weapons. Stimulated by this research, he developed a project for a general science of control organization, which he baptized cybernetics, and presented in a highly successful book entitled Cybernetics or Control and Communication. In animals and machines.

From then on, while he continued to deal with general mathematics, he devoted himself mainly to the development and propagation of the new system.

In informatics, a von Neumann architecture is a type of hardware architecture for stored programmable digital computers that share program data and program instructions in the same memory space.

For this property, the von Neumann architecture contrasts with the Harvard architecture where program data and program instructions are stored in distinct memory spaces.

Von Neumann proposed a very simple architecture which is what we find mirrored in principle in our computers.

According to John von Neumann, the basic elements of a programmable computer are:

  • Control unit, which controls the sequence of operations and controls it so that it is done correctly.
  • An arithmetic logical unit (called ALU, arithmetic logic unit), performs arithmetic and logical operations.
  • The accumulator, which is a memory unit located inside the ALU, is able to receive information from the outside (input data) to pass it to the system and return the results of calculations to the outside world (output data).
  • Memory had to be accessed in a very short time to recover the data and the software contained within it.

What is informatics?

Informatics in Common Language

Computer science is an ambiguous term in common parlance. In fact, we use it to refer to three distinct, albeit related, things.

1. Informatics as a set of applications and artifacts (computers)

It's the layman's perception: “Informatics literacy” means “knowing how to use applications” (such as using a personal computer to write texts, send emails, and browse the Internet).

2. Computer science as a technology that creates applications

It is the perception of the technician, the expert, and the hobbyist (each at his or her level of competence): “Informatics knowledge” means “knowing how to build applications”.

3. Computer science as a scientific discipline establishes and makes this technology possible

This is the perception of a graduate in informatics: “Knowing computer science” means “knowing how to see the computational content of reality and being able to describe it in an appropriate (even formal) language.”

The operational aspect (point 1) and the technological aspect (point 2) are present to some extent in some non-computer-based courses. On the other hand, the scientific aspect (point 3) is neglected in non-informatics courses, which often leads to serious consequences.

Not only is informatics or computer science not reducible to the use of its tools, but the use of tools without exposure to scientific principles causes in one aspect the obsolescence of skills (change the tool, change the skills), and the other does not allow us to see its potential for innovation.

Informatics: technical definition

Information technology studies the actual processes of information processing (storage, transmission, etc.). It contributes to science with its own concepts, such as the concept of efficiency, the concept of computational complexity, and the concept of hierarchy of abstraction.

Other sciences are involved in the study of techniques for solving certain problems. Problem-solving means dismantling sub-problems, restructuring them, solving them, and then reassembling their solutions.

Information technology – and this is one of its important original contributions – provides linguistic tools designed to make this possible and as simple as possible. Furthermore, it examines the similarities between problems and their solutions, providing the tools needed to build effective and robust solutions.

Information technology is:

  1. The science that studies computational procedures for solving problems;
  2. Which studies programming languages to describe algorithms;
  3. Also, who studies computer architectures to run programs;
  4. Even one who studies automatic thinking.

Information technology is not…

  1. absurdity;
  2. Assembly and disassembly of computers;
  3. Knowledge of software packages and their installation;
  4. Surf the Internet;
  5. Know as many programming languages as possible;
  6. Just a program.

Basics of informatics

Powerpoint

By using the term “computer science” we can define the discipline that is concerned with the design of automatic information processing systems.

The term “Informatics Technology” originated with the juxtaposition of the terms “information” and “automatic” which clearly invokes the above meaning.

The means by which this processing is carried out is usually a computer.

We can also think of a computer as a mathematical function f(x); The function accepts an argument (x) as input, for example, a memory state (which can be represented by an exact number) and produces a new memory state (another number) after processing.

Objectives: Computer science for informatics

The Computer Science Fundamentals course aims to provide the student with the cognitive foundations necessary for a first course in informatics sciences, and the concepts taught in the course will allow access to higher education levels: operating systems, programming languages, computer architecture, etc…

In particular, the following will be examined: the history of computer science hints about computer architecture, the binary numbering system, the representation of information through binary coding, the concept of the algorithm, and some practical examples.

It is advisable to follow some introductory programming courses in parallel with the development of the course. To this end, a basic understanding of C or C++ is recommended, and Java is also welcome. Technical courses should be studied, such as engineering, mathematical, physical, and natural science courses, and the more general subject of computer science is recommended.

Informatics in Vocational Training

It is limited to what the requirements provide in terms of procedures and methods for operating them.

The qualification in the application and use of methodologies, tools, and basic information enables you to carry out skilled activities related to the operational support of systems, networks, and data management solutions, specifically in the areas of installation, configuration, and use of computer hardware and software support and hardware in office automation and Support digital communications, regular and extraordinary maintenance of systems, networks, devices, and user terminals, and process, maintain and transfer data managed by digital archives.

Known skills in vocational training:

  • Identify and plan the phases of operations to be carried out, in accordance with safety regulations, on the basis of received instructions, supporting documentation (diagrams, drawings, procedures, bills of materials, etc.), and the reporting system;
  • Prepare, monitor, and take care of the regular maintenance of tools, equipment, and machinery necessary for the various service processing stages on the basis of the type of materials to be used, the indicators and procedures envisaged, and the expected result;
  • Work safely and in compliance with hygiene and environmental protection rules, and identify and prevent situations of danger to oneself, others, and the environment;
  • Installs configures, and uses computer hardware and software that typically support office automation and digital communications based on the client's specific needs;
  • Perform normal and abnormal maintenance of systems, networks, devices, and user terminals, and identify any anomalies and operating problems;
  • Processing, maintaining, and transferring data managed by digital archives.

This is relatively good, but the study of informatics is based on the English language first, then programming and mathematics. There are many paid online training courses.

Finally

Two special features of today's computer systems are their size and complexity, which probably exceed those of any other man-made system.

For example, operating systems in use today consist of more than 50 million lines of program code. It is impossible to understand such systems and control their behavior without a rigorous scientific and engineering approach in the world of mathematics and programming.

The loss of Europe's first space rocket, Arianne 5, in 1996 due to a programming error was resounding evidence of this.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button