Makeup.  Hair care.  Skin care

Makeup. Hair care. Skin care

» Section “Problems of teaching computer science. Programming Paradigms Basics of the Structured Programming Paradigm

Section “Problems of teaching computer science. Programming Paradigms Basics of the Structured Programming Paradigm

The general programming paradigms that emerged at the very beginning of the era of computer programming, including the paradigms of applied, theoretical and functional programming, are the most stable.

Applied programming is subject to a problem orientation, reflecting the computerization of information and computational processes of numerical processing, studied long before the advent of computers. It was here that a clear practical result quickly emerged. Naturally, in such areas, programming is not much different from coding; for it, as a rule, the operator style of representing actions is sufficient. In the practice of applied programming, it is customary to trust proven templates and libraries of procedures and avoid risky experiments. The accuracy and stability of scientific calculations is valued. The Fortran language, a veteran of application programming, gradually began to yield somewhat in this area to Pascal, C, and on supercomputers to parallel programming languages ​​such as Sisal.

Theoretical programming adheres to a publication orientation aimed at comparability of the results of scientific experiments in the field of programming and computer science. Programming tries to express its formal models, to show their significance and fundamental nature. These models inherited the main features of related mathematical concepts and established themselves as an algorithmic approach in computer science. The desire for evidence of constructions and assessment of their effectiveness, plausibility, correctness, correctness and other formalized relations in diagrams and texts of programs served as the basis for structured programming and other methods for achieving reliability of the program development process, for example, competent programming. The standard subsets of Algol and Pascal, which served as working material for programming theory, were replaced by more convenient applicative languages ​​for experimentation, such as ML, Miranda, Scheme, Haskell and others. Now they are joined by innovations in C and Java.

Functional programming was formed as a tribute to the mathematical orientation in the research and development of artificial intelligence and the development of new horizons in computer science. An abstract approach to the presentation of information, a laconic, universal style of constructing functions, clarity of the execution environment for different categories of functions, freedom of recursive constructions, trust in the intuition of the mathematician and researcher, avoidance of the burden of prematurely solving unprincipled problems of memory allocation, rejection of unreasonable restrictions on the scope of definitions - - all this is linked by John McCarthy to the idea of ​​the Lisp language. The thoughtfulness and methodological validity of the first Lisp implementations made it possible to quickly accumulate experience in solving new problems and prepare them for applied and theoretical programming. Currently, there are hundreds of functional programming languages ​​focused on different classes of tasks and types of technical means.

The main programming paradigms have evolved as the complexity of the problems being solved has increased. There has been a stratification of programming tools and methods depending on the depth and generality of the elaboration of the technical details of the organization of computer information processing processes. Different styles of programming have emerged, the most mature of which are machine-oriented, system, logical, transformational, and high-performance parallel programming.

Machine-oriented programming is characterized by a hardware approach to organizing the operation of a computer, aimed at accessing any hardware capabilities. The focus is on hardware configuration, memory state, commands, control transfers, sequencing of events, exceptions and surprises, device response time and response success. Assembly language has been overshadowed for some time as the visual medium of choice by Pascal and C, even in microprogramming, but user interface improvements may regain its position.

System programming has been developing for a long time under the pressure of service and custom work. The manufacturing approach inherent in such work relies on a preference for reproducible processes and stable programs designed for repeated use. For such programs, a compilation processing scheme, static analysis of properties, automated optimization and control are justified. This area is dominated by the imperative-procedural style of programming, which is a direct generalization of the operator style of applied programming. It allows for some standardization and modular programming, but it acquires rather complex structures, specifications, testing methods, program integration tools, etc. The stringent requirements for efficiency and reliability are met by the development of professional tools that use complex associative semantic heuristics along with methods of syntactically-driven design and program generation. The undeniable potential of such tools in practice is limited by the complexity of development - a qualification requirement arises.

High-performance programming is aimed at achieving the maximum possible performance when solving particularly important problems. The natural reserve of computer performance is parallel processes. Their organization requires detailed consideration of time relations and a non-imperative style of managing actions. Supercomputers supporting high-performance computing required special systems programming techniques. The graph-network approach to representing systems and processes for parallel architectures has been expressed in specialized parallel programming languages ​​and supercompilers, adapted to map the abstract hierarchy of task-level processes onto the specific spatial structure of processors of real equipment.

Logic programming arose as a simplification of functional programming for mathematicians and linguists solving symbolic processing problems. Particularly attractive is the possibility of using nondeterminism as a conceptual basis, which frees us from premature ordering when programming the processing of formulas. The production style of generating processes with returns is sufficiently natural for a linguistic approach to clarifying formalized knowledge by experts and reduces the starting barrier.

Transformational programming methodologically combined the techniques of program optimization, macrogeneration and partial computation. The central concept in this area is information equivalence. It manifests itself in defining transformations of programs and processes, in searching for criteria for the applicability of transformations, in choosing a strategy for their use. Mixed calculations, deferred actions, lazy programming, delayed processes, etc. are used as methods for increasing the efficiency of information processing under certain additionally identified conditions.

Extensive programming approaches are a natural response to radical improvements in the performance of hardware and computer networks. There is a transition of computing tools from the class of technical tools to the class of household appliances. The ground has emerged for updating approaches to programming, as well as the possibility of rehabilitating old ideas that were poorly developed due to the low technology and performance of computers. It is of interest to form research, evolutionary, cognitive and adaptation approaches to programming that create the prospect of rational development of real information resources and computer potential.

A research approach with an educational-game style of professional, educational and amateur programming can give impetus to search ingenuity in improving programming technology that could not cope with crisis phenomena on the previous element base. The evolutionary approach with a mobile style of refining programs is quite clearly visible in the concept of object-oriented programming, which is gradually developing into subject-oriented programming. Reusing definitions and inheriting object properties can lengthen the life cycle of debugged information environments, increase the reliability of their operation and ease of use.

A cognitive approach with an interoperable style of visual-interface development of open systems and the use of new audio-video tools and non-standard devices open up ways to enhance the perception of complex information and simplify its adequate processing.

An adaptive approach with an ergonomic style of individualized design of personalized information systems provides computer scientists with the opportunity to competently program, organize and support real-time technological processes that are sensitive to the human factor. The direction of development of the programming paradigm reflects a change in the circle of people interested in the development and application of information systems. Many important concepts for programming practice, such as events, exceptions and errors, potential, hierarchy and orthogonality of constructions, extrapolation and program growth points, quality measurement, etc. did not reach a sufficient level of abstraction and formalization. This allows us to predict the development of programming paradigms and select educational material for the future of component programming. If traditional means and methods for selecting reusable components were subject to the criterion of modularity, understood as the optimal choice of minimal coupling with maximum functionality, then the modern element base allows the operation of multi-contact units that perform simple operations. But we can get acquainted with all these types and programming paradigms using even Wikipedia. Currently, there is a very wide range of programming development in different directions.

RECURSIVE COMPUTING IN VARIOUS PROGRAMMING PARADIGMS.

G.V. Vanykina, A.V. Yakushin

Tula State Pedagogical University named after. L. N. Tolstoy

[email protected]

1. Paradigms of modern programming.

Historically, the first technological techniques in the field of programming include the decomposition of the general structure of the problem being solved into such components, which in this context are elementary. Subsequently, a serious theoretical basis and a number of recommended technological techniques were developed for this approach, which together constitute modern programming. Depending on the choice of the “element base”, you can obtain various sets of methodological techniques, rules, relationships, dependencies, etc. to solve a given problem, which together form a programming paradigm.

It should be noted that the word “paradigm” came into programming from the influential book “The Structure of Scientific Revolutions,” written by the historian of science Thomas Kuhn in 1970. Kuhn used the term to describe a set of theories, standards, and methods that together constitute a way of organizing scientific knowledge—in other words, a way of seeing the world. Kuhn's main point is that revolutions in science occur when an old paradigm is revised, rejected, and replaced by a new one.

In a similar sense, as a model or example, and generally as an organizing approach, the word is used in the lecture “Programming Paradigms” by Robert Floyd, winner of the 1979 Turing Award.

The programming paradigm is a very high-level fundamental concept in programming theory and therefore cannot be strictly defined. By a paradigm in programming we mean an internally consistent set of program elements that have common fundamental features, both logical and algorithmic, and the basic concepts associated with these elements. Similarly, a paradigm in programming can be said to be a way of conceptualizing how to carry out calculations and how the work performed by a computer should be structured and organized.

Programming paradigms occupy an important place in software development technology. It is around them that methodological concepts begin to be built and developed. This role is determined by the fact that emerging new ideas for creating programs are initially implemented in simple tools that support research and experimental verification of the proposed style, which are most often programming languages. After generalizing the initial experience, an understanding of the advantages and disadvantages comes, which allows us to move on to the formation of methodologies that ensure the use of the paradigm in the development of large software systems. If the developed paradigm is not capable of serving as the basis of an industrial methodology, it is rejected or applied on a limited scale. We can say that the programming paradigm is implemented through programming methodologies, which are concluded in a set of agreements and agreements on basic language tools and their combinations that are acceptable and not acceptable for a given paradigm.

Depending on the decomposition method, the following main programming paradigms can be distinguished, presented in Table 1.

Table 1. Basic programming paradigms

Paradigm name

Decomposition method

Example programming languages

Imperative

(synonyms: directive, procedural)

Subroutines, abstract data types

Fortran, C, Pascal, Basic

Declarative

(components: logical and functional)

Goals expressed in terms of predicate calculus. "If-then" rules

Lisp, Scheme, Prolog, ML, Haskell

Object-oriented

Classes and objects

Java, C++, Ruby

Programming in Constraints

Invariant relations, system of restrictions

languages ​​CLP(X), SETL, Prolog III

Scenario

Elementary processing or control scenario

Perl, PHP, Python, ASP

It should be noted that following a certain paradigm significantly influences the technological approach to solving problems and the use of various kinds of characteristic heuristic structures in programming practice. In general, we can say that the choice and use of one paradigm for a long time leaves a certain imprint on the thinking of a programmer (especially a beginner) and it can be very difficult to break the vicious circle of established “cliches” when it is necessary to solve non-standard problems. For a user of any level, once made, the choice of a certain paradigm essentially determines both his attitude towards computer technologies and the effectiveness of their use.

Currently, professional knowledge of computer technology and a high level of information culture are impossible without a clear understanding of not only the principles of computer operation, but, even to a greater extent, its potential capabilities. This trend cannot but influence the structure of the study of computer science both at a university and, especially, at school, since it is school education that lays the fundamental basis that determines and supports the further development of a future specialist.

The problem of developing the algorithmic culture of schoolchildren as a fundamental component of information culture is solved from the point of view of various programming paradigms depending on a number of factors: the curriculum for teaching computer science courses, hardware and software, as well as the personal and professional qualities of the teacher. The “Programming” section within the framework of the general education level of schoolchildren involves studying the basics of imperative ( Basic, Pascal, C , school algorithmic language) or object-oriented ( Delphi, C++, Java ) paradigms. Profile training of high school students in natural sciences is, as a rule, also implemented within the framework of the above paradigms. It should be noted, however, that teaching programming in a specific language is not the main task in the implementation of algorithmic training for schoolchildren. A much higher priority is training in the algorithmic approach to problem solving, the ability to evaluate the effectiveness of a developed algorithm from the point of view of the problem being solved, and the choice of technology for its implementation - that is, we are talking about familiarization with methods for developing algorithmic models, the implementation of which is possible within the framework of any paradigm.

The development of various programming paradigms took place in parallel with each other. For a long time, the imperative approach was dominant; in the 70s and 80s, the emphasis shifted towards the study of non-classical paradigms; the 90s were marked by the rapid development of the object-oriented paradigm and the introduction of its elements into others. The development of programming languages ​​from the point of view of various paradigms is presented in Fig. 1

Fortran

Algol,C

Pascal

Modula

Oberon

Directive

Basic

LISP

ML, Scheme

Haskell

Declarative

Prolog

CProlog

VB, C++, Object Pascal

Java, C#

Object-oriented

Smalltalk

Ruby

SETL

Programming in Constraints

Prolog III

CPL (X)

Perl

Python

Scenario

PHP, ASP

Fig1. Development of programming languages ​​and paradigms

In order to study an alternative method in programming, we can propose the technology of using recursive symbolsproblems when studying computer science, since recursion, being a fundamental mathematical concept, is implemented very similarly in the context of different programming paradigms. Recursive constructions, in general, are not natural for most programming languages ​​and their implementation requires a certain level of abstraction from the chosen methodology for creating programs. At the same time, attention should be paid to the possible inefficiency and significant complexity of recursive algorithms in some cases, therefore it is advisable to develop recursive algorithms with a subsequent assessment of their complexity.

Let us trace the conceptual capabilities of various paradigms used to implement recursion.

2. Imperative paradigm.

Imperative programming – one of the most natural approaches to writing programs for the widespread von Neumann architecture. The program in this case consists of assignment operators and clauses that control the sequence of their execution. Directive programming is based on an automaton model of a computer that separates the abstractions of state and behavior. In this case, the program is considered as a process of changing state by executing individual commands. Imperativeness here is understood as an indication to the calculator that How solve a problem. Recursion in directive languages ​​is implemented by creating a special kind of subroutines that allow making recursive calls and using a special technology for translating recursive programs into machine codes. In directive languages, there is the possibility of an alternative between iteration and recursion, but it is the use of the latter that allows you to build not only efficient, but also easy to read algorithms.

3.1. Logic programming appeared as a result of research by a group of French scientists led by Colmerier in the field of natural language analysis. It was subsequently discovered that logic programming is just as effective in implementing other artificial intelligence tasks, for which it is currently mainly used. Logic programming also turns out to be convenient for implementing other complex tasks.

Logic programming is based on predicate logic. Predicate logic is a branch of formal logic that developed in the 20th century. In logic programming, the focus is on describing the structure of an application problem, rather than on instructing the computer what it should do. In logic programming, a program represents some theory (described in a fairly limited language) and a statement that needs to be proven. The proof of this statement will consist of the execution of the program.

The work of a program in a logical language is carried out by searching for proof of the proposed statement in the existing knowledge base, which is a set of certain facts and rules. Recursion in the languages ​​of this paradigm is implemented both in the process of searching for evidence and to specify the mechanism for such a search. Establishing the truth value of a statement is recursive in nature, and accordingly, recursion is implemented here in a fairly natural way.

3.2. Functional programming relies on the theory of recursive functions and Church's lambda calculus. The emphasis is on the dependency between functions on the data. A function program consists of a collection of function definitions, which in turn represent calls to other functions and statements that control the sequence of calls. In functional languages, recursion is implemented naturally, since it is the fundamental basis for constructing the semantics of the language. In general, a functional program does not contain an assignment operator; the calculation of any function does not lead to any side effects other than the actual calculation of its value. The branching of calculations is based on the mechanism for processing the arguments of a conditional sentence, and cyclic calculations are implemented using recursion.

4. Object-oriented paradigm.

Object-oriented programming (OOP) is a natural evolution of earlier programming methodologies. The result of object decomposition is a set of objects, which are then implemented as variables of some specially developed types (classes), which are a set of data fields and methods that work with these fields. It can be said that OOP – is the modeling of objects through hierarchically related classes. At the same time, insignificant details of the object are hidden from us, and if we give a command to some object, then it “knows” how to carry it out. A fundamental concept in OOP is the concept of responsibility or responsibility for performing an action.

All objects are representatives, or copies, classes. The method that an object invokes in response to a message is determined by the class to which the message recipient belongs. All objects of the same class use the same methods in response to the same messages. Classes are represented as a hierarchical tree structure, in which classes with more general features are located at the root of the tree, and specialized classes and, ultimately, individuals are located in the branches. Already the object decomposition itself contains elements of recursive constructions, since the hierarchy of objects obtained in this way has elements of self-similarity. As you know, object-oriented programming is based on three main concepts: encapsulation (hiding data in a class or method); inheritance; polymorphism. Encapsulation can be thought of as a protective shell around the code of the data with which this code operates. The shell defines behavior and protects the code from arbitrary access from outside. Inheritance - is the process by which one type inherits the properties of another type. Polymorphism is a concept that allows you to have different implementations for the same method, which will be chosen depending on the type of object passed to the method when called.

The object-oriented paradigm occupies a special position among all existing ones, since most modern languages ​​of various styles are object-oriented, but in general, the object-oriented implementation of the language differs significantly from the standard one.

Recursion in this paradigm is implemented both in the process of object decomposition and in a direct way. In general, the development of object-oriented programs is several orders of magnitude more difficult than traditional ones, but building a recursive hierarchy of objects is not only quite feasible, but also a vital task for schoolchildren.

5. Programming in constraints.

Constraint programming is a fairly new direction in declarative programming. It appeared in the 80s of the twentieth century as a result of the development of symbolic computing systems, artificial intelligence and operations research. The basic idea of ​​constraint programming is to define a set of variables and set the constraints they must satisfy, and the system finds appropriate values.

Constraint programming is closely related to traditional logic programming. Most constrained programming systems are a language interpreter Prolog with a built-in mechanism for solving a certain class of constraint satisfaction problems. Programming in such systems is called Constraint Logic Programming (CLP), and most languages ​​or libraries are called CLP(X), where X indicates the class of problems being solved.

For example, CLP(B) means the ability to solve equations with Boolean variables. CLP(Q) are equations in rational numbers, and CLP(R) are in real numbers. The most popular solvers for problems on finite sets of integers are CLP(FD).

The problem statement is a finite set of variables X = ( x 1 , ..., x n ), corresponding finite (enumerable) sets of values ​​D X = ( dx 1 , ..., dx n ), and a set of restrictions C = ( c 1 ,..., c m ). A system of constraints can include equations, inequalities, logical functions, as well as any admissible formal constructions connecting variables from the set X . The solution to the problem is to construct a set of variables that satisfies all the constraints c i ,where i = 1,.., m.

Semantically, constraint programming differs from traditional logic programming primarily in that program execution is considered not as proving a statement, but as finding the values ​​of variables. In this case, the internal order of the solution for the execution of individual constraints does not matter, and the constraint programming system, as a rule, seeks to optimize the order of proof of statements in order to minimize rollback in case of failure. Thus, the use of a backward recursion scheme takes place. Recursive constructions are generally implemented in the same way as in logic programming.

6. Scenario paradigm.

Scripting languages. Scripting languages ​​have made huge strides in recent years. Ten years ago they were assigned the role of auxiliary means, but now skepticism towards them has been replaced by interest and recognition.

Script languages ​​have a fairly long history of development. The concept of scripted programming appeared as a natural development of the language LISP . The first scripting languages ​​include built-in command shell controls for the operating system. A batch file in the operating system language is a control script that performs a given sequence of actions. We can say that the script “glues together” the various parts of the operating system and interacts with them.

Currently, the popularity of scripting languages ​​is associated with the development Internet -technology. Scripting languages ​​are used to create dynamic, interactive web -pages, the content of which is modified depending on user actions and the state of other pages and data.

A distinctive feature of scripting languages ​​is the formation of a program in some external language as a result of script execution. A scripting language relies to a small extent on creating the final product from scratch and, to a greater extent, on using the capabilities of the operating system, graphical environment, application service engine and other similar components, the interaction of which is carried out using scripts.

The scenario paradigm involves dividing a task into separate parts, each of which is solved by specialized software; the scenario acts as a “dispatcher” responsible for organizing their interaction.

Scripting languages ​​for web -developments were mainly created in the 90s XX century and include elements of various programming paradigms from imperative to object-oriented. Among the most powerful and popular scripting systems are the following: Perl, Python, PHP, ASP . The syntax and semantics of different scripting languages ​​are quite similar. This is due to the significant influence of the C and C++ languages ​​on the programming community. Support for recursion in scripting languages ​​is implemented similarly to imperative and object-oriented paradigms.

Modern methodological and technical literature quite often provides examples of the implementation of recursive programs in various programming paradigms. The great attention paid to recursion is a confirmation of the fact that the recursive methodology allows you to focus on the very logic of solving a problem, and not on the details of its implementation. Thus, we can conclude that recursion is not only one of the alternative methods in programming, but also a certain style of abstract algorithmic thinking.

(BASICS OF ALGORITHMIZATION AND PROGRAMMING)
  • Programming paradigms and technologies
    Objectives of Chapter 1. Study the concepts of “programming paradigm”, “programming technology”. 2. Get a general understanding of modern software development technologies. 3. Study the stages of creating a structural program. 4. Get acquainted with software development life cycle models...
  • SE programming paradigms
    SWEBOK includes a number of programming paradigms See: Lavrishcheva E. M. Assembly-type programming paradigms in software engineering // UKRProg-2014. No. 2-3. pp. 121-133. . Its programming bootcamps include the following: procedural programming(course CS1011 “Programming fundamentals”),...
    (SOFTWARE ENGINEERING AND TECHNOLOGIES FOR PROGRAMMING COMPLEX SYSTEMS)
  • PROGRAMMING PARADIGMS
    MODULAR PROGRAMMING. BASIC CONCEPTS One of the key problems of modern programming is the reuse of modules and components (KPI). They could be programs, subroutines, algorithms, specifications, etc., suitable for use in the development of new, more complex software....
    (SOFTWARE ENGINEERING. PARADIGMS, TECHNOLOGIES AND CASE TOOLS)
  • Procedural paradigm
    The procedural paradigm was chronologically first and prevailed for a long time. Currently, it is gradually giving way to the object-oriented paradigm, although it still occupies about half of the software development market. It is applied at all levels of software development...
    (ALGORITHMIZATION AND PROGRAMMING)
  • Declarative and procedural memory
    Another independent way of functional organization of memory, independent of others, is its division into declarative And procedural. These two methods of organizing memory have a completely understandable functional basis. A form of declarative memory is designed to support mental...
    (Psychology and pedagogy)
  • And it seemed that no one disputed the need for design and programming in the OOP style. But still, over time, I encountered misunderstandings. This will be a purely historical theoretical article. Of course, without even trying to cover the entire breadth of the topic. But this is a message, so to speak, to a young developer who reads from the top and cannot choose which principles and rules to adhere to, what is primary and what is secondary.

    The title of this topic may now seem very controversial to many (and rather intentionally provocative, but for the sake of the matter :)). But still, we will try to substantiate this here and understand what properties a programming paradigm must have in order to have the right to be called a paradigm.

    The only thing I ask is that if you read it diagonally, please comment with restraint.

    What does Floyd tell us about paradigms?

    The term “programming paradigm” was introduced by Robert Floyd (“R. W. Floyd.” “Communications of the ACM”, 22(8):455-460, 1979. For the Russian translation, see the book: Lectures of Turing Award Laureates for the First twenty years (1966-1985), M.: MIR, 1993.). He says in his 1979 lecture:

    A familiar example of a programming paradigm is structured programming, which seems to be the dominant paradigm in programming methodology. It is divided into two phases. In the first phase, top-down design, the problem is divided into a small number of simpler sub-problems. This gradual hierarchical decomposition continues until there are identified sub-problems that are simple enough to deal with directly. The second phase of the structured programming paradigm entails working upward from concrete objects and functions to more abstract objects and functions used throughout the modules produced by top-down design. But the structured programming paradigm is not universal. Even its most ardent defenders would admit that it alone is not enough to make all difficult problems easy. Other high-level paradigms of a more specialized type continue to be important. (This is not an exact translation, but an author’s compilation based on R. Floyd’s lecture, but adhering to his words as much as possible. The wording has been changed and arranged only to highlight the main idea of ​​R. Floyd and his clear presentation.)

    He goes on to mention dynamic programming and logic programming, also calling them paradigms. But their peculiarity is that they were developed from a specialized subject area, some successful algorithms were found and corresponding software systems were built. He goes on to say that programming languages ​​must support programming paradigms. And at the same time he points out that the structured programming paradigm is a higher-level paradigm:

    The paradigm """even""" at a higher level of abstraction than the """structured programming paradigm""" is the construction of a hierarchy of languages, where programs in the language of the highest level interact with abstract objects, and translate them into programs in the language of the next lower level level.

    Features of higher level paradigms

    As we see, R. Floyd also distinguished paradigms into higher-level and more specialized ones. What features of paradigms allow us to say that they are higher-level? Of course, this is the possibility of applying them to various subject problems. But what makes paradigms applicable to different domain problems? Of course, the question here is not about the specifics of the subject problem, which can be solved by one approach or another. All paradigms that propose creating algorithms in one or another specialized way are not paradigms at all, they are just a special approach within the framework of a higher-level paradigm.

    And there are only two high-level paradigms: structured programming and even higher-level object-oriented programming. Moreover, these two paradigms at a high level contradict each other, but at a low level, the level of constructing algorithms, they coincide with each other. And already approaches (low-level paradigms), such as logical, dynamic, functional, can well be used within the framework of the structured programming paradigm, and some of the emerging specializations - aspect-based, agent-oriented, event-oriented - are used within the framework of the object-oriented programming paradigm. Thus, this does not mean that programmers only need to know one or two high-level paradigms, but knowledge of other approaches will be useful when solving a more specialized, low-level problem. But at the same time, when you have to design software, you need to start with higher-level paradigms, and, if necessary, move on to lower-level ones. But if the problem of choosing which principles to give preference arises, the principles of lower-level paradigms should never dominate the principles of higher-level paradigms. For example, the principles of structured programming should not be observed to the detriment of the principles of object-oriented programming, and the principles of functional or logical programming should not violate the principles of structured programming. The only exception is the performance of algorithms, which is a problem of code optimization by compilers. But since it is not always possible to build perfect compilers, and the interpretation of higher-level paradigms is, of course, more complex than low-level ones, sometimes you have to go against the principles of high-level paradigms.

    But let's return to our question: what makes paradigms applicable to various subject problems? But to answer it we need to make a historical excursion.

    Basics of the Structured Programming Paradigm

    We know that ideas about structured programming arose after E. Dijkstra's report back in 1965, where he justified the abandonment of the GOTO operator. It was this operator that turned programs into unstructured ones (Spaghetti code), and Dijkstra proved that it was possible to write programs without using this operator, as a result of which the programs would become structured.

    But theory is one thing, practice is another. In this sense, it is interesting to consider what the situation was by 1975. This can be clearly seen from the book by E. Yodan (). It is important to consider this because now, more than 30 years later, the principles that were already well known then are now being rediscovered and elevated to a new rank. But at the same time, the historical context is lost, and the hierarchy of the importance of these principles, what is primary and what is secondary. This situation of amorphousness very well characterizes the current state of programming.

    But what happened then? As Yodan describes, it all starts with answering the question: “What does it mean to write a good program?” This is the first criterion for what questions a high-level programming paradigm should answer. If it doesn't answer that question directly, but rather tells you how you can get some interesting characteristics of your program, then you're dealing with a low-level programming paradigm.

    At the dawn of programming, there was such an approach to evaluating programmers by the speed of writing programs. Does this mean that he writes good programs? Does he enjoy special favor and respect from management? If the answer to the last question is affirmative, then all issues of improving programming are of rather academic interest. But management may also notice that some superprogrammers can make programs very quickly or write very efficient programs, but these programs sometimes remain unstructured, impossible to understand, maintain, or modify. And the latter also takes a lot of time.

    A rather characteristic dispute between programmers is noteworthy:
    * Programmer A: “My program is ten times faster than yours, and it takes up three times less memory!”
    * Programmer B: “Yes, but your program doesn’t work, but mine does!”

    But programs are constantly becoming more complex and therefore it is not enough for us that the program just works. Certain methods are needed to verify the correct operation of the program and the programmer himself. Moreover, this is not testing the program, but carrying out some systematic procedure for checking precisely the correctness of the program in the sense of its internal organization. That is, even then, in modern terms, they were talking about code review.

    In addition, even then they talked about the flexibility of the program - the ease of changing, expanding and modifying it. To do this, you need to constantly answer questions of a certain type. “What happens if we want to extend this table?”, “What happens if one day we want to define a new change program?”, “What if we have to change the format of such and such output?”, “What if will someone decide to enter data into the program in a different way?”

    They also talked about the importance of interface specifications, i.e. a formalized approach to the specification of inputs, functions and outputs that must be implemented by each module.

    In addition, the size and immutability of the module were a central focus. Moreover, as for the immutability of the module, it was not considered as a whole, but with the identification of individual factors:
    1. Logical structure of the program, i.e. algorithm. If the entire program depends on some special approach, how many modules will need to be modified when the algorithm changes?
    2. Arguments, or parameters, of the module. Those. change in interface specification.
    3. Internal table variables and constants. Many modules depend on common tables, if the structure of such tables changes, then we can expect that the modules will also change.
    4. Database structure and format. To a greater extent, this dependence is similar to the dependence on common variables and tables mentioned above, with the difference that from a practical point of view it is more convenient to consider the database independent of the program.
    5. Modular program management structure. Some people write a module without really thinking about how it will be used. But if the requirements have changed. How much of the module's logical structure will we have to change?

    These and many other aspects (which we have not considered here) generally formulate the idea of ​​structured programming. Taking care of these aspects is what makes structured programming a high-level paradigm.

    Fundamentals of the object-oriented programming paradigm

    As we can see, all the principles of organizing good programs are considered in structured programming. Could the emergence of one more or a group of previously unknown principles for writing good programs change the paradigm? No. This would just expand the ways and ideology of writing structured programs, i.e. structured programming paradigm.

    But if high-level paradigms are designed to answer the question of how to write a good program, and the emergence of a new technical technique, or the consideration of new factors does not allow one to go beyond the boundaries of structured programming (since it will remain structural, regardless of the number of techniques and factors), then What then will allow us to go beyond the boundaries of this paradigm. Indeed, as we know from science, paradigms generally do not change so quickly. Scientific revolutions rarely happen when the previous paradigm, in practice, from the existing theoretical views simply cannot explain the occurring phenomena. We have a similar situation when changing the paradigm from structural to object-oriented.

    It is already recognized that the reason for the emergence of the object-oriented paradigm was the need to write more and more complex programs, while the structured programming paradigm has a certain limit, after which it becomes unbearably difficult to develop the program. Here, for example, is what G. Schildt writes:

    At each stage of programming development, methods and tools appeared to “harness” the growing complexity of programs. And at each such stage, the new approach absorbed all the best from the previous ones, marking progress in programming. The same can be said about OOP. Before OOP, many projects reached (and sometimes exceeded) a limit beyond which a structured approach to programming would no longer work. Therefore, to overcome the difficulties associated with the increasing complexity of programs, the need for OOP arose. ()

    To understand the reason why object-oriented programming made it possible to write more complex programs and practically eliminate the problem of the emergence of a complexity limit, let’s turn to one of the founders of OOP - Gradi Buci (). He begins his explanation of OOP with what complexity means and what systems can be considered complex. That is, he purposefully approaches the issue of writing complex programs. Next he moves on to the question of the connection between complexity and human capabilities to understand this complexity:

    There is another main problem: the physical limitations of a person when working with complex systems. When we begin to analyze a complex software system, it reveals many components that interact with each other in various ways, and neither the parts of the system themselves nor the ways in which they interact reveal any similarities. This is an example of disorganized complexity. When we begin to organize a system during its design process, there are many things to think about at once. Unfortunately, one person cannot monitor all of this at the same time. Experiments by psychologists such as Miller show that the maximum number of structural units of information that the human brain can simultaneously monitor is approximately seven, plus or minus two. Thus we are faced with a serious dilemma. """The complexity of software systems is increasing, but our brain's ability to cope with this complexity is limited. How can we get out of this predicament?"""

    Then he talks about decomposition:

    Decomposition: algorithmic or object-oriented? Which decomposition of a complex system is more correct - by algorithms or by objects? There is a catch to this question, and the correct answer to it is that both aspects are important. The algorithmic division focuses attention on the order of events, while the object division emphasizes agents, who are either objects or subjects of action. However, we cannot design a complex system in two ways at the same time. We must begin to partition the system either by algorithm or by object, and then, using the resulting structure, try to look at the problem from a different point of view. Experience shows that it is more useful to start with object decomposition. This start will help us do a better job of bringing organization to the complexity of software systems.

    Thus, he also favors object-oriented principles over structural principles, but emphasizes the importance of both. In other words, structural principles must obey object-oriented principles in order for the human brain to cope with the complexity of the problems encountered. He further emphasizes the importance of the model:

    The importance of building a model. Modeling is widespread across all engineering disciplines, in large part because it implements the principles of decomposition, abstraction, and hierarchy. Each model describes a certain part of the system under consideration, and we, in turn, build new models based on old ones, in which we are more or less confident. Models allow us to control our failures. We evaluate the behavior of each model in normal and unusual situations, and then make appropriate adjustments if we are not satisfied with something. It is most useful to create models that focus on the objects found in the domain itself, forming what we have called an object-oriented decomposition.

    Now, if you look more closely, it turns out that the object-oriented paradigm is nothing more than modeling in general, the most important aspect of which was most clearly expressed by S. Lem:

    Modeling is an imitation of Nature, taking into account a few of its properties. Why only a few? Because of our inability? No. First of all, because we need to protect ourselves from excess information. Such an excess, however, may also mean its inaccessibility. The artist paints pictures, but although we could talk to him, we will not know how he creates his works. He himself does not know what is happening in his brain when he paints a picture. Information about this is in his head, but it is not available to us. When modeling, we should simplify: a machine that can paint a very modest picture would tell us more about the material, that is, cerebral, foundations of painting than such a perfect “model” of the artist as his twin brother. The practice of modeling involves taking into account some variables and abandoning others. The model and the original would be identical if the processes occurring in them coincided. This doesn't happen. The results of model development differ from actual development. This difference can be influenced by three factors: the simplification of the model compared to the original, properties of the model that are alien to the original, and, finally, the uncertainty of the original itself. (fragment of the work “Sum of Technologies”, Stanislav Lem, 1967)

    Thus, S. Lem talks about abstraction as the basis of modeling. At the same time, abstraction is the main feature of the object-oriented paradigm. G. Butch writes about this:

    Reasonable classification is undoubtedly a part of any science. Michalski and Stepp state: “An integral task of science is to construct a meaningful classification of observed objects or situations. This classification greatly facilitates the understanding of the main problem and the further development of scientific theory.” Why is classification so difficult? We attribute this to the lack of a “perfect” classification, although, of course, some classifications are better than others. Coombs, Raffia, and Thrale argue that “there are as many ways of dividing the world into object systems as there are scientists who undertake the task.” Any classification depends on the subject's point of view. Flood and Carson give an example: “The United Kingdom... may be viewed by economists as an economic institution, by sociologists as a society, by environmentalists as a dying corner of nature, by American tourists as a tourist attraction, by Soviet leaders as a military threat, and finally by the most romantic among us. , the British are like the green meadows of their homeland.”
    """Search and select key abstractions."""A key abstraction is a class or object that is included in the vocabulary of the problem domain. ""The most important value of key abstractions is that they define the boundaries of our problem""": they highlight what is included in our system and therefore important to us, and eliminate what is unnecessary. The task of identifying such abstractions is specific to the problem domain. As Goldberg states, “The correct choice of objects depends on the purpose of the application and the level of detail of the information being processed.”

    As we have already noted, identifying key abstractions involves two processes: discovery and invention. We discover abstractions by listening to domain experts: if an expert talks about it, then that abstraction is usually really important. By inventing, we create new classes and objects that are not necessarily part of the domain, but are useful in designing or implementing a system. For example, an ATM user says “account, withdraw, deposit”; these terms are part of the domain vocabulary. The system developer uses them, but adds his own, such as a database, screen manager, list, queue, and so on. These key abstractions are no longer created by the domain, but by design.

    The most powerful way to isolate key abstractions is to reduce the problem to already known classes and objects.

    So, the object-oriented paradigm becomes a high-level paradigm, and dominates the principles of the structured programming paradigm, since it is engaged in modeling reality, building models of subject areas in the language of specialists in these areas. If you neglect this in favor of writing a good program that is easy to modify, extend, and have clear interfaces and independent modules, you will return to the level of the structured programming paradigm. Your program will be good for everyone, but it will not be understandable, since it will not correspond to reality, it will be explained in terms only known to you, and a specialist who knows the subject area will not be able to understand the program without your help. Eventually, the difficulty will decrease within a very narrow range, even though you have organized a good program. But it is a program, not a model. The absence of a model, or only its superficial representation, will “explode” your good program from the inside, and will not allow you to further develop and maintain it in the future. When you introduce classes for which abstractions do not exist, when these classes are purely systemic and have nothing to do with the subject area, when they are introduced only to simplify the flow of interaction of other classes - your software becomes "with a beard", and if refactoring is not followed beyond such areas, at one point the development of your software will stop and become impossible - you will reach the limit of structured programming (and did you think that using classes and objects would not threaten you?).

    upd. I was thinking, this is a sensitive topic, I won’t comment on it. I presented the facts in the article, but I don’t want to slide to the level of holivar. If this didn’t help you think, well, no luck this time. Indeed, it will be constructive if you write counter-arguments in a separate article. I don’t undertake to destroy mass stereotypes.

    Yes, and also, to make it clear, I decided to publish it after discussions here. Let's program the Rosenblatt perceptron? , where it obviously became clear that functional programming when building a bad model in OOP works much worse. And the fact that they boast of super speed is a fiction; in fact, the correct model is important. For some (not many such tasks comparatively) functional programming can be successful, but it should not be used everywhere where it does not provide anything good. Well, or so - can you write the piece discussed there ONLY in a functional style, and so that it works faster than with OOP events?

    Tags: Add tags

    It turned out that those paradigms that previously fought their way into the light with sweat and blood through hordes of adherents of traditional methods are gradually forgotten. These paradigms arose at the dawn of programming and why they arose, what advantages they provided and why they are still used is still useful for any developer to know.

    OK. The introduction is a lot of fun, but you don’t read it anyway, so if anyone is interested, welcome to the cut!

    Imperative programming



    Historically, the vast majority of computer technology that we program has a state and is programmed by instructions, so the first programming languages ​​were mainly purely imperative, i.e. did not support any paradigms other than the imperative one.

    These included machine codes, assembly languages, and early high-level languages ​​like Fortran.

    Key points:

    In this paradigm, computation is described in the form of instructions that change the state of the program step by step.

    In low-level languages ​​(such as assembly language), state can be memory, registers, and flags, and instructions can be those instructions that the target processor supports.

    In higher-level ones (such as C), the state is only memory; instructions can be more complex and cause memory to be allocated and deallocated as they operate.

    In very high-level ones (such as Python, if you program it imperatively), state is limited to only variables, and commands can be complex operations that would take hundreds of lines in assembly language.

    Basic concepts:

    - Instructions
    - State

    Generated concepts:

    - Assignment
    - Transition
    - Memory
    - Index

    As main:
    - Assembly languages
    - Fortran
    -Algol
    -Cobol
    -Pascal
    - C
    - C++
    -Ada
    As an auxiliary:
    - Python
    - Ruby
    - Java
    - C#
    -PHP
    - Haskell (via monads)

    It is worth noting that most modern languages ​​support imperative programming to one degree or another. Even the pure functional language Haskell can be written imperatively.

    Structured programming



    Structured programming is a programming paradigm (also commonly used as a development methodology), which was the first big step in the development of programming.

    The founders of structured programming were such famous people as E. Dijkstra and N. Wirth.

    The pioneer languages ​​in this paradigm were Fortran, Algol and B, later succeeded by Pascal and C.

    Key points:

    This paradigm introduces new concepts that combine commonly used patterns for writing imperative code.

    In structured programming, we still operate with state and instructions, but the concept of a compound instruction (block), branch and loop instructions is introduced.

    With these simple changes, it's possible to eliminate the goto statement in most cases, simplifying your code.

    Sometimes goto does make the code more readable, which is why it is still widely used, despite all the claims of its opponents.

    Basic concepts:

    - Block
    - Cycle
    - Branching

    Languages ​​supporting this paradigm:

    As main:
    - C
    -Pascal
    - Basic
    As an auxiliary:
    - C#
    - Java
    - Python
    - Ruby
    - JavaScript

    Partially supported:
    - Some macro assemblers (via macros)

    Again, most modern languages ​​support the structural paradigm.

    Procedural programming



    Again, the increasing complexity of software forced programmers to look for other ways to describe calculations.

    Actually, additional concepts were once again introduced that allowed us to take a fresh look at programming.

    This concept this time was procedure.

    As a result, a new methodology for writing programs arose, which is welcomed to this day - the original problem is broken down into smaller ones (using procedures) and this happens until the solution to all specific procedures turns out to be trivial.

    Key points:

    A procedure is an independent piece of code that can be executed as a single instruction.

    In modern programming, a procedure can have multiple exit points (return in C-like languages), multiple entry points (using yield in Python or static local variables in C++), have arguments, return a value as the result of its execution, be overloaded in number, or type of parameters and much more.

    Basic concepts:

    - Procedure

    Generated concepts:

    - Challenge
    - Arguments
    - Return
    - Recursion
    - Overload

    Languages ​​supporting this paradigm:

    As main:
    - C
    - C++
    -Pascal
    - Object Pascal
    As an auxiliary:
    - C#
    - Java
    - Ruby
    - Python
    - JavaScript

    Partially supported:
    - Early Basic

    It's worth noting that several entry points from all of these languages ​​are only supported in Python.

    Modular programming



    Once again the increasing complexity of programs forced developers to share their code. This time the procedures were not enough and this time a new concept was introduced - a module.

    Looking ahead, I will say that modules also turned out to be unable to contain the growing complexity of software at an incredible speed, and subsequently packages (this is also modular programming), classes (this is already OOP), and templates (generalized programming) appeared.

    A program described in the modular programming style is a set of modules. What's inside, classes, imperative code or pure functions, doesn't matter.

    Thanks to modules, serious encapsulation appeared in programming for the first time - it is possible to use any entities inside a module, but not show them to the outside world.

    Key points:

    A module is a separate named entity of a program that combines other program units that are similar in functionality.

    For example, the List.mod file includes the List class
    and functions for working with it - a module.

    The Geometry folder containing the Shape, Rectangle and Triangle modules is also a module, although some languages ​​separate the concept of a module and a package (in such languages ​​a package is a set of modules and/or a set of other packages).

    Modules can be imported (connected) in order to use the entities declared in them.

    Basic concepts:

    - Module
    - Import

    Generated concepts:

    - Plastic bag
    - Encapsulation

    Languages ​​supporting this paradigm:

    As main:
    - Haskell
    -Pascal
    - Python
    As an auxiliary:
    - Java
    - C#
    - ActionScript 3

    Partially supported:
    - C/C++

    Some languages ​​introduce separate abstractions for modules, while others can use header files (in C/C++), namespaces, static classes, and/or dynamic link libraries to implement modules.

    Instead of a conclusion

    In this article, I did not describe the now popular object-oriented, generic and functional programming. Simply because I have my own, rather radical opinion on this matter and I did not want to start a holivar. At least for now. If the topic proves useful to the community, I plan to write several articles outlining the basics of each of these paradigms in detail.

    Also, I did not write anything about exotic paradigms, such as automata, applicative, aspect/agent/component-oriented programming. I didn’t want to make the article very large, and again, if the topic is in demand, I will write about these paradigms, perhaps in more detail and with code examples.