Skip to content Skip to sidebar Skip to footer

Computer Programming and Its Rich History


If the history of programming is to be retold, it is safe to begin with the difference engine of Charles Babbage back in 1822. Even during a time when computers were simple, they still needed instructions in order to perform tasks. This set of instructions is now known as computer programming.


During the era of the difference engine, the gears needed to be manually changed in order to perform calculations. This changed when the US Government's 1942 machine named ENIAC replaced physical motion with signals of electricity. This machine also introduced the concept of accepting programming.


In 1945, John Von Neumann, who was with the Institute for Advanced Study at the time, developed two vital concepts that directly influenced programming languages. The first concept was known as the shared-program method, which dictated that the hardware needed to be non-complex and not hand-wired for every program. Intricate instructions were used to control this type of hardware, making reprogramming quicker.


The second concept, called "conditional control transfer," gave birth to code blocks that could be used in different orders or as subroutines. This concept also introduced logical branching, allowing for the reuse of code blocks.


In 1949, the Short Code language was introduced, becoming the mother of electronic device computer language. Programmers were required to use 0s and 1s instead of traditional statements. In 1951, Grace Hopper developed the A-0 compiler, which translated the 0s and 1s for the computer, resulting in faster programming.


FORTRAN (FORmula TRANslating System) was introduced in 1957 as the first key language. It was designed for IBM for scientific computation and included GOTO, DO, and IF statements. However, FORTRAN was not well-suited for business computing.


In 1959, COBOL was developed as a language for businessmen. COBOL programs were structured like essays, with 4-5 sections comprising a major whole, making it easier to study.


The LISP language, developed for artificial intelligence study, also known as Cambridge Polish, was developed in 1958 by John McCarthy. This highly abstract and specific programming language is still in use today and can store and modify lists.


In the same year, the Algol language was produced, becoming the mother of Pascal, C, C++, and Java. Algol also introduced the Backus-Naar form or BNF, the first proper grammar. Algol 68, the next version, was more difficult to use, leading to the development of Pascal.


Niklaus Wirth introduced the Pascal language in 1968 as a necessary means of teaching. It was a combination of ALGOL, FORTRAN, and COBOL and improved the pointer data form. However, Pascal's downfall was its lack of variable groups. Modula-2 was introduced, but C had already become popular among users.


C, developed by Dennis Ritchie in 1972 and used by Unix, was comparable to Pascal, with B and BCPL as its precursors. It is now used in Windows, Linux, and MacOS. Object-Oriented Programming (OOP) was developed in the 1970s and 1980s and evolved into the C++ language in 1983. This language can handle multiple tasks simultaneously and is the chosen language for AP Computer Science courses. In 1987, Perl (Practical Extraction and Reporting Language) was developed.


Java followed in 1994, with many goals to reach, especially in terms of improving the speed of programs. However, there are high hopes for the future of this language. Microsoft also developed VB or Visual Basic, which uses widgets and is widely used.


The future holds many more developments for computer programming. Starting from a crude method, the languages in use today have undergone numerous developments, leaving us to wonder what "impossibilities" may become possible in the near future.


Post a Comment for " Computer Programming and Its Rich History"