about
programming
What is computer programming?
Computer programming is a way of giving computers instructions about what they should do next. These instructions are known as source code, and computer programmers write code to solve problems or perform a task.
The end goal is to create something. This means creating anything from a web page, or a piece of software, or even just a pretty picture. That’s why computer programming is often described as a mix between art and science. Programming is technical and analytical but also creative at the same time.
Programming Languages
The source code of a program is written in one or more programming languages. A programming language is a vocabulary and set of grammatical rules for instructing a computer or computing device. The term programming language usually refers to high-level languages, such as BASIC, C, C++, COBOL, Java, FORTRAN, Ada, and Pascal. Many exist.
Each programming language has a unique set of keywords (words that it understands) and a special syntax (spelling and grammar) for organizing the program instructions.
High-Level Programming Languages
Computer languages have various levels.
High-level programming languages, while simple compared to human languages, are more complex than the languages the computer actually understands, called machine languages. Each different type of computer CPU (Central Processing Unit) chip has its own unique machine language.
Lying between machine languages and high-level languages are languages called assembly languages. Assembly languages are similar to machine languages, but they are much easier to program in because they allow a programmer to substitute names for numbers. Machine languages consist of numbers only.
Lying above high-level languages are languages called fourth-generation languages (usually abbreviated 4GL). 4GLs are far removed from machine languages and represent the class of computer languages closest to human languages.
Converting to Machine Language
Machine language is the lowest-level programming language (except for computers that utilize programmable microcode). Machine languages are the only languages understood by computers.
Regardless of what language is used, the program eventually needs to be converted into machine language so that the computer can understand it. There are two ways to do this:
Compile the program - to transform a program written in a high-level programming language from source code into object code at all once. It can then be executed.
Interpret the program - to translate high-level instructions using a program into an intermediate form, which can be executed instruction-by-instruction in real-time.
Some Basic concepts of any programming language:
Variables - a variable is simply a way to store some sort of information for later use, and can retrieved by referring to a “word” that describes this information. Variables can have different types, like character strings, dates or numbers.
Control Structures - is code for just a decision that the computer makes.
Data Structures - a data structure is just a way to get around having to create tons of variables.
Syntax - the set of rules that define the layout or combinations of symbols that are considered to be correctly structured programs in that language.
Tools - a piece of software that, when used while you code, allows you to edit, correct and get your program completed faster.
First Computer Programmer
Ada Lovelace, was an English mathematician and writer chiefly known for her work on Charles Babbage's early mechanical general-purpose computer, the Analytical Engine. Her notes on the engine include what is recognized as the first algorithm intended to be carried out by a machine. Because of this, she is often described as the world's first computer programmer.
Debugging
No overview of programming would be complete without mentioning debugging. The term refers to the discovery and correction of mistakes in computer programs. The computer is doing what you instructed it to do, not what you meant it to do. If you enjoy puzzles, there's a good chance you will find the process of debugging an interesting challenge.
The origin of the term came from a bug (a moth) found in a relay of a computer in 1947, by Admiral Grace Murray Hopper. She found why her program was not working.
Debugging a program is done in steps that match the Scientific Method. They are (1 Observation, (2 Hypothesize, (3 Make predictions, and (4 Test.
The program is modified or some debugging tool feature in the programming environment is used to test the prediction. Modification of the program can be the addition of instructions or a logic change. You repeat to find the mistake!
Want to learn programming? Then contact us.