Computer science

Posted on at


Computer science is the scientific and practical approach to computation and its applications. It is the systematic study of the feasibility, structure, expression, and mechanization of the methodical procedures (oralgorithms) that underlie the acquisition, representation, processing, storage, communication of, and access toinformation. An alternate, more succinct definition of computer science is the study of automating algorithmic processes that scale. A computer scientist specializes in the theory of computation and the design of computational systems.[1]

Its fields can be divided into a variety of theoretical and practical disciplines. Some fields, such as computational complexity theory (which explores the fundamental properties of computational and intractable problems), are highly abstract, while fields such as computer graphics emphasize real-world visual applications. Still other fields focus on challenges in implementing computation. For example, programming language theory considers various approaches to the description of computation, while the study of computer programming itself investigates various aspects of the use of programming language and complex systems. Human–computer interaction considers the challenges in making computers and computations useful, usable, and universally accessible to humans.

Contents
 [hide] 
1History1.1Contributions
2Philosophy2.1Name of the field
3Areas of computer science3.1Theoretical computer science3.1.1Theory of computation
3.1.2Information and coding theory
3.1.3Algorithms and data structures
3.1.4Programming language theory
3.1.5Formal methods
3.2Applied computer science3.2.1Artificial intelligence
3.2.2Computer architecture and engineering
3.2.3Computer performance analysis
3.2.4Computer graphics and visualization
3.2.5Computer security and cryptography
3.2.6Computational science
3.2.7Computer networks
3.2.8Concurrent, parallel and distributed systems
3.2.9Databases
3.2.10Software engineering
4The great insights of computer science
5Academia
6Education
7See also
8Notes
9References
10Further reading
11External links
History[edit]
Main article: History of computer science
 
Charles Babbage is credited with inventing the first mechanical computer.
 
Ada Lovelace is credited with writing the first algorithmintended for processing on a computer.
The earliest foundations of what would become computer science predate the invention of the modern digital computer. Machines for calculating fixed numerical tasks such as the abacus have existed since antiquity, aiding in computations such as multiplication and division. Further, algorithms for performing computations have existed since antiquity, even before the development of sophisticated computing equipment. The ancient Sanskrit treatise Shulba Sutras, or "Rules of the Chord", is a book of algorithms written in 800 BC for constructing geometric objects like altars using a peg and chord, an early precursor of the modern field ofcomputational geometry.

Blaise Pascal designed and constructed the first working mechanical calculator, Pascal's calculator, in 1642.[2] In 1673, Gottfried Leibniz demonstrated a digital mechanical calculator, called the Stepped Reckoner.[3] He may be considered the first computer scientist and information theorist, for, among other reasons, documenting the binary number system. In 1820, Thomas de Colmarlaunched the mechanical calculator industry[note 1] when he released his simplified arithmometer, which was the first calculating machine strong enough and reliable enough to be used daily in an office environment. Charles Babbage started the design of the first automatic mechanical calculator, his Difference Engine, in 1822, which eventually gave him the idea of the first programmable mechanical calculator, his Analytical Engine.[4] He started developing this machine in 1834 and "in less than two years he had sketched out many of the salient features of the modern computer".[5] "A crucial step was the adoption of a punched card system derived from the Jacquard loom"[5] making it infinitely programmable.[note 2] In 1843, during the translation of a French article on the Analytical Engine, Ada Lovelace wrote, in one of the many notes she included, an algorithm to compute the Bernoulli numbers, which is considered to be the first computer program.[6] Around 1885, Herman Hollerith invented the tabulator, which used punched cards to process statistical information; eventually his company became part of IBM. In 1937, one hundred years after Babbage's impossible dream, Howard Aiken convinced IBM, which was making all kinds of punched card equipment and was also in the calculator business[7] to develop his giant programmable calculator, the ASCC/Harvard Mark I, based on Babbage's Analytical Engine, which itself used cards and a central computing unit. When the machine was finished, some hailed it as "Babbage's dream come true".[8]

During the 1940s, as new and more powerful computing machines were developed, the term computer came to refer to the machines rather than their human predecessors.[9] As it became clear that computers could be used for more than just mathematical calculations, the field of computer science broadened to study computation in general. Computer science began to be established as a distinct academic discipline in the 1950s and early 1960s.[10][11] The world's first computer science degree program, the Cambridge Diploma in Computer Science, began at the University of Cambridge Computer Laboratory in 1953. The first computer science degree program in the United States was formed at Purdue University in 1962.[12] Since practical computers became available, many applications of computing have become distinct areas of study in their own rights.

Although many initially believed it was impossible that computers themselves could actually be a scientific field of study, in the late fifties it gradually became accepted among the greater academic population.[13][14] It is the now well-known IBM brand that formed part of the computer science revolution during this time. IBM (short for International Business Machines) released the IBM 704[15] and later the IBM 709[16] computers, which were widely used during the exploration period of such devices. "Still, working with the IBM [computer] was frustrating […] if you had misplaced as much as one letter in one instruction, the program would crash, and you would have to start the whole process over again".[13] During the late 1950s, the computer science discipline was very much in its developmental stages, and such issues were commonplace.[14]

Time has seen significant improvements in the usability and effectiveness of computing technology.[17] Modern society has seen a significant shift in the users of computer technology, from usage only by experts and professionals, to a near-ubiquitous user base. Initially, computers were quite costly, and some degree of human aid was needed for efficient use—in part from professional computer operators. As computer adoption became more widespread and affordable, less human assistance was needed for common usage.

Contributions[edit]
 
The German military used the Enigma machine (shown here) during World War II for communications they wanted kept secret. The large-scale decryption of Enigma traffic atBletchley Park was an important factor that contributed to Allied victory in WWII.[18]
Despite its short history as a formal academic discipline, computer science has made a number of fundamental contributions toscience and society—in fact, along with electronics, it is a founding science of the current epoch of human history called theInformation Age and a driver of the Information Revolution, seen as the third major leap in human technological progress after theIndustrial Revolution (1750–1850 CE) and the Agricultural Revolution (8000–5000 BC).

These contributions include:

The start of the "digital revolution", which includes the current Information Age and the Internet.[19]
A formal definition of computation and computability, and proof that there are computationally unsolvable and intractableproblems.[20]
The concept of a programming language, a tool for the precise expression of methodological information at various levels of abstraction.[21]
In cryptography, breaking the Enigma code was an important factor contributing to the Allied victory in World War II.[18]
Scientific computing enabled practical evaluation of processes and situations of great complexity, as well as experimentation entirely by software. It also enabled advanced study of the mind, and mapping of the human genome became possible with theHuman Genome Project.[19] Distributed computing projects such as Folding@home explore protein folding.
Algorithmic trading has increased the efficiency and liquidity of financial markets by using artificial intelligence, machine learning, and other statistical and numerical techniques on a large scale.[22] High frequency algorithmic trading can also exacerbatevolatility.[23]
Computer graphics and computer-generated imagery have become ubiquitous in modern entertainment, particularly in television,cinema, advertising, animation and video games. Even films that feature no explicit CGI are usually "filmed" now on digital cameras, or edited or post-processed using a digital video editor.[24][25]
Simulation of various processes, including computational fluid dynamics, physical, electrical, and electronic systems and circuits, as well as societies and social situations (notably war games) along with their habitats, among many others. Modern computers enable optimization of such designs as complete aircraft. Notable in electrical and electronic circuit design are SPICE, as well as software for physical realization of new (or modified) designs. The latter includes essential design software for integrated circuits.[citation needed]
Artificial intelligence is becoming increasingly important as it gets more efficient and complex. There are many applications of AI, some of which can be seen at home, such as robotic vacuum cleaners. It is also present in video games and on the modern battlefield in drones, anti-missile systems, and squad support robots.
Philosophy[edit]
Main article: Philosophy of computer science
A number of computer scientists have argued for the distinction of three separate paradigms in computer science. Peter Wegner argued that those paradigms are science, technology, and mathematics.[26] Peter Denning's working group argued that they are theory, abstraction (modeling), and design.[27] Amnon H. Eden described them as the "rationalist paradigm" (which treats computer science as a branch of mathematics, which is prevalent in theoretical computer science, and mainly employs deductive reasoning), the "technocratic paradigm" (which might be found in engineering approaches, most prominently in software engineering), and the "scientific paradigm" (which approaches computer-related artifacts from the empirical perspective of natural sciences, identifiable in some branches ofartificial intelligence).[28]

Name of the field[edit]
Although first proposed in 1956,[14] the term "computer science" appears in a 1959 article in Communications of the ACM,[29] in which Louis Fein argues for the creation of a Graduate School in Computer Sciences analogous to the creation of Harvard Business School in 1921,[30] justifying the name by arguing that, likemanagement science, the subject is applied and interdisciplinary in nature, while having the characteristics typical of an academic discipline.[29] His efforts, and those of others such as numerical analyst George Forsythe, were rewarded: universities went on to create such programs, starting with Purdue in 1962.[31] Despite its name, a significant amount of computer science does not involve the study of computers themselves. Because of this, several alternative names have been proposed.[32] Certain departments of major universities prefer the term computing science, to emphasize precisely that difference. Danish scientist Peter Naursuggested the term datalogy,[33] to reflect the fact that the scientific discipline revolves around data and data treatment, while not necessarily involving computers. The first scientific institution to use the term was the Department of Datalogy at the University of Copenhagen, founded in 1969, with Peter Naur being the first professor in datalogy. The term is used mainly in the Scandinavian countries. An alternative term, also proposed by Naur, is data science; this is now used for a distinct field of data analysis, including statistics and databases.

Also, in the early days of computing, a number of terms for the practitioners of the field of computing were suggested in the Communications of the ACM—turingineer, turologist, flow-charts-man, applied meta-mathematician, and applied epistemologist.[34] Three months later in the same journal, comptologist was suggested, followed next year by hypologist.[35] The term computics has also been suggested.[36] In Europe, terms derived from contracted translations of the expression "automatic information" (e.g. "informazione automatica" in Italian) or "information and mathematics" are often used, e.g. informatique (French),Informatik (German), informatica (Italian, Dutch), informática (Spanish, Portuguese), informatika (Slavic languages and Hungarian) or pliroforiki (πληροφορική, which means informatics) in Greek. Similar words have also been adopted in the UK (as in the School of Informatics of the University of Edinburgh).[37]

A folkloric quotation, often attributed to—but almost certainly not first formulated by—Edsger Dijkstra, states that "computer science is no more about computers than astronomy is about telescopes."[note 3] The design and deployment of computers and computer systems is generally considered the province of disciplines other than computer science. For example, the study of computer hardware is usually considered part of computer engineering, while the study of commercialcomputer systems and their deployment is often called information technology or information systems. However, there has been much cross-fertilization of ideas between the various computer-related disciplines. Computer science research also often intersects other disciplines, such as philosophy, cognitive science,linguistics, mathematics, physics, biology, statistics, and logic.

Computer science is considered by some to have a much closer relationship with mathematics than many scientific disciplines, with some observers saying that computing is a mathematical science.[10] Early computer science was strongly influenced by the work of mathematicians such as Kurt Gödel and Alan Turing, and there continues to be a useful interchange of ideas between the two fields in areas such as mathematical logic, category theory, domain theory, and algebra.[14]

The relationship between computer science and software engineering is a contentious issue, which is further muddied by disputes over what the term "software engineering" means, and how computer science is defined.[38] David Parnas, taking a cue from the relationship between other engineering and science disciplines, has claimed that the principal focus of computer science is studying the properties of computation in general, while the principal focus of software engineering is the design of specific computations to achieve practical goals, making the two separate but complementary disciplines.[39]

The academic, political, and funding aspects of computer science tend to depend on whether a department formed with a mathematical emphasis or with an engineering emphasis. Computer science departments with a mathematics emphasis and with a numerical orientation consider alignment with computational science. Both types of departments tend to make efforts to bridge the field educationally if not across all research.



About the author

160