Introduction - Computational Thinking
In order to think about the big picture for this course and get a first look at some important concepts we will consider a number of questions. An answer to a question can often lead to other questions. We will see that things are related. An answer to what one thing is will involve other things we have or will consider.
What is computational thinking?
"Computational Thinking is a new problem solving method, named for its extensive use of computer science techniques."
"Computational thinking can be used to algorithmically solve complicated problems of scale, and is often used to realize large improvements in efficiency."
"Computational Thinking is a problem-solving process that includes the following characteristics:
- Analyzing and logically organizing data
- Data modeling, data abstractions, and simulations
- Formulating problems such that computers may assist
- Identifying, testing, and implementing possible solutions
- Automating solutions via algorithmic thinking
- Generalizing and applying this process to other problems"
The above is from Wikipedia.
We see a number of items that could use further investigation:
- Computational, computation, compute
- Computer Science
What is computing/computation?
Compute - To determine by mathematics, especially by numerical methods [Free Dictionary].
Computation is any type of calculation or use of computing technology in information processing [Wikipedia].
Computation is a process following a well-defined model understood and expressed as, for example, an algorithm, or a protocol.
The study of computation is paramount to the discipline of computer science.
What is a computer?
computer - A computer is a general purpose device that can be programmed to carry out a set of arithmetic or logical operations. Since a sequence of operations can be readily changed, the computer can solve more than one kind of problem. [Wikipedia]
What is an algorithm?
algorithm - An algorithm is a procedure for solving a mathematical problem in a finite number of steps that frequently involves the repetition of an operation, or more broadly, a step-by-step method for accomplishing some task.
General algorithm examples:
- A cooking recipe
- Travel instructions
- Assembly instructions
- Shampoo hair
Think of some concrete examples.
What is a computer?
"A computer is a general purpose device that can be programmed to carry out a finite set of arithmetic or logical operations." [Wikipedia]
"Since a sequence of operations can be readily changed, the computer can solve more than one kind of problem." [Wikipedia]
A computer is a machine, usually an electronic device.
The word "computer" is a good description of what a computer does, computers compute.
This leads to the question: "What can be computed?", or "What problems can be solved computationally?"
What is a computer program?
- A computer program is a group of instructions that a computer can carry out.
- Usually the instructions are given for the purpose of accomplishing some task.
- The instructions must be given in a form the computer is able to "understand".
What is Computer Science?
The Gibbs and Tucker definition of computer science is:
Computer science - Computer science is the study of algorithms, including their
- Formal and mathematical properties
- Hardware realizations
- Linguistic realizations
Note: the central concept of the definition of computer science is algorithms.
This study includes
- Devising algorithms to solve problems, determining if the algorithms correct and efficient.
- Designing and building hardware to execute the algorithms.
- Creating programming languages to specify the algorithms in so the hardware may execute the algorithms.
- Writing correct and efficient software using the algorithms to solve problems.
What is abstraction?
Abstraction - Abstraction is taking a conceptual view of an object or task that ignores the details of construction or implementation.
Note that objects and tasks can be viewed abstractly.
A "more abstract" view is often called a "higher-level view".
The details that are ignored are presumably not important for the purpose of the abstraction.