What is "parallelism"

Parallel computing is a form of computation in which multiple calculations are performed simultaneously, operating on the principle that large problems can usually be divided into smaller problems, which are then solved concurrently (in parallel).

There are different forms of parallel computing: bit, instruction, data or task. The parallelism technique has been used for several years, mainly in high-performance computing, but recently the interest in the topic has grown due to physical limitations that prevent the increase of processing frequency. With the increasing concern of computer power consumption, parallel computing has become the dominant paradigm in computer architectures in the form of multi-core processors.

Parallel computer programs are more difficult to program than sequential, as the competition introduces several new classes of potential defects, such as the running condition. Communication and synchronization between different subtasks is typically one of the biggest barriers to achieving great performance in parallel programs. The increase in speed as a result of parallelism is given by Amdahl’s law.

Source: Parallel Programming - Wikipedia