Parallelism and Simultaneity

Asked

Viewed 918 times

8

Reading a few things on the subject, I realized they are not the same thing, so I would like to find out:

  • What’s the difference between Parallelism and Simultaneity in proceedings?

3 answers

7


Parallelism and Simultaneity.

Often simultaneity is called competition since the tasks compete with each other for an execution unit, each task can only perform for a handful of time. But some people define competition differently.

You only have parallelism when at least two tasks occur simultaneously. Really.

Note that the computer may give the impression of simultaneity. In fact the term simultaneity is used for perception, according to the Wikipedia article linked above describes, even if it is not specific to computing. The difference is between being something real or just perception.

The perception is with the exchange between tasks. Each task is executed for a short time and then goes to another task. How these exchanges occur so often (a few microseconds) for a human gives the impression that tasks are happening at the same time, but only one of them is executed at a time. They are simultaneous but not parallel. Simultaneity helps decrease latency.

The computer can only get parallelism if it has some mechanism that actually runs more than one thing at the same time.

The most common is to have more than one processor, physical or logical. Even if it is logical there will be a physical support that clearly separates the two execution units. This way reaches a high throughput.

The processor does not need to be the main one, some tasks are parallelized with specialized processors.

It is also possible to achieve distributed parallelism using more than one computer. Obviously two computers can perform tasks really at the same time.

Any application running on a computer with a modern and complete operating system (excluding some niche) can run in parallel, or at least simultaneously. In fact it is very common to have several applications running at the same time.

Two tasks can be parallel without being competitive, they can be totally independent of each other. At least there is a definition that indicates this.

Can be seen in the diagram in Wikipedia page about Openmp.

Paralelo X Concorrente

Pure parallelism is not difficult to achieve. If he solves his problem is the best of worlds. But its pure form can only be used in very specific situations. It is difficult for an entire application to be parallel by full. If the tasks have to communicate already begins to complicate, if the communication is at the same time they are running possibly we have already started to have competition for the same resources.

Simultaneity without parallelism is only useful in terms of perception or in cases where there is waiting for resources external to that of the execution unit. Because there is an administration cost, making something simultaneous, but no parallel where there is no wait just slows everything down. As it is speaking in human perception, the benefit occurs even so since the user does not need to wait for a sequence of tasks to go running, but it will take longer in all.

Simultaneity is very useful in GUI, disk access, network, etc. if it can run in parallel even better. When there is no interaction with users (none at all) it can be much less useful if there is no parallelism.

Applications that aim to be real team work best if they are serial or at least purely parallel. It is still possible to get close to real time if competition can be well controlled, but not 100% guaranteed.

Competition

Some say that competition occurs only when these tasks are dependent each other. There is competition when they compete for a resource that they both need to access in some way, in addition to the processor. It can be memory, disk, etc. In this case we are talking about simultaneous tasks that dispute who can access what.

Competition in this context may occur in parallel or only simultaneous tasks.

Competition, in this sense, is a difficult computational problem to solve in most situations. There are ways to make it easier, but if you don’t know what you’re doing, it could have catastrophic consequences. And finding out what causes the problem when something goes wrong is extremely difficult. It is almost always a non-reproducible problem without well-advanced diagnostic tools and a lot of experience. When programming concurrently it is worth even more the statement that you have to do right, not just work. The execution order is not usually deterministic.

It is very difficult to maintain shared states in a way that does not incur deadlocks or livelocks, maintain consistency and atomicity.

Serial

To complete, if everything runs in a single line it is called a serial. Very rare today, at least in most generic computer devices, but not so rare in well-specialized devices.

Completion

There’s so much ambiguous, misguided and even wrong information on the subject that I hope I’m not the one who wrote something wrong. Surely there will be those who think wrong or right according to what she believes is right.

I have seen widely accepted definitions that oppose these two lines of thought.

Useful reading.

1

You must have been in doubt, because the simultaneity is more "seen" in books as competition.

Simultaneity can be seen as an operating state in which the computer "apparently" does many things at once. Parallelism is the ability of the computer to do two or more things at once.

The main difference between parallelism and simultaneity is the speed of execution. Competing programs can run hundreds of separate execution paths and do not change speed.

In the Book: Python effective: 59 ways to program better in Python by author Brett Slatkin, has a discussion about this.

  • 1

    Direct answer to the point without 500 lines to explain the difference instead of the winding of the others. The difference was asked, not the meaning of the two

0

MULTIPROCESSING Since its inception, computers have been seen as sequential machines, where the CPU performs the instructions of a program, one at a time. In reality, this view is not entirely true, because at the hardware level, multiple signals are active simultaneously, which can be understood as a form of parallelism.

With the implementation of multi-processor systems, the concept of simultaneity or parallelism can be expanded to a broader level, called multiprocessing, where a task can be divided and executed at the same time by more than one processor.

Processor-level parallelism - The idea is to design computers with more than one processor that can be organized in a matrix, vector, sharing bus with shared memory or not.

Considering that the concept of parallelism and simultaneity are very equal, it is concluded that there are no differences between them, as mentioned in the short passage where the author portrays when saying parallelism or simultaneity.

Browser other questions tagged

You are not signed in. Login or sign up in order to post.