When one Function terminates the other ends also multiprocessing asynchronous functions

Asked

Viewed 51 times

1

I’m using the library multiprocessing, running two def at the same time, but I want to finish one of the two def the other that is still running also end in the middle of the process.

Example:

from multiprocessing import Process

def a():
    while True:
        print('a')
        break
        
def b():
    while True:
        print('b')

if __name__ == '__main__':
    pA = Process(target=a)
    pB = Process(target=b)
    

So... in the code above what would you do for when the break in function a, consequently also terminates the function b (that is repeating).

Thank you In advance

  • maybe using the function quit() at the end of the two

  • I will try to store the processes made in a list and pass this list as parameter and then terminate in each one. I will test

2 answers

1

In this case I would use a class and consider an internal class variable as parameter, do the test:

from multiprocessing import Process


class Classe:
    processar = True

    def a(self):
        while True:
            print('a')
            break

        self.processar = False

    def b(self):
        while self.processar is True:
            print('b')


if __name__ == '__main__':
    classe = Classe()
    pA = Process(target=classe .a)
    pB = Process(target=classe .b)

Below is a model of what I use personally:

import threading
import time


def run_thread(job_function):
    job_thread = threading.Thread(target=job_function)
    job_thread.start()


class Classe:
    processar = True

    def processar_a(self):
        for x in range(1, 100):
            print("processar_a: {}".format(str(x)))

        self.processar = False

    def processar_b(self):
        while self.processar is True:
            print("processar_b")


if __name__ == "__main__":
    classe = Classe()

    run_thread(job_function=classe.processar_a)
    run_thread(job_function=classe.processar_b)

    while classe.processar is True:
        time.sleep(3)
  • I had settled with the multiprocessing Event() but your reply was the best solution I’ve seen so far, Thank you.

  • 1

    Note that this solution is for multi-threading, not multiprocessing. In separate processes, a worker has no way of "seeing" the class attribute of the other worker - they are in processes

  • 1

    the first listing here should not work - as it is, does nothing at all, since it is not called the method ". start" of each "Process"; they are prepared, ready to run and the program ends. If this is fixed, and the example is changed to one that makes a difference the sign (and not a single print in each process), it will not work because the class is in different processes - (and even if it were in the same process, it would not work because they are separate instances, each one would see its own self.processar distinguished)

  • I understood your point of view, but here I never had problems of variables being in separate instances because it was declared within classe = Classe(). But it may be that the last example of Augusto Vasques is best solution of multiprocessing

1

If there is no need to end the process immediately pB is possible with the method terminate() signaling from one process to another that is finalized by passing its reference through the parameters args or kwargs of the class constructor Process.

from multiprocessing import Process

#Parâmetro p é o processo que será finalizado ao termino dessa função.
def a(p):
    while True:
        print('a')            
        break
    p.terminate()             #Termina o processo p.
        
def b():
    while True:
        print('b')

#Troca a ordem de criação dos processos para que a referência pB fique disponível a pA no momento de sua criação.
pB = Process(target=b)                 
pA = Process(target=a,kwargs={"p":pB})
pB.start()                     #pB é inicializado primeiro pois do contrário pA pode terminar antes de pB começar.
pA.start()
pA.join()
pB.join()

Unfortunately reverse the reasoning to test within pB if the process pA is still alive with is_alive() does not work because only the parent process can test your children:

#Esse código irá gerar erro...
from multiprocessing import Process

def a():
    while True:
        print('a')            
        break
      
def b(p):
    while p.is_alive():
        print('b')

          
pA = Process(target=a)
pB = Process(target=b, kwargs={"p": pA})       
pA.start()
pB.start()
pA.join()
pB.join()

Process Process-2:
Traceback (most recent call last):
  File "/usr/lib/python3.8/multiprocessing/process.py", line 315, in _bootstrap
    self.run()
  File "/usr/lib/python3.8/multiprocessing/process.py", line 108, in run
    self._target(*self._args, **self._kwargs)
  File "main.py", line 9, in b
    while p.is_alive():
  File "/usr/lib/python3.8/multiprocessing/process.py", line 160, in is_alive
    assert self._parent_pid == os.getpid(), 'can only test a child process'
AssertionError: can only test a child process

But the verification can be done in the main process where both pA and pB are child processes:

from multiprocessing import Process

def a():
    i = 0
    while i < 100:
        print('a')
        i+=1
        
def b():
    while True:
        print('b')

pA = Process(target=a)
pB = Process(target=b)
pB.start()
pA.start()
while pA.is_alive():
  pass
pB.terminate()
pA.join()
pB.join()

Browser other questions tagged

You are not signed in. Login or sign up in order to post.