Program to convert mp3 using multiprocessing module is looping

Asked

Viewed 168 times

1

import subprocess

from multiprocessing import Process, Lock

def ConverteMusica(a,lock):

    input_file_fmt = '{}.mp3'
    output_file_fmt = a

    for x in range(1, 5):
        subprocess.call(['ffmpeg',
                         '-i',
                         input_file_fmt.format(x),
                         '-acodec',
                         'libmp3lame',
                         '-ac',
                         '2',
                         '-ab',
                         '16k',
                         '-ar',
                         '44100',
                         output_file_fmt.format(x)])


lock = Lock()

MeuConversor1 = Process(target=ConverteMusica,args=("Personalidades_Famosas_Elogiam_Cultura_Vedica.mp3",lock))


MeuConversor1.start()
MeuConversor1.join()

I tried to adapt the program I already had by switching from threading to multiprocessing. I don’t know why the program is looping! What’s wrong?

  • Loop? Like, the program restarts every time?

  • @Anderson Carlos Woss: Stay locked!

  • You want to execute the command ffmpeg at the same time in 4 different processes? That’s it?

  • @Anderson Carlos Woss: I would like to convert several files "at the same time" so as not to have to wait for one to finish to start converting the other!

1 answer

6


For now I owe you the error in your code, because I will need to analyze more time, but an alternative to the problem is to use the module asyncio, since the processing itself of your program runs outside of Python, then you will have no problems with the GIL. Basically you will only be using Python to start the processes.

With the asyncio, could do something similar to:

import asyncio

async def command(*args):
  process = await asyncio.create_subprocess_exec(*args, stdout=asyncio.subprocess.PIPE)
  stdout, stderr = await process.communicate()
  return stdout.decode() if process.returncode == 0 else stderr.decode()

How can I not play the same command you want with the ffmpeg, I will demonstrate a code that performs the ping on multiple servers. Thus:

URLS = [
  'woss.eng.br',
  'pt.stackoverflow.com',
  'chat.stackexchange.com',
  '127.0.0.1'
]

commands = [command('ping', '-c', '10', url) for url in URLS]

The list commands shall be composed of all commands I wish to execute in parallel, i.e., in this case, the command ping on the four given servers. Thus, simply set the event loop of the asyncio and wait for answers:

loop = asyncio.get_event_loop()
processes = asyncio.gather(*commands)
result = loop.run_until_complete(processes)
loop.close()

print(result)

The complete code, with some print the more to monitor the execution, would be:

import asyncio, time

async def command(*args):
  process = await asyncio.create_subprocess_exec(*args, stdout=asyncio.subprocess.PIPE)
  print('Iniciado o processo', process.pid)
  stdout, stderr = await process.communicate()
  return stdout.decode() if process.returncode == 0 else stderr.decode()

URLS = [
  'woss.eng.br',
  'pt.stackoverflow.com',
  'chat.stackexchange.com',
  '127.0.0.1'
]

START = time.time()

commands = [command('ping', '-c', '10', url) for url in URLS]

loop = asyncio.get_event_loop()
processes = asyncio.gather(*commands)
result = loop.run_until_complete(processes)
loop.close()

END = time.time()

print(result)
print('Tempo', END - START)

See working on Repl.it

Running you will see that it will run in approximately 9 seconds, which is the approximate time of the command ping to a domain, transferring 10 packages, even if four commands were executed, different if the same commands were executed in sequence, totaling almost 40 seconds.

Browser other questions tagged

You are not signed in. Login or sign up in order to post.