Run multiple processes on Linux with Python

Asked

Viewed 270 times

2

I receive some Urls as parameter via MQTT, after extracting them I can execute the command for FFMPEG to record them with os.system. But it works for only one process, and I need to run N simultaneously.

I come from Java and I can’t imagine how to do it in Python...

import paho.mqtt.client as paho
import json
import os


def on_message(client, userdata, message):
    content = str(message.payload.decode("utf-8"))
    conversor(content)


def on_connect(client, userdata, flags, rc):
    client.subscribe("cameras/gravacao")


def on_disconnect():
    connect_to_mqtt()


def connect_to_mqtt():
    client = paho.Client("id")
    client.username_pw_set("", "")
    client.on_connect = on_connect
    client.on_disconnect = on_disconnect
    client.on_message = on_message
    client.connect("localhost", 1883, 60)
    client.loop_forever()


def conversor(content):
    data = json.loads(content)
    for n in range(data.get("videos")):
        os.system("ffmpeg -i " + data.get("remote_urls")[n]['url'] + "-acodec copy -vcodec copy "
                                                                     "/home/user/Vídeos/output.mp4")


connect_to_mqtt()
  • Does that help you? https://answall.com/q/290167/5878

  • I will study for him, I did not understand how to pass the Urls (after all I will not always receive a fixed size) but it is a start. Thanks :)

  • The biggest difference from what is in my answer to that other question suggested is that the method call of the subprocess waits for the subprocess called to end. Popen without other parameters starts the process in a S.O. parallel process and continues the execution of the Python code. In the suggested query, the author tries to leave things in parallel using "multiprocessing" and "threads" on the Python side, but it’s really not necessary.

2 answers

5

The os.system breaks a branch in the interactive environment, or in very small scripts, replacing some shell-script too small.

To have better control over processes used from a Python program, the features are grouped in the module subprocess.

In this case, you don’t even need multiple threads in the Python program and no other complicator - you simply have to be able to fire a subprocess - it runs in parallel with your program, coordinated by the operating system. Eventually you can consult the result of the process to know whether it was finished, if it was wrong - but depending on the degree of sophistication you want, you don’t even need it.

In short, to call an external process and continue executing Python code, replace your call to os.system for:

import subprocess

...
    proc = subprocess.Popen(["ffmpeg", "-i", data.get("remote_urls")[n]['url'],  "-acodec", "copy", "-vcodec", "copy"])

That is - the biggest difference is who instead of having the string of the command as it would be typed in the terminal, each parameter must be an element of a list, which is passed to the Popen. The return is a popen object, which has the documentation in the link above, and can be used to query if the process is still running, and, with other parameters passed to Popen, check the output of it to stdin, or in stderr.

If the list with the options gets too weird, nothing prevents you from writing it as a string yourself, and use the "split" operator to turn it into a list:

executavel = 'ffmpeg'
url = data.get("remote_urls")[n]['url']
parametros = "-acodec copy -vcodec copy /home/user/Vídeos/output.mp4".split()
proc = subprocess.Popen([executavel, "-i", url] + parametros)
  • 1

    if you want to use a string, a hint is to use the shlex.split which follows the shell rules for splitting the string, considering things like quotes.

  • Thanks for the reply, however, I would need to execute that command many times. In a comment above, was posted the link to another post that served me very well, but I will take this your answer for life!

1


You’d have to do something like this:

async def command(*args):
    process = await asyncio.create_subprocess_exec(*args, stdout=asyncio.subprocess.PIPE)
    cria_arquivo_com_pid(process.pid)
    stdout, stderr = await process.communicate()
    return stdout.decode() if process.returncode == 0 else stderr.decode()

def gravacao():
    comandos_gravacao = []
    for url in URLS:
        comandos_gravacao.append(command('ffmpeg', '-i', url, '-acodec', 'copy', '-vcodec', 'copy',
                                dirs + 'video.mp4'))

    loop = asyncio.get_event_loop()
    processes = asyncio.gather(*comandos_gravacao)
    loop.run_until_complete(processes)
    loop.close()

gravacao()

This really is something more advanced than Python.

Browser other questions tagged

You are not signed in. Login or sign up in order to post.