Communication between two programs in PYTHON

Asked

Viewed 1,550 times

0

Guys, I’m developing two programs that have to communicate with each other, I tried the way to import the other program, but when I run them both they behave like a single code, and that’s not what I want. They need to be kept as separate codes.

I want to be able to use the values of the variables in the other code. One code communicates with the Arduino via serial and receives values from it, the other takes fts with a camera and saves them according to each value read from the serial.

  • What kind of communication you need?

  • 1

    I want to be able to use the variables values of the other code. One code communicates with the Arduino via serial and receives values from it, the other takes fts with a camera and saves them according to each value read from the serial.

  • who has called to "close" - I believe that now the question is responsive, and the answer can be useful to quite a lot of people.

  • Thank you, I’m waiting for some help, if you need more information, I’ll give.

  • It just wasn’t very clear to me the need to do this in two separate programs. Wouldn’t you be able to perform the process of taking the photo the moment you receive the data via serial, through a thread, for example? By the way, perhaps it would be interesting for you to add to the question these codes to better notice the answers.

  • I tried Thread but since I’m using the Raspberry Pi3 it kind of overloads the processing, it goes to 100%. In the program that communicates with the Arduino I also use the pygame for the use of a joystick, when I ran the codes separately worked smoothly, type one only taking fts and the other program using the joystick and communicating with the Arduino, but without the programs interacting with each other.

Show 1 more comment

1 answer

3

So - this can range from "easy" to "complicated" - especially if you’re going to put it into production and want to have some security of who calls what.

I will try to list some options and give the path of the stones to some of them:

Maybe you don’t need a lawsuit anymore:

You have to see how much you want to "complicate" your life - if in a process you stay in a loop while reading values of the serial, and just want the controller code of the camera can see these values, nothing prevents everything to be in one program, and it is only a matter of refactoring your code.

A good way to do this is to put both your code that reads serial values and optionally the camera values in co-routines, using "Yield".

So, assuming your code is now something like:

def principal():
    # codigo para configurar a serial
    while True:
        valor = funcao_pra_ler_da_serial()
        # quero que o código da camera receba "valor"
        #
        # Num loop continuo é importante forçar uma pausa
        # pra nao usar mais CPU que o necessário:
        time.sleep(0.05)

principal()

You can then rewrite this excerpt to be something like this:

def le_valores():
    # codigo para configurar a serial
    while True:
        valor = funcao_pra_ler_da_serial()
        yield valor

def principal():
   gerador_de_valores = le_valores()
   while True:
       time.sleep(0.05)
       valor = next(gerador_de_valores)
       if <expressao_usando "valor">:
            funcao_que_tira_foto()
       time.sleep(0.05)

principal()

Multiprocessing - Single Code Base

If one program can import the other, or you can write a program that can import both - you can use Python multiprocessing. You create a function that controls each of the desired operations, calls each one in a different process, and uses a multiprocessing.Queue to pass data between the two parts of your code.

This will be the simplest method.

Celery

Celery is the "standard way" of calling out-of-process functions in production environments today. Even why, unlike Multiprocessing, you can assemble configurations where processes are on different machines, different operating systems, etc...

The downside is that it needs an intermediate system that functions as a "Broker" to pass messages from one process to another. This is a service installed separately, such as redis or rabbitmq. To run on a local machine, this is very simple, and the default values are enough. For production, is to dig the documentation and build your Broker hole proof, with authentication, and etc...

It also requires that one process can import another’s code - although it needs it just to "know" the names of the functions, which are called Celery tasks. The first example of the tutorial I think is quite clear.

http://docs.celeryproject.org/en/latest/getting-started/first-steps-with-celery.html

xmlrpc or jsonrpc

This is a way that was once easier to configure. Today, with existing examples it is possible, but it can give some extra work - on account of some fixes to avoid security holes that have been discovered over time, and also because the method has not evolved to use Unicode transparently with Python3.

But basically it’s the simplest way that doesn’t require one of the processes to import the code from the other, and still allows each process to be in a different version of Python (Python 2 and Python 3, or you can mix pypy, Jython, Fepython, etc...)

As I mentioned earlier, one of your processes will then have to play the role of "server" - it will expose methods that can be called from outside processes: https://docs.python.org/3/library/xmlrpc.server.html#module-xmlrpc.server

Final considerations:

Note that none of these methods will let you simply "read a variable" from another process in real time, as you read a variable from a module that is imported - most of them require you to call a function remotely, and then you have the return value of that function. That means you’ll probably have to refactor your code anyway. The exception is the multiprocessing method, which can use an object of type Queue to share the data.

  • 1

    Thanks man, I’ll test here, show your answer.

Browser other questions tagged

You are not signed in. Login or sign up in order to post.