0
I am creating a robot that at a given time needs to make multiple HTTP requests at the same time and, return the result in a list. The code so far is like this:
import requests
import simplejson
from multiprocessing import Process
class Bot:
def __init__(self):
self.base_url = 'http://www.example.com'
self.tracked_itens = ['foo', 'bar', 'bin']
self.headers = {
'accept' : 'application/json, text/plain, */*',
'cache-control': 'no-cache',
'content-type' : 'application/json;charset=UTF-8',
'user-agent' : 'Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/'
'537.36 (KHTML, like Gecko) Chrome/80.0.3987.163 Safari/537.36'
}
def make_search(self):
body = { 'key:': 'example' }
resp = requests.post(self.base_url, json=body, headers=self.headers)
return simplejson.loads(resp.text)
def make_search_item(self, id, item):
resp = requests.get(f'{self.baseUrl}/{id}/{item}', headers=self.headers)
return simplejson.loads(resp.text)
def concurrent_search(self, id):
processes = [Process(target=self.search_item, args(id, item)
for item in self.tracked_items]
[p.start() for p in processes]
results = [p.join() for p in processes]
return {k: v for d in results for k, v in d.items()}
def search_item(self, id, item):
result = self.make_search_item(id, item)
return {item: result}
def search(self):
result = self.make_search()
if result['id']:
results = self.concurrent_search(result['id'])
return {'example_bot': results}
My problem is that in the method concurrent_search
I can’t run all the processes in parallel and get the result of them. What I need to do to return the result of all processes on a list?
have you thought about using the threading module?
– x8ss
yes, but the intention of using multiprocessing is to be able to extract as much as possible from multiple cores in the CPU. threading creates multiple threads but python does not use multiple CPU’s when using this module
– LeandroLuk