Python Multiprocessing Using Queue To Write To Same File
Solution 1:
I think you should slim your example to the basics. For example:
from multiprocessing import Process, Queue
deff(q):
q.put('Hello')
q.put('Bye')
q.put(None)
if __name__ == '__main__':
q = Queue()
p = Process(target=f, args=(q,))
p.start()
withopen('file.txt', 'w') as fp:
whileTrue:
item = q.get()
print(item)
if item isNone:
break
fp.write(item)
p.join()
Here I have two process (the main process, a p). p puts strings in a queue which are retrieved by the main process. When the main process finds None (a sentinel that I am using to indicate: "I am done" it breaks the loop.
Extending this to many process (or threads) is trivial.
Solution 2:
I achieved writing results from multiprocessing to a single file by uing 'map_async' function in Python3. Here is the function I wrote:
defPPResults(module,alist):##Parallel processing
npool = Pool(int(nproc))
res = npool.map_async(module, alist)
results = (res.get())###results returned in form of a list return results
So, I provide this function with a list of parameters in 'a_list' and 'module' is a function that does the processing and returns result. The above function keeps on collecting the results in form of list and returns back when all the parameters from 'a_list' have been processed. The results might not be correct order but as order was not important for me this worked well. The 'result' list can be iterated and individual results written in file like:
fh_out = open('./TestResults', 'w')
for i in results:##Write Results from list to file
fh_out.write(i)
To keep the order of the results we might need to use 'queues' similar to I mentioned in my question (above). Though I am being able to fix the code but I believe it is not required to be mentioned here.
Thanks
AK
Post a Comment for "Python Multiprocessing Using Queue To Write To Same File"