Python Multiprocessing And Database Access With Pyodbc "is Not Safe"?
Solution 1:
Multiprocessing relies on pickling to communicate objects between processes. The pyodbc connection and cursor objects can not be pickled.
>>> cPickle.dumps(aCursor)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/lib64/python2.5/copy_reg.py", line 69, in _reduce_ex
raise TypeError, "can't pickle %s objects" % base.__name__
TypeError: can't pickle Cursor objects
>>> cPickle.dumps(dbHandle)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/lib64/python2.5/copy_reg.py", line 69, in _reduce_ex
raise TypeError, "can't pickle %s objects" % base.__name__
TypeError: can't pickle Connection objects
"It puts items in the work_queue", what items? Is it possible the cursor object is getting passed as well?
Solution 2:
The error is raised within the pickle
module, so somewhere your DB-Cursor object gets pickled and unpickled (serialized to storage and unserialized to the Python object again).
I guess that pyodbc.Cursor
does not support pickling. Why should you try to persist the cursor object anyway?
Check if you use pickle
somewhere in your work chain or if it is used implicitely.
Solution 3:
pyodbc has Python DB-API threadsafety level 1. This means threads cannot share connections, and it's not threadsafe at all.
I don't think underlying thread-safe ODBC drivers make a difference. It's in the Python code as noted by the Pickling error.
Post a Comment for "Python Multiprocessing And Database Access With Pyodbc "is Not Safe"?"