Skip to content Skip to sidebar Skip to footer

Dask Memory Error When Running Df.to_csv()

I am trying to index and save large csvs that cannot be loaded into memory. My code to load the csv, perform a computation and index by the new values works without issue. A simpli

Solution 1:

I encourage you to try smaller chunks of data. You should control this in the read_csv part of your computation rather than the to_csv part.


Post a Comment for "Dask Memory Error When Running Df.to_csv()"