Skip to content Skip to sidebar Skip to footer

Python: When Writing To A Large File, Keep The File Open Or To Open It And Append To It As Needed?

I am wondering how to best handle writing to a big file in python. My python code loops many times running an external program (ancient Fortran with a weird input file format), re

Solution 1:

Opening and closing the files definitely has a cost. However if your legacy program takes one or more second to respond you propably won't notice.

deffunc1():
    for x inrange(1000):
        x = str(x)
        withopen("test1.txt", "a") as k:
            k.write(x)

1 loops, best of 3: 2.47 s per loop

deffunc2():
    withopen("test2.txt", "a") as k:
        for x inrange(1000):
            x = str(x)
            k.write(x)

100 loops, best of 3: 6.66 ms per loop

However if your file get's really big it becomes slower: (800+mb)

deffunc3(file):
    for x inrange(10):
        x = str(x)
        withopen(file, "a") as k:
            k.write(x)

12kb file:

10 loops, best of 3: 33.4 ms per loop

800mb+ file:

1 loops, best of 3: 24.5 s per loop

Keeping the file open will mainly cost you memory.

I would suggest using SQlite to store your data.

Post a Comment for "Python: When Writing To A Large File, Keep The File Open Or To Open It And Append To It As Needed?"