Your chunk size could be too large, have you tried dropping that - maybe 1024 bytes at a time? (also, you could use with
to tidy up the syntax)
def DownloadFile(url):
local_filename = url.split('/')[-1]
r = requests.get(url)
with open(local_filename, 'wb') as f:
for chunk in r.iter_content(chunk_size=1024):
if chunk: # filter out keep-alive new chunks
f.write(chunk)
return
Incidentally, how are you deducing that the response has been loaded into memory?
It sounds as if python isn't flushing the data to file, from other SO questions you could try f.flush()
and os.fsync()
to force the file write and free memory;
with open(local_filename, 'wb') as f:
for chunk in r.iter_content(chunk_size=1024):
if chunk: # filter out keep-alive new chunks
f.write(chunk)
f.flush()
os.fsync(f.fileno())