mirror of https://github.com/python/cpython
bpo-43650: Fix MemoryError on zip.read in shutil._unpack_zipfile for large files (GH-25058)
`shutil.unpack_archive()` tries to read the whole file into memory, making no use of any kind of smaller buffer. Process crashes for really large files: I.e. archive: ~1.7G, unpacked: ~10G. Before the crash it can easily take away all available RAM on smaller systems. Had to pull the code form `zipfile.Zipfile.extractall()` to fix this Automerge-Triggered-By: GH:gpshead
This commit is contained in:
parent
83f0f8d62f
commit
f32c7950e0
|
@ -1163,20 +1163,16 @@ def _unpack_zipfile(filename, extract_dir):
|
|||
if name.startswith('/') or '..' in name:
|
||||
continue
|
||||
|
||||
target = os.path.join(extract_dir, *name.split('/'))
|
||||
if not target:
|
||||
targetpath = os.path.join(extract_dir, *name.split('/'))
|
||||
if not targetpath:
|
||||
continue
|
||||
|
||||
_ensure_directory(target)
|
||||
_ensure_directory(targetpath)
|
||||
if not name.endswith('/'):
|
||||
# file
|
||||
data = zip.read(info.filename)
|
||||
f = open(target, 'wb')
|
||||
try:
|
||||
f.write(data)
|
||||
finally:
|
||||
f.close()
|
||||
del data
|
||||
with zip.open(name, 'r') as source, \
|
||||
open(targetpath, 'wb') as target:
|
||||
copyfileobj(source, target)
|
||||
finally:
|
||||
zip.close()
|
||||
|
||||
|
|
|
@ -0,0 +1,2 @@
|
|||
Fix :exc:`MemoryError` in :func:`shutil.unpack_archive` which fails inside
|
||||
:func:`shutil._unpack_zipfile` on large files. Patch by Igor Bolshakov.
|
Loading…
Reference in New Issue