There was a subtle big in save_newobj(): it used self.save_global(t)

on the type instead of self.save(t).  This defeated the purpose of
NEWOBJ, because it didn't generate a BINGET opcode when t was already
memoized; but moreover, it would generate multiple BINPUT opcodes for
the same type!  pickletools.dis() doesn't like this.

How I found this?  I was playing with picklesize.py in the datetime
sandbox, and noticed that protocol 2 pickles for multiple objects were
in fact larger than protocol 1 pickles!  That was suspicious, so I
decided to disassemble one of the pickles.

This really needs a unit test, but I'm exhausted.  I'll be late for
work as it is. :-(
This commit is contained in:
Guido van Rossum 2003-01-30 06:37:41 +00:00
parent 4fba220f4a
commit 9b40e804c7
1 changed files with 2 additions and 1 deletions

View File

@ -233,6 +233,7 @@ class Pickler:
# growable) array, indexed by memo key. # growable) array, indexed by memo key.
if self.fast: if self.fast:
return return
assert id(obj) not in self.memo
memo_len = len(self.memo) memo_len = len(self.memo)
self.write(self.put(memo_len)) self.write(self.put(memo_len))
self.memo[id(obj)] = memo_len, obj self.memo[id(obj)] = memo_len, obj
@ -386,7 +387,7 @@ class Pickler:
save = self.save save = self.save
write = self.write write = self.write
self.save_global(t) self.save(t)
save(args) save(args)
write(NEWOBJ) write(NEWOBJ)
self.memoize(obj) self.memoize(obj)