Project

General

Profile

Actions

Bug #17951

closed

Collisions in Proc#hash values for blocks defined at the same line

Added by decuplet (Nikita Shilnikov) over 3 years ago. Updated over 3 years ago.

Status:
Closed
Assignee:
-
Target version:
-
ruby -v:
ruby 3.0.1p64 (2021-04-05 revision 0fb782ee38) [x86_64-darwin20]
[ruby-core:104248]

Description

require 'set'

def capture(&block)
  block
end

# it creates 1k of same blocks
blocks = Array.new(1000) { capture { :foo } }

hashes = blocks.map(&:hash).uniq
ids = blocks.map(&:object_id).uniq
equality = blocks.map { blocks[0].eql?(_1) }.tally
hash = blocks.to_h { [_1, nil] }
set = blocks.to_set

puts(hashes.size)      # => 11
puts(ids.size)         # => 1000
puts(equality.inspect) # => {true=>1, false=>999}
puts(hash.size)        # => 1000
puts(set.size)         # => 1000

The script builds one thousand blocks and then compares them in various ways. I would expect proc objects to be completely opaque and thus be treated as separate objects. As in, they are not equal. All tests but first confirm this expectation. However, Proc#hash doesn't return 1000 different results rather it varies between 3 and 20 on my machine.

As I understand, current behavior doesn't violate ruby's guarantees. But I would expect Proc#hash results to be as unique as Proc#object_id, at least a lot more unique than they currently are.

The problem is likely to occur only for blocks defined at the same line.

ref to similar/related issue https://bugs.ruby-lang.org/issues/6048

Actions

Also available in: Atom PDF

Like0
Like0Like0Like0Like0Like0