Is the Java hashtable #hashcode () implementation broken?
I wonder if the default implementation of Java's hashtable #hashcode () is broken when the hashtable only contains each pair of entries with the same key and value
See the following applications:
public class HashtableHash { public static void main(final String[] args) { final Hashtable<String,String> ht = new Hashtable<String,String>(); final int h1 = ht.hashCode(); System.out.println(h1); // output is 0 ht.put("Test","Test"); final int h2 = ht.hashCode(); System.out.println(h2); // output is 0 ?!? // Hashtable#hashCode() uses this algorithm to calculate hash code // of every element: // // h += e.key.hashCode() ^ e.value.hashCode() // // The result of XOR on identical hash codes is always 0 // (because all bits are equal) ht.put("Test2","Hello world"); final int h3 = ht.hashCode(); System.out.println(h3); // output is some hash code } }
The hash code of an empty hashtable is 0 After the key "test" is used and the value "test" has been added to the entry of hastable, the hash code is still 0
The problem is that in the hashcode () method of hashtable, the hash code of each entry is calculated and added to the hash code, as shown below
h += e.key.hashCode() ^ e.value.hashCode()
However, the XOR of the same hash code (in the case of the same string) is always 0 Therefore, entries with the same key and value are not part of the hashtable hash code
Since hashtable has actually changed, this implementation is difficult to implement It doesn't matter whether the key and value are the same
Solution
Documents from hashcode;
In other words, poor implementation - maybe Crushing - not in accordance with specifications