2.HashMap:它根据键的hashCode()得到hashcode值,然后根据高位运算和取模运算来定位该键值对的存储位置。Java中HashMap采用了拉链法解决冲突。大多数情况下可以直接定位到它的值,因而具有很快的访问速度。HashMap最多只允许一条记录的键为null,允许多条记录的值为null。非线程安全。如果需要满足线程安全,可以用 Collections的synchronizedMap方法使HashMap具有线程安全的能力,或者使用ConcurrentHashMap
3.Hashtable:Hashtable是遗留类,很多映射的常用功能与HashMap类似,不同的是它承自Dictionary类。线程安全。并发性不如ConcurrentHashMap,因为ConcurrentHashMap引入了分段锁。
4.LinkedHashMap:LinkedHashMap是HashMap的一个子类,保存了记录的插入顺序,在用Iterator遍历LinkedHashMap时,先得到的记录肯定是先插入的,也可以在构造时带参数,按照访问次序排序。
5.TreeMap:TreeMap实现SortedMap接口,能够把它保存的记录根据键排序,默认是按键值的升序排序,也可以指定排序的比较器,当用Iterator遍历TreeMap时,得到的记录是排过序的。在使用TreeMap时,key必须实现Comparable接口或者在构造TreeMap传入自定义的Comparator,否则会在运行时抛出java.lang.ClassCastException类型的异常。
6. 哈希桶数组需要在空间成本和时间成本之间权衡。Hash算法和扩容机制可以控制map使得Hash碰撞的概率又小,哈希桶数组(Node[] table)占用空间又少。
7.扩容是一个特别耗性能的操作,所以当程序员在使用HashMap的时候,估算map的大小,初始化的时候给一个大致的数值,避免map进行频繁的扩容。负载因子是可以修改的,也可以大于1,但是建议不要轻易修改,除非情况非常特殊。
负载因子表示一个散列表的空间的使用程度,有这样一个公式:initailCapacity*loadFactor=HashMap的容量。所以负载因子越大则散列表的装填程度越高,也就是能容纳更多的元素,元素多了,链表大了,所以此时索引效率就会降低。反之,负载因子越小则链表中的数据量就越稀疏,此时会对空间造成烂费,但是此时索引效率高。
8.建议是initailCapacity设置成2的n次幂,当数组长度为2的n次幂的时候,不同的key算得得index相同的几率较小,那么数据在数组上分布就比较均匀,也就是说碰撞的几率小,相对的,查询的时候就不用。遍历某个位置上的链表,这样查询效率也就较高了。laodFactor根据业务需求,如果迭代性能不是很重要,可以设置大一下
9.Hash算法本质上就是三步:取key的hashCode值、高位运算、取模运算。
10.staticfinalintDEFAULT_INITIAL_CAPACITY = 1 << 4;
staticfinalintMAXIMUM_CAPACITY = 1 << 30;
staticfinalfloatDEFAULT_LOAD_FACTOR = 0.75f;
staticfinalintTREEIFY_THRESHOLD = 8;//由链表转换成树的阈值,如果哈希函数不合理,即使扩容也无法减少箱子中链表的长度,因此 Java 的处理方案是当链表太长时,转换成红黑树。这个值表示当某个箱子中,链表长度大于 8 时,会转化成树。
staticfinalintUNTREEIFY_THRESHOLD = 6;//由树转换成链表的阈值,在哈希表扩容时,如果发现链表长度小于 6,则会由树重新退化为链表
staticfinalintMIN_TREEIFY_CAPACITY = 64;//当桶中的bin被树化时最小的hash表容量。(如果没有达到这个阈值,即hash表容量小于MIN_TREEIFY_CAPACITY,当桶中bin的数量太多时会执行resize扩容操作)这个MIN_TREEIFY_CAPACITY的值至少是TREEIFY_THRESHOLD的4倍。在转变成树之前,还有这次判断,只有键值对数量大于 64 才会发生转换。这是为了避免在哈希表建立初期,多个键值对恰好被放入了同一个链表中而导致不必要的转化。
理想状态下哈希表的每个箱子中,元素的数量遵守泊松分布:
当负载因子为 0.75 时,上述公式中 λ 约等于 0.5,因此箱子中元素个数和概率的关系如下:
数量 |
概率 |
0 |
0.60653066 |
1 |
0.30326533 |
2 |
0.07581633 |
3 |
0.01263606 |
4 |
0.00157952 |
5 |
0.00015795 |
6 |
0.00001316 |
7 |
0.00000094 |
8 |
0.00000006 |
这就是为什么箱子中链表长度超过 8 以后要变成红黑树,因为在正常情况下出现这种现象的几率小到忽略不计。一旦出现,几乎可以认为是哈希函数设计有问题导致的。
staticclass Node
finalinthash;
final K key;
V value;
Node
Node(inthash, K key, V value, Node
this.hash = hash;
this.key = key;
this.value = value;
this.next = next;
}
publicfinal K getKey() { returnkey; }
publicfinal V getValue() { returnvalue; }
publicfinal String toString() { returnkey + "=" + value; }
publicfinalint hashCode() {
return Objects.hashCode(key) ^ Objects.hashCode(value);
}
publicfinal V setValue(V newValue) {
V oldValue = value;
value = newValue;
returnoldValue;
}
publicfinalbooleanequals(Object o) {
if (o == this)
returntrue;
if (oinstanceof Map.Entry) {
Map.Entry,?> e = (Map.Entry,?>)o;
if (Objects.equals(key, e.getKey())&&
Objects.equals(value, e.getValue()))
returntrue;
}
returnfalse;
}
}
—————————两个重要的static方法————————————————————————————————————
————————————————————————————————————————————————————————————————————————————————————————————————————————————————————————————
staticfinalint hash(Object key) {//计算键的hash值和右移16位的值进行异或运算
inth;
return (key == null) ? 0 : (h = key.hashCode()) ^ (h >>> 16);
}
例如:
为什么要这么干呢?与HashMap中table下标的计算有关。因为,table的长度都是2的幂,因此index仅与hash值的低n位有关,hash值的高位都被与操作置为0。假如 table.length=2^4=16。
由上图可以看到,只有hash值的低4位参与了运算。这样做很容易产生碰撞。设计者权衡了speed, utility, and quality,将高16位与低16位异或来减少这种影响。设计者考虑到现在的hashCode分布的已经很不错了,而且当发生较大碰撞时也用树形存储降低了冲突。仅仅异或一下,既减少了系统的开销,也不会造成的因为高位没有参与下标的计算(table长度比较小时),从而引起的碰撞。
让cap-1再赋值给n的目的是另找到的目标值大于或等于原值。这是为了防止,cap已经是2的幂。如果cap已经是2的幂,又没有执行这个减1操作,则执行完后面的几条无符号右移操作之后,返回的capacity将是这个cap的2倍。
容量最大也就是32bit的正数,因此最后n |= n >>> 16; ,最多也就32个1,但是这时已经大于了MAXIMUM_CAPACITY ,所以取值到MAXIMUM_CAPACITY 。
staticfinalint tableSizeFor(intcap) {//对于给定的目标容量返回一个2的次幂容量(返回大于cap的最小的2的次幂)
intn = cap - 1;
n |= n >>> 1;
n |= n >>> 2;
n |= n >>> 4;
n |= n >>> 8;
n |= n >>> 16;
return (n < 0) ? 1 : (n >= MAXIMUM_CAPACITY) ? MAXIMUM_CAPACITY : n + 1;
}
———————————————————————Fields——————————————————————————————————————————————————————————————————————————————————————————————
transient Node
transient Set
transientintsize;//map中包含的键值对的数目
transientintmodCount;// HashMap结构更改次数(改变映射数目的修改,修正内部结构的修改如重哈希),该字段用于HashMap的 集合视图快速失败
intthreshold;//下一次重新分配空间resize()时table数组的大小
finalfloatloadFactor;//哈希表的装载因子
--------------------public操作-------------------------------------------------------------------------------------
public HashMap(intinitialCapacity, floatloadFactor) {//根据给定初始容量和装载因子构造一个空的HashMap
if (initialCapacity < 0)
thrownew IllegalArgumentException("Illegal initial capacity:" +
initialCapacity);
if (initialCapacity > MAXIMUM_CAPACITY)
initialCapacity = MAXIMUM_CAPACITY;
if (loadFactor <= 0 || Float.isNaN(loadFactor))
thrownew IllegalArgumentException("Illegal load factor: " +
loadFactor);
this.loadFactor = loadFactor;
this.threshold = tableSizeFor(initialCapacity);
}
public HashMap(intinitialCapacity) {//根据指定初始容量和默认装载因子0.75构造一个空的HashMap
this(initialCapacity, DEFAULT_LOAD_FACTOR);
}
public HashMap() {//构造一个默认大小和默认装载因子的HashMap
this.loadFactor = DEFAULT_LOAD_FACTOR; // all other fields defaulted
}
public HashMap(Map extends K, ? extends V> m) {//构造一个和给定Map映射相同的HashMap,默认装载因子是0.75,初始空间以足够存放给定Map中的映射为准
this.loadFactor = DEFAULT_LOAD_FACTOR;
putMapEntries(m, false);
}
finalvoidputMapEntries(Map extends K, ? extends V> m, booleanevict) {//实现Map.putAll()和Map的构造器
ints = m.size();
if (s > 0) {
if (table == null) { // pre-size
floatft = ((float)s / loadFactor) + 1.0F;
intt = ((ft < (float)MAXIMUM_CAPACITY) ?
(int)ft : MAXIMUM_CAPACITY);
if (t > threshold)
threshold = tableSizeFor(t);
}
elseif (s > threshold)
resize();
for (Map.Entry extends K, ? extends V> e : m.entrySet()) {
K key = e.getKey();
V value = e.getValue();
putVal(hash(key), key, value, false, evict);
}
}
}
publicint size() {//返回Map中键值对的数目
returnsize;
}
publicboolean isEmpty(){//返回Map中是否包含键值对
returnsize == 0;
}
public V get(Object key) {//返回指定关键字的值value,没有则返回null
Node
return (e = getNode(hash(key), key)) == null ? null : e.value;
}
final Node
Node
if ((tab = table) != null && (n = tab.length) > 0 &&
(first = tab[(n - 1) & hash]) != null) {
if (first.hash == hash && // always check first node
((k = first.key) == key || (key != null && key.equals(k))))
returnfirst;
if ((e = first.next) != null) {
if (firstinstanceof TreeNode)
return ((TreeNode
do {
if (e.hash == hash &&
((k = e.key) == key || (key != null && key.equals(k))))
returne;
} while ((e = e.next) != null);
}
}
returnnull;
}
publicbooleancontainsKey(Object key) {//如果Map中包含关键字key则返回true
return getNode(hash(key), key) != null;
}
public V put(K key, V value) {//将指定关键字和指定value关联在一起,如果Map中之前就已经包含该关键字,则新值替换旧值
return putVal(hash(key), key, value, false, true);
}
final V putVal(inthash, K key, V value, booleanonlyIfAbsent, booleanevict) {//实现Map.put及其相关方法
Node
if ((tab = table) == null || (n = tab.length) == 0)
n = (tab = resize()).length;
if ((p = tab[i = (n - 1) & hash]) == null)
tab[i] = newNode(hash, key, value, null);
else {
Node
if (p.hash == hash &&
((k = p.key) == key || (key != null && key.equals(k))))
e = p;
elseif (pinstanceof TreeNode)
e = ((TreeNode
else {
for (intbinCount = 0; ; ++binCount) {
if ((e = p.next) == null) {
p.next = newNode(hash, key, value, null);
if (binCount >= TREEIFY_THRESHOLD - 1) // -1 for 1st
treeifyBin(tab, hash);
break;
}
if (e.hash == hash &&
((k = e.key) == key || (key != null && key.equals(k))))
break;
p = e;
}
}
if (e != null) { // existing mapping for key
V oldValue = e.value;
if (!onlyIfAbsent || oldValue == null)
e.value = value;
afterNodeAccess(e);
returnoldValue;
}
}
++modCount;
if (++size > threshold)
resize();
afterNodeInsertion(evict);
returnnull;
}
final Node
Node
intoldCap = (oldTab == null) ? 0 : oldTab.length;
intoldThr = threshold;
intnewCap, newThr = 0;
if (oldCap > 0) {
if (oldCap >= MAXIMUM_CAPACITY) {
threshold = Integer.MAX_VALUE;
returnoldTab;
}
elseif ((newCap = oldCap << 1) < MAXIMUM_CAPACITY &&
oldCap >= DEFAULT_INITIAL_CAPACITY)
newThr = oldThr << 1; // double threshold
}
elseif (oldThr > 0) // initial capacity was placed inthreshold
newCap = oldThr;
else { // zero initial threshold signifies usingdefaults
newCap = DEFAULT_INITIAL_CAPACITY;
newThr = (int)(DEFAULT_LOAD_FACTOR * DEFAULT_INITIAL_CAPACITY);
}
if (newThr == 0) {
floatft = (float)newCap * loadFactor;
newThr = (newCap < MAXIMUM_CAPACITY && ft < (float)MAXIMUM_CAPACITY ?
(int)ft : Integer.MAX_VALUE);
}
threshold = newThr;
@SuppressWarnings({"rawtypes","unchecked"})
Node
table = newTab;
if (oldTab != null) {
for (intj = 0; j < oldCap; ++j) {
Node
if ((e = oldTab[j]) != null) {
oldTab[j] = null;
if (e.next == null)
newTab[e.hash & (newCap - 1)] = e;
elseif (einstanceof TreeNode)
((TreeNode
else { // preserve order
Node
Node
Node
do {
next = e.next;
if ((e.hash & oldCap) == 0) {
if (loTail == null)
loHead = e;
else
loTail.next = e;
loTail = e;
}
else {
if (hiTail == null)
hiHead = e;
else
hiTail.next = e;
hiTail = e;
}
} while ((e = next) != null);
if (loTail != null) {
loTail.next = null;
newTab[j] = loHead;
}
if (hiTail != null) {
hiTail.next = null;
newTab[j + oldCap] = hiHead;
}
}
}
}
}
returnnewTab;
}
finalvoidtreeifyBin(Node
intn, index; Node
if (tab == null || (n = tab.length) < MIN_TREEIFY_CAPACITY)
resize();
elseif ((e = tab[index = (n - 1) & hash]) != null) {
TreeNode
do {
TreeNode
if (tl == null)
hd = p;
else {
p.prev = tl;
tl.next = p;
}
tl = p;
} while ((e = e.next) != null);
if ((tab[index] = hd) != null)
hd.treeify(tab);
}
}
publicvoid putAll(Map extends K, ? extends V> m) {//将指定Map中的所有映射拷贝进来,会替代现有重复关键字对应的value值
putMapEntries(m, true);
}
public V remove(Object key) {//如果存在就删除关键字对应的映射
Node
return (e = removeNode(hash(key), key, null, false, true)) == null ?
null : e.value;
}
final Node
Node
if ((tab = table) != null && (n = tab.length) > 0 &&
(p = tab[index = (n - 1) & hash]) != null) {
Node
if (p.hash == hash &&
((k = p.key) == key || (key != null && key.equals(k))))
node = p;
elseif ((e = p.next) != null) {
if (pinstanceof TreeNode)
node = ((TreeNode
else {
do {
if (e.hash == hash &&
((k = e.key) == key ||
(key != null && key.equals(k)))) {
node = e;
break;
}
p = e;
} while ((e = e.next) != null);
}
}
if (node != null && (!matchValue || (v = node.value) == value ||
(value != null && value.equals(v)))) {
if (nodeinstanceof TreeNode)
((TreeNode
elseif (node == p)
tab[index] = node.next;
else
p.next = node.next;
++modCount;
--size;
afterNodeRemoval(node);
returnnode;
}
}
returnnull;
}
publicvoid clear() {// 删除Map中所有映射,执行后Map为空
Node
modCount++;
if ((tab = table) != null && size > 0) {
size = 0;
for (inti = 0; i < tab.length; ++i)
tab[i] = null;
}
}
publicbooleancontainsValue(Object value) {//如果Map中包含一个或多个关键字的value等于给定value,则返回true
Node
if ((tab = table) != null && size > 0) {
for (inti = 0; i < tab.length; ++i) {
for (Node
if ((v = e.value) == value ||
(value != null && value.equals(v)))
returntrue;
}
}
}
returnfalse;
}
public Set
Set
if (ks == null) {
ks = new KeySet();
keySet = ks;
}
returnks;
}
finalclass KeySet extends AbstractSet
publicfinalint size() { returnsize; }
publicfinalvoid clear() { HashMap.this.clear(); }
publicfinal Iterator
publicfinalbooleancontains(Object o) { return containsKey(o); }
publicfinalbooleanremove(Object key) {
return removeNode(hash(key), key, null, false, true) != null;
}
publicfinal Spliterator
returnnewKeySpliterator<>(HashMap.this, 0, -1, 0, 0);
}
publicfinalvoidforEach(Consumer super K> action) {
Node
if (action == null)
thrownewNullPointerException();
if (size > 0 && (tab = table) != null) {
intmc = modCount;
for (inti = 0; i < tab.length; ++i) {
for (Node
action.accept(e.key);
}
if (modCount != mc)
thrownewConcurrentModificationException();
}
}
}
public Collection
Collection
if (vs == null) {
vs = new Values();
values = vs;
}
returnvs;
}
finalclass Values extends AbstractCollection
publicfinalint size() { returnsize; }
publicfinalvoid clear() { HashMap.this.clear(); }
publicfinal Iterator
publicfinalbooleancontains(Object o) { return containsValue(o); }
publicfinal Spliterator
returnnew ValueSpliterator<>(HashMap.this, 0, -1, 0, 0);
}
publicfinalvoidforEach(Consumer super V> action) {
Node
if (action == null)
thrownewNullPointerException();
if (size > 0 && (tab = table) != null) {
intmc = modCount;
for (inti = 0; i < tab.length; ++i) {
for (Node
action.accept(e.value);
}
if (modCount != mc)
thrownewConcurrentModificationException();
}
}
}
public Set
Set
return (es = entrySet) == null ? (entrySet = new EntrySet()) : es;
}
finalclass EntrySet extends AbstractSet
publicfinalint size() { returnsize; }
publicfinalvoid clear() { HashMap.this.clear(); }
publicfinal Iterator
returnnew EntryIterator();
}
publicfinalbooleancontains(Object o) {
if (!(oinstanceof Map.Entry))
returnfalse;
Map.Entry,?> e =(Map.Entry,?>) o;
Object key = e.getKey();
Node
returncandidate != null && candidate.equals(e);
}
publicfinalbooleanremove(Object o) {
if (oinstanceof Map.Entry) {
Map.Entry,?> e = (Map.Entry,?>) o;
Object key = e.getKey();
Object value = e.getValue();
return removeNode(hash(key), key, value, true, true) != null;
}
returnfalse;
}
publicfinal Spliterator
returnnew EntrySpliterator<>(HashMap.this, 0, -1, 0, 0);
}
publicfinalvoidforEach(Consumer super Map.Entry
Node
if (action == null)
thrownewNullPointerException();
if (size > 0 && (tab = table) != null) {
intmc = modCount;
for (inti = 0; i < tab.length; ++i) {
for (Node
action.accept(e);
}
if (modCount != mc)
thrownewConcurrentModificationException();
}
}
}
// Overrides of JDK8 Map extensionmethods
@Override
public V getOrDefault(Object key, V defaultValue) {
Node
return (e = getNode(hash(key), key)) == null ? defaultValue : e.value;
}
@Override
public V putIfAbsent(K key, V value) {
return putVal(hash(key), key, value, true, true);
}
@Override
publicbooleanremove(Object key, Object value) {
return removeNode(hash(key), key, value, true, true) != null;
}
@Override
publicboolean replace(Kkey, V oldValue, V newValue) {
Node
if ((e = getNode(hash(key), key)) != null &&
((v = e.value) == oldValue || (v != null && v.equals(oldValue)))) {
e.value = newValue;
afterNodeAccess(e);
returntrue;
}
returnfalse;
}
@Override
public V replace(K key, V value) {
Node
if ((e = getNode(hash(key), key)) != null) {
V oldValue = e.value;
e.value = value;
afterNodeAccess(e);
returnoldValue;
}
returnnull;
}
@Override
public V computeIfAbsent(K key,
Function super K, ? extends V> mappingFunction) {
if (mappingFunction == null)
thrownew NullPointerException();
inthash = hash(key);
Node
intbinCount = 0;
TreeNode
Node
if (size > threshold || (tab = table) == null ||
(n = tab.length) == 0)
n = (tab = resize()).length;
if ((first = tab[i = (n - 1) & hash]) != null) {
if (firstinstanceof TreeNode)
old = (t =(TreeNode
else {
Node
do {
if (e.hash == hash &&
((k = e.key) == key || (key != null && key.equals(k)))) {
old = e;
break;
}
++binCount;
} while ((e = e.next) != null);
}
V oldValue;
if (old != null && (oldValue = old.value) != null) {
afterNodeAccess(old);
returnoldValue;
}
}
V v = mappingFunction.apply(key);
if (v == null) {
returnnull;
} elseif (old != null) {
old.value = v;
afterNodeAccess(old);
returnv;
}
elseif (t != null)
t.putTreeVal(this, tab, hash, key, v);
else {
tab[i] = newNode(hash, key, v, first);
if (binCount >= TREEIFY_THRESHOLD - 1)
treeifyBin(tab, hash);
}
++modCount;
++size;
afterNodeInsertion(true);
returnv;
}
public V computeIfPresent(K key,
BiFunction super K, ? super V, ? extends V> remappingFunction) {
if (remappingFunction == null)
thrownew NullPointerException();
Node
inthash = hash(key);
if ((e = getNode(hash, key)) != null &&
(oldValue = e.value) != null) {
V v = remappingFunction.apply(key, oldValue);
if (v != null) {
e.value = v;
afterNodeAccess(e);
returnv;
}
else
removeNode(hash, key, null, false, true);
}
returnnull;
}
@Override
public V compute(K key,
BiFunction super K, ? super V, ? extends V> remappingFunction) {
if (remappingFunction == null)
thrownew NullPointerException();
inthash = hash(key);
Node
intbinCount = 0;
TreeNode
Node
if (size > threshold || (tab = table) == null ||
(n = tab.length) == 0)
n = (tab = resize()).length;
if ((first = tab[i = (n - 1) & hash]) != null) {
if (firstinstanceof TreeNode)
old = (t =(TreeNode
else {
Node
do {
if (e.hash == hash &&
((k = e.key) == key || (key != null && key.equals(k)))) {
old = e;
break;
}
++binCount;
} while ((e = e.next) != null);
}
}
V oldValue = (old == null) ? null : old.value;
V v = remappingFunction.apply(key, oldValue);
if (old != null) {
if (v != null) {
old.value = v;
afterNodeAccess(old);
}
else
removeNode(hash, key, null, false, true);
}
elseif (v != null) {
if (t != null)
t.putTreeVal(this, tab, hash, key, v);
else {
tab[i] = newNode(hash, key, v, first);
if (binCount >= TREEIFY_THRESHOLD - 1)
treeifyBin(tab, hash);
}
++modCount;
++size;
afterNodeInsertion(true);
}
returnv;
}
@Override
public V merge(K key, V value,
BiFunction super V, ? super V, ? extends V> remappingFunction) {
if (value == null)
thrownew NullPointerException();
if (remappingFunction == null)
thrownew NullPointerException();
inthash = hash(key);
Node
intbinCount = 0;
TreeNode
Node
if (size > threshold || (tab = table) == null ||
(n = tab.length) == 0)
n = (tab = resize()).length;
if ((first = tab[i = (n - 1) & hash]) != null) {
if (firstinstanceof TreeNode)
old = (t =(TreeNode
else {
Node
do {
if (e.hash == hash &&
((k = e.key) == key || (key != null && key.equals(k)))) {
old = e;
break;
}
++binCount;
} while ((e = e.next) != null);
}
}
if (old != null) {
V v;
if (old.value != null)
v = remappingFunction.apply(old.value, value);
else
v = value;
if (v != null) {
old.value = v;
afterNodeAccess(old);
}
else
removeNode(hash, key, null, false, true);
returnv;
}
if (value != null) {
if (t != null)
t.putTreeVal(this, tab, hash, key, value);
else {
tab[i] = newNode(hash, key, value, first);
if (binCount >= TREEIFY_THRESHOLD - 1)
treeifyBin(tab, hash);
}
++modCount;
++size;
afterNodeInsertion(true);
}
returnvalue;
}
@Override
publicvoidforEach(BiConsumer super K, ? super V> action) {
Node
if (action == null)
thrownew NullPointerException();
if (size > 0 && (tab = table) != null) {
intmc = modCount;
for (inti = 0; i < tab.length; ++i) {
for (Node
action.accept(e.key, e.value);
}
if (modCount != mc)
thrownewConcurrentModificationException();
}
}
@Override
publicvoidreplaceAll(BiFunction super K, ? super V, ? extends V> function) {
Node
if (function == null)
thrownew NullPointerException();
if (size > 0 && (tab = table) != null) {
intmc = modCount;
for (inti = 0; i < tab.length; ++i) {
for (Node
e.value = function.apply(e.key, e.value);
}
}
if (modCount != mc)
thrownewConcurrentModificationException();
}
}