1> MapMaker for creating ConcurrentMap instances.
2> CacheBuilder for creating LoadingCache and Cache instances.
3> CacheBuilderSpec for creating CacheBuilder instance from a formatted string
4> CacheLoader that is used by a LoadingCache instance to retrieve a single value for a given key
5> CacheStats that provides statistics of the performance of the cache
6> RemovalListener that receives notifications when an entry has been removed from the cache
1> MapMaker
@Test public void makeMapTest() { Map<Object, Object> map = new MapMaker().concurrencyLevel(2) .weakValues().weakKeys().makeMap(); map.put(new Object(), new Object()); }
Q: What's the meaning of "concurrencyLevel"?
Q: What's the benefit of using "weakValues" and "weakKeys"?
2> Cache & LoadingCache
1. Cache
public interface Cache<K, V> { void put(K key, V value); @Nullable V getIfPresent(Object key); V get(K key, Callable<? extends V> valueLoader) throws ExecutionException; /** * Returns a map of the values associated with {@code keys} in this cache. The returned map will * only contain entries which are already present in the cache. */ ImmutableMap<K, V> getAllPresent(Iterable<?> keys); void invalidate(Object key); void invalidateAll(Iterable<?> keys); void invalidateAll(); ConcurrentMap<K, V> asMap(); }
1) get & getIfPresent
@Test(expected = InvalidCacheLoadException.class) public void getTest() throws ExecutionException { Cache<String, String> cache = CacheBuilder.newBuilder().build(); cache.put("KEY_1", "VALUE_1"); String value = cache.getIfPresent("KEY_2"); assertNull(value); value = cache.get("KEY_2", new Callable<String>() { public String call() throws Exception { return "VALUE_2"; } }); assertEquals("VALUE_2", value); value = cache.getIfPresent("KEY_2"); assertEquals("VALUE_2", value); value = cache.get("KEY_2", new Callable<String>() { public String call() throws Exception { return null; } }); assertEquals("VALUE_2", value); cache.invalidate("KEY_2"); value = cache.get("KEY_2", new Callable<String>() { public String call() throws Exception { return null; // InvalidCacheLoadException would be thrown } }); }
The logic of cache.get(key, valueLoader) is:
1> Find corresoponding value in cache with provided key
2> If we can find its value, then the value is returned, valueLoader will NEVER be invoked.
3> If we cannot find its value, then the valueLoader will be invoked to get the value.
1> If valueLoader returns null, then CacheLoader$InvalidCacheLoadException will be thrown.
2> If valueLoader returns not null value, then the value would be returned and then key/value pair would be stored in cache at the same time.
Thus the thumbnail principle is that DO NOT RETURN NULL IN VALUELOADER.
If we want return null if not cannot find its corresponding value, then use getIfPresent(key) instead.
But the Callable implies that valueLoader is executed asynchronously, but what do we do if we don't need/want to execute an asynchronous task?
@Test public void getTest() throws ExecutionException, InterruptedException { Cache<String, String> cache = CacheBuilder.newBuilder().build(); Callable<String> callable = new Callable<String>() { @Override public String call() throws Exception { System.out.println("Thread: " + Thread.currentThread()); Thread.sleep(1000); return "VALUE_" + System.currentTimeMillis(); } }; System.out.println(System.currentTimeMillis()); String value = cache.get("KEY_1", callable); System.out.println(System.currentTimeMillis()); System.out.println(value); value = cache.getIfPresent("KEY_1"); System.out.println(System.currentTimeMillis()); System.out.println(value); } // output: // 1409031531671 // Thread: Thread[main,5,main] // 1409031532699 // VALUE_1409031532684 // 1409031532699 // VALUE_1409031532684Q: It seems the callable is still executed in main thread, how can we start valueLoader asynchronously??
@Test public void syncGetTest() throws ExecutionException { Cache<String, String> cache = CacheBuilder.newBuilder().build(); System.out.println(System.currentTimeMillis()); String value = cache.get("KEY_1", Callables.returning("VALUE_" + System.currentTimeMillis())); // What if the Callables.returning(timeConsumingService.get("KEY_1")) ? // Main thread still have to wait for the returning of this service. System.out.println(System.currentTimeMillis()); System.out.println(value); } // output: // 1409031825841 // 1409031825842 // 1409031825869 // VALUE_1409031825842
2) invalidate & invalidateAll
@Test public void invalidateTest() { Cache<String, String> cache = CacheBuilder.newBuilder().build(); cache.put("KEY_1", "VALUE_1"); cache.put("KEY_2", "VALUE_2"); cache.put("KEY_3", "VALUE_3"); cache.put("KEY_4", "VALUE_4"); String value = cache.getIfPresent("KEY_1"); assertEquals("VALUE_1", value); cache.invalidate("KEY_1"); value = cache.getIfPresent("KEY_1"); assertNull(value); cache.invalidateAll(Lists.newArrayList("KEY_2", "KEY_3")); value = cache.getIfPresent("KEY_2"); assertNull(value); value = cache.getIfPresent("KEY_3"); assertNull(value); value = cache.getIfPresent("KEY_4"); assertEquals("VALUE_4", value); cache.invalidateAll(); value = cache.getIfPresent("KEY_4"); assertNull(value); cache.invalidate("KEY_N"); }
2. LoadingCache
public interface LoadingCache<K, V> extends Cache<K, V>, Function<K, V> { V get(K key) throws ExecutionException; V getUnchecked(K key); ImmutableMap<K, V> getAll(Iterable<? extends K> keys) throws ExecutionException; void refresh(K key); ConcurrentMap<K, V> asMap(); }
The LoadingCache interface extends the Cache interface with the self-loading functionality.
Consider the following code:
Book book = loadingCache.get(id);
If the book object was not available when the get call was executed, LoadingCache will know how to retrieve the object, store it in the cache, and return the value.
As implementations of LoadingCache are expected to be thread safe, a call made to get(), with the same key, while the cache is loading would block. Once the value was loaded, the call would return the value that was loaded by the orginal call to the get() method.
However, multiple calls to get with distinct keys would load concurrently.
static LoadingCache<String, String> cache = CacheBuilder.newBuilder() .build(new CacheLoader<String, String>() { private int i = 1; @Override public String load(String key) throws Exception { Thread.sleep(1000); return "DUMMY_VALUE" + (++i); } }); @Test public void syncLoadingTest() throws ExecutionException { System.out.println(System.currentTimeMillis()); String value = cache.get("DUMMY_KEY"); // Blocking 1000ms for loading System.out.println(System.currentTimeMillis()); System.out.println("Finished syncLoadingTest, value: " + value); } // output: We can see, the loading process cost 1000ms. // 1409046809839 // 1409046810850 // Finished syncLoadingTest, value: DUMMY_VALUE2 @SuppressWarnings("unchecked") @Test public void asyncReadingTest() throws ExecutionException, InterruptedException { Callable<String> readThread1 = new Callable<String>() { @Override public String call() throws Exception { return cache.get("DUMMY_KEY_1"); } }; Callable<String> readThread2 = new Callable<String>() { @Override public String call() throws Exception { return cache.get("DUMMY_KEY_2"); } }; Callable<String> readThread3 = new Callable<String>() { @Override public String call() throws Exception { return cache.get("DUMMY_KEY_3"); } }; System.out.println("Before invokeAll: " + System.currentTimeMillis()); Executors.newFixedThreadPool(3).invokeAll( Lists.newArrayList(readThread1, readThread2, readThread3)); System.out.println("After invokeAll: " + System.currentTimeMillis()); System.out.println("Before get: " + System.currentTimeMillis()); String value1 = cache.get("DUMMY_KEY_1"); String value2 = cache.get("DUMMY_KEY_2"); String value3 = cache.get("DUMMY_KEY_3"); System.out.println("After get: " + System.currentTimeMillis() + ".\nvalue1: " + value1 + ", value2: " + value2 + ", value3:" + value3); } // output: We can see, for all 3 values, the loading process only cost 1000ms. // Before invokeAll: 1409046901047 // After invokeAll: 1409046902116 // Before get: 1409046902116 // After get: 1409046902116. // value1: DUMMY_VALUE2, value2: DUMMY_VALUE2, value3:DUMMY_VALUE3
If we have a collection of keys and would like to retrieve the values for each key, we will make the following call:
ImmutableMap<K, V> map = cache.getAll(Iterable<? extends K> keys);
The map returned from getAll could either be all cached values, all newly retrieved values, or a mix of already cached and newly retrieved values.
Q: The process of loading for uncached value is sync of async?
A: Sync:
static LoadingCache<String, String> cache = CacheBuilder.newBuilder() .build(new CacheLoader<String, String>() { private int i = 1; @Override public String load(String key) throws Exception { Thread.sleep(1000); return "DUMMY_VALUE" + (++i); } }); @Test public void getAllTest() throws ExecutionException { System.out.println("Before getAllTest: " + System.currentTimeMillis()); cache.getAll(Lists.newArrayList("KEY_1", "KEY_2", " KEY_3")); System.out.println("After getAllTest: " + System.currentTimeMillis()); } // output: We can find the total time consumption is 3000ms // Before getAllTest: 1409049717214 // After getAllTest: 1409049720343
LoadingCache also provides a mechanism for refreshing values in the cache:
void refresh(K key);
By making a call to refresh, LoadingCache will retrieve a new value for the key. The current value will not be discarded until the new value has been returned; this means that the calls to get during the loading process will return the current value in the cache. If an exception is thrown during the refresh call, the original value is kept in the cache. Kepp in mind that if the value is retrieved asynchronously, the method could return before the value is actually refreshed.
static LoadingCache<String, String> cache = CacheBuilder.newBuilder() .build(new CacheLoader<String, String>() { private int i = 1; @Override public String load(String key) throws Exception { Thread.sleep(1000); return "DUMMY_VALUE" + (++i); } }); /** * Test for refresh() in loading cache <br/> * The calls to get() during the loading process will return the current * value in the cache <br/> * * @param args */ public static void main(String[] args) { cache.put("DUMMY_KEY1", "DUMMY_VALUE1"); Thread refreshThread = new Thread(new Runnable() { @Override public void run() { try { while (true) { Thread.sleep(1000); System.out.println("Start refresh KEY: DUMMY_KEY1"); cache.refresh("DUMMY_KEY1"); System.out.println("Finished refresh KEY: DUMMY_KEY1"); } } catch (InterruptedException e) { e.printStackTrace(); } } }); Thread getThread = new Thread(new Runnable() { @Override public void run() { try { while (true) { Thread.sleep(500); System.out.println("Start get KEY: DUMMY_KEY1"); String value = cache.get("DUMMY_KEY1"); System.out .println("Finished get KEY: DUMMY_KEY1, VALUE: " + value); } } catch (ExecutionException e) { e.printStackTrace(); } catch (InterruptedException e) { e.printStackTrace(); } } }); refreshThread.start(); getThread.start(); } // output: We can find that during the course of refresh, we will still get the legacy value. // Start get KEY: DUMMY_KEY1 // Finished get KEY: DUMMY_KEY1, VALUE: DUMMY_VALUE1 // Start get KEY: DUMMY_KEY1 // Start refresh KEY: DUMMY_KEY1 // Finished get KEY: DUMMY_KEY1, VALUE: DUMMY_VALUE1 // Start get KEY: DUMMY_KEY1 // Finished get KEY: DUMMY_KEY1, VALUE: DUMMY_VALUE1 // Start get KEY: DUMMY_KEY1 // Finished get KEY: DUMMY_KEY1, VALUE: DUMMY_VALUE1 // Finished refresh KEY: DUMMY_KEY1 // Start get KEY: DUMMY_KEY1 // Finished get KEY: DUMMY_KEY1, VALUE: DUMMY_VALUE2 // Start get KEY: DUMMY_KEY1 // Finished get KEY: DUMMY_KEY1, VALUE: DUMMY_VALUE2 // Start refresh KEY: DUMMY_KEY1 // Start get KEY: DUMMY_KEY1 // Finished get KEY: DUMMY_KEY1, VALUE: DUMMY_VALUE2
3> CacheBuilder
The CacheBuilder class provides a way to obtain Cache and LoadingCache instances via the Builder pattern. There are many options we can specify on the Cache instance we are creating rather than listing all of them.
Eg1:
package edu.xmu.guava.cache; import java.util.concurrent.TimeUnit; import org.junit.Test; import com.google.common.cache.CacheBuilder; import com.google.common.cache.CacheLoader; import com.google.common.cache.LoadingCache; import com.google.common.cache.RemovalListener; import com.google.common.cache.RemovalNotification; public class CacheBuilderTest { @Test public void buildCacheTest() throws Exception { LoadingCache<String, String> cache = CacheBuilder.newBuilder() .expireAfterWrite(2, TimeUnit.SECONDS) .ticker(Ticker.systemTicker()) .removalListener(new RemovalListener<String, String>() { @Override public void onRemoval( RemovalNotification<String, String> notification) { System.out.println(String.format( "[%s] is removed from cache", notification)); } }).build(new CacheLoader<String, String>() { @Override public String load(String key) throws Exception { return key + System.currentTimeMillis(); } }); String value = cache.get("Hello"); System.out.println(value); value = cache.get("Hello"); System.out.println(value); Thread.sleep(2100); System.out.println(cache.size()); Thread.sleep(1100); value = cache.get("Hello"); System.out.println(value); } } // output: The new value is created after 3100ms instead of 2000ms, and when we get size at 2100ms, the size is 1 instead of 0. // Hello1409057402516 // Hello1409057402516 // 1 // [Hello=Hello1409057402516] is removed from cache // Hello1409057405725expireAfterWrite: When duration is zero, this method hands off to maximumSize(0), ignoring any otherwise-specificed maximum size or weight. This can be useful in testing, or to disable caching temporarily without a code change. Actually this expireAfterWrite will not remove this entry automatically when TimeUnit expires, it will remove entry when we visit it again and (CurrentTimestamp-LastAccessedTimestamp > TimeUnit).
ticker: Specifies a nanosecond-precision time source for use in determining when entries should be expired. By default, System.nanoTime is used. The primary intent of this method is to facilitate testing of caches which have been configured with expireAfterWrite or expireAfterAccess.
Eg2:
@Test public void maxSizeTest() throws ExecutionException { LoadingCache<String, String> cache = CacheBuilder.newBuilder() .maximumSize(3L) .removalListener(new RemovalListener<String, String>() { @Override public void onRemoval( RemovalNotification<String, String> notification) { System.out.println(String.format( "[%s] is removed from cache", notification)); } }).build(new CacheLoader<String, String>() { @Override public String load(String key) throws Exception { return key + "_" + System.currentTimeMillis(); } }); for (int i = 0; i < 12; i++) { System.out.println(cache.get(String.valueOf(i % 5))); } } // output: 0_1409058469045 1_1409058469046 2_1409058469046 [0=0_1409058469045] is removed from cache 3_1409058469046 [1=1_1409058469046] is removed from cache 4_1409058469055 [2=2_1409058469046] is removed from cache 0_1409058469055 [3=3_1409058469046] is removed from cache 1_1409058469055 [4=4_1409058469055] is removed from cache 2_1409058469055 [0=0_1409058469055] is removed from cache 3_1409058469055 [1=1_1409058469055] is removed from cache 4_1409058469055 [2=2_1409058469055] is removed from cache 0_1409058469056 [3=3_1409058469055] is removed from cache 1_1409058469056maximumSize: Less recently Used(LRU) entries are subject to be removed as the size of the cache approaches the maximum size number, not necessarily when the maxmium size is met or exceeded. That means if we set maximumSize = 100, there might be an occasion that some entries are removed when the size of the cache is 98 or even smaller. When size is zero, elements will be evicted immediately after being loaded into the cache. This can be useful in testing, or to disable caching temporarily without a code change. This feature cannot be used in conjunction with maxmiumWeight