Online Coded Caching Mohammad Ali Maddah-Ali Bell Labs, Alcatel-Lucent, USA joint work with Ramtin Pedarsani UC-Berkeley Urs Niesen Bell Labs Video on Demand • Video on Demand – Netflix – Amazon – Hulu – Verizon/Comcast –… Places significant stress on service providers’ networks. Caching can be used to mitigate this stress. Least Recently Used (LRU) Server • LRU caching gain: deliver content locally (local gain) LRU File • LRU is approximately optimum Cache for single cache [Sleator, Tarjan, 85] • LRU is widely used in industry User • Cache every uncached requested file • Cache is full: Evict the Least Recently Used (LRU). Beyond Local Gain Maddah-Ali, Nisien, Fundamental Limits of Caching. 2012 A1 A2 B1 B2 • Local Gain = 0.5 • Global (coding) Gain = 0.5 A2 A2⊕B1 B1 • As number of cashes increases A1 B1 A2 B2 • Local gain stays constant! • Global gain scales Linearly Efficient online caching must capture the GLOBAL GAIN. In this talk: Coded Least Recently Sent • We propose coded LRS to exploit global gain Coded Coded • Cache any uncashed requested file • Partially! • Randomly • Uniformly • No matter who requested! • Cache is full: • Evict least recently sent Optimality of Coded LRS Set of Equi-Popular N Files With prob. p Theorem (Coded LRS) Local Gain K User Global (coding) Gain: Scales with K Sketch of Proof Partially Cached files popular files Users Demands No Caching Gain Enjoys code cashing gain Challenges: • Load for uncached demands are bounded by a constant number. • Number of uncached demands is governed by a complicated Markov chain. • Big gain for the partially cached demands. Performance Evaluation • Real-Life demand time series extracted From Netflix Prize Data (10 millions demands, over one year period) • Dynamic Variation of the Users’ Demand Performance Evaluation Size of Each Isolated Cache • Significant gain due to coded global gain Conclusion • For cache networks, LRU is NOT optimal. • Introduced online coded caching (Coded LRS). • Significant gain over LRU • Proved coded LRS is approximately optimal under some conditions • Validated the result for real-life time series of requests extracted from Netflix. Further Reading • Maddah-Ali and Niesen, “Fundamental Limits of Caching”, Sept 2012 (IEEE Trans. On Information theory, March 2014). • Maddah-Ali and Niesen, “Distributed Caching Attains Order-Optimal MemoryRate Trade-offs”, Jan. 2013 ( to Appear in ACM/IEEE Trans. On Networking, 2014). • Niesen and Maddah-Ali “Coded Caching with Non-Uniform Demands”, Jun. 2013. (Submitted to IEEE Trans. On Information Theory). • Pedarsani, Maddah-Ali, and Niesen, “Online Coded Caching”, Nov. 2013 (Submitted to ACM/IEEE Trans. on Networking). • Karamchandani, Niesen, Maddah-Ali, and Diggavi “Hierarchical Coded Caching”, Jan. 2014 (Submitted to IEEE Trans. On Information Theory).