1-3hit |
In Information-Centric Networking (ICN), different routing and caching schemes have been proposed to efficiently utilize in-network caches and reduce network traffic. Most of them assume that the popularity distribution of user-requested content is homogeneous. However, the actual popularity distribution measured on the Internet is reported to possess spatial and temporal localities, which can heavily affect caching performance in ICN. Breadcrumbs (BC) routing is a key solution to mitigate performance degradation due to spatial locality because of its ability to flexibly discover cached contents in the off-path. In this paper, we deeply investigate the spatial effects of BC by revealing where utilized cached contents are located, how BC discovers these contents, what kind of contents are found, and how BC fill in the locality gap of content popularity. We also focus on another time-dimension perspective, i.e., the temporal locality of content popularity, and conduct a comprehensive study of how BC routing can be adapted to the spatiotemporal locality of content popularity in ICN.
Takuma NAKAJIMA Masato YOSHIMI Celimuge WU Tsutomu YOSHINAGA
Cooperative caching is a key technique to reduce rapid growing video-on-demand's traffic by aggregating multiple cache storages. Existing strategies periodically calculate a sub-optimal allocation of the content caches in the network. Although such technique could reduce the generated traffic between servers, it comes with the cost of a large computational overhead. This overhead will be the cause of preventing these caches from following the rapid change in the access pattern. In this paper, we propose a light-weight scheme for cooperative caching by grouping contents and servers with color tags. In our proposal, we associate servers and caches through a color tag, with the aim to increase the effective cache capacity by storing different contents among servers. In addition to the color tags, we propose a novel hybrid caching scheme that divides its storage area into colored LFU (Least Frequently Used) and no-color LRU (Least Recently Used) areas. The colored LFU area stores color-matching contents to increase cache hit rate and no-color LRU area follows rapid changes in access patterns by storing popular contents regardless of their tags. On the top of the proposed architecture, we also present a new routing algorithm that takes benefit of the color tags information to reduce the traffic by fetching cached contents from the nearest server. Evaluation results, using a backbone network topology, showed that our color-tag based caching scheme could achieve a performance close to the sub-optimal one obtained with a genetic algorithm calculation, with only a few seconds of computational overhead. Furthermore, the proposed hybrid caching could limit the degradation of hit rate from 13.9% in conventional non-colored LFU, to only 2.3%, which proves the capability of our scheme to follow rapid insertions of new popular contents. Finally, the color-based routing scheme could reduce the traffic by up to 31.9% when compared with the shortest-path routing.
Qian HU Muqing WU Song GUO Hailong HAN Chaoyi ZHANG
Information-centric networking (ICN) is a promising architecture and has attracted much attention in the area of future Internet architectures. As one of the key technologies in ICN, in-network caching can enhance content retrieval at a global scale without requiring any special infrastructure. In this paper, we propose a workload-aware caching policy, LRU-GT, which allows cache nodes to protect newly cached contents for a period of time (guard time) during which contents are protected from being replaced. LRU-GT can utilize the temporal locality and distinguish contents of different popularity, which are both the characteristics of the workload. Cache replacement is modeled as a semi-Markov process under the Independent Reference Model (IRM) assumption and a theoretical analysis proves that popular contents have longer sojourn time in the cache compared with unpopular ones in LRU-GT and the value of guard time can affect the cache hit ratio. We also propose a dynamic guard time adjustment algorithm to optimize the performance. Simulation results show that LRU-GT can reduce the average hops to get contents and improve cache hit ratio.