Information-centric networking (ICN) is a promising architecture and has attracted much attention in the area of future Internet architectures. As one of the key technologies in ICN, in-network caching can enhance content retrieval at a global scale without requiring any special infrastructure. In this paper, we propose a workload-aware caching policy, LRU-GT, which allows cache nodes to protect newly cached contents for a period of time (guard time) during which contents are protected from being replaced. LRU-GT can utilize the temporal locality and distinguish contents of different popularity, which are both the characteristics of the workload. Cache replacement is modeled as a semi-Markov process under the Independent Reference Model (IRM) assumption and a theoretical analysis proves that popular contents have longer sojourn time in the cache compared with unpopular ones in LRU-GT and the value of guard time can affect the cache hit ratio. We also propose a dynamic guard time adjustment algorithm to optimize the performance. Simulation results show that LRU-GT can reduce the average hops to get contents and improve cache hit ratio.
Qian HU
Beijing University of Posts and Telecommunications
Muqing WU
Beijing University of Posts and Telecommunications
Song GUO
Beijing University of Posts and Telecommunications
Hailong HAN
Beijing University of Posts and Telecommunications
Chaoyi ZHANG
Beijing Forestry University
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Copy
Qian HU, Muqing WU, Song GUO, Hailong HAN, Chaoyi ZHANG, "Workload-Aware Caching Policy for Information-Centric Networking" in IEICE TRANSACTIONS on Communications,
vol. E97-B, no. 10, pp. 2157-2166, October 2014, doi: 10.1587/transcom.E97.B.2157.
Abstract: Information-centric networking (ICN) is a promising architecture and has attracted much attention in the area of future Internet architectures. As one of the key technologies in ICN, in-network caching can enhance content retrieval at a global scale without requiring any special infrastructure. In this paper, we propose a workload-aware caching policy, LRU-GT, which allows cache nodes to protect newly cached contents for a period of time (guard time) during which contents are protected from being replaced. LRU-GT can utilize the temporal locality and distinguish contents of different popularity, which are both the characteristics of the workload. Cache replacement is modeled as a semi-Markov process under the Independent Reference Model (IRM) assumption and a theoretical analysis proves that popular contents have longer sojourn time in the cache compared with unpopular ones in LRU-GT and the value of guard time can affect the cache hit ratio. We also propose a dynamic guard time adjustment algorithm to optimize the performance. Simulation results show that LRU-GT can reduce the average hops to get contents and improve cache hit ratio.
URL: https://globals.ieice.org/en_transactions/communications/10.1587/transcom.E97.B.2157/_p
Copy
@ARTICLE{e97-b_10_2157,
author={Qian HU, Muqing WU, Song GUO, Hailong HAN, Chaoyi ZHANG, },
journal={IEICE TRANSACTIONS on Communications},
title={Workload-Aware Caching Policy for Information-Centric Networking},
year={2014},
volume={E97-B},
number={10},
pages={2157-2166},
abstract={Information-centric networking (ICN) is a promising architecture and has attracted much attention in the area of future Internet architectures. As one of the key technologies in ICN, in-network caching can enhance content retrieval at a global scale without requiring any special infrastructure. In this paper, we propose a workload-aware caching policy, LRU-GT, which allows cache nodes to protect newly cached contents for a period of time (guard time) during which contents are protected from being replaced. LRU-GT can utilize the temporal locality and distinguish contents of different popularity, which are both the characteristics of the workload. Cache replacement is modeled as a semi-Markov process under the Independent Reference Model (IRM) assumption and a theoretical analysis proves that popular contents have longer sojourn time in the cache compared with unpopular ones in LRU-GT and the value of guard time can affect the cache hit ratio. We also propose a dynamic guard time adjustment algorithm to optimize the performance. Simulation results show that LRU-GT can reduce the average hops to get contents and improve cache hit ratio.},
keywords={},
doi={10.1587/transcom.E97.B.2157},
ISSN={1745-1345},
month={October},}
Copy
TY - JOUR
TI - Workload-Aware Caching Policy for Information-Centric Networking
T2 - IEICE TRANSACTIONS on Communications
SP - 2157
EP - 2166
AU - Qian HU
AU - Muqing WU
AU - Song GUO
AU - Hailong HAN
AU - Chaoyi ZHANG
PY - 2014
DO - 10.1587/transcom.E97.B.2157
JO - IEICE TRANSACTIONS on Communications
SN - 1745-1345
VL - E97-B
IS - 10
JA - IEICE TRANSACTIONS on Communications
Y1 - October 2014
AB - Information-centric networking (ICN) is a promising architecture and has attracted much attention in the area of future Internet architectures. As one of the key technologies in ICN, in-network caching can enhance content retrieval at a global scale without requiring any special infrastructure. In this paper, we propose a workload-aware caching policy, LRU-GT, which allows cache nodes to protect newly cached contents for a period of time (guard time) during which contents are protected from being replaced. LRU-GT can utilize the temporal locality and distinguish contents of different popularity, which are both the characteristics of the workload. Cache replacement is modeled as a semi-Markov process under the Independent Reference Model (IRM) assumption and a theoretical analysis proves that popular contents have longer sojourn time in the cache compared with unpopular ones in LRU-GT and the value of guard time can affect the cache hit ratio. We also propose a dynamic guard time adjustment algorithm to optimize the performance. Simulation results show that LRU-GT can reduce the average hops to get contents and improve cache hit ratio.
ER -