Network Bandwidth Optimization: Development Of A Cache Management Model

[featured_image]
Download
Download is available until [expire_date]
  • Version
  • Download 15
  • File Size 1.42 MB
  • File Count 1
  • Create Date June 7, 2022
  • Last Updated June 7, 2022

Network Bandwidth Optimization: Development Of A Cache Management Model

ABSTRACT

The surge in the use of multimedia contents for fun, businesses and education has led to increased demand for Internet services across the globe. It is observed that there is an infrastructural deficit in an attempt to provide the Internet service, this has led to poor services and slow Internet speed. Proxy web cache is one of the Internet infrastructures which when efficiently managed, would improve Internet services to the user. Existing cache models do not provide enough effective capability to cause a fast Internet speed. This research proposed a model that combines three traditional models- Least Recently Used (LRU), Least Frequently Used (LFU) and Weight factor to enhance the Internet service. Raw Web Log Data was obtained from Information Technology and Communication Unit (INTECU), Obafemi Awolowo University, Ile-Ife. A 7-days (1 week) raw web data record was obtained from the proxy server of the Unit of users’ access made between 8th and 14th April, 2013. 7 days (both night and day) web record was taken at the peak of the semester when utmost activities were ongoing on the campus, and that (7days) also captured all the days of the week. The data was formatted using Web Log Analyzer and Simulation performed using MATLAB2009. Time complexity performance comparison of the proposed model with the existing LRU and LFU models showed that the existing models have better result than the proposed model. 59.3% for LFU and 190% performance reduction when the size of the cache was 300gb. The performance varied with change in cache sizes. LRU model also displayed an improvement over the proposed model with the proposed model depreciating by 500% when the cache size was 50GB when checked against Byte Hit ratio while the LFU model with cache size of 50GB, had a performance depreciation of 80%. The proposed model had an increasing depreciation in performance as the cache sizes moved increased from 50GB to 300GB. The proposed model has an improved performance as it relates to hit ratio over both LRU and LFU. When cache size was increased to 100mb, the proposed model performed better by 50% over LRU and 8.3% over LFU when the cache size was 100mb. The performance improved with increase in cache sizes. . Hit Ratio refers to the ratio of number of web objects that was found over the total request made and it happens to be the most important performance metrics. This however established that the proposed model would improve the Internet services better than the existing models of LRU and LFU.

KEYWORDS: bandwidth, cache management, networks, optimization, Internet, time complexity

SHARE