Evaluating the Efficiency of Caching Strategies in Reducing Application Latency

Evaluating the Efficiency of Caching Strategies in Reducing Application Latency

Authors

  • Mikita Piastou Full-Stack Developer, Emplifi, Calgary, AB Canada

DOI:

https://doi.org/10.55662/JST.2023.4606

Downloads

Keywords:

Application latency, caching mechanisms, in-memory caching, file-based caching, database caching, performance optimization, response time, latency reduction

Abstract

The paper discusses the efficiency of various caching strategies that can reduce application latency. A test application was developed for this purpose to measure latency from various conditions using logging and profiling tools. These scenario tests simulated high traffic loads, large data sets, and frequent access patterns. The simulation was done in Java; accordingly, T-tests and ANOVA were conducted in order to measure the significance of the results. The findings showed that the highest reduction in latency was achieved by in-memory caching: response time improved by up to 62.6% compared to non-cached scenarios. File-based caching decreased request processing latency by about 36.6%, while database caching provided an improvement of 55.1%. These results enhance the huge benefits stemming from the application of various caching mechanisms. In-memory caching proved most efficient in high-speed data access applications. On the other hand, file-based and database caching proved to be more useful in certain content-heavy scenarios. This research study provides some insight for developers on how to identify proper caching mechanisms and implementation to further boost responsiveness and efficiency of applications. Other recommendations for improvements to be made on the cache involve hybrid caching strategies, optimization of the eviction policies further, and integrating mechanisms with edge computing for even better performance.

Downloads

Download data is not yet available.

References

A. Ioannou, S. Weber, “A Survey of Caching Policies and Forwarding Mechanisms in Information-Centric Networking”, IEEE Communications Surveys & Tutorials, vol. 18, issue 4, pp. 2847-2886, May 2016.

M. H. Shahid, A. R. Hameed, S. Islam, H. A. Khattak, I. U. Din, and J. Rodrigues, “Energy and delay efficient fog computing using caching mechanism”, Computer Communications, vol. 154, pp. 534-541, Mar. 2020.

M. I. Zulfa, R. Hartanto, and A. E. Permanasari, “Caching strategy for Web application – a systematic literature review”, International Journal of Web Information Systems, Oct. 2020.

C. A. Hassan, M. Hammad, M. Uddin, J. Iqbal, J. Sahi, and S. Hussain, “Optimizing the Performance of Data Warehouse by Query Cache Mechanism”, IEEE Access, vol. 10, pp. 13472-13480, Jan. 2022.

B. Abolhassani, J. Tadrous, and A. Eryilmaz, “Single vs Distributed Edge Caching for Dynamic Content”, IEEE/ACM Transactions on Networking, vol. 30, issue 2, pp. 669-682, Nov. 2021.

W. M. Mellette, R. Das, Y. Guo, R. McGuinness, A. C. Snoeren, and G. Porter. “Expanding across time to deliver bandwidth efficiency and low latency”, USENIX, NSDI, 2020.

R. O. Aburukba, M. AliKarrar, T. Landolsi, and K. El-Fakih, “Scheduling Internet of Things requests to minimize latency in hybrid Fog–Cloud computing”, Future Generation Computer Systems, vol. 111, pp. 539-551, Oct. 2020.

A. M. Abdelmoniem, H. Susanto, and B. Bensaou, “Reducing Latency in Multi-Tenant Data Centers via Cautious Congestion Watch”, ICPP '20: Proceedings of the 49th International Conference on Parallel Processing, art. no 72, pp. 1-11, Aug. 2020.

J. Yang, Y. Yue, and K. V. Rashmi, “A Large-scale Analysis of Hundreds of In-memory Key-value Cache Clusters at Twitter”, ACM Transactions on Storage (TOS), vol. 17, issue 3, art. no 17, pp 1-35, Aug. 2021.

J. Yang, Y. Yue, and R. Vinayak, “Segcache: a memory-efficient and scalable in-memory key-value cache for small objects”, USENIX, NSDI, 2021.

K. Wang, J. Liu, and F. Chen, “Put an Elephant into a Fridge: Optimizing Cache Efficiency for In-memory Key-value Stores”, National Science Foundation, 2020.

O. Chuchuk, G. Neglia, M. Schulz, and D. Duellmann, “Caching for dataset-based workloads with heterogeneous file sizes”, Inria. Hal. Science Web Portal, ver. 1, 2022.

X. Xia, F. Chen, Q. He, J. Grundy, M. Abdelrazek, and H. Jin, “Online Collaborative Data Caching in Edge Computing”, IEEE Transactions on Parallel and Distributed Systems, vol. 32, issue 2, pp. 281-294, Feb. 2021.

X. Xia, F. Chen, J. Grundy, M. Abdelrazek, H. Jin, and Q. He, “Constrained App Data Caching Over Edge Server Graphs in Edge Computing Environment”, IEEE Transactions on Services Computing, vol. 15, issue 5, pp. 2635-2647, Oct. 2022.

Y. Liu, Q. He, D. Zheng, X. Xia, F. Chen, and B. Zhang, “Data Caching Optimization in the Edge Computing Environment”, IEEE Transactions on Services Computing, vol. 15, issue 4, pp. 2074-2085, Aug. 2022.

C. Jiang, Zhen Li, “Decreasing Big Data Application Latency in Satellite Link by Caching and Peer Selection”, IEEE Transactions on Network Science and Engineering, vol. 7, issue 4, pp. 2555-2565, Dec. 2020.

C. Bernardini, T. Silverston, and A. Vasilakos, “Caching Strategies for Information Centric Networking: Opportunities and Challenges”, Uni. of Innsbruck, Uni. of Tokyo, Lulea Uni. of Tech., pp. 1-14, Sep.2021.

F. Mendes, “Consistent and Efficient Application Caching”, Nova School of Science and Technology, Oct. 2023.

V. Meena, K. Krithivasan, P. Rahul, and T. S. Praba, “Toward an Intelligent Cache Management: In an Edge Computing Era for Delay Sensitive IoT Applications”, Wireless Personal Communications, vol. 131, pp. 1075-1088, Apr. 2023.

Y. Wang, S. Li, Q. Zheng, A. Chang, H. Li, and Y. Chen, “EMS-i: An Efficient Memory System Design with Specialized Caching Mechanism for Recommendation Inference”, ACM Transactions on Embedded Computing Systems, vol. 22, issue 5s, art. no 100, pp. 1-22, Sep. 2023.

R. Alubady, M. Salman, and A. S. Mohamed, “A review of modern caching strategies in named data network: overview, classification, and research directions”, Telecommunication Systems, vol. 84, pp. 581-626, Sep. 2023.

Downloads

Published

06-11-2023
Citation Metrics
DOI: 10.55662/JST.2023.4606
Published: 06-11-2023

How to Cite

Piastou, M. “Evaluating the Efficiency of Caching Strategies in Reducing Application Latency”. Journal of Science & Technology, vol. 4, no. 6, Nov. 2023, pp. 83-98, doi:10.55662/JST.2023.4606.
PlumX Metrics

Plaudit

License Terms

Ownership and Licensing:

Authors of this research paper submitted to the Journal of Science & Technology retain the copyright of their work while granting the journal certain rights. Authors maintain ownership of the copyright and have granted the journal a right of first publication. Simultaneously, authors agreed to license their research papers under the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International (CC BY-NC-SA 4.0) License.

License Permissions:

Under the CC BY-NC-SA 4.0 License, others are permitted to share and adapt the work, as long as proper attribution is given to the authors and acknowledgement is made of the initial publication in the Journal of Science & Technology. This license allows for the broad dissemination and utilization of research papers.

Additional Distribution Arrangements:

Authors are free to enter into separate contractual arrangements for the non-exclusive distribution of the journal's published version of the work. This may include posting the work to institutional repositories, publishing it in journals or books, or other forms of dissemination. In such cases, authors are requested to acknowledge the initial publication of the work in the Journal of Science & Technology.

Online Posting:

Authors are encouraged to share their work online, including in institutional repositories, disciplinary repositories, or on their personal websites. This permission applies both prior to and during the submission process to the Journal of Science & Technology. Online sharing enhances the visibility and accessibility of the research papers.

Responsibility and Liability:

Authors are responsible for ensuring that their research papers do not infringe upon the copyright, privacy, or other rights of any third party. The Journal of Science & Technology and The Science Brigade Publishers disclaim any liability or responsibility for any copyright infringement or violation of third-party rights in the research papers.

Loading...