Visible to the public Biblio

Filters: Author is Chen, Feifei  [Clear All Filters]
2022-01-31
Liu, Ying, Han, Yuzheng, Zhang, Ao, Xia, Xiaoyu, Chen, Feifei, Zhang, Mingwei, He, Qiang.  2021.  QoE-aware Data Caching Optimization with Budget in Edge Computing. 2021 IEEE International Conference on Web Services (ICWS). :324—334.
Edge data caching has attracted tremendous attention in recent years. Service providers can consider caching data on nearby locations to provide service for their app users with relatively low latency. The key to enhance the user experience is appropriately choose to cache data on the suitable edge servers to achieve the service providers' objective, e.g., minimizing data retrieval latency and minimizing data caching cost, etc. However, Quality of Experience (QoE), which impacts service providers' caching benefit significantly, has not been adequately considered in existing studies of edge data caching. This is not a trivial issue because QoE and Quality-of-Service (QoS) are not correlated linearly. It significantly complicates the formulation of cost-effective edge data caching strategies under the caching budget, limiting the number of cache spaces to hire on edge servers. We consider this problem of QoE-aware edge data caching in this paper, intending to optimize users' overall QoE under the caching budget. We first build the optimization model and prove the NP-completeness about this problem. We propose a heuristic approach and prove its approximation ratio theoretically to solve the problem of large-scale scenarios efficiently. We have done extensive experiments to demonstrate that the MPSG algorithm we propose outperforms state-of-the-art approaches by at least 68.77%.
2020-02-18
Liu, Ying, He, Qiang, Zheng, Dequan, Zhang, Mingwei, Chen, Feifei, Zhang, Bin.  2019.  Data Caching Optimization in the Edge Computing Environment. 2019 IEEE International Conference on Web Services (ICWS). :99–106.

With the rapid increase in the use of mobile devices in people's daily lives, mobile data traffic is exploding in recent years. In the edge computing environment where edge servers are deployed around mobile users, caching popular data on edge servers can ensure mobile users' fast access to those data and reduce the data traffic between mobile users and the centralized cloud. Existing studies consider the data cache problem with a focus on the reduction of network delay and the improvement of mobile devices' energy efficiency. In this paper, we attack the data caching problem in the edge computing environment from the service providers' perspective, who would like to maximize their venues of caching their data. This problem is complicated because data caching produces benefits at a cost and there usually is a trade-off in-between. In this paper, we formulate the data caching problem as an integer programming problem, and maximizes the revenue of the service provider while satisfying a constraint for data access latency. Extensive experiments are conducted on a real-world dataset that contains the locations of edge servers and mobile users, and the results reveal that our approach significantly outperform the baseline approaches.