Optimizing Content Caching in Vehicular Networks: A Gale–Shapley Approach

In an article published in the journal Scientific Reports, researchers proposed a content caching scheme for fog nodes in 5G-enabled vehicular networks. They utilized a modified Gale–Shapley algorithm to optimize content placement on fog nodes based on the preferences of content providers and vehicles. Additionally, they aimed to reduce content downloading time and improve the quality of service for vehicular applications.

Study: Optimizing Content Caching in Vehicular Networks: A Gale–Shapley Approach. Image credit: metamorworks/Shutterstock
Study: Optimizing Content Caching in Vehicular Networks: A Gale–Shapley Approach. Image credit: metamorworks/Shutterstock

Background

Vehicular networks are emerging as a key technology for smart cities, enabling applications like traffic management, safety, and infotainment. Vehicular networks consist of vehicles equipped with wireless devices that can communicate with each other (V2V), with roadside units (RSUs) (V2I), or with other entities (V2X). RSUs are devices that provide wireless connectivity and caching capacity for vehicles. Fog computing is a paradigm that extends cloud computing to the network edge, bringing computation and storage closer to the end users. Fog nodes are RSUs with enhanced caching and processing capabilities that can serve the content requests of vehicles more efficiently than remote servers.

One of the main challenges in vehicular networks is to provide efficient content delivery to vehicles, as they have high and dynamic demands for various types of content, such as traffic information, weather forecasts, music, and video streaming. Fetching these contents from remote servers may cause high delay, bandwidth consumption, and energy consumption. Therefore, caching the popular content can significantly improve the performance and user experience of vehicular networks. However, caching all the requested contents on fog nodes is not feasible due to their limited capacity and the large number and size of contents. Hence, an optimal content caching scheme is needed to meet the requirements of both content providers and vehicles.

About the Research

In the present paper, the authors designed an efficient content caching technique to effectively cache the most popular and relevant contents on fog nodes to meet the demands of most vehicles. Their approach considered the cost and revenue of content providers, who are willing to place their content on fog nodes for a certain price. It used a modified Gale–Shapley, a well-known stable matching algorithm, to assign contents to fog nodes based on their preference lists. The preference lists are determined by a utility function that considers the following factors:

  • Content popularity: The ratio of content requests to the total number of requests in a fog node’s coverage area.
  • Vehicle connectivity: The remaining time that a vehicle stays in the fog node’s coverage area.
  • Channel quality: The signal-to-noise ratio between the vehicle and the fog node.

The utility function of a content provider for placing its content on a fog node is based on the cost of caching and the expected revenue from content retrieval. The utility function of a fog node for caching content is based on the inverse of the product of the above three factors. The lower the utility value, the higher the preference.

The proposed scheme works as follows:

  • The vehicles send their content requests to the traffic control center (TCC), a central entity that manages the vehicular network.
  • The TCC calculates the utility function for each content provider and prepares their preference list for fog nodes.
  • The TCC also computes the utility function for each fog node and prepares its preference list for content.
  • The TCC applies the modified Gale–Shapley algorithm to match contents to fog nodes based on their preference lists. The algorithm allows the number of contents to be cached on a fog node until its capacity is full or its preference list is empty.
  • The TCC informs the fog nodes and the content providers about the content placement and the caching price.

Furthermore, the authors evaluated the performance of the devised method through simulations and compared it with two existing schemes: a one-to-one stable matching scheme and a first-come-first-served (FCFS) scheme. They utilized the following metrics to measure the performance:

  • Percentage of contents cached: The ratio of the number of contents cached to the number of requested contents by vehicles.
  • Downloading time: The time required for a vehicle to download content from a fog node or a remote server.
  • Downloaded data: The amount of data downloaded by vehicles from fog nodes or remote servers.

Research Findings

The outcomes demonstrated that the developed scheme achieved better performance than the existing schemes regarding all metrics. It cached more popular content on fog nodes, which reduced the downloading time and increased the downloaded data. Additionally, it considers the preferences of both content providers and vehicles, ensuring fair and efficient content placement. Furthermore, the new scheme is robust to the variations in the number of fog nodes, the caching capacity of fog nodes, and the number of content requests.

The newly presented technique could be applied to scenarios such as smart transportation, infotainment, and mobile edge computing, providing timely and accurate traffic information, navigation guidance, diverse content, and offloading computation-intensive tasks to fog nodes.

Conclusion

In summary, the novel scheme demonstrated its capability for content caching on fog nodes in 5G-enabled vehicular networks. It effectively improved the quality of service and reduced delays. Moreover, it facilitated improved content delivery to vehicles and increased revenue for content providers in smart cities. The simulation highlighted the superiority of the new scheme in content caching, downloading time, and downloaded data compared to the existing technique.

The researchers acknowledged limitations and suggested that future research could focus on enhancing data rates and channel quality for vehicular communication, particularly in high-mobility scenarios and urban environments. Additionally, they recommended exploring alternative matching algorithms to optimize content caching and accommodate dynamic changes in content popularity and vehicle connectivity.

Journal reference:
Muhammad Osama

Written by

Muhammad Osama

Muhammad Osama is a full-time data analytics consultant and freelance technical writer based in Delhi, India. He specializes in transforming complex technical concepts into accessible content. He has a Bachelor of Technology in Mechanical Engineering with specialization in AI & Robotics from Galgotias University, India, and he has extensive experience in technical content writing, data science and analytics, and artificial intelligence.

Citations

Please use one of the following formats to cite this article in your essay, paper or report:

  • APA

    Osama, Muhammad. (2024, February 21). Optimizing Content Caching in Vehicular Networks: A Gale–Shapley Approach. AZoAi. Retrieved on November 25, 2024 from https://www.azoai.com/news/20240221/Optimizing-Content-Caching-in-Vehicular-Networks-A-Galee28093Shapley-Approach.aspx.

  • MLA

    Osama, Muhammad. "Optimizing Content Caching in Vehicular Networks: A Gale–Shapley Approach". AZoAi. 25 November 2024. <https://www.azoai.com/news/20240221/Optimizing-Content-Caching-in-Vehicular-Networks-A-Galee28093Shapley-Approach.aspx>.

  • Chicago

    Osama, Muhammad. "Optimizing Content Caching in Vehicular Networks: A Gale–Shapley Approach". AZoAi. https://www.azoai.com/news/20240221/Optimizing-Content-Caching-in-Vehicular-Networks-A-Galee28093Shapley-Approach.aspx. (accessed November 25, 2024).

  • Harvard

    Osama, Muhammad. 2024. Optimizing Content Caching in Vehicular Networks: A Gale–Shapley Approach. AZoAi, viewed 25 November 2024, https://www.azoai.com/news/20240221/Optimizing-Content-Caching-in-Vehicular-Networks-A-Galee28093Shapley-Approach.aspx.

Comments

The opinions expressed here are the views of the writer and do not necessarily reflect the views and opinions of AZoAi.
Post a new comment
Post

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.

You might also like...
CALDERA Enables Leaner Language Models for Phones and Laptops