INTRODUCTION: The Internet of Things (IoT) has transformed daily life by interconnecting digital devices via integrated sensors, software, and connectivity. Although IoT devices excel at real-time data collection and decision-making, their performance on complex tasks is hindered by limited power, …
INTRODUCTION: The Internet of Things (IoT) has transformed daily life by interconnecting digital devices via integrated sensors, software, and connectivity. Although IoT devices excel at real-time data collection and decision-making, their performance on complex tasks is hindered by limited power, resources, and time. To address this, IoT is often combined with cloud computing (CC) to meet time-sensitive demands. However, the distance between IoT devices and cloud servers can result in latency issues. OBJECTIVES: To mitigate latency challenges, Mobile Edge Computing (MEC) is integrated with IoT. MEC offers cloud-like services through servers located near network edges and IoT devices, enhancing device responsiveness by reducing transmission and processing latency. This study aims to develop a solution to optimize task offloading in IoT-MEC environments, addressing challenges like latency, uneven workloads, and network congestion. METHODS: This research introduces the Game Theory-Based Task Latency (GTBTL-IoT) algorithm, a two-way task offloading approach employing Game Matching Theory and Data Partitioning Theory. Initially, the algorithm matches IoT devices with the nearest MEC server using game-matching theory. Subsequently, it splits the entire task into two halves and allocates them to both local and MEC servers for parallel computation, optimizing resource usage and workload balance. RESULTS: GTBTL-IoT outperforms existing algorithms, such as the Delay-Aware Online Workload Allocation (DAOWA) Algorithm, Fuzzy Algorithm (FA), and Dynamic Task Scheduling (DTS), by an average of 143.75 ms with a 5.5 s system deadline. Additionally, it significantly reduces task transmission, computation latency, and overall job offloading time by 59%. Evaluated in an ENIGMA-based simulation environment, GTBTL-IoT demonstrates its ability to compute requests in real-time with optimal resource usage, ensuring efficient and balanced task execution in the IoT-MEC paradigm. CONCLUSION: The Game Theory-Based Task Latency (GTBTL-IoT) algorithm presents a novel approach to optimize task offloading in IoT-MEC environments. By leveraging Game Matching Theory and Data Partitioning Theory, GTBTL-IoT effectively reduces latency, balances workloads, and optimizes resource usage. The algorithm's superior performance compared to existing methods underscores its potential to enhance the responsiveness and efficiency of IoT devices in real-world applications, ensuring seamless task execution in IoT-MEC systems.