With the rapid advancements in artificial intelligence (AI), 5G technology, and robotics, multi-sensor fusion technologies have emerged as a critical solution for achieving high-precision localization in mobile robots operating within dynamic and unstructured environments. This study proposes a novel hybrid fusion framework that combines the Extended Kalman Filter (EKF) and Recurrent Neural Network (RNN) to address challenges such as sensor frequency asynchrony, drift accumulation, and measurement noise. The EKF provides real-time statistical estimation for initial data fusion, while the RNN effectively models temporal dependencies, further reducing errors and enhancing data accuracy. A complementary fusion mechanism integrating LiDAR (Light Detection and Ranging) data ensures robustness against noise and disturbances. The algorithm is validated through comprehensive simulations on the Gazebo platform, demonstrating a localization error within 8 cm under various noise levels and dynamic disturbances. The method also outperforms state-of-the-art algorithms, including Particle Filter (PF) and Graph SLAM, in both accuracy and computational efficiency, achieving an average runtime of 30.1 ms per frame, suitable for real-time applications. These results highlight the efficacy of the proposed EKF-RNN framework, which balances accuracy, robustness, and computational efficiency, offering significant contributions to autonomous robotic navigation.