In today's data-driven world, where information is one of the most valuable resources, forecasting the behavior of time series, collected by modern sensor networks and IoT systems, is crucial across various fields, including finance, climatology, and engineering. However, existing neural network models often struggle with time series forecasting collected by different sensors due to challenges such as large data volumes, long-term dependencies, noise, and anomalies, which can negatively impact predictive accuracy. This paper aims to enhance the accuracy of time series forecasting by proposing an adapted transformer architecture combined with an innovative data preprocessing method. The proposed preprocessing technique employs the fast Fourier transform (FFT) to transition from the time domain to the frequency domain, enriching the data with additional frequency-domain features. These features are represented as complex numbers, which improve the informational content of the data for subsequent analysis, thereby boosting forecasting performance. Furthermore, the paper introduces a modified transformer model specifically designed to address the identified challenges in time series prediction. The performance of the proposed model was evaluated using three diverse datasets collected by different sensors, each with varying measurement frequencies, data types, and application domains, providing a comprehensive comparison with state-of-the-art models such as LSTM, FFT-LSTM, DeepAR, Transformer, and FFT-Transformer. Extensive evaluation using five distinct performance metrics demonstrates that the proposed model consistently outperforms existing methods, achieving the highest accuracy across all datasets.