This paper introduces ESTransformer, a hybrid forecasting model that combines Holt-Winter's exponential smoothing with transformer neural networks for time series prediction. By using exponential smoothing to extract predictable patterns first, then applying transformers to model the remaining complexities, our hybrid model addresses the limitations of previous methods that relied on recurrent neural networks, which can be slow to train and struggle with long-term patterns due to their sequential processing. Evaluated on the M4 competition dataset containing 100,000 time series across six different frequencies (hourly to yearly), ESTransformer demonstrates comparable or improved forecasting performance compared to the state-of-the-art ESRNN model, with particular gains on daily and quarterly data. The model successfully captures complex trends and seasonal patterns, signalling a modest yet foundational step towards enhancing forecasting capabilities within the research community.