As an important economic crop, tomato is highly susceptible to diseases that, if not promptly managed, can severely impact yield and quality, leading to significant economic losses. Traditional diagnostic methods rely on expert visual inspection, which is not only laborious but also prone to subjective bias. In recent years, object detection algorithms have gained widespread application in tomato disease detection due to their efficiency and accuracy, providing reliable technical support for crop disease identification. In this paper, we propose an improved tomato leaf disease detection method based on the YOLOv10n algorithm, named BED-YOLO. We constructed an image dataset containing four common tomato diseases (early blight, late blight, leaf mold, and septoria leaf spot), with 65% of the images sourced from field collections in natural environments, and the remainder obtained from the publicly available PlantVillage dataset. All images were annotated with bounding boxes, and the class distribution was relatively balanced to ensure the stability of training and the fairness of evaluation. First, we introduced a Deformable Convolutional Network (DCN) to replace the conventional convolution in the YOLOv10n backbone network, enhancing the model's adaptability to overlapping leaves, occlusions, and blurred lesion edges. Second, we incorporated a Bidirectional Feature Pyramid Network (BiFPN) on top of the FPN + PAN structure to optimize feature fusion and improve the extraction of small disease regions, thereby enhancing the detection accuracy for small lesion targets. Lastly, the Efficient Multi-Scale Attention (EMA) mechanism was integrated into the C2f module to enhance feature fusion, effectively focusing on disease regions while reducing background noise and ensuring the integrity of disease features in multi-scale fusion. The experimental results demonstrated that the improved BED-YOLO model achieved significant performance improvements compared to the original model. Precision increased from 85.1% to 87.2%, recall from 86.3% to 89.1%, and mean average precision (mAP) from 87.4% to 91.3%. Therefore, the improved BED-YOLO model demonstrated significant enhancements in detection accuracy, recall ability, and overall robustness. Notably, it exhibited stronger practical applicability, particularly in image testing under natural field conditions, making it highly suitable for intelligent disease monitoring tasks in large-scale agricultural scenarios.