The current study examines the neural mechanisms underlying facial recognition, focusing on how emotional expression and mouth display modulate event-related potential (ERP) waveforms. 42 participants categorized faces by gender in one of two experimental setups: one featuring full-face images and another with cropped faces presented against neutral gray backgrounds. The stimuli included 288 images balanced across gender, race/ethnicity, emotional expression ("Fearful", "Happy", "Neutral"), and mouth display ("closed mouth" vs. "open mouth with exposed teeth"). Results revealed that N170 amplitude was significantly greater for open-mouth (exposed teeth) conditions (