People learn associations between attentional demands (e.g., conflict likelihood) and predictive cues, then reactively retrieve relatively relaxed or focused control settings upon reoccurrence of a predictive cue, which is referred to as learning-guided control. A key theoretical question concerns whether learning-guided control transfers to novel situations in which the cues are no longer predictive of attentional demands. Using a picture-word Stroop task, we examined whether learning-guided control settings acquired when responding with one response modality (i.e., manual keypress) were evidenced when subsequent transfer items required a different response modality (i.e., vocal response). For training items, an item-specific proportion congruence manipulation was employed such that some predictive cues (pictures) were mostly congruent, and others were mostly incongruent. Transfer items were visibly identical to training items, but critically all cues were 50% congruent. In Experiment 1 in which training occurred prior to a separate transfer phase, we did not observe transfer. In Experiment 2, we intermixed manual modality training items and vocal modality transfer items within blocks throughout the experiment. In this case, transfer was observed. These novel findings demonstrate that learning-guided control settings can generalize from one response modality to another under select conditions. We discuss the roles of modality-specific processes and boundaries between training items and transfer items in modulating transfer, and implications for automaticity.