The type of response options selected for items on a survey, along with how many response options to include and whether to allow neutral midpoints, impacts data obtained from survey collections and the interpretations made using the results. Further, if subgroups within a population (e.g., racial/ethnic, gender, age) interpret response options differently, this variance can artificially inflate non-significant differences or mask true differences between groups. In this study, we apply two recursive partitioning procedures for investigating differential item functioning (DIF) in an experiment evaluating seven item response formats (five levels of an agree–disagree [AD] format and two levels of an item-specific [IS] format). Partial credit tree procedures allow for the evaluation of multiple covariates without prespecifying subgroups to be compared. We applied the procedures to items measuring adults’ attitudes toward legal abortion and all response formats functioned without DIF for age, gender, race, education, and religion when evaluated using global DIF screening approaches. Item-focused analyses indicated that odd-numbered response formats were less susceptible to content-based DIF. The combination of psychometric properties indicated that five-point AD and IS formats may be preferable for abortion attitude measurement based on the screening procedures conducted in this study.