Does going to college make people more liberal? Probably yes, but it’s complicated… For decades, US adults with degrees have held more left-leaning views on social issues, but not on economic ones. And, until the 2010s, grads did not *identify* as more liberal than non-grads.
20 days ago