Presidents often claim their election victories give them a mandate. How true is that in this hyperpolarized era — when President-elect Trump didn’t win 50% of the vote?
Do election victories really give presidents a ‘mandate’?

Presidents often claim their election victories give them a mandate. How true is that in this hyperpolarized era — when President-elect Trump didn’t win 50% of the vote?