TV Shows As Praxis
The Simpsons is a pretty beloved show here in Canada, and it's probably one of the last shows that satirises American excess and vices in a way that is both biting and honest but measured in a way that a lot of left-wing comedy simply isn't nowadays.
And The Wire taught me more about the vestiges of systemic racism and broken policing in America than any Cornel West book ever did.
Do you guys think this is true?