No, they didn't. Every single thing that was done just made Hollywood look worse. I don't know how Chris Rock sold them on that, but he let Hollywood excrement all over itself. Nobody anywhere thinks Hollywood looks good right now.
The Oscars went out of their way to show they're not racist. Most of the presenters were black. They look bad because they're obviously pandering and giving in to the bullshit accusations. Yeah, I agree they look bad. It's Fuckn embarrassing for America. You get called racist, and you give in.
freak this country.