Who the fuck cares about Oscars? Nobody watches the damn ceremonies anymore (Will Smith bitch-slapping Chris Rock was the most entertaining thing to come out of it in years), general audiences have grown savvy that Oscar-winning movies equates to boring and pretentious, and movie stars are becoming less relevant everyday. I see winning an Oscar as a detriment.
Who the fuck cares about Oscars? Nobody watches the damn ceremonies anymore (Will Smith bitch-slapping Chris Rock was the most entertaining thing to come out of it in years), general audiences have grown savvy that Oscar-winning movies equates to boring and pretentious, and movie stars are becoming less relevant everyday. I see winning an Oscar as a detriment.
The Chris Rock slap was staged to drive up ratings.