"Are you a feminist?"
For better or worse, it's a question an increasing number of Hollywood starlets are asked these days. And there is, indeed, a "worse" side — oftentimes, it's sadly just low-hanging clickbait from journalists who posit it exclusively to female celebrities without preamble or explanation.
But there are positives born from the question, as well. For starters, the fact a celebrity's stance on feminism is considered clickbait is, in a weird way, encouraging. The writer's intentions may be commercial, but dually, it means discussions of feminism are now mainstream enough to be thus promoted. And that's something society flat-out hasn't seen before. Additionally, their answers create opportunities to clarify and circulate the term's truest meaning.
Sure, it'd be just nifty if the bulk of female celebrities proudly self-identified as feminist, thus making the ideology that much more accessible in our celeb-obsessed society. But if they aren't feminist, fine, everyone is entitled to their own dogmas (Kendall Jenner, coincidentally, is an ace example of how to articulate that). What tends to be noteworthy, though, is what these women say AFTER denying they're feminist. Generally, it's either listing their *actual* beliefs, which happen to be concordant with the definition of feminism, or citing a range of well-worn stereotypes.
They shouldn't be harshly criticized for sharing these views, as again, most were prodded by agenda-ridden journalists. Instead, their words can be a reminder that popular misconceptions about feminism persist, and serve as an educational moment as we — calmly, thoughtfully — disprove them.
Below, here are 13 female celebrities whose definitions of feminism are a wee off.