The word "empowerment" has become the rallying cry of mainstream feminism, with virtually any act performed enthusiastically by a woman—from washing her hair to posting her bikini photos—now designated as "empowering." But while everyone from Unilever to the Republican Party has embraced the background noise of "empowerment," this frenzy has done almost nothing to change our society's structures or understanding of authority.
Women are still drastically underrepresented anywhere that genuine power resides in the U.S., especially in business and politics . . . By advising women to fight this sexist norm through empowerment—the feeling of inner potency, not the material gain in status—the feminist movement has started to sound like a branch of the self-help industry. Lean in! Adopt power positions! Negotiate a raise! Walk tall! Stop apologizing! Think positive! Be assertive! The message is clear: If you want to feel empowered, you need to be improved.
If we buy into this story—in which feminism is a feel-good anthem and women are to blame for their own oppression—the genuinely powerful woman will remain an exception . . .
—Ruth Whippman, TIME magazine (Whippman is the author of America the Anxious: How Our Pursuit of Happiness is Creating a Nation of Nervous Wrecks, out Oct. 4)
As a non-confrontationalist, this was nice to read.