We’ve got good news and bad news… The good news—according to a study by The Center for the Study of Women in Television and Film—is that women are getting more and more work in the film industry, whether they’re acting or directing. More women directing means, statistically, more female protagonists (compared to the mere 13 percent of movies directed and written by men which have a female lead). Overall, women are taking on more leading roles and positions of authority in Hollywood, and we love it.
Related: What’s Wrong With Fandango’s List of Most Anticipated Movies This Year?
The bad news is that, while progress is being made, there’s still inequality. The majority of women being cast in these films are white. There has been no increase in Latina or Asian actresses landing roles, and the percentage of black women earning leading roles barely saw an increase at all (a ridiculous 2%). This isn’t just a feminist issue. It is also another in a string of wakeup calls pointing to a lack of diversity in popular culture, which needs to be resolved.
The numbers show that casts become increasingly diverse as more diverse people are hired as writers or directors. Maybe, if this happens more often, Hollywood will improve on these depressing statistics. It’s not fair or morally correct to celebrate one group of women without celebrating the rest of the female population in television and film along with them.