Japan, many people believe, is a country of rigid, traditional gender norms, where men work and women are homemakers. But this stereotype is passe. Despite the continued prevalence of traditional gender roles in television shows, Japanese culture has undergone a sea change — most women now have jobs.
Japan isn’t alone. In recent years, female labor force participation has been rising across almost all industrialized countries. There’s at least one big exception, though — the U.S.