Answer
Sep 18, 2024 - 07:58 AM
Women began to more commonly wear pants in the workplace during the 1960s and 1970s, influenced by the women's liberation movement and changing fashion norms. However, it wasn't until the 1990s that pants became widely accepted for women in most professional settings, with some workplaces relaxing their dress codes earlier than others.