I have been putting off this post for a while because of its sensitive nature, and I respectfully want to open the discussion. We've come to see different world views on the beauty industry because of the mainstream boom in Asian beauty trends. We were once pretty much sold an ideal created in Europe, but now we are finding ourselves immersed in different standards of beauty from around the world, but unfortunately, they're still very limited (as standards usually are). Black and brown skin still needs pride movements and representation; tanned skin is accepted and desired; but light skin is still the winner in some parts of the world.
Where does skin whitening stem from?