Racism is not American. It's Global. Since ages it has been believed that white is desirable, beautiful and pure; and black is ugly, impure and undesirable. Racism and colorism are just corollary of this thinking. It's the 21st Century and Brands & People are still Promoting, Believing in and Using Fairness Creams, which are supposed to lighten the melanin which one is born with. People consider extreme options like chemically lightening their skin tones and even plastic surgery- all for what? ACCEPTANCE? Is it really acceptance in the SOCIETY that we crave? Or the acceptance of our OWN TRAITS that we lack?