We all know that injustices have been committed on the continent of Africa—namely the Transatlantic Slave Trade—but the media generally likes to portray Westerners as washing the African blood off their hands after that point. The uncomfortable truth is that the colonial era came next, which arguably did greater damage to the cultures of this beautiful continent than the slave …