Africa has meant many different things to many different people. The word “Africa” may have come from a Greek word meaning “without cold” or from a Latin reference to the “land of the Afri,” probably a Berber tribe. There is also a similar Latin word meaning “warm.” Whatever the origin of the word itself, “Africa” has certain meanings for African Americans and other meanings for white Americans. Within each of these groups, of course, there are many subdivisions, ranging along the entire spectrum of political and cultural opinions.
For some time, it was common for Europeans and white Americans to refer to Africa as the Dark Continent, with a derogatory connotation. The word “Africa” carried with it the meaning of lack of civilization, intellect, and sophistication. As Dorothy Hammond and Alta Jablow observe in The Myth of Africa the West defined Africa as that which the West was not ...