Gambia: Change of Diet

[Daily Observer]Colonisation is the establishment of political and economic control over a foreign territory. Most African countries were in one way or the other colonised by a country from the West, who were mainly interested in enriching themselves and imposing their culture and lifestyles on us.
Source: AllAfrica News: Health and Medicine - Category: African Health Source Type: news