The Dark Ages is a term referring to the perceived period of cultural decline that took place in Western Europe after the Decline of the Roman Empire.

Leave a Reply

Your email address will not be published. Required fields are marked *