It is clear that history tends to be written by people that, as much as they may wish to, are influenced by their own personal biases, interests, and the extent of the knowledge they have. In the case of European historians in the 18th and 19th centuries (in fact, well into the 20th century), the concept of the “rise of the West” is ubiquitous. What explains this phenomenon, particularly when taking into consideration that Europe was not a unified entity, but a series of independent states that saw one another with suspicion if we want to be kind, or with hostility if we want to be more honest. What would give them the impression that “European civilization” was superior? Before anyone mentions anything related to the Industrial Revolution, IR started only in the late 1700s and took about 100 years to become widespread. As well, if anyone wants to tackle the theory of evolution (survival of the fittest and so on), please remember that Darwin’s The Origin of the Species was published only in 1859, and was not accepted widely until much, much later.