Apparently the British empire used to cover 80% of the world. I can imagine the rest of the world being angry at the brits forcing themselves on the world and it was they who 'discovered' America. Columbus set sail and discovered it and the English then conquered it. Academics then set up the laws for the country with the best of their 'knowledge'. Do other people agree or disagree with this rough history of the sea faring nation?