Most Americans know a little something about European history because that's what they're taught in school. The origins of America were in Europe. We get a little about Africa and Asia when we speak of slavism and the Vietnam War, etc., but the majority of Americans can ramble off quite a few countries from which the original settlers of the east coast came. We learn about WW1 and WW2, with the emphasis usually on the Europe when covering WW2. We hear about them on the news, about how they're helping in the war on terror. France, England, Ireland, Spain, Portugal, Germany, Poland, Sweden, Italy, Greece...
They're the important ones we hear about and learn about. Britain is our closest ally, France has a bunch of fancy wusses, Germany had Nazis and has the autobahn, Greece invented the Olympics, Spain and Portugal faught over the world all over the world, Ireland has Leprechauns, potatoes, and angry drunks, Sweden has really hot women, Italy had one of the greatest civilizations ever, Rome, and spaghetti.
Egypt is in Africa and has pyramids. There's a big desert and a jungle that lions live in. Poor, starving people abound.
Which paragraph was longer? I can't believe people actually think lions live in the jungle.
Of course we all know about the Middle East now, and we could ramble off a few countries over there, if not some things they're known for. America has dealt with them in the past, so we learn about it now. Sure, we had a conflict in Somalia, but everyone has forgotten about that little mishap.