So, what do you guys think about cultural imperialism? I'm talking about the West, and specifically English and American culture sweeping in and pulverizing the native culture. I see all around the world, cultures wanting to throw away their centuries of history and tradition to become part of the "West" and enter our wonderful area of globalization. And it's sadly true. I hope I'm not offending anyone here, but can I just say how much I detest Japan? It is a PERFECT case of American cultural imperialism. They are SO desperate to be like the West, that it's just sickening. As an American, it really bothers me to see another culture, with its own wonderful traditions and traits, throwing them all away, then implementing a bastardized version of my culture. The other big way I see this is through linguistic imperialism. This is the kind of stuff that really bothers me. I was reading a book, called When Languages Die by K David Harrison, and it talked about a particular society in northern Russia that had this complex system of classifying reindeer. These animals were so vital to their daily life, that they developed a much more specialized vocabulary for them, it makes total sense. Now what's happening, is the Russian language is pervading the language, so that only the very oldest members of the tribe use it. There is an inexplicable loss of knowledge, when you move to the Russian language. The Russian language has a much more limited vocabulary regarding reindeer, therefore whole phrases are needed where in the old language just one word was needed (so things like "a castrated male, who has reached adulthood, and is good for riding" would have been just one word in the old language.) This is practically genocide! Killing the language and replacing it by the more "useful" language, that makes describing simple parts of daily life much more complex and tedious...it's ridiculous. What are your thoughts on this? Stopping it? Or is it not a real problem?