What we have now is doctors who are actually better technically at what they're doing in their specialty than 30 or 40 years ago but we lost the relationship when the doctor would look people in the eye and say 'I care about you. We can do this together.'
We should never denigrate any other culture but rather help people to understand the relationship between their own culture and the dominant culture. When you understand another culture or language it does not mean that you have to lose your own culture.
Next Article
Film's hard when you don't have any relationship with the director at all and you just show up. Then you really are just a gun for hire.