A lot of anonymous Americans I associate time and time again with still have deep rooted animosity towards Japan and their people, despite WW2 being over, calling them "deceiving". I don't understand though, because on the face of publicity, America seems to now be on positive grounds with Japan, although I am not sure if it's really just out of fear or not, America showed some significant sympathy for their plight during the tsunamis years ago.
Do you still have some lingering animosity for whatever mangled past we have had? What's your opinion on this matter and what do you make of it? I'd like I you gave a constructive response.