We need MORE Western influence in anime

Why do people bemoan "Western influence" in Japanese media so much. Japan thrives on taking inspiration from the West. Unless it's something like Shigurui or Sword of the Stranger, your favourite anime is probably full of Western influences.
Everything mecha and sci-fi originated in the West. Japan didn't suddenly come up with that in the 60s and 70s.
Everything fantasy, Western origin.
Think of something like Gunsmith Cats, you're gonna tell me that's not influenced by American action movies?
It goes from character and setting designs to music and story.

Attached: haibane-renmei-2-by-potassium-big.jpg (1024x768, 383.59K)

I feel like as long as somebody is there to constantly remind the studios that the far left does not represent their audience the worst that will happen from western pandering is an increase in pointless sex scenes
and for certain series that might be a good thing

I don't understand the point of your post. You claim anime needs western influence, but then go onto say most anime is already full of western influence. It's as if you're just trying to bait shounentards into bitching about SJW nutjobs or something.

Because taking inspiration from the West is not at all the same thing as Western influence you drooling retard.

You're an idiot, globalism is for worthless scum that have no character of their own. I don't need to read a manga to learn about Christian mythology just as I'm not going to watch some Hollywood movie to learn about samurai. Also Western civilization is in its last gasps and just thrashing about to take down as many people as possible with it, there's no reason for Japan to suicide themselves just yet. They can wait a couple decades and just pick out the good stuff from an extinct civilization just as the West did with every civilization it took from.

tl;dr

I just think it's weird how adversed people are to Japanese taking Western influences when the thing they already love is so full of it already.

Semantics.

Shut up, globalism is cool as fuck.

Also, for the record, most of the stuff you think of as the West is just watered-down stuff that some kike stole from some actual culture to pacify mutts with. Imagine thinking "action movies" symbolize the West.

Not reading. Kill yourself.

End your fucking life, retard.