I think many people just don't understand war at all. How could they? It's no longer taught, our history, as it happened, if at all, and certainly not with the complete picture.

War is not normal living, it's a dirty, miserable event that includes inhumane experiences and demands. I guess it's a "necessary evil" in battling a greater evil, that being one of having freedom and human rights taken away.