As Donny (re-)delivers his version of the Team America refrain, I can’t help (re-)wondering if the man has any grip on reality. But if you always rely on your own perceptions and ignore everything else apart from Fox News, this, it seems, is the result.
There are many films glorifying America and its reverie and many where a more complex story is told. And much of American history has been illustrated and invented in films that have given us a feel of the country (and I’m using Donny’s definition, the USA) that we kinda know the place, even if we’ve never been there.
I’ve actually been to the location featured in my pick: The Florida Project. It’s set in a cheap motel block near a Disney paradise in Orlando; the place I stayed for a couple of nights wasn’t dissimilar. But I couldn’t live there, with kids, like people find themselves having to do in the not-so-great parts of America.
What films about America (the USA) would you recommend?