When throughout american history has the government told us things were alright when they werent? Like uh I dunno a war was being fought but people didnt know.
I was thinking of something where US was losing a war but was advertised as them winning it... help. |