One of the mantras that I hear regularly from folks is the need to “take back our country.” In fact, I heard two Republican candidates for office make that claim this week in a meeting I attended as the reason they are qualified to hold office.
I confess that I am clueless about many things, so it shouldn’t be a surprise to anyone, but I find myself wondering from what do we need to take our country back? Of course, I know the rhetoric that is spouted about on the news, but in all seriousness, I am having trouble discerning the specifics regarding a vision of what we want to return to. Are folks wanting to dial back to the 1940’s, when the New Deal reigned supreme but we were as united as a nation as we’ve ever been in supporting our troops at war? Are we looking to return to the roaring ’20’s, when there was a lot of fun but the divisions between rich and poor were as great as any time in our history? Some would suggest we need to return to the Reagan era, but Reagan engaged in deficit spending at great levels putting pressure on the budget. In all seriousness, I really am not understanding what we are trying to return to when we say we need to take our country back.
How would you define what it means to “take our country back?”