Did liberalism in the USA degenerate into something more totalitarian than liberal?

Upvote:10

Anixx's comment is quite right: in America "liberal" means "center-left".

According to the Encyclopedia Britannica, "In the United States liberalism is associated with the welfare-state policies of the New Deal program of the Democratic administration of Pres. Franklin D. Roosevelt, whereas in Europe it is more commonly associated with a commitment to limited government and laissez-faire economic policies."

But for the rest of the world, a "(Classical-type) liberal" refers to what Americans call a "(right-wing) libertarian" (more commonly just libertarian).

Recent right-wing rhetoric has castigated any form of government and/or left-wing ideas as "totalitarian" and "socialistic". I won't comment on the validity of these claims, but I suspect they would cause confusion for...well, everyone.

More post

Search Posts

Related post