A slightly derogatory word for America's west coast, used by Republicans to refer to the primarily Democratic California, Oregon, and Washington.
Arnie must feel very alone on the left coast.
92👍 25👎