Over the years you may have heard about the trend among hard-line conservatives in the United States that universities and colleges are not to be trusted. Whether it is someone who is seen as “too” highly educated, or a professor at a university seen as “too book smart” and likely to be politically to the “left”, a blanket skepticism has taken hold in some segments of American culture. A phenomenon that has been recorded in a recent survey from the Pew Center asking people from different walks of life in America, how they view higher education institutions.
When asked the question of what impact universities have had on the country, people who described themselves as democrats answered 67% positive, versus 20% negative. Among independents, it was 61% versus 26%. And finally among Republicans overall, 51% said positive while 36% said colleges had a negative effect on the country. Within the Republican category, those that defined themselves as Conservative Republicans, 46% saw college as positive, while 39% said it was a negative. An eye opening statistic for anyone who thinks education is a universal value.
Perhaps somewhat less of a surprise, among those surveyed who identified with the movement known as the Tea Party, only 38% viewed colleges as having a positive impact on the country.
On the question of what college should be for, personal growth or job training, only one surveyed group chose growth over training – liberal democrats. Everyone else on the American political spectrum responded more favorably to the idea that college is for career/job training.
I think I’d better cancel my American lecture series entitled “How to grow and become a well adjusted adult during college.”
Source: Pew Research
Photo: Roel Wijnants / flickr