PDA

View Full Version : Interested in a Free Market-Friendly Economics Program?


SF-TX
11-23-2009, 22:24
Interested in a Free Market-Friendly Economics Program?

Posted By Lawrence W. Reed

“I want my son or daughter to be exposed to free market economics after high school. What colleges or universities do you recommend?”

If I’ve been asked that question once, I’ll bet I’ve been asked it a thousand times. Parents who cherish the values of freedom, limited government and private enterprise have good reason to be concerned about where they send their offspring for higher “education.” Academia is full of statist bias, and statists usually aren’t comfortable when the first non-statist is accidentally hired for a teaching position (they think it’s a takeover). So when you find an economics program in which free market ideas are treated with respect and given a prominent forum, it’s news to celebrate.

Keep in mind that I am talking here about economics, period. If a college or university has a good econ program, that doesn’t mean it also has a good offering or even a decent balance in its other social science programs.

Certain superb schools I am familiar with roll off the tongue quickly and easily: Very good economics programs and faculty are in place at Grove City College (my alma mater) in Pennsylvania, Hillsdale College and Northwood University in Michigan, George Mason University in Virginia, Houston Baptist and the University of Dallas in Texas, The Kings College in New York, San Jose State University in California, Auburn University in Alabama, Clemson University in South Carolina, Beloit College in Wisconsin, Florida State University and Webber International University in Florida, and the University of Arizona.

There are others as well, and I’ll write about them too in future weeks and months. The one I want to acquaint readers with on this occasion is one I’ve become familiar with in the last couple of years, Florida Gulf Coast University in Ft. Myers, Florida. It’s a place where students can get a good education in free market economics despite the distractions of a warm, sunny climate near the beach and great restaurants.

FGCU, as the locals call it, is one of the newest universities in the country, having opened its doors in 1997. Some of its dorms are better described as waterfront luxury apartments. They share a lake with a golf-course community with multi-million dollar homes. Students have easy access to all sorts of watersports activities. More importantly, I know and have met four of the six members of the economics faculty and can vouch that they have strong free-market views. A brief biography of each is provided below.

BRADLEY K. HOBBS, BB&T Distinguished Professor of Free Enterprise, earned his undergraduate degree in history and his Ph.D. in economics from Florida State University in 1991. His research interests are wide in range, encompassing economic history, the moral and philosophical foundations of free markets, property rights, economic freedom, and teaching methodologies. He has published in Entrepreneurship Theory and Practice, the Journal of Accounting and Finance Research, Journal of Real Estate Research, Laissez-Faire, Journal of Executive Education, Financial Practice and Education, and Research in Finance, among others. He has also written for the Foundation for Economic Education, the Institute for Humane Studies, and the Florida Council on Economic Education.

Professor Hobbs has been active in leading undergraduate research and was recently recruited as the founding Faculty Advisor for a new undergraduate research journal, the Journal of Liberty and Society. He has been a member of the National Teaching Faculty for the Foundation for Teaching Economics since 2001, taught for the Institute for Humane Studies, The Koch Associate’s Program and is active in The Liberty Fund having participated in programs as a participant, Discussion Leader, and Director. He serves on the Executive Board of the Association for Private Enterprise Education. Professor Hobbs has received the FGCU Senior Faculty Teaching Award and was a recipient of a 2008 Excellence in Teaching award from the Acton Foundation for Entrepreneurial Excellence. He has taught a wide variety of courses, though he currently teaches primarily intermediate microeconomics and the moral foundations of capitalism.

CARRIE KEREKES is an Assistant Professor of Economics. She earned her Ph.D. in economics from West Virginia University in 2008. She teaches Principles of Microeconomics, Principles of Macroeconomics, and Economic Development. Her research interests are in the areas of public choice and economic development, with an emphasis on institutions and property rights. Professor Kerekes regularly attends the meetings of the Association of Private Enterprise Education and has also participated in seminars sponsored by the Foundation for Economic Education and the Institute for Humane Studies.

DEAN STANSEL, Associate Professor of Economics, earned his Ph.D. from George Mason University in 2002. Prior to entering academia, Professor Stansel earned an undergraduate degree in economics and political science from Wake Forest University in 1991. He then worked at the Cato Institute through 1999, where he produced over 60 publications on fiscal policy issues. Stansel attended several Institute for Humane Studies (IHS) seminars (as both student and lecturer) and received fellowships from IHS, as well as the Bradley Foundation and the Center for the Study of Market Processes. His current research interests involve the impact of competition between local governments on fiscal and economic outcomes, the relationship between the size of government and economic growth, state fiscal crises, and a variety of other issues in the areas of public economics and urban economics. His research has been published in a variety of journals including the Journal of Urban Economics, Public Finance Review, and the Cato Journal. He teaches mostly microeconomics, public sector economics, and urban economics. Professor Stansel regularly attends the meetings of the Association of Private Enterprise Education and the Southern Economic Association. He and his wife (Robin Hulsey, who also worked at Cato in the late 1990s) have two young children.

CAROL SWEENEY earned her Master of Science in Development Studies from the University College Dublin, Ireland in the fall of 1993. Ms. Sweeney earned her undergraduate degree in economics from George Mason University, where she studied under Professors Peter Boettke and the late Don Lavoie. While at George Mason, she was president of the economics club, which hosted a lecture by Nobel Laureate James Buchanan, and attended an Institute for Humane Studies seminar. Prior to working in academia Ms. Sweeney worked in the communications and education industry. She teaches both principles of microeconomics and macroeconomics. Her research interests are in sports economics, development economics, and public choice economics. Ms. Sweeney recently attended her first Association of Private Enterprise Education meeting in Guatemala.

Give the economics program at FGCU a look. I think you will be hearing more good news from their offices and classrooms in the years to come.

URL to article: http://fee.org/schools/interested-free-marketfriendly-economics-program/

jatx
11-24-2009, 10:00
IMHO, the authors above are way off-base. Not a single one is the product of a top tier economics program - they failed to gain entrance to these and seem to have languished in second rate schools ever since. Why should someone listen to them??

Sending your kid to a third-tier school (like those named) in search of "free-market economics" is a fool's errand. Basic economics courses are taught pretty much the same way everywhere. They are meant to introduce the kids to the "basics" of macro and micro, augmented by statistics and econometrics courses. What they are describing above is an ideology-driven political science program that will produce pitifully educated wannabe economists.

Want your kid to be an economist who can think for themselves? Send them to any top tier university where good work and steady effort will be required just to get by. Have them join the departmental honors program in economics, take two semesters each of macro and micro, stats and business stats, at least two semesters of econometrics, public and corporate finance, any course offerings in political economy and round things out with political science, history and cultural anthropology courses (if they have an interest in international development). Make them work summer jobs but ensure they have the opportunity to study abroad.

Then send them to a top-tier public policy or public administration program, which are mostly applied econ programs. If they can't get into one of the top five or so (Harvard, Chicago, Princeton, Johns Hopkins, Georgetown, plus a very few others in specialty areas), they should consider business school instead.

Sorry if this sounds harsh, but I am tired of running into and interviewing well-intentioned College Republicans without the math to make their own points stick. The above program will produce someone who can give you the rough results of multilinear regressions in their head, who can tear holes in faulty logic and flawed reasoning, and who has had the benefit of eating their liberal classmates for lunch every day for six years.

Economics is a discipline for students with the quantitative aptitude and intellectual curiousity needed for exploration, plus the discipline needed to grind through the prerequisites and build the basic skillset. It is not a place for indoctrination, left, right or otherwise.

Richard
11-24-2009, 10:09
Some might consider going with the opinions of the end-users:

http://bwnt.businessweek.com/bschools/undergraduate/09rankings/specialty.asp

http://www.usnewsuniversitydirectory.com/graduate-schools/social-sciences-humanities/economics.aspx

Richard's $.02 :munchin

jatx
11-24-2009, 10:17
Good links, Richard. My only caveat - the best graduate programs to look at are the MPP/MPA programs, not graduate programs in economics. Their focus is on application and all will offer the opportunity to cross-register for upper-level courses in the schools' economics and MBA programs, if the student so desires. The most important thing is that they will keep the students focused on the real-world, though.

SF-TX
11-24-2009, 10:35
Some might consider going with the opinions of the end-users:

http://bwnt.businessweek.com/bschools/undergraduate/09rankings/specialty.asp

http://www.usnewsuniversitydirectory.com/graduate-schools/social-sciences-humanities/economics.aspx

Richard's $.02 :munchin

Apparently, not all agree that the rankings you cite are all that meaningful.

The following is a 1996 (and still relevant) letter from the Stanford University Office of the President criticizing said rankings:

STANFORD UNIVERSITY
OFFICE OF THE PRESIDENT


GERHARD CASPER


September 23, 1996

Mr. James Fallows
Editor
U.S. News & World Report
2400 N Street NW
Washington, DC 20037

Dear Mr. Fallows:

I appreciate that, as the new editor of U.S. News & World Report, you have much to do at this moment. However, it is precisely because you are the new editor that I write to you, personally.

I emphasize you, because of your demonstrated willingness to examine journalism in the same way that journalism examines all other facets of society. And I say personally because my letter is for your consideration, and not a letter to the editor for publication.

My timing also is related to the recent appearance of the annual U.S. News "America's Best Colleges" rankings. As the president of a university that is among the top-ranked universities, I hope I have the standing to persuade you that much about these rankings - particularly their specious formulas and spurious precision - is utterly misleading. I wish I could forego this letter since, after all, the rankings are only another newspaper story. Alas, alumni, foreign newspapers, and many others do not bring a sense of perspective to the matter.

I am extremely skeptical that the quality of a university - any more than the quality of a magazine - can be measured statistically. However, even if it can, the producers of the U.S. News rankings remain far from discovering the method. Let me offer as prima facie evidence two great public universities: the University of Michigan-Ann Arbor and the University of California-Berkeley. These clearly are among the very best universities in America - one could make a strong argument for either in the top half-dozen. Yet, in the last three years, the U.S. News formula has assigned them ranks that lead many readers to infer that they are second rate: Michigan 21-24-24, and Berkeley 23-26-27.

Such movement itself - while perhaps good for generating attention and sales - corrodes the credibility of these rankings and your magazine itself. Universities change very slowly - in many ways more slowly than even I would like. Yet, the people behind the U.S. News rankings lead readers to believe either that university quality pops up and down like politicians in polls, or that last year's rankings were wrong but this year's are right (until, of course, next year's prove them wrong). What else is one to make of Harvard's being #1 one year and #3 the next, or Northwestern's leaping in a single bound from #13 to #9? And it is not just this year. Could Johns Hopkins be the 22nd best national university two years ago, the 10th best last year, and the 15th best this year? Which is correct, that Columbia is #9 (two years ago), #15 (last year) or #11 (this year)?

Knowing that universities - and, in most cases, the statistics they submit - change little from one year to the next, I can only conclude that what are changing are the formulas the magazine's number massagers employ. And, indeed, there is marked evidence of that this year.

In the category "Faculty resources," even though few of us had significant changes in our faculty or student numbers, our class sizes, or our finances, the rankings' producers created a mad scramble in rank order, for example:

Down Last year This year Up Last year This year
Harvard #1 #11 MIT #6 #2
Stanford 3 15 Duke 13 4
Brown 12 22 Yale 10 6
Johns Hopkins 15 19
Dartmouth 18 24

One component of this category, "Student/faculty ratio," changed equally sharply, and not just in rank order but in what the magazine has presented as absolute numbers. Again, this is with very little change in our student or faculty counts:

Worse Last year This year Better Last year This year
Johns Hopkins 7/1 14/1 Chicago 13/1 7/1
Harvard 11/1 12/1 Penn 11/1 6/1
Stanford 12/1 13/1 Yale 11/1 9/1
Duke 12/1 14/1

Then there is "Financial resources," where Stanford dropped from #6 to #9, Harvard from #5 to #7. Our resources did not fall; did other institutions' rise so sharply?

I infer that, in each case, the formulas were simply changed, with notification to no one, not even your readers, who are left to assume that some schools have suddenly soared, others precipitously plummeted.

One place where a change was made openly was, perhaps, the most openly absurd. This is the new category "Value added." I quote the magazine:

Researchers have long sought ways to measure the educational value added by individual colleges. We believe that we have created such an indicator. Developed in consultation with academic experts, it focuses on the difference between a school's predicted graduation rate - based upon the median or average SAT or ACT scores of its students and its educational expenditures per student - and its actual graduation rate.

This passage is correct that such a measure has long been sought. However, like the Holy Grail, no one has found it, certainly not the "we" of this passage. The method employed here is, indeed, the apotheosis of the errors of the creators of these ratings: valid questions are answered with invalid formulas and numbers.

Let me examine an example in "Value added": The California Institute of Technology offers a rigorous and demanding curriculum that undeniably adds great value to its students. Yet, Caltech is crucified for having a "predicted" graduation rate of 99% and an actual graduation rate of 85%. Did it ever occur to the people who created this "measure" that many students do not graduate from Caltech precisely because they find Caltech too rigorous and demanding - that is, adding too much value - for them? Caltech could easily meet the "predicted" graduation rate of 99% by offering a cream-puff curriculum and automatic A's. Would that be adding value? How can the people who came up with this formula defend graduation rate as a measure of value added? And even if they could, precisely how do they manage to combine test scores and "education expenditures" - itself a suspect statistic - to predict a graduation rate?

Were U.S. News, under your leadership, to walk away from these misleading rankings, it would be a powerful display of common sense. I fear, however, that these rankings and their byproducts have become too attention-catching for that to happen.

Could there not, though, at least be a move toward greater honesty with, and service to, your readers by moving away from the false precision? Could you not do away with rank ordering and overall scores, thus admitting that the method is not nearly that precise and that the difference between #1 and #2 - indeed, between #1 and #10 - may be statistically insignificant? Could you not, instead of tinkering to "perfect" the weightings and formulas, question the basic premise? Could you not admit that quality may not be truly quantifiable, and that some of the data you use are not even truly available (e.g., many high schools do not report whether their graduates are in the top 10% of their class)?

Parents are confused and looking for guidance on the best choice for their particular child and the best investment of their hard-earned money. Your demonstrated record gives me hope that you can begin to lead the way away from football-ranking mentality and toward helping to inform, rather than mislead, your readers.

Sincerely,

Gerhard Casper




BUILDING 10 * STANFORD, CALIFORNIA 94305-2060 * (415) 723-2481 * FAX (415) 725-6847

http://www.stanford.edu/dept/pres-provost/president/speeches/961206gcfallow.html

SF-TX
11-24-2009, 10:42
One more:

ERIC Identifier: ED468728
Publication Date: 2002-00-00
Author: Holub, Tamara
Source: ERIC Clearinghouse on Higher Education Washington DC.
College Rankings. ERIC Digest.

The popularity of college ranking surveys published by U.S. News and World Report, Money magazine, Barron's, and many others is indisputable. However, the methodologies used in these reports to measure the quality of higher education institutions have come under fire by scholars and college officials. Also contentious is some college and university officials' practice of altering or manipulating institutional data in response to unfavorable portrayals of their schools in rankings publications.

INTRODUCTION

In college rankings publications, as opposed to college guides which offer descriptive information, a judgment or value is placed on an institution or academic department based upon a publisher's criteria and methodology (Stuart, 1995, p. 13). In the United States, academic rankings first appeared in the 1870s, and their audience was limited to groups such as scholars, higher education professionals, and government officials (Stuart, 1995, pp.16-17). College rankings garnered mass appeal in 1983, when U.S. News and World Report's college issue, based on a survey of college presidents, was the first to judge or rank colleges (McDonough, Antonio, Walpole, and Perez, 1998, p. 514). In today's market, the appeal of college ranking publications has increased dramatically. Time magazine estimates that prospective college students and their parents spend about $400 million per year on college-prep products, which include ranking publications (McDonough et al., 1998, p. 514).

POPULARITY OF COLLEGE RANKINGS

Hunter (1995) believes that the popularity of rankings publications can be attributed to several factors: growing public awareness of college admissions policies during the 1970s and 1980s; the public's loss of faith in higher education institutions due to political demonstrations on college campuses; and major changes on campus in the 1960s and 1970s such as coeducation, integration, and diversification of the student body, which forced the public to reevaluate higher education institutions (p. 8). Parents of college-bound students may also use reputational rankings that measure the quality colleges as a way to justify their sizable investment in their children's college education. (McDonough et al., 1998, p. 515-516).

COLLEGE RELIANCE ON RANKINGS AND GENERAL CRITICISMS OF THE RANKINGS PUBLICATIONS

College administrators have increasingly relied on rankings publications as marketing tools, since rising college costs and decreasing state and federal funding have forced colleges to compete fiercely with one another for students (See Hossler, 2000; Hunter, 1995; McDonough et al., 1998). According to Machung (1998), colleges use rankings to attract students, to bring in alumni donations, to recruit faculty and administrators, and to attract potential donors (p. 13). Machung asserts believes that a high rank causes college administrators to rejoice, while a drop in the rankings often has to be explained to alumni, trustees, parents, incoming students, and the local press (1998, p. 13).

Criticisms of rankings publications have proliferated as scholars, college administrators, and higher education researchers address what they perceive as methodological flaws in the rankings. After reviewing research on rankings publications, Stuart (1995) identified a number of general methodological problems: 1) Rankings compare institutions or departments without taking into consideration differences in purpose and mission; 2) Reputation is used too often as a measure of academic quality; 3) Survey respondents may be biased or uninformed about all the departments or colleges they are rating; 4) Rankings editors may tend to view colleges with selective admissions policies as prestigious; and 5) One department's reputation may indiscriminately influence the ratings of other departments on the same campus (pp. 17-19).

U.S. NEWS AND WORLD REPORT'S "AMERICA'S BEST COLLEGES"

The most specific criticism has been directed against U.S. News and World Report's, "America's Best Colleges," published since 1990 and the most popular rankings guide. Monks and Ehrenberg (1999) investigated how U.S. News determines an institution's rank, basing their study on statistics from U.S. News' 1997 publication. They found that U.S. News takes a weighted average of an institution's scores of in seven categories of academic input and outcome measures as follows: academic reputation (25%); retention rate (20%); faculty resources (20%); student selectivity (15%); financial resources (10%); alumni giving (5%); and graduation rate performance (5%) (Monks and Ehrenberg, 1999, p. 45). These categories were further divided and 16 variables were used as measurements. McGuire (1995) asserts that the variables U.S. News uses to measure quality are usually far removed from the educational experiences of students (McGuire, 1995, p. 47). For example, U.S. News measures the average compensation of full professors, a sub factor of the faculty resources variable mentioned above. McGuire argues that this variable implies that well-paid professors are somehow better teachers than lower-paid professors-an implication unsupported by direct evidence.. He says that "In the absence of good measures, poor measures will have to suffice because the consumer demand for some type of measurement is strong and the business of supplying that demand is lucrative" (McGuire, 1995, p. 47). Along the same lines, Hossler (2000) believes that better indicators of institutional quality are outcomes and assessment data that focus on what students do after they enroll, their academic and college experiences, and the quality of their effort (p. 23).

Monks and Ehrenberg (1999) found that U.S. News periodically alters its rankings methodology, so that "changes in an institution's rank do not necessarily indicate true changes in the underlying 'quality' of the institution" (p. 45). They contend note, for example, that the California Institute of Technology jumped from 9th place in 1998 to 1st place in 1999 in U.S. News, largely due to changes in the magazine's methodology (Monks and Ehrenberg, 1999, p. 44). Ehrenberg (2000) details how a seemingly minor change in methodology on the part of U.S. News can have a dramatic effect on an institution's ranking (p. 60). Machung (1998) states that "The U.S. News model itself is predicated upon a certain amount of credible instability" (p. 15). The number one college in "America's Best Colleges" changes from year to year, with the highest ranking fluctuating among 20 of the 25 national universities that continually vie for the highest positions in the U.S. News rankings (Machung, 1998, p. 15). Machung asserts that "new" rankings are a marketing ploy by U.S. News to sell its publication (1998, p. 15).

Although eighty percent of American college students enroll in public colleges and universities, these schools are consistently ranked poorly by U.S. News (Machung, 1998, p. 13). Machung (1998) argues that the U.S. News model works against public colleges by valuing continuous undergraduate enrollment, high graduation rates, high spending per student, and high alumni giving rates (p. 13). She also contends that the overall low ranking of public colleges by U.S. News is a disservice to the large concentration of nontraditional students (over 25, employed, and with families to support) enrolled in state schools (Machung, 1998, p. 14).

COLLEGE AND UNIVERSITY RESPONSES TO RANKINGS

College and university officials have responded to the unfavorable or undesirable rankings placement of their institutions in a variety of ways. Some ignore the rankings, others refuse to participate in the surveys, and many respond by altering or misrepresenting institutional data presented to rankings publications (See Stecklow, 1995; Machung, 1998; Monks and Ehrenberg, 1999). By examining the inconsistencies between the information colleges presented to guidebooks and the information they submitted to debt-rating agencies in accordance with federal securities laws, Stecklow (1995) has documented how numerous colleges and universities have manipulated SAT scores and graduation rates in order to achieve a higher score in the rankings publications (p. A1). He noted that many colleges have inflated the SAT scores of entering freshman by deleting the scores from one or more of the following groups: international students, remedial students, the lowest-scoring group, or and learning disabled students. Although many college officials admit that this practice raises ethical concerns, they continue these manipulations because there are no legal obstacles preventing such action. Stecklow asserts says that many surveyors such as Money magazine, Barron's, and U.S. News do not always check the validity of the data submitted to them by colleges (1995, p. A1).

BALANCED APPROACH

Since many published rankings have been perceived as biased, uninformative, or flawed, a number of higher education practitioners encourage parents and prospective students to do their own research on colleges, to view alternative college prep publications, and to view the rankings publications with a critical eye.


http://www.ericdigests.org/2003-3/rankings.htm

Richard
11-24-2009, 11:24
Apparently, not all agree that the rankings you cite are all that meaningful.

I have been directly involved in this ranking system of schools (both high schools and colleges) which has been an on-going debate amongst educators, administrators, and those publishing the various rankings for quite awhile. FWIW - the primary dissention towards its usefulness in determining college programs and placement is almost wholly directed towards the undergrad rankings as this system most adversely affects both the public perception of the schools and their finances in today's on-going battle over enrollment and tuition at that level - and is why I only posted the links to what are considered to be the best grad schools in the specific areas noted.

Yawn. ;)

Richard

PedOncoDoc
11-24-2009, 13:38
The Cleveland Clinic - an institution I am glad to no longer be employed by - was rated #6 overall in the nation in Obstetrics and Gynecology when they do not even have a Labor and Delivery department on their main campus. The rating is based on research dollars and "national reputation." People rate hospitals with prestigious names high because of the name - not because of the medicine practiced there. I give you my justification below:

In the first part of this decade, the Cleveland Clinic realized if they decentralized L&D to the suburban hospitals they could take all of the poor inner city women who show up in the ED and transport them 2 blocks over to University Hospitals or to the community hospital 15 minutes away who can eat the bill for delivering an uninsured woman. The practice was justified by saying, "we don't have a labor and delivery unit in this hospital." It was a rarity that a woman delivered at the main hospital - and it typically happened in the ED - you cannot transfer a woman who is actively pushing a baby out. On rare occasions, high-risk pregnancies with known fetal defects would be brought to the main hospital for delivery in an operating room near the NICU.

After seeing this I discredited anything I saw in their reports - regardless of what they were rating.

I'll stop my rant now and apologize for the tangent - I have strong feelings on this topic.

SF-TX
11-24-2009, 15:40
...and is why I only posted the links to what are considered to be the best grad schools in the specific areas noted.

Richard

Considering the author was offering his opinion on the best undergraduate programs in economics, what is the relevance of what you posted? And, if the methodology itself is flawed, it seems rankings of graduate schools based on this methodology would be equally specious.

Richard
11-24-2009, 16:15
Considering the author was offering his opinion on the best undergraduate programs in economics, what is the relevance of what you posted? And, if the methodology itself is flawed, it seems rankings of graduate schools based on this methodology would be equally specious.

The exact same methodology isn't used for the grad school rankings and the consensus amongst those involved in that endeavor is that for a field of study such as economics (among a number of others), any decent undergrad program (and there are many) is about as good as the next - and is looked at primarily as a way to get into a 'named' (top-ranked) grad school which is pretty much mandatory to be 'marketable' in such a field.

As for relevance - just offering some additional information to be considered in weighing the opinion piece by Mr Reed and when selecting any school. For example - this political cartoon and excerpt from an FEE article are components of the FEE school of thinking regarding economic freedom and immigration policy (from their Freeman newszine).

An Unconstitutional Line in the Sand

Whether they’re between states or countries, borders soon cease to be noticed by most people living along them. They marry one another, establish businesses, visit, laugh, cry, agree, disagree, and dream together. So it is along the U.S.-Mexican boundary. The wall will sunder these families and friends as mercilessly as Berlin’s barricade did Germans.

The Founding Fathers understood government’s essence, its cruelty and callousness, far better than do modern Americans. That’s why their Constitution never empowers politicians to regulate anyone’s movement into or out of the country (except for slaves, fittingly enough: What else are we when we beg a bureaucrat, “Please, may I enter?”). Article 1, Section 9 bars Congress from “prohibit[ing]” the “Migration or Importation” of “such Persons as any of the States now existing shall think proper to admit” until 1808. If we dismiss the doctrine of enumerated powers, this implies that Congress may prohibit all the migrating and importing it likes thereafter. And if we also dismiss the literary and historical context that limits Article 1, Section 9 to slaves, it appears the feds may indeed control anyone’s immigration after 1808—but only in those states existing at the Constitution’s adoption. None of those border Mexico, and mighty few do Canada. DHS needs to relocate its wall down the Atlantic coast.

Nor does the Constitution deputize the central government to “protect” the country’s borders, much less build walls “funneling” migrants through deadly desert where cops lurk to kidnap them. Immigration ought never to have been federalized in the first place; government had no business arrogating an “interest” in it during the 1870s, then tightening its vise each decade since. Immigration is an issue of property rights—not the DHS’s infernal abrogation of them, but a decision by the folks Michael Chertoff so despises, “each individual landowner,” as to whether migrants may cross his property.

Despite its utter lack of constitutional authority, DHS will probably continue militarizing our borders.

http://www.thefreemanonline.org/archive/issues/?volume=59&issue=5&Type=Issue


Richard

SF-TX
09-13-2010, 18:53
Gig 'em, Aggies!


In today’s jobs market, new college grads can use all the help they can get when trying to crack into the professional arena. While some economists are warning that unemployment numbers may remain high for a while, the Wall Street Journal has interviewed job recruiters from across the country to find out which colleges are best-preparing their grads for the working world:

Recruiters say graduates of top public universities are often among the most prepared and well-rounded academically, and companies have found they fit well into their corporate cultures and over time have the best track record in their firms.

Employers also like schools where they can form partnerships that allow them to work with professors and their students, giving them an inside track when it comes time to make offers for internships and jobs.

Here are WSJ’s top 25 picks (and the schools’ 2011 U.S. News ranking):

1. Penn State (#47 in U.S. News)
2. Texas A&M (#63)
3. Illinois (#47)
4. Purdue (#56)
5. Arizona State (#143)
6. Michigan (#29)
7. Georgia Tech (#35)
8. Maryland (#56)
9. Florida (#53)
10. Carnegie Mellon (#23)
11. BYU (#75)
12. Ohio State (#56)
13. Virginia Tech (#69)
14. Cornell (#15)
15. UC-Berkeley (#22)
16. Wisconsin (#45)
17. UCLA (#25)
18. Texas Tech (#159)
19. North Carolina State (#111)
20. [Tie 19] Virginia (#25)
21. Rutgers (#64)
22. Notre Dame (#19)
23. MIT (#7)
24. USC (#23)
25. North Carolina (#30)
26. [Tie 25] Washington State (#111)

http://www.theblaze.com/stories/which-colleges-give-grads-the-best-job-prospects/

Wall Street Journal article:

http://online.wsj.com/article/SB10001424052748704358904575477643369663352.html?m od=WSJ_hps_LEFTTopStories

Richard
09-13-2010, 19:19
As with the USNWR rankings, one needs to read the entire article to understand the context of the WSJ's comments and rankings.

Here is just one of many important points made in the article:

Under pressure to cut costs and streamline their hiring efforts, recruiting managers find it's more efficient to focus on fewer large schools and forge deeper relationships with them, according to a Wall Street Journal survey of top corporate recruiters whose companies last year hired 43,000 new graduates.

Corporate budget constraints also play a role. Recruiter salaries, travel expenses, advertising and relocation costs run upwards of $500,000 to recruit 100 college grads, according to the National Association of Colleges and Employers. "We're all accountable to the bottom line," said Diane Borhani, campus recruiting leader at Deloitte LLP, who said she recently narrowed her roster to about 400 schools from 500.

The impact on students is significant. Steve Canale, head of General Electric Co.'s recruiting efforts, said it is critical for prospective students to ask which companies recruit on campus before deciding where to matriculate. GE, for example, focuses on about 40 key schools—many of them state schools—to hire 2,200 summer interns; upwards of 80% of its new-graduate hires come from its internship pool, said Mr. Canale.

Many recruiters say they are closely eyeing schools in their own backyard. Aside from the obvious convenience of proximity, companies are drawn to nearby schools for year-round access to interns and a greater chance that new-graduate hires reside locally, which eliminates relocation expenses.

Richard :munchin

SF-TX
09-13-2010, 19:25
As with the USNWR rankings, one needs to read the entire article to understand the context of the WSJ's comments and rankings.



It doesn't take a college graduate to understand that if recruiters are focusing on the schools listed, for whatever reason, new graduates from one of these schools will have better job prospects.

Richard
09-13-2010, 19:31
It doesn't take a college graduate to understand that if recruiters are focusing on the schools listed, for whatever reason, new graduates from one of these schools will have better job prospects.

Exactly - that's one point the article made - it's in the third paragraph cited in my post and one of the reasons I said people should read the entire article to understand it.

Richard

SF-TX
09-13-2010, 19:35
That's one point the article made - it's in the third paragraph cited in my post and one of the reasons I said people should read the entire article to understand it.

Richard

Thanks Richard. I would have only read the first paragraph, if it weren't for your advice.

Richard
09-13-2010, 19:38
Always willing to help. ;)

Thanks for posting the WSJ article - the reality of the educational marketplace is changing to align itself with the reality of the world's economic engine, and those seeking to take an active part in it all should heed the sound advice of those who contributed to this excellent article.

Richard