Saturday, January 11, 2014

From Beijing to Jerusalem: The Creation of a Mega-Zone of Conflict. By Robert D. Kaplan

From Beijing to Jerusalem: The creation of a mega-zone of conflict. By Robert D. Kaplan. Politico, January 8, 2014.

On Forecasting. By Robert Kaplan. Real Clear World, January 9, 2014.

Israel and the Death of Pan-Arabism. By Caroline Glick.

Israel and the death of pan-Arabism. By Caroline Glick. Jerusalem Post, January 10, 2014. Also at Real Clear World and


The so-called Arab Spring unleashed forces that have been dormant for a century. Like their counterparts throughout the region, Israel’s Arabic-speaking minorities are changing in profound ways. But our leaders fail to grasp the implications of what is happening.
Consider the Christian community.
Father Gabriel Nadaf, a Greek Orthodox priest from Nazareth, has become the symbol of this new period. Nadaf is the spiritual leader of an Israeli Christian movement calling for Israeli Christian youth to serve in the IDF. He is responsible for the 300 percent rise in Christian Arab enlistment in the IDF in the past year.
Nadaf does not hide his goal or his motivation.
His seeks the full integration of Israel’s 130,000 Christians into Israeli society. He views military service as the key to that integration.
Nadaf is motivated to act by the massive persecution of Christians throughout the Arab world since the onset of the Arab revolutionary wave in December 2010.
As he explained in a recent interview with Channel 1, it is “in light of what we see happening to Christians in Arab countries, how they are slaughtered and persecuted on a daily basis, killed and raped just because they are Christians.
Does this happen in the State of Israel? No, it doesn’t.”
Shahdi Halul, a reserve captain in the Paratroopers who works with Nadaf, declared, “Every Christian in the State of Israel should join the army and defend this country so it will exist forever. Because if, God forbid, the government is overthrown here, as it was in other places, we will be the first to suffer.”
These men, and their supporters, are the natural result of the most significant revolutionary development of the so-called Arab Spring: the demise of Arab nationalism.
As Ofir Haivry, vice president of the Herzl Institute, explained in an important article in the Mosaic online magazine, Arab nationalism was born in pan-Arabism – an invention of European powers during World War I that sought to endow the post-Ottoman Middle East with a new identity.
The core of the new identity was the Arabic language. The religious, tribal, ethnic and nationalist aspirations of the peoples of the Arabic- speaking region were to be smothered and replaced by a new pan-Arab identity.
For the Christians of the former Ottoman Empire, pan-Arabism was a welcome means of getting out from under the jackboot of the Islamic Laws of Omar, which reduce non-Muslims living under Muslim rule to the status of powerless dhimmis, who survive at the pleasure of their Islamic rulers.
But now pan-Arabism lies in ruins from North Africa to the Arabian Peninsula. The people of the region have gone back to identifying themselves by tribe, religion, ethnicity, and in the case of the Kurds and the Berbers, non-Arab national identity. In this new era, Christians find themselves imperiled, with few if any protectors or allies to be found.
As Haivry notes, Israel’s central strategic challenge has always been contending with pan-Arabism, which was invented at the same time that the nations of the world embraced modern Zionism.
Since its inception, pan-Arab leaders always saw Israel as the scapegoat on which to pin their failure to deliver on pan-Arabism’s promise of global Arab power and influence.
Israel changed its position on pan-Arabism drastically over the years. Once, Israel could see the dangers in pan-Arabism and Arab nationalism.
But since 1993, says Haivri, Israel’s national strategy has been based on appeasing the secular authoritarian pan-Arab leaders by offering land for peace to Syria and the PLO.
Haivry notes that Shimon Peres is the political godfather of Israel’s accommodationist strategy, which is rooted in a mix of perceived powerlessness on the one hand, and utopianism on the other.
The sense of powerlessness owes to the conviction that Israel cannot influence its environment.
That the Arabs will never change. Israel’s neighbors will always see themselves primarily as Arabs, and they will always want, more than anything else, Arab states.
At the same time, the accommodationists hold the utopian belief that Israeli appeasement of Palestinian Arab nationalism will break through the wall of pan-Arab rejection, end hatred for the Jewish state, and even lead the Arabs to invite Israel to join the Arab League.
The so-called Arab Spring has put paid to every one of the accommodationists’ beliefs. From Egypt to Tunisia to Iraq to Syria, Israel’s neighbors are fighting each other as Sunnis, Shi’ites and Salafists, or as members of clans and tribes, without a thought for the alleged primacy of their Arab identity. What Israel’s Palestinian-state-obsessed Left has failed to realize is that many of Israel’s neighbors do not share the pan-Arab scapegoating of the Jewish state. So bribing the now largely irrelevant Arabs nationalists with another Arab state may do little more than create the newest victim of the Arab revolutions.
It is because they see what is happening to their co-religionists in the post-pan-Arab Middle East that more and more Israeli Christians realize they will lead safer, more prosperous and more fulfilling lives as Christian citizens in the Middle East’s only democracy than as pan-Arabs battling the Zionist menace.
But old habits die hard. Most of Israel’s elected Arab leaders owe their positions to their embrace of pan-Arabism. This embrace has brought them the support of the PLO and Europe, and since 1993, of the Israeli Left.
And so, since he first appeared on the scene, Father Nadaf’s life has been constantly threatened.
Everyone from Arab members of Knesset to the Communist head of the Greek Orthodox Council has incited against him, calling him and his followers traitors to the Palestinian Arab nation.
He also threatens the Israeli Left. For its view of Israel’s strategic powerlessness and consequent need to appease its neighbors to remain relevant, the pan-Arab forces in the Arab world must be perceived as still dominant, even invincible.
And so, the Israeli Left refuses to consider the larger strategic implications of the regional upheaval from which Nadaf’s initiative emerged.
Even worse, the official policy of the Netanyahu government appears based on this irrelevant Leftist view of the region. This is the implication of Foreign Minister Avigdor Liberman’s defeatist speech at the Foreign Ministry’s annual conference of ambassadors on Sunday.
Liberman’s speech has been rightly viewed as the supposedly right-wing politician’s formal break with his ideological camp and his embrace of the Left. In his remarks Liberman let it be known, that like the Left, he now bases his positions on a complete denial or avoidance of reality.
For this, he was congratulated for his “maturity” by Peres who was sitting on the stage with him.
In his speech, Liberman acknowledged that the Obama administration’s peace plan for Israel and the Palestinians is horrible for Israel. But, he said, it is better than the European Union’s peace plan.
Never considering the possibility of saying no to both, Liberman said he thinks we should accept the bad American deal. His only condition is that he insists that the PLO accept towns in the Galilee and their 300,000 Israeli Arab residents.
Liberman’s surrender of the Galilee is a key component to his population swap plan. Under his plan, Israel would retain control over the fraction of Judea and Samaria in which large numbers of Israeli Jews live, in exchange for the area of the Galilee that is home to 300,000 Israeli Arabs. This plan has reportedly been presented to US Secretary of State John Kerry as an official Israeli position.
In other words, the Netanyahu government has failed to recognize the implications of the death of pan-Arabism. In maintaining their slavish devotion to the two-state formula, and viewing the Arabs in the Galilee, Judea, Samaria, Jerusalem and surrounding states as an impenetrable bloc, they are placing Israel’s future in the hands of actors who have already disappeared or will soon disappear. Instead of building alliances with non-Jewish citizens of Israel, such as Druse and Christians, who are more than happy to defend Israel against Islamists and other regional fanatics, the Netanyahu government insists on placing the state’s future in the hands of pan-Arabs whose grip on power is slipping and who would never willingly coexist with Israel anyway.
Nadaf and his followers respond to the allegation – uttered by MKs like Haneen Zoabi and Basel Ghattas, among others – that they are traitors to the Palestinian Arab nation, with contempt.
“When someone tells me, ‘We’re all Arabs,’ I tell him, ‘No, we’re not all Arabs. You’re an Arab. I’m not,’” Halul told Channel 1.
Samer Jozin, whose daughter Jennifer opted for IDF service instead of medical school, agrees.
“Telling me I’m a Palestinian is a curse. I’m, thank God, an Israeli Christian and proud of it. And I thank God I was born in the Land of Israel,” he said.
The message couldn’t be clearer. We are basing our national strategy on a world that no longer exists.
Today our longtime allies the Kurds have carved out virtually independent states for themselves in Iraq and Syria.
Christians throughout the region are on the run. The Druse of Syria and Lebanon are exposed, without protection, and looking for help.
As for the Muslims, as Haivry notes, they are fragmented along sectarian and political lines, and at war with one another in battlefields throughout the region. While so engaged, they have little time to devote to blaming Israel for their failures.
This state of affairs has implications for Israel’s Arab Muslim minority. None of the regional warring Muslim camps are natural homes for Israel’s Muslim community. A community that has lived in an open, free society for 65 years does not naturally turn to Salafism. Israel is a much easier fit for most Israeli Muslims.
At a minimum, no one is better off if Israel forces them to cast their lot with any of the warring factions in Syria or Lebanon, or the increasingly irrelevant forces in the Palestinian Authority.
There may very well be hundreds of Muslim versions of Father Nadaf just waiting for a signal from our government that we want them to lead their community into our society.
The post-pan-Arab Middle East exposes the truth that has been obscured for a century. The Jews and their Jewish state are a natural component of our diverse neighborhood, just like the Kurds, the Christians, the Druse, the various Muslim sects, and the Arabs. The demise of pan-Arabism is our great opportunity, at home and regionally, to build the alliances we need to survive and prosper. But so long as our leaders insist on clinging to the now irrelevant dream of appeasing the defunct pan-Arabists, we will lose these opportunities and convince our allies that we are treacherous, disloyal and temporary.

Echoes of 1914. By Gordon Adams.

Echoes of 1914. By Gordon Adams. Foreign Policy, January 10, 2014. Also here.

1914 all over again? By Susanne Spröer. DW, January 9, 2014.

2014 and 1914: Two Ships. By Adam Gopnik. The New Yorker, January 6, 2014.


We think we can predict the future – though as physicist Niels Bohr noted years ago, prediction is very difficult, especially about the future. But in the first days of 2014 – a year that happens to mark the 100th anniversary of the start of World War I – some of the coming conflicts and challenges are pretty clear. We will hear a lot about the Syrian civil war, the fate of the Iranian nuclear program, conflict in Iraq, the departure of U.S. forces from Afghanistan – not to speak of what applecart Vladimir Putin plans to upset next, whether the North Korean regime will implode, and whether China and its neighbors intensify their conflict over the rocky outcroppings they all want to own.
As we reflect on this anniversary year, however, there are deeper rumblings afoot, rumblings that will color and shape many of these conflicts. The same was true 100 years ago. The Edwardian era that preceded the Great War celebrated a pervasive view that war might be obsolete, and a blithe lack of concern among the wealthy about the rising tide of unhappiness at the gap in resources and power between rich and poor.
At the start of that new century, however, the shape of world politics was about to transform, while class conflict rose and shook the very foundations of the monarchies of continental Europe. Between these two forces, they would wipe out the Austro-Hungarian Empire, remove royalty from power in Germany, bring revolutionary turmoil to Russia, undermine the colonial systems established by France and Germany, and bring a new power – the United States – to the center of the world stage.
For all the differences in the current historical moment (and there are many), there are two eerily similar challenges that lie beneath the surface of these predictable conflicts today. Both will be hard for policymakers to manage, and both could usher in dramatic change to the international system over the next decade.
The first of these is the clear decline in the ability of the world’s most powerful country – the United States – to act as the indispensable nation, particularly as the influence of other countries rises and the global system rebalances. The second is the yawning economic gap between rich and poor, both in the United States and internationally. Systemic geopolitical rebalancing and the wealth gap are already substantially reshaping the international system in ways that are hard to predict, just as the statesmen and politicians of the last century could barely see the conflict that would break out in 1914.
The power shift and rebalancing of the international system is even harder for many to adjust to. The United States appears to remain the most powerful country in the world. But it is a power measured today largely in one dimension – the possession of the world’s only truly global hard security capabilities: military force and intelligence. That’s the surface reality. But something is clearly going on underneath the reality of that military power that is weakening the hold the United States had over the international system.
The decline in the role of the United States as system integrator, manager, and, for some, global hegemon (a trend I have already noted) continues to manifest itself at an accelerating pace. It is reflected most recently in widening disregard for expressed American desires and goals – such as whether Japan should increasingly arm itself and extend its military reach beyond its own shoreline. It is found in the growing distance between Washington and its long-time ally in Ankara, as the struggling Turkish government blames the United States for its internal corruption problems and struggles to assert an independent regional role. Meanwhile, India attacks the United States for allegedly mishandling an Indian diplomat in New York, and reduces the privileges it had provided to American diplomats in New Delhi. Likewise, another traditional ally, Saudi Arabia, grows increasingly unhappy with U.S. policy in the Gulf region and becomes querulous and critical.
Each incident, taken on its own, might be explained away as diplomatic feather-ruffling, simply business as usual. But together they are becoming a trend, forcing Secretary of State John Kerry to flit from country to country, trying to dampen the fires. There seems to be some recognition that things are fundamentally changing – just look at the apparent reluctance of the Obama administration to use its power to intrude into the myriad of conflicts that beset the Middle East and Africa. Leave peace enforcement in Africa to the African Union, the United Nations, or the French. Don’t send the Marines. Stay at the edges of the Syrian conflict, not at the center. Encourage a peaceful solution to the disputes over the seas off the Chinese coast, but do not promise to send U.S. warships steaming into the middle of tense waters. And so on.
I don’t think Washington has yet come fully to grips with the reality of systemic change. There is not yet a clear strategy to deal with a world in flux. But some of this reality seems to have penetrated, nonetheless. First, there seems to be a realization that, despite the global superiority of the U.S. armed forces, military intervention has lost what international popularity it ever had, partly as a result of the failed use of force in Iraq and Afghanistan. In neither country has stability been created, democracy implemented, or economic development established – while regional security around these countries is less stable today than before the U.S. intervened with force. Other countries, and their populations, cast a more jaundiced eye today than they once did about American leadership, intentions, and capabilities.
This global skepticism about Washington’s use of its hard power has been exacerbated by the exposure of the reach of U.S. “silent power” in the intelligence arena. The Snowden flakes that keep falling documenting the extent of America's global intrusion into private, public, and governmental communications have only accelerated. And the fallout is real: First, Brazilian President Dilma Rousseff cancels a state visit to the White House because the NSA was eavesdropping on her personal communications. Traditional allies in Germany and France are equally upset. Other governments are searching for ways to protect their information and communications systems from U.S. intrusion.
Second, not unlike the negative international reaction to America’s power projection abroad, there has been a significant shift in domestic opinion about the nature of U.S. foreign policy and public willingness to countenance more hard power deployments into foreign conflicts. The latest Pew poll on these issues is dispositive. More than half of those polled think the United States “should mind its own business internationally and let other countries get along the best they can on their own.” This exceeds the previous polling summit on this question in 1976, which was right smack in the post-Vietnam era. And 70 percent recognize that the United States has lost respect internationally, virtually as high as the Bush-era numbers in 2008.
What we see here is an opinion shift well within the historic American view of its global role – a reassertion, not of isolationism, but of a more realistic engagement of a different sort. Rather than send the Marines overseas “in search of monsters to destroy” (about which John Quincy Adams warned back in 1821), the American people seem to be saying that we should engage through the power of example, diplomacy, and, especially, through economic means. In other words: Keep our noses out of other people’s business, solve our problems here at home, and keep our military powder dry. Even the soldiers who fought to free Fallujah from terrorists just a few years ago agree – the United States needs to stay out, today.
Inevitably, the realization that rebalancing requires rethinking U.S. policy and the nature of U.S. engagement is not universally popular here at home. To advocates of “muscularity” like Sens. John McCain and Lindsey Graham, the hesitation in the Obama administration’s practice is a political fault to be criticized, not a global reality. Time to talk tough, send arms to Syria, assert some leadership here, tell others what they should do, for goodness sake.
The problem with muscularity as an answer is that we are in a period of system change, not business as usual. President George W. Bush tried the muscular thing and not only failed to reach his goals, but, in trying, simply accelerated the trend toward rebalancing. Trying to restore the “ancien régime” would not only be self-defeating, but dangerous, exacerbating global concern about the wisdom and intentions of the U.S. role. The world has changed. And being the muscle-bound bully will only lead to getting global sand kicked back in America’s face, to military conflicts that it did not anticipate getting into. The “historical rhyme” (perhaps apocryphally attributed to Mark Twain) here of 100 years ago is pretty clear: Look to the presumed security European nations thought they would obtain by arming up in the years before 1914. The system was rebalancing, but the old order could not be preserved by arms.
The other eerie similarity to the era of 100 years ago is the revival of sharp distinctions in the national and global economies. The Pulitzer Prize winning author Hedrick Smith is not alone in documenting the major shift in income and wealth between the very rich in the United States in his 2013 book, Who Stole the American Dream? The disappearance of the American middle class is an economic, sociological, and, in the end, political phenomenon of enormous significance, one that upends the dominant American mythology of the 1950s and 1960s. It has increasingly divorced the very rich from the rest – the so-called 99 percent. The gap has driven the United States down the global list of countries ranked by income disparity. Today, income inequality in America, measured by Palma ratios (the gap between the richest 10 percent and the poorest 40 percent) ranks the United States 44 out of 86 countries, well below most industrialized countries – even below Nigeria, India, Iran, and, Egypt.
A similar trend in inequality is found in other countries around the world, which many analysts believe will lead to a rising tide of global unrest. Slowing economic growth in India could risk destabilization. Fissures over the unequal distribution of income and wealth in China are linked to provincial instability. And anger across the Middle East can be linked not only to religious and political tensions, but to the stubborn resistance in the region’s economies to allow for the kind of growth that could create opportunities for millions of educated, but unemployed, youth.
In the Edwardian era, rising wealth was seen as a positive trend, one that would usher in an era of broad economic well-being. Instead, it reflected what Princeton professor Samuel Hynes described as a world of “estrangement and anxious uncertainty,” in which a British upper class of “irresponsible rich, living in a new vulgarity and a strange new poor, living in new ugliness, were replacing the old class division of gentry and peasantry.” Social conflict was the inevitable outcome.
We are clearly entering a time of global political and economic transition, where the shoreline of apparent stability is receding in the distance. We are on the waves of change, with the new shoreline – the emerging international balance and the global economy – not yet clear. It is going to make for hard sailing for U.S. foreign policy in 2014 and well beyond. Some will want to hang on to the apparent stability provided by U.S. military power; in a world of “uncertainty,” they will say, military dominance is the best instrument of power.
But holding on to that instrument could well lead directly to destabilizing conflict. And the failure at the same time to deal, nationally and internationally, with the economic gap could exacerbate that conflict in unpredictable and dangerous ways. Welcome to the new year.

Compassionate Conservatives Are Confusing a Slogan With an Aganda. By Ben Domenech.

Compassionate Conservatives Are Confusing a Slogan With an Agenda. By Ben Domenech. The Federalist, January 10, 2014.

A Conservative Vision of Government. By Michael Gerson and Peter Wehner. NJBR, January 2, 2014.

A Yellow Light for Government. By Michael Gerson. NJBR, January 7, 2014.

Movement on the Right. By David Brooks. NJBR, January 9, 2014.


Good government inevitably becomes big government.

In the year since President Barack Obama’s re-election, a handful of advocates for compassionate conservatism have re-emerged to push back against limited government conservatives with the same agenda they’ve been peddling for nearly 15 years. Built around a message of governance in favor of the public good, they have chided the Tea Party and its limited government allies for ignoring the plight of the poor, heartlessly pursuing libertarian ends, and adopting a view of government’s proper role which is unrealistic and ahistorical.
The problem is that their own views are based on assumptions undermined by the failings of the George W. Bush presidency and by the organic growth in distrust in government among all Americans – and they fail to recognize the inherent weakness of their message, which confuses a political slogan with a coherent philosophy of governance and would allow for sweeping expansions of the state.
Former Bush speechwriters Michael Gerson and Pete Wehner have a long essay in National Affairs about conservative governance which has been getting some attention over the past few weeks. If it’s too much for you to read, you can read a shorter summary in Gerson’s Washington Post column here, which critiques “the identification of constitutionalism with an anti-government ideology” as “not only politically toxic; it is historically and philosophically mistaken.” Gerson continued on that theme in his subsequent column:
One of the main problems with an unremittingly hostile view of government — held by many associated with the tea party, libertarianism and “constitutionalism” — is that it obscures and undermines the social contributions of a truly conservative vision of government. Politics requires a guiding principle of public action.
For popular liberalism, it is often the rule of good intentions: If it sounds good, do it. Social problems can be solved by compassionate, efficient regulation and bureaucratic management — which is seldom efficient and invites unintended consequences in complex, unmanageable systems (say, the one-sixth of the U.S. economy devoted to health care). The signal light for government intervention is stuck on green. For libertarians and their ideological relatives, the guiding principle is the maximization of individual liberty. It is a theory of government consisting mainly of limits and boundaries. The light is almost always red.
Conservatism (as Peter Wehner and I explain in our recent National Affairs essay, “A Conservative Vision of Government”) offers a different principle of public action — though one a bit more difficult to explain than “go” or “stop.” In the traditional conservative view, individual liberty is ennobled and ordered within social institutions — families, religious communities, neighborhoods, voluntary associations, local governments and nations. The success of individuals is tied to the health of these institutions, which prepare people for the responsible exercise of freedom and the duties of citizenship. This is a limiting principle: Higher levels of government should show deference to private associations and local institutions. But this is also a guide to appropriate governmental action — needed when local and private institutions are enervated or insufficient in scale to achieve the public good.
The problem with Gerson’s framing here is obvious: in what way is appropriate governmental action to achieve a public good determined? If we are in an era when social institutions are in decline – partially due to government, but due as much to culture – what limits if any should expansionists recognize on the size and scope of government? This is the equivalent of the general welfare clause: If there is any limit to what can be defined as a public good, which of Michael Bloomberg’s policies would Gerson describe as unconservative? Isn’t it good for people to be healthier, even if the state is being a bit of a nanny? Were local and private institutions really dealing with those problems of too much soda and salt?
Philip Klein has more:
Throughout the piece, Gerson and Wehner make arguments that are very difficult to distinguish philosophically from liberalism. “The founders, then, provided us with a strong governing system – strong precisely because it could adapt to changing circumstances,” they write, echoing the liberal idea of a “living Constitution.” The authors also argue for a federal government “strong enough to shape global events and to guarantee a minimal provision for the poor, ill, and elderly.” Though Gerson and Wehner insist they believe in limited government, it’s hard to see what limiting principle they have in mind, as the definition of “minimal provision” could vary widely. Evidently, what philosophically separates them from liberals is a belief that the welfare state should be less centralized and technocratic.
Gerson and Wehner are not politicians, of course. But there are those who appear to be adopting their brand of reform. Senator Marco Rubio’s proposal this week for an anti-poverty reform agenda is a useful example of the problem these compassionate conservative assumptions run into when you attempt to put them into practice. While consolidation and block-granting are all well and good, Rubio doesn’t stop there:
Mr. Rubio will also propose Wednesday to replace the Earned Income Tax Credit, which was used by 28 million tax payers in 2011, with a new “wage enhancement” system that directs federal money towards supplementing the income of people who work in “qualifying low-income jobs.”
Rubio’s motivations here are noble, and almost certainly pass Gerson’s “public good” test: wage stagnation is indeed a problem, and the EITC is a warped system which has racked up a roughly 25% fraud percentage over the past decade. But think for a moment about what he’s proposing here: a future of long fights over what a “qualifying low-income job” is, a definition ripe for unions to exploit under future Democratic administrations. And let’s not even get started on the audits and oversight. I thought that limited government advocates would want to get government out of businesses, not further integrating them. Conn Carroll explains:
All conservatives should ask themselves: Do I want to empower President Obama to decide which are the “qualifying low-wage jobs” and which are not? Is there any doubt Obama, or future liberal presidents, would use this new government program to play favorites in the market place? Would Obama or President Hillary Clinton every give wage subsidies to coal miners? Or Americans working at an oil refinery? Of course not. How would the federal government prevent fraud and abuse without making the new program a burden on participating employers? Instead of creating a brand new government program to subsidize low paying jobs, why not just cut the payroll tax for everyone? No favoritism. No fraud. No abuse. Just make it easier for employers to hire and let Americans take home more of their money every paycheck. Why not keep it simple?
Robert Rector has some criticism of Rubio’s plan here. But the bigger issue is that Rubio’s focusing on the wrong problem, as Scott Winship indicates here in a piece on another topic. Wage subsidies accept the left’s proposition that the problem here is a monetary one, where just giving poor people more money to be more comfortable in their poverty is the solution. That’s the opposite of a safety net, which – if properly designed – offers peace of mind to the most vulnerable in the event of total disaster. And Rubio’s answer ignores the fact that the real problem faced by the working and middle class isn’t wage stagnation so much as the actions of government have caused things like health care, education, gas and groceries to eat up a larger portion of their pocketbooks… an approach which would be far more consistent with a limited view of government’s role.
The best critique of Gerson and Wehner’s views may be this 2008 review of the former’s book, Heroic Conservatism, by John Podhoretz. In an eloquent passage, Podhoretz reveals the real failing ignored by the compassionate conservative advocates: they’re trying to turn a limited marketing slogan into a comprehensive governing philosophy.
But it is precisely the gap between the lofty principles expressed in speeches and the often compromised policies enacted by officialdom that has helped create public skepticism about the efficacy of government action to cure social ills. This skepticism vexes Gerson, but he does not offer a reasoned argument against it. He simply cautions conservatives not to be excessively fearful of the so-called “law of unintended consequences”—i.e., the possibility that government action intended to do good can have the opposite result. . .
 “Like all true conservatives,” Gerson writes, “I believe in limited government.” But there is very little in this book about limiting government’s reach and a great deal about expanding it. Gerson’s call to idealism is inspiring, especially in his chapters dealing with Bush’s campaign to combat AIDS in Africa—surely the most underappreciated initiative of this presidency and perhaps of any presidency in modern times. And his account of the thinking behind the magisterial series of addresses through which George W. Bush transformed the foreign policy of the United States after September 11 is essential reading for any student of American politics.
But it seems Gerson never really grasped the truth about compassionate conservatism. This is that it was not a party program, let alone a developed political philosophy, but a marketing gimmick. It is thus little wonder that eight years of exploring the depths and reaches of this topic have led to a very singular brand of politics. Michael Gerson’s party of heroic conservatism is, I fear, a party of one.
The challenge of conservative governance in this era of the right’s muddled grappling with their ongoing philosophical disagreement will continue to create tensions between a faction that believes conservatism means doing the business of compassion more efficiently in pursuit of a vaguely defined public good, and one which believes it’s more important to restrain the warping effects of government and return the government to the role it occupied for most of American history, before LBJ set us on the path toward an unsustainable entitlement state. . . which was, if you think about it, entirely justified at the time if you adopted Gerson’s approach.
Here’s a hint: If your approach to conservative governance would justify the Great Society, it’s usually a sign you took a wrong turn somewhere. Maybe because the lights were all green.

The Realist Prism: As Mideast Unravels, Time to Reconsider “Soft Partitions.” By Nikolas Gvosdev.

The Realist Prism: As Mideast Unravels, Time to Reconsider “Soft Partitions.” By Nikolas Gvosdev. World Politics Review, January 10, 2014. Also here.


Depressing headlines from the Middle East have thrown cold water on any lingering optimism that U.S. policy objectives in the region were on track. In Iraq, Fallujah and Ramadi have been lost, at least for now, to al-Qaida-linked insurgents. The Syrian conflict has apparently transformed into a multi-sided war, increasing the likelihood that Bashar al-Assad’s regime will survive. And progress remains elusive in Afghanistan as the countdown to withdrawal continues. Not long ago there was reason for hope in all these countries. The surges in Iraq and Afghanistan were supposed to have worked, and the Arab Spring, it was hoped, would not simply topple authoritarian regimes but lay the groundwork for the emergence of secular democracies throughout the region.
I concur with my colleague Steven Metz’s observation earlier this week that the Middle East is not President Barack Obama’s to “lose.” Certainly, we should not accept any narrative that denies agency to Iraqi, Syrian or Afghan leaders and absolves them of responsibility for the errors, mistakes and blunders of the past several years. At the same time, however, American policymakers’ own preferences and beliefs shaped the policies Washington pursued.
A case in point is Vice President Joe Biden’s unheeded endorsement, as a senator in 2006, of a “soft partition” for Iraq. Such a move might have laid the basis for a more stable country over the long term, and might have then provided a more workable path forward for dealing with the current unravelings in both Syria and Afghanistan. At the time, Biden’s approach clashed with the majority view, in both Democratic and Republican circles, that the way forward was to continue to push for a strong central government empowered by an electoral system defined by “one person, one vote.”
It is unfortunate that the bitterly learned lessons of the Yugoslav wars of the 1990s were set aside because their conclusions were at odds with Americans’ preferred outcomes for the Middle East, particularly after a decade of sacrifice and loss. Had there been a greater willingness to grapple with an honest after-action report of what happened in the former Yugoslavia—and how U.S. and allied policy preferences and choices contributed to those outcomes—some of the setbacks we are now experiencing could have been avoided.
What were some of the lessons that were ignored?
The first and most important one is that you cannot have “normal” politics in a country as long as the dominant political identities are ethno-sectarian. If the prime motivator for most voters is a shared ethnic affiliation, rather than social, political and economic positions, then political life will be dominated by ethnic considerations and by intergroup polarization. In Bosnia’s elections, for instance, voters have for the most part cast ballots for parties defined by their affiliation with fellow Serbs, Muslims or Croats; in the definitive elections before the outbreak of war in the early 1990s, voters tended to cast their ballots for politicians who shared their identity rather than their political outlook. Before the war—and after the end of fighting in 1995—this voting pattern produced unstable coalitions where a great deal of effort was expended to divide up government offices and positions on an ethnically proportional basis.
A second, related lesson is that when democracy is defined in majoritarian terms and politics are run on ethnic lines, then the ethnic and religious minorities will consistently be outvoted. This will either produce the impetus for separatism—reinforcing the belief that “the system” has no place for minorities—or, depending on the demographics, an effort to forge an “alliance of the minorities” to counterbalance or even dominate the titular majority. Either way, it also tends to produce politics defined by a zero-sum mentality. The inexorable logic of the Kosovo conflict demonstrates this. Once Kosovo’s autonomous status had been revoked by Slobodan Milosevic in 1989, the Kosovar Albanians who were a majority in the province found themselves in an ostensibly centralized republic in which they would be the perpetual minority. Having fought to separate, however, the government in Pristina must now cope with the unwillingness of the Serb-majority areas of Northern Kosovo to, in turn, accept perpetual minority status in a Kosovo independent of Serbia.
The third lesson is that under such conditions it can be very difficult to generate neutral state institutions that inspire trust and confidence. Bosnia functions, in essence, on life support provided by the European Union, which may, in conditions of economic austerity, be scaled back. Macedonia, which itself experienced an abortive civil war more than a decade ago between its Slav-majority and Albanian-minority populations, must still grapple with this problem.
The U.S. initially wanted to hold Yugoslavia together, as U.S. policymakers were uncomfortable with accommodating the nationalism that was an important reason for Yugoslavia’s demise as a multiethnic federation. But Washington came to believe that successor states could be neatly devised from the wreckage. After strongly opposing a “soft partition” plan for Bosnia in 1992, the U.S. brokered a settlement—devised by Richard Holbrooke at Dayton three years later, after thousands of lives had been lost—that codified an internal partition of Bosnia between a Serb republic and a federation composed of Muslim and Croat units. Washington turned a blind eye to population exchanges, some forced, some consensual, that changed the demographics within Bosnia and Croatia; accepted that borders could be changed by shifting position on Kosovo to support complete independence; and helped to broker power-sharing agreements in Macedonia. These arrangements, imperfect as they are, have at least endured peacefully.
Today, the U.S. continues to maintain there are Syrian, Iraqi and Afghan identities that transcend and trump religious, sectarian, tribal and linguistic affiliations in those countries. The prevailing approach is that national elections, as already held in Iraq and Afghanistan, should produce governments perceived as legitimate by all sectors of the population. But as long as democratic majoritarianism is understood locally either as the dominance of the majority ethnic group or an opportunity to create an alliance of minorities, then stability is not possible through voting. Events in Iraq this past week are the latest sign of this reality.
Biden’s proposal on Iraq involved “decentralizing it, giving each ethno-religious group . . . room to run its own affairs, while leaving the central government in charge of common interests.” That plan, itself based on lessons learned from the Bosnia example, might gain new momentum as a way to hold an increasingly fragmented Iraq together. It might also serve as the basis for a political settlement in Syria and a blueprint for holding Afghanistan together as Western forces depart and the aid spigot that has funded the central government is turned off.
If Washington is serious about its postponed pivot to the Pacific, it can no longer devote as much time and attention to the Middle East. It might be time for Biden to make the case for soft partition one last time. As he noted seven years ago, “We’re going to get there either by our action or by our inaction; what we need to do is to manage this.”

Intelligent Populism vs. Mindless Progressivism. By Victor Davis Hanson.

Intelligent Populism vs. Mindless Progressivism. By Victor Davis Hanson. PJ Media, January 6, 2014. Also at VDH’s Private Papers.


New Deal Liberals Transform into the Faux Populist Radical Left
With elections looming in 2014, it is about time for Barack Obama to gear up another progressive “war” against the rich, the limb loppers, the fat cats, the tonsil pullers, the “enemies” of Latinos, the jet junketers, the women haters, and those who knew neither when to stop profiting nor how the government had really built their businesses. We shall shortly witness some of the wealthiest and most privileged of capitalist America decrying inequality and unfairness from the 18th hole in Hawaii, the Malibu gated estate, and the Beacon Hill mansion. And the faux populism will probably work, at least if 2008 and 2012 are any indications.
It is easy to chart the evolution of the wealthy progressive elite from the occasional limousine liberal of the 1950s and 1960s to the now dominant hierarchy of the Democratic Party.
The traditional Democratic boilerplate that I grew up with (as much as a ten year old can notice much of anything in 1963) — minimum wage, 40-hour work week, overtime pay, disability insurance, fair housing, civil rights, assistance for the needy — was mostly achieved by 1970. Equality of opportunity, however, did not translate into equality of result — given differences and imperfections in human nature.
Six instead of two children, three packs of cigarettes a day, four beers after work, two DUIs, a messy divorce, a freak accident on the job — the possibilities of either unsustainable responsibility or mishap are endless — can send one from middle class into poverty, well beyond the powers of the most enlightened government to prevent it. What is the liberal to do in those cases to ensure that we end up the same?
Moreover by 1995, the huge expansion of the U.S. economy, globalization, and sweeping breakthroughs in technology radically transformed the prior idea of “poverty,” as I had remembered it in 1960 (we of the rural middle class a half-century ago all used the privy farm toilet when outdoors around the house, and shared a party phone line with eight other families). Today’s poor struggle with drugs, crime, shattered families, and malaise, but not outdoor privies, the lack of air conditioning and heating, dusty dirt roads, or a denial of access to a phone or TV. Deprivation now is almost defined as the absence of a free electronic tablet at school.
Urban riots do not break out over bread, but more likely about the nth model of Air Jordan sneakers. When I go to a local Quest lab for a blood draw, the waiting room is full of poor who suffer terribly from diabetes and kidney failure brought about by carbohydrate- and sugar-driven obesity, not malnourishment. Too many calories are the scourge of America. There are no stormings of the local Wal-Mart to spread beans and rice around; occasional flash mobbing of electronics stores nationwide is prompted by desire for smart phones and pads.
I have seen holiday shoppers in my environs shout and push over big-screen TV holiday sales, not rant over who gets the last ham hock at the meat counter. The knockout game is not driven by poverty, but by boredom, a poverty of the mind, and the assumption that there will be little government downside (e.g., getting caught, convicted and sentenced to a long prison terms won’t necessary happen) or private consequences (i.e. the frail-looking metrosexual target might well pull out a .45 semi-automatic).
Once the liberal vision of legal equality of opportunity was mostly achieved, the melodrama of ensuring an equality of result entailed. Wealthy liberals, however, were not quite up to their own rhetoric, in the sense of living the life of egalitarianism, diversity, and conspicuously reduced consumption. I don’t remember any Silicon Valley grandees offering space for a few non-running Winnebagos to be parked out behind their six-car garages. (I can offer blueprints of how it is done by sending a few pictures from six or seven of my neighbors.) There are few Kias on Malibu streets. Or less dramatically, Google execs do not put their kids in Redwood City elementary schools to learn of hard-knocks from the Other. Kanye West’s house has unused room for lots of homeless people. MSNBC radicals do not take the subway home to inner Harlem. Tenured Stanford faculty do not live in East Palo Alto.
The Modern Psychological Disorder of Elite Liberalism
The result of cosmic disappointment in the ability of progressive politics to correct human disparities has given birth to the modern psychological disorder of elite liberalism, which is mostly about squaring the circle of maintaining privilege while deploring inequality. Say America is unfair ten times a day, and the BMW in the garage and the new putter are no longer sins.
Barack Obama cannot finish a sentence without lamenting unfairness; but he proves to be no Jimmy Carter in scouting out the most exclusive of golf courses, and the richest of fat cats to putt with. Elizabeth Warren talks of oppressed minorities, but then invents a pseudo-Native-American identity to get a leg up on the elite competition in order to land at Harvard. The fact is that the elite who champion the poor and the poor themselves are not the players of the 1930s; the former usually make about the same amount of money and enjoy the same privileges as those they damn, while the latter have access to appurtenances and privileges denied the royalty of old.
The wealthier and more secluded an Oprah, the most desperately she searches for evidence of bias and inequality, finally reduced to the caricature of whining about racially driven poor service over a $38,000 crocodile handbag. If most in California don’t care what people do in their bedrooms, or if gays have on average higher incomes than non-gays, or if gay marriage is now de rigeur, the search for cosmic equality continues at an even brisker pace, resulting in transgendered bathrooms in the public schools (crede mihi: the ten-year-old daughters of the Yahoo elite will not encounter transgendered fifteen-year-old boys in the female Menlo School restrooms).
It is not perverse, but logical that Obamacare architects don’t want Obamacare coverage. It is understandable that Washington young-gun liberals know exactly where DuPont Circle or Georgetown gets iffy. Modern liberalism provides the necessary mental mechanisms to ensure the enjoyment of privilege. Al Gore was the classical liberal of the age, crafting an entire green empire predicated on opposing the very values that he later embraced to become, and preserve staying, very rich.
Where Does This All Lead?
I don’t know, but the Republicans have not been able to explain to the country either the illiberal nature of liberalism or its hypocrisies.
To win the presidency after eight years of liberal acculturation, the Republicans are going to have to nominate a man of the people, in the Reagan fashion of the wood-chopper who talks incessantly from first-hand knowledge about the common man.
An entire array of issues is going to have to be reformulated. Take illegal immigration. It is a gift for wealthy employers and La Raza elites, but an anathema for entry-level laborers of all races whose wages are destroyed by off-the-books illegals. Strapped taxpayers, not big business, pay for the impact on schools, infrastructure, and the legal system when eleven million cross the border illegally.
More gun control is an elite musing: it does nothing to rid us of illegal weaponry, but much to aggravate the middle-class hunter and homeowner, while exempting well-armed security of the elite class.
Zero interest rates have made banks flush with cash that they pay no interest on, while hardly reducing credit card interest rates and ensuring that play-by-the-rules middle class passbook savers lose money.
Quantitative easing and the Federal Reserve have ensured a rush to the stock market that booms while unemployment remains high. The world of Larry Summers, Jack Lew, or Peter Orszag benefits — not the retired teacher with his life savings earning nothing. The farm bill is still a giveaway to wealthy agribusiness at a time of record farm prices, predicated on the quid-pro-quo notion that 70% of the legislation’s funds will go to food stamps. But again, why let the progressive mind feed on the carcass of the old, easily caricatured Republican wealthy?
It is past time to forge a new populist approach without the theatrics of shutting down the government or playing on the same keyboards as Pajama Boy Obamacarespivs. The liberal elite runs the culture, from universities and entertainment to government bureaucracies and the media, but it nonetheless is predicated on loudly condemning in the abstract the very creed that they embrace in the concrete.
In response, we need to expose their hypocrisies and start worrying about a shrinking middle class that has been damaged by Obama-era elites.  Almost every issue from fracking, gun control, illegal immigration and quantitative easing to Obamacare and cap and trade invites a populist critique. Yet so far the Republican establishment seems uninterested in making the case that the Democratic hierarchy is of, by, and for the elite. Liberals are funded and represented by the privileged in Wall Street, universities, entertainment, the media, politics, foundations, the arts, and government, and the inherited wealthy. They all have set the agenda for the nation, called it progressive, and then sought exemption by seeking more taxpayer money for entitlements to ensure the fealty of the poor.
Liberalism is not progressivism, but instead pull-up-the-ladder-after-me regressivism — and someone with some imagination and worldly experience needs to say that. When multimillionaire Trimalchios like Bill and Hillary Clinton, the inherited rich like John Kerry, or insider hucksters like Al Gore are called progressives, we all are in sad shape.

The Outlaw Campus. By Victor Davis Hanson.

The Outlaw Campus. By Victor Davis Hanson. National Review Online, January 7, 2014. Also at VDH’s Private Papers.


Two factors have so far shielded the American university from the sort of criticism that it so freely levels against almost every other institution in American life. (1) For decades a college education has been considered the key to an ascendant middle-class existence. (2) Until recently a college degree was not tantamount to lifelong debt. In other words, American society put up with a lot of arcane things from academia, given that it offered something — a BA or BS degree — that almost everyone agreed was a ticket to personal security and an educated populace.
Not now. Colleges have gone rogue and become virtual outlaw institutions. Graduates owe an aggregate of $1 trillion in student debt, borrowed at interest rates far above home-mortgage rates — all on the principle that universities could charge as much as they liked, given that students could borrow as much as they needed in federally guaranteed loans.
Few graduates have the ability to pay back the principal; they are simply paying the compounded interest. More importantly, a college degree is not any more a sure pathway to a good job, nor does it guarantee that its holder is better educated than those without it. If the best sinecure in America is a tenured full professorship, the worst fate may be that of a recent graduate in anthropology with a $100,000 loan. That the two are co-dependent is a national scandal.
In short, the university has abjectly defaulted on its side of the social contract by no longer providing an affordable and valuable degree. Accordingly, society can no longer grant it an exemption from scrutiny.
Here are ten areas that need radical reform.
1. Tenure. Few if any other professions — not law, medicine, finance, engineering, etc. — offer guaranteed lifetime employment after a six-year apprenticeship. Tenure was predicated on a simple premise: The protection of faculty free speech and instruction was worth the possible downside of complacency and an absence of serious ongoing faculty audit. Whatever may once have been the case, in our time tenure does not ensure free expression, but instead a banal orthodoxy, in which 90 percent of the faculty in the humanities share the same progressive outlook. Tenure also created a caste system far more rigid than anything found in private enterprise, while a huge permanent faculty class ensured inflexibility in scheduling and budgeting. The associate or full professor enjoyed a lifelong right of selection of his classes without too much worry over whether they were either needed or taught well. Worse, the nontenured faculty member, in the fashion of the Middle Ages, was admitted to the guild only if his tenured peers believed that he was agreeable in politics and attitude. He was usually judged by teaching and publication criteria that did not necessarily apply to his board of overseers, many of whom had achieved tenure 20 years earlier under entirely different criteria.
2. Faculty exploitation. The abuse of lecturers, part-timers, and graduate students is institutionalized. In a word, the university is the most exploitative institution operating at present in the United States, protected by the notion that it is progressive and that its protocols cannot possibly be understood by the ordinary public. Temporary and adjunct faculty members often have degrees as good as those of their tenured betters. Often their teaching records and publications are comparable, if not superior. They may teach the same classes as permanent faculty do, and yet often receive about half the compensation. Were Wal-Mart or a coal mine to operate under such protocols, it would earn Labor Department sanctions. At some public universities, nearly half of the curriculum is taught by part-time faculty — in effect a subsidy that allows the tenured caste to teach smaller and less-in-demand classes, where less time is needed for preparation and grading. Worse still, universities knowingly turn out too many PhDs in the humanities, which ensures a glut of job applicants, which, again, ensures a continued supply of cheap temps to sustain tenured privilege.
3. Curriculum. Tenure and abuse of part-timers are partly a result of a faculty governance that determines the curriculum, especially the general-education core, on the basis of politics and ease of teaching. Somewhere around 1980, a new generation of faculty created a whole new curriculum with the suffix “studies.” The result was advocacy, not disinterested empiricism. Nationwide, thousands of traditional classes in history, philosophy, literature, and the social sciences gave way to ethnic studies, women’s studies, leisure studies, gender studies, peace studies, environmental studies, etc. Students did not receive the same degree of writing and reasoning preparation as in the old classes, much less the factual foundations of a liberal education. It was also nearly impossible to do well in these courses for a student who disagreed with the political assumptions of the advocate faculty. “Studies” contributed in no small part to the unfortunate emergence of the arrogant and ignorant graduate, who left the campus zealous for social change but sadly without the skills to even articulate his goals.
4. Admissions. University lawyers and sociologists are quick to issue papers deploring the hiring policies of private enterprise and government; yet, oddly, no one really knows the criteria by which students are selected for admission. No university publishes the percentage of students who are admitted not on merit but on the basis of athletics, legacies, cash benefactions, race, and gender. “Diversity” became the successor to affirmative action, once the latter’s rules and guidelines became impossible to define, much less to defend. Worse still, even within these rubrics there is no transparency: What size of gift leverages a B+ student into Harvard? Does someone from the Punjab qualify for diversity consideration in the same way a third-generation, one-quarter Latino might? Only the university could have allowed an Elizabeth Warren to invent an entirely fictitious minority pedigree and parlay it without audit all the way to Harvard. If there were not a Ward Churchill, he would have to be invented. That no one will ever know exactly on what criteria the president of the United States was admitted to Columbia College or Harvard Law School is a testament to the secrecy and mystery of the university guild that has such intrusive interest in the less-than-transparent workings of other institutions.
5. Administration. Much of the recent explosion in annual costs is due to administrative bloat — special assistants to this and deputy associates of that. Left unspoken is that many of these trumped-up six-figure positions are to promote “diversity” and “technology” that have little to do with mastery of reasoning, prose, and scientific knowledge. Most administrative jobs require less formal expertise than does a faculty position, and it is generally recognized that full professors who take on administrative positions are sometimes welcomed out of the classroom given their poor teaching and research records. But why should those who dreamed up exploitative part-time teaching positions be exempt from their own logic? Private enterprise could supply all sorts of part-time administrative clerks to the university at a fraction of the present in-house costs. If a PhD in French can be hired as a lecturer for $800 a month, surely the Associate Provost for Diversity Affairs can be part-timed and outsourced for $600?
6. The credential. The schools of education have grown enormously on the strength of their monopoly over credentialing, the requisite two-year supplemental training in “education” that supposedly teaches graduates how to teach in the public schools. But credentialing programs have grown less academic and far more partisan in focusing on how to look at the world through race, class, and gender lenses. The solution would be to give every postgraduate the choice of either obtaining the teaching credential or receiving an MA degree in an academic subject. Most recent graduates would rather have two years of extended historical or mathematical study than the therapeutics of the credentialing degree — and their future students would be far better off as well. If the schools of education did not have a monopoly over credentialing, they would quickly dissolve, given that their product has made the public schools far less credible.
7. National competency testing. Lawyers and doctors have to pass state or national exams to practice their craft. So do veterinarians, electricians, and general contractors. Society’s assumption apparently is that one’s professional training alone is not sufficient proof of competency. Prospective faculty members should also be required to take a general test in their field to ensure competency. Sadly, a PhD in history is no proof these days that the recipient can distinguish the Battle of Shiloh from the Battle of Waterloo, the Enlightenment from the Renaissance, or a Doric from an Ionic column. In addition, to receive the bachelor’s degree, graduating seniors should be required to take a national competency test in general education — something open as well to non-college students who wish to win the BA or BS degree by examination. This idea of national audit remains an anathema to universities, because there is no proof that the graduates of our most prestigious schools would do any better than those of state colleges — or than autodidacts or the homeschooled.
8. Budget. Since university costs have gone up over 7 percent annually on average for the last two decades, it is past time for transparency, especially given the infusion of state and federal subsidies. How strange that universities will publish statistical data on almost every facet of American life — from racial matters to the environment — but not provide the public with a detailed breakdown of their own expenditures to allow students and their parents to understand why their tuition is priced as it is. Students should have the choice of deciding whether they wish to attend a college that budgets for rock-climbing walls, an Assistant Dean of Internet Technology, or visits by a Michael Moore or John Edwards, at thousands of dollars per campus rant.
9. Publication. Expensive university presses arose to ensure that quality research would be disseminated without regard to its market value. The costs of a university-press book were absorbed to ensure that first-rate research did not have to depend on Book-of-the-Month Club sales potential to see the light of day. Not now. The Internet allows such information to be cheaply accessible. Faculty publications could easily be downloaded without an expensive hard-copy version. Moreover, digitalizing would allow transparency about the degree to which such publications were read by peers in the field. The old notion that a peer-reviewed article in a particular journal or a university-press monograph is the key to tenure has become antiquated in the age of the World Wide Web and the ubiquitous electronic audit of just about everything we do. Faculty are terrified of a future where one’s life’s work can be instantly accessed, and where its usefulness can be assessed by the number of scholars who consult it, footnote it, or buy it.
10. Legal exemption. Entering a campus should not mean sacrificing constitutional protections. Yet the rights of the accused are often subordinate to campus speech codes and protocols dealing with supposed sexual and racial insensitivities. The same exemption is often extended to campus violence, especially disruptions of inconvenient speech. Universities should not be allowed to construct their own bill of rights that supersedes that of the federal government, any more than private enterprises can concoct their own laws and regulations that trump those outside their plant or office. State and federal funding to colleges should be predicated on full compliance with current state and federal laws.
In sum, we have allowed the university to become a rogue institution, whose protocols are often at odds with normal practice off campus and secretive to a degree unknown elsewhere.
The common theme of all university reform should be transparency. Faculties are superb self-appointed auditors of others; it is time we should extend the same audit to them as well.