Saturday, January 18, 2014

Colbie Caillat: Hold On.

Colbie Caillat: Hold On. Video. ColbieCaillatVEVO, January 15, 2014. YouTube.



Israel/Palestine: Who’s Indigenous? By Ryan Bellerose.

Israel/Palestine: Who’s Indigenous? By Ryan Bellerose. Israellycool, January 9, 2014.

Bellerose:

I am a Métis from Paddle Prairie Metis settlement. My father, Mervin Bellerose, co-authored the Métis Settlements Act of 1989, which was passed by the Alberta legislature in 1990 and cemented our land rights. I founded Canadians For Accountability, a native rights advocacy group, and I am an organizer and participant in the Idle No More movement in Calgary. And I am a Zionist.
 
Indigenous status
 
To begin, let us acknowledge that there is no rule that a land can have only one indigenous people; it is not a zero sum game in which one group must be considered indigenous so that therefore another is not. However, there is a very clear guideline to being an indigenous people. It is somewhat complex but can be boiled down to the checklist below, as developed by anthropologist José R. Martínez-Cobo (former special rapporteur of the Sub-commission on Prevention of Discrimination and Protection of Minorities for the United Nations).
 
This list was developed because indigenous rights are beginning to be respected across the planet. This recognition is incredibly important, so we as indigenous people cannot allow non-indigenous people to make false claims, which ultimately would harm our own rights. Israel is the world’s first modern indigenous state: the creation and declaration of the sovereign nation of Israel marks the first time in history that an indigenous people has managed to regain control of its ancestral lands and build a nation state. As such, this is incredibly important for indigenous people both to recognise and to support as a great example for our peoples to emulate.
 
The actual working definition of “indigenous people,” (not the Wikipedia version, nor Merriam Webster, both more suited to plants and animals) for purposes of this essay is that developed by aforementioned anthropologist José R. Martínez-Cobo. With this as my foundation, I will detail why Jews are indigenous to Israel, and why Palestinians are not.
 
Martínez-Cobo’s research suggests that indigenous communities, peoples and nations are those which, having a historical continuity with pre-invasion and pre-colonial societies that developed on their territories, consider themselves distinct from other sectors of the societies now prevailing on those territories, or parts of them. They form at present non-dominant sectors of society and are determined to preserve, develop and transmit to future generations their ancestral territories, and their ethnic identity, as the basis of their continued existence as peoples, in accordance with their own cultural patterns, social institutions and legal system.
 
This historical continuity may consist of the continuation, for an extended period reaching into the present of one or more of the following factors:
Occupation of ancestral lands, or at least of part of them.
 
Common ancestry with the original occupants of these lands.
 
Culture in general, or in specific manifestations (such as religion, living under a tribal system, membership of an indigenous community, dress, means of livelihood, lifestyle, etc.).
 
Language (whether used as the only language, as mother-tongue, as the habitual means of communication at home or in the family, or as the main, preferred, habitual, general or normal language).
 
Residence in certain parts of the country, or in certain regions of the world.
 
Religion that places importance on spiritual ties to the ancestral lands.
 
Blood quantum – that is, the amount of blood you carry of a specific people to identify as that people. The concept was developed by colonialists in order to eventually breed out native peoples.
Let us now look quickly at the Jews. How do they fit this definition?
Their lands were occupied, first by the Romans, then by the Arabs in the seventh century.
 
They share common ancestry with previous occupants as determined by several genetic studies.
 
Their culture can be traced directly to the Levant, where it developed into what is now known as “Jewish culture.” While different Jewish communities have slightly different traditions, they all share the same root culture, and it remains unchanged. They have resurrected their traditional language, and while many still speak Yiddish and Ladino, Hebrew has become the primary language again.
 
They have spiritual ties to the land, which plays a large role in their traditions as a people.
Despite all the arguments about “European” Jews, they in fact meet all the criteria set forth by Martínez-Cobo. Even though Israel is the first modern indigenous state, it still has lands that are occupied by foreigners in Judea and Samaria. Those are ancestral lands and, many feel that they should be returned to the indigenous peoples for self-determination.
 
Now, for the flip side.
 
Palestinians have what are called “rights of longstanding presence;” and although these are legitimate rights, they do not trump indigenous rights. The very nature of “longstanding presence” means that although they lived somewhere a long time, they do not have the right to occupy indigenous peoples and control them.
 
The argument that Palestinians are indigenous is incorrect for several reasons.
Approximately 50% percent of Palestinian Arabs can track their ancestors back farther than their great-grandparents. Many are descended from Arabs brought to the Levant by the British to build infrastructure after World War I.
 
The vast majority of Palestinians are Arabic speaking Muslims; the Arabic language is indigenous to the Arabian Peninsula, as is the Muslim religion. The Muslim religion’s holiest places are not in the Levant, but in the city of Mecca, located in the Arabian Peninsula. They have no specifically Palestinian culture that is completely Palestinian dating before the 1960s; in fact, prior to that, the majority identified as “greater Syrians.”
 
Some Palestinians share common ancestry with indigenous peoples, but they neither follow indigenous traditions nor do they self-identify as those indigenous peoples. They share neither religion nor language with them. Blood quantum alone is insufficient to transmit indigenous status.
 
The Arabs of the Middle East subsumed several indigenous populations, but no group can become indigenous through subsuming indigenous peoples. Rather, they conquered the entire region and spread their own language, customs, and religion. This is historical fact.
Now you might ask, why is this important? It is important to indigenous people because we cannot allow the argument that conquerors can become indigenous. If we, as other indigenous people, allow that argument to be made, then we are delegitimising our own rights.
 
If conquerors can become indigenous, then the white Europeans who came to my indigenous lands in North America could now claim to be indigenous. The white Europeans who went to Australia and New Zealand could now claim to be indigenous. If we, even once, allow that argument to be made, indigenous rights are suddenly devalued and meaningless. This is somewhat peculiar, as those who are arguing for Palestinian “indigenous rights” are usually those who have little grasp of the history, and no understanding of the truth behind indigenous rights.
 
If you should encounter the argument that conquerors may themselves become indigenous to a region by virtue of conquering, direct those who assert the argument to this article, and help them understand not only is the argument wrong – it is dangerous to Indigenous people everywhere.

The Death of Expertise. By Tom Nichols.

The Death of Expertise. By Tom Nichols. The Federalist, January 17, 2014.

When Ignorance Begets Confidence: The Classic Dunning-Kruger Effect. By Daniel R. Hawes. Psychology Today, June 6, 2010.

Unskilled and unaware of it: How difficulties in recognizing one’s own incompetence lead to inflatedself-assessments. By Justin Kruger and David Dunning. Journal of Personality and Social Psychology, Vol. 77 No. 6 (December 1999).

Why People Fail to Recognize Their Own Incompetence. By David Dunning, Kerri Johnson, Joyce Ehrlinger, and Justin Kruger. Current Directions in Psychological Science, Vol. 12, No. 3 (June 2003).


Nichols:

I am (or at least think I am) an expert. Not on everything, but in a particular area of human knowledge, specifically social science and public policy. When I say something on those subjects, I expect that my opinion holds more weight than that of most other people.
 
I never thought those were particularly controversial statements. As it turns out, they’re plenty controversial. Today, any assertion of expertise produces an explosion of anger from certain quarters of the American public, who immediately complain that such claims are nothing more than fallacious “appeals to authority,” sure signs of dreadful “elitism,” and an obvious effort to use credentials to stifle the dialogue required by a “real” democracy.
 
But democracy, as I wrote in an essay about C.S. Lewis and the Snowden affair, denotes a system of government, not an actual state of equality. It means that we enjoy equal rights versus the government, and in relation to each other. Having equal rights does not mean having equal talents, equal abilities, or equal knowledge.  It assuredly does not mean that “everyone’s opinion about anything is as good as anyone else’s.” And yet, this is now enshrined as the credo of a fair number of people despite being obvious nonsense.
 
What’s going on here?
 
I fear we are witnessing the “death of expertise”: a Google-fueled, Wikipedia-based, blog-sodden collapse of any division between professionals and laymen, students and teachers, knowers and wonderers – in other words, between those of any achievement in an area and those with none at all. By this, I do not mean the death of actual expertise, the knowledge of specific things that sets some people apart from others in various areas. There will always be doctors, lawyers, engineers, and other specialists in various fields. Rather, what I fear has died is any acknowledgement of expertise as anything that should alter our thoughts or change the way we live.
 
This is a very bad thing. Yes, it’s true that experts can make mistakes, as disasters from thalidomide to the Challenger explosion tragically remind us. But mostly, experts have a pretty good batting average compared to laymen: doctors, whatever their errors, seem to do better with most illnesses than faith healers or your Aunt Ginny and her special chicken gut poultice. To reject the notion of expertise, and to replace it with a sanctimonious insistence that every person has a right to his or her own opinion, is silly.
 
Worse, it’s dangerous. The death of expertise is a rejection not only of knowledge, but of the ways in which we gain knowledge and learn about things. Fundamentally, it’s a rejection of science and rationality, which are the foundations of Western civilization itself. Yes, I said “Western civilization”: that paternalistic, racist, ethnocentric approach to knowledge that created the nuclear bomb, the Edsel, and New Coke, but which also keeps diabetics alive, lands mammoth airliners in the dark, and writes documents like the Charter of the United Nations.
 
This isn’t just about politics, which would be bad enough. No, it’s worse than that: the perverse effect of the death of expertise is that without real experts, everyone is an expert on everything. To take but one horrifying example, we live today in an advanced post-industrial country that is now fighting a resurgence of whooping cough — a scourge nearly eliminated a century ago — merely because otherwise intelligent people have been second-guessing their doctors and refusing to vaccinate their kids after reading stuff written by people who know exactly zip about medicine. (Yes, I mean people like Jenny McCarthy.
 
In politics, too, the problem has reached ridiculous proportions. People in political debates no longer distinguish the phrase “you’re wrong” from the phrase “you’re stupid.” To disagree is to insult. To correct another is to be a hater. And to refuse to acknowledge alternative views, no matter how fantastic or inane, is to be closed-minded.
 
How conversation became exhausting
 
Critics might dismiss all this by saying that everyone has a right to participate in the public sphere. That’s true. But every discussion must take place within limits and above a certain baseline of competence. And competence is sorely lacking in the public arena. People with strong views on going to war in other countries can barely find their own nation on a map; people who want to punish Congress for this or that law can’t name their own member of the House.
 
None of this ignorance stops people from arguing as though they are research scientists. Tackle a complex policy issue with a layman today, and you will get snippy and sophistic demands to show ever increasing amounts of “proof” or “evidence” for your case, even though the ordinary interlocutor in such debates isn’t really equipped to decide what constitutes “evidence” or to know it when it’s presented. The use of evidence is a specialized form of knowledge that takes a long time to learn, which is why articles and books are subjected to “peer review” and not to “everyone review,” but don’t tell that to someone hectoring you about the how things really work in Moscow or Beijing or Washington.
 
This subverts any real hope of a conversation, because it is simply exhausting — at least speaking from my perspective as the policy expert in most of these discussions — to have to start from the very beginning of every argument and establish the merest baseline of knowledge, and then constantly to have to negotiate the rules of logical argument. (Most people I encounter, for example, have no idea what a non-sequitur is, or when they’re using one; nor do they understand the difference between generalizations and stereotypes.) Most people are already huffy and offended before ever encountering the substance of the issue at hand.
 
Once upon a time — way back in the Dark Ages before the 2000s — people seemed to understand, in a general way, the difference between experts and laymen. There was a clear demarcation in political food fights, as objections and dissent among experts came from their peers — that is, from people equipped with similar knowledge. The public, largely, were spectators.
 
This was both good and bad. While it strained out the kook factor in discussions (editors controlled their letters pages, which today would be called “moderating”), it also meant that sometimes public policy debate was too esoteric, conducted less for public enlightenment and more as just so much dueling jargon between experts.
 
No one — not me, anyway — wants to return to those days. I like the 21st century, and I like the democratization of knowledge and the wider circle of public participation. That greater participation, however, is endangered by the utterly illogical insistence that every opinion should have equal weight, because people like me, sooner or later, are forced to tune out people who insist that we’re all starting from intellectual scratch. (Spoiler: We’re not.) And if that happens, experts will go back to only talking to each other. And that’s bad for democracy.
 
The downside of no gatekeepers
 
How did this peevishness about expertise come about, and how can it have gotten so immensely foolish?
 
Some of it is purely due to the globalization of communication. There are no longer any gatekeepers: the journals and op-ed pages that were once strictly edited have been drowned under the weight of self-publishable blogs. There was once a time when participation in public debate, even in the pages of the local newspaper, required submission of a letter or an article, and that submission had to be written intelligently, pass editorial review, and stand with the author’s name attached. Even then, it was a big deal to get a letter in a major newspaper.
 
Now, anyone can bum rush the comments section of any major publication. Sometimes, that results in a free-for-all that spurs better thinking. Most of the time, however, it means that anyone can post anything they want, under any anonymous cover, and never have to defend their views or get called out for being wrong.
 
Another reason for the collapse of expertise lies not with the global commons but with the increasingly partisan nature of U.S. political campaigns. There was once a time when presidents would win elections and then scour universities and think-tanks for a brain trust; that’s how Henry Kissinger, Samuel Huntington, Zbigniew Brzezinski and others ended up in government service while moving between places like Harvard and Columbia.
 
Those days are gone. To be sure, some of the blame rests with the increasing irrelevance of overly narrow research in the social sciences. But it is also because the primary requisite of seniority in the policy world is too often an answer to the question: “What did you do during the campaign?” This is the code of the samurai, not the intellectual, and it privileges the campaign loyalist over the expert.
 
I have a hard time, for example, imagining that I would be called to Washington today in the way I was back in 1990, when the senior Senator from Pennsylvania asked a former U.S. Ambassador to the UN who she might recommend to advise him on foreign affairs, and she gave him my name. Despite the fact that I had no connection to Pennsylvania and had never worked on his campaigns, he called me at the campus where I was teaching, and later invited me to join his personal staff.
 
Universities, without doubt, have to own some of this mess. The idea of telling students that professors run the show and know better than they do strikes many students as something like uppity lip from the help, and so many profs don’t do it. (One of the greatest teachers I ever had, James Schall, once wrote many years ago that “students have obligations to teachers,” including “trust, docility, effort, and thinking,” an assertion that would produce howls of outrage from the entitled generations roaming campuses today.) As a result, many academic departments are boutiques, in which the professors are expected to be something like intellectual valets. This produces nothing but a delusion of intellectual adequacy in children who should be instructed, not catered to.
 
The confidence of the dumb
 
There’s also that immutable problem known as “human nature.” It has a name now: it’s called the Dunning-Kruger effect, which says, in sum, that the dumber you are, the more confident you are that you’re not actually dumb. And when you get invested in being aggressively dumb . . . well, the last thing you want to encounter are experts who disagree with you, and so you dismiss them in order to maintain your unreasonably high opinion of yourself. (There’s a lot of that loose on social media, especially.)
 
All of these are symptoms of the same disease: a manic reinterpretation of “democracy” in which everyone must have their say, and no one must be “disrespected.” (The verb to disrespect is one of the most obnoxious and insidious innovations in our language in years, because it really means “to fail to pay me the impossibly high requirement of respect I demand.”) This yearning for respect and equality, even—perhaps especially—if unearned, is so intense that it brooks no disagreement. It represents the full flowering of a therapeutic culture where self-esteem, not achievement, is the ultimate human value, and it’s making us all dumber by the day.
 
Thus, at least some of the people who reject expertise are not really, as they often claim, showing their independence of thought. They are instead rejecting anything that might stir a gnawing insecurity that their own opinion might not be worth all that much.
 
Experts: the servants, not masters, of a democracy
 
So what can we do? Not much, sadly, since this is a cultural and generational issue that will take a long time come right, if it ever does. Personally, I don’t think technocrats and intellectuals should rule the world: we had quite enough of that in the late 20th century, thank you, and it should be clear now that intellectualism makes for lousy policy without some sort of political common sense. Indeed, in an ideal world, experts are the servants, not the masters, of a democracy.
 
But when citizens forgo their basic obligation to learn enough to actually govern themselves, and instead remain stubbornly imprisoned by their fragile egos and caged by their own sense of entitlement, experts will end up running things by default. That’s a terrible outcome for everyone.
 
Expertise is necessary, and it’s not going away. Unless we return it to a healthy role in public policy, we’re going to have stupider and less productive arguments every day. So here, presented without modesty or political sensitivity, are some things to think about when engaging with experts in their area of specialization.
 
1. We can all stipulate: the expert isn’t always right.
 
2. But an expert is far more likely to be right than you are. On a question of factual interpretation or evaluation, it shouldn’t engender insecurity or anxiety to think that an expert’s view is likely to be better-informed than yours. (Because, likely, it is.)
 
3. Experts come in many flavors. Education enables it, but practitioners in a field acquire expertise through experience; usually the combination of the two is the mark of a true expert in a field. But if you have neither education nor experience, you might want to consider exactly what it is you’re bringing to the argument.
 
4. In any discussion, you have a positive obligation to learn at least enough to make the conversation possible. The University of Google doesn’t count. Remember: having a strong opinion about something isn’t the same as knowing something.
 
5. And yes, your political opinions have value. Of course they do: you’re a member of a democracy and what you want is as important as what any other voter wants. As a layman, however, your political analysis, has far less value, and probably isn’t — indeed, almost certainly isn’t — as good as you think it is.
 
And how do I know all this? Just who do I think I am?
 
Well, of course: I’m an expert.


The Impossible Standard. By Steve Apfel.

The Impossible Standard. By Steve Apfel. Jerusalem Post, January 8, 2014.