Authored by Roosevelt Montás, the senior lecturer in American Studies at Columbia University, this work answers the question “what is the value of a traditional liberal education?” The full title is Rescuing Socrates: How the Great Books Changed My Life and Why They Matter for a New Generation. I am not a fan of putting the full thesis of the work in the subtitle, but then again, no one can complain they were misled about what they were about to read!
Montás is a capable writer, and this work is part autobiography and part argument for a traditional liberal education. By that he means an undergraduate curriculum focused on intense study of the classics of Western philosophy and literature. This is the program he went through as an undergraduate, and it deeply affected him, so much so he dedicated his teaching career to carrying on the tradition.
Thus his arguments do not stem from a neutral perspective, nor does he try to marshal an imposing array of data to support his points. This does not slight his work, which remains powerful based on his personal experience. Through a variety of helpful government, charitable, and fraternal organizations, he moved from a tiny village in the Dominican Republic to New York City, where he eventually lands a scholarship to Columbia and engages its Core Curriculum, one of the few such programs in the country.
Montás begins by describing the original intent of a university education, an intent long lost in modern America: not to earn a good living, but to lead a good life. Working skills were taught by the trades, and were (and still are) an effective means to earn a good living. But what constitutes a life well-lived? Such a challenge requires serious engagement with “what is good?” and “how would we know?” While the classics of Western thought may be of little value in programming the next immersive digital experience, they are critical to those looking for answers to those nagging “why?” questions. Montás believes everyone–even students looking for a solid STEM education–would benefit from an engagement with the classics. A point I would add in his favor: if a person like Facebook founder Mark Zuckerberg had spent more time engaging the classics at Harvard, he might not have made the naive mistake of thinking his social media project would only be used for good purposes!
Montás eschews the typical partisan battles about “dead white males” or “what constitutes the Western Canon” by stating of course the material can be enlarged, but whatever is chosen must stand the test of quality and the test of time. For this work, he highlights Plato, St. Augustine, Freud, and Gandhi, winding together a short synopsis of what makes these thinkers “great” with how they affected him. He proofs his argument by including Freud and Gandhi: the former is largely discredited but has a profound influence on society, the latter a modern and non-“Western.” Of the four parts, I found the one on Freud the least interesting, since it leads into Montás personal experience with psychoanalysis, which to me is as useful as astrology (Hey, I’m a Libra!).
I came to this book as a true believer, already convinced of the value of such an education. University curricula today have become à la carte offerings that leave dedicated students with marketable skills but little balance, little depth, and little ability to argue persuasively. One only need look at social media to see the paucity of serious engagement. I hoped for a juggernaut of an argument, but Montás presents more of a personal plea, citing his own rise. This is effective, even if I wanted more.
The book is an easy read, and provides nice summaries of the four great thinkers for those who may have forgotten–or never had the chance to engage with–them. I sincerely hope more parents encourage students to demand a rigorous core curriculum in place of the thin gruel offered today. Some will benefit greatly, and all will benefit some. The rest of us will benefit from a more intelligent discourse!
Forget the Alamo is a new work by Bryan Burrough, a onetime Wall Street Journal reporter who now writes at times for Vanity Fair. He has several well-received books, including Barbarians at the Gates (about the takeover of RJR Nabisco) and Public Enemies (about the birth of the FBI). This work takes on the legendary siege at the Mision San Antonio de Valera, a staple of Texas history (where he grew up). His co-authors are Chris Tomlinson and Jason Stanford. I will refer to them collectively as Burrough.
Burrough admits early on, and repeats in summary, that this is not strictly speaking a work of history. He calls it historiography, or a history of the history of how various groups have used the events of the siege to their own ends. Needless to say (based on the title), he believes most of those who cherish the legend are wrong, either in what they believe (the defenders were heroic, they were fighting for freedom, they died to-a-man while fighting) or what they believe in (Anglo superiority, anti-Mexican hate, general hatred of immigrants, macho sexism, or simple racism).
Burrough writes with a clean style and has ample footnotes to work by historians. I find his attempts at humor, introducing today’s jargon for example, as off-putting, but some may enjoy them. His storytelling skills remain strong, and while you never doubt where his sympathies lie, he keeps the narration interesting.
The main challenge this book presents is Burrough’s contention that we should reject the dominant narrative–and he readily admits this narrative is dominant–of heroic Alamo fighters dying in defense of freedom. His bases this claim on a number of supporting points. First, the Anglo settlers in Mexican Tejas were mainly interested in cotton-growing and slavery, and that was the purpose of the secession movement from Mexico. Second, the defense of the Alamo was a colossal military blunder by Colonel Bowie and Sam Houston, so the siege never should have happened. Third, it is uncertain whether the defenders died heroically: some may have been captured, others may have been cut down running away outside the walls. Fourth, General Santa Ana and the Mexican Army have become the bogeymen of the drama. Finally, various groups (e.g., the KKK, Confederates, bigots and racists) have used the Alamo legend to their own unfortunate ends. Let’s deal with each in turn, from last to first.
Obviously, the fact that some malign groups misuse the Alamo legend is no reason to forget what happened there. The most powerful piece of evidence Burrough presents is testimony by many Texas Latinos who grew up feeling as American as apple pie until they took the school-sponsored, mandatory, seventh-grade field trip to the Alamo, where they saw themselves portrayed as the enemy. No one should doubt their stories, and there is ample reason to tell the entire Alamo story, not just the part with heroic white guys dying at the hands of evil brown guys. Yet Burrough must know, in fact he even explains, why this happened. The two groups of settlers in Mexican Tejas were long-standing Tejanos (of Spanish/Mexican descent) and Anglo newcomers called either Texians or Texicans. The Tejanos wanted Tejas to provoke a change in the Mexican federal government toward greater federalism, while the Anglos wanted independence from Mexico. The Texians ended up winning, and the Tejanos were relegated to a side-story in the history of what became American Texas. Their story deserves to be told; some Tejanos died at the Alamo. As Churchill reportedly said however, “History is written by the victors.”
As to General Antonio López de Santa Anna, little needs to be done to confirm his role as a very bad person. He was a slippery leader who moved seamlessly from side to side in the Mexican War of Independence and had a preternatural ability to recover from battlefield losses. He fought against the American government, sought their assistance, sold territory to them, and ceded all of California, New Mexico and Arizona after losing the 1848 war. His behavior on the battlefield in the Tejas campaign was both pitiful and dreadful (more on this later). As a braggart and a vicious bully, he is practically typecast. Even Mexicans consider him “one who failed the nation.”
Another argument Burrough presents is based on more recent evidence that the Anglo defenders did not all die where they stood: some were captured, others apparently sortied (leaving the fort) and were run down by Mexican cavalry. The new accounts come from Mexican military sources, which could present a bias, but let’s accept them at face value. Burrough’s analysis of these sources evinces a lack of military experience. Burrough cites evidence of Davy Crockett (and a few others) being summarily executed after the battle (at the command of Santa Ana) as proof they did not die heroically. A soldier may still be captured without surrendering. Ammunition spent, surrounded by a mass of enemy soldiers, you can swing your rifle and still be subdued. It has no bearing upon your heroism. Likewise, Burrough assumes the men who sortied were running away. With the fort overrun and again out of ammunition, it is natural to strike out away from the enemy. People jumped from the top of the World Trade Center when presented with no other choice; it was not a matter of dying without honor.
The argument over whether the Alamo battle was a blunder is also misguided. True, Sam Houston sent Colonel Jim Bowie to the site to reconnoiter and if necessary, destroy the fort. Bowie recognized it was perhaps the only defensible site between where Santa Ana was marshaling his forces and the Brazos river, where Houston planned to marshal his. The Alamo was not much of a fort; Travis had only two hundred troops, and he would have needed more like one thousand to adequately defend it. However, having made the decision to stay, he counted on being reinforced by the 400 men in Goliad and more from Sam Houston. Neither arrived. Sam Houston thought Bowie was being hysterical, and Colonel James Fannin at Goliad dithered, only to later surrender to Santa Ana and have all his men massacred. In the end, Travis only got thirty volunteers from nearby Gonzales. They arrived after the siege began, demonstrating that the Alamo defenders could have still withdrawn after Santa Ana arrived and laid siege; that they didn’t stands as testament in their favor. Travis probably did not count on causing 600 Mexican casualties and slowing Santa Ana’s advance by two weeks; those were accidents of battle. But there is no way to consider the choice of battle site as a colossal mistake.
Finally, Burrough makes much of the seedy personal stories of men like Crockett and Travis and Bowie, and the reliance of the Anglo settlers on slavery. Crockett was a slimy politician, Bowie a drunk and a slave trader, Travis a deadbeat. Little is made of this in histories of the battle, since it has little to do with what happened inside the walls. These men were emblematic of Texicans in general: men looking for a second chance after failing elsewhere.
Burrough believes his most damning accusation is the Texican’s embrace of slavery. He treats this as a revelation, as if anyone even minimally aware of the history of Texas hadn’t heard it was a slave state that later fought with the Confederacy. Burrough admits the Anglos were invited to settle in Tejas, and slavery permitted, in a series of political compromises with the Mexican state and federal governments. The peace-loving Commanches had nearly depopulated the northern part of Tejas, and the government in Mexico City felt a wave of Anglo settlers might forestall more attacks and perhaps even rile the American government to eliminate the Commanches. It was only after Santa Ana reneged on these deals that the Tejanos supported a revolt (for greater state’s rights, of all things) and the Texicans did so too, to retain their cotton plantations and slaves. As Burrough’s footnotes point out, cotton prices went on a boom during this period, and Tejas had the land to support massive cotton production if slavery was permitted. It wasn’t that Texicans were dedicated slavers as much as they were a collection of desperate men trying out their luck in a situation that can only be described as “the wild west.” If beef or oil had been the hot commodity at that time, it would have been the cause of the revolt.
Perhaps the best work by Burroughs is when he describes the various machinations by the Daughters of the Republic of Texas (DRT), a lineal group who eventually came to own and mismanage the site. Theirs is a story right out of a sorority house, filled with infighting, cliques, and fierce resistance to any change. That they fatally mismanaged the site is proven by a simple visit. Like most visitors, I was stunned at the tawdry ring of wax museums and tourist kitsch surrounding the Alamo. Even the inside locations lacked adequate signage of what happened. Eventually, the Texas State Land Commission withdrew the DRT claim to manage the site, and a renovation has fallen to none other than George P. Bush (eldest son of Jeb). Good luck with that!
This book belongs to a genre that uses anachronistic arguments to attack various ‘sacred cows’ of American history. No one firing on the advancing Mexican army from behind the walls of the Alamo shouted “give me slavery or give me death!” Most of those killed there did not own slaves, yet slavery was a sticking point in the larger argument over Texas independence. The Alamo plays the role it does in Texas (and indeed, American) history for what happened during those thirteen days in February and March, 1836, not for slavery, Jim Crow, civil rights, or Black Lives Matter.
Burrough et al try too hard to forget the Alamo. I hope their work leads to increased interest in the full story, and a much better job at preservation by San Antonio and Texas.
This is a complicated story; I hope I make it worth your trouble to follow!
The simple part is Notre Dame football. As an Irish Catholic lad growing up about 3 miles (as the crow flies) from the Golden Dome, I was predestined to be a fan of the Fighting Irish. Going to the local parochial school, still staffed by nuns who led us in prayers of thanksgiving every Monday for the past Saturday’s victory, didn’t hurt. And my adolescence coincided with a run of greatness that included national championships, miraculous comebacks, and legendary players/coaches. Unfortunately, it also came with the childish expectation that such things would continue: “as it was in the beginning, is now, and ever shall be, world without end.” Amen.
Except of course it wasn’t. While Notre Dame football went through extended periods of mediocrity, I remained a staunch fan. While excellence was a key component of their reputation, so was winning the right way, graduating student-athletes, and enforcing a degree of discipline and decorum. It’s not that the Irish players were better people than the players anywhere else. It’s that the institution held them to a higher standard, and disciplined them when they failed to meet it. And the same went for its coaches.
And then came Brian Kelly, a coaching success at small programs who brought Cincinnati near the top of the college football ledger for a season, right at the end of four failed Irish coaches in a row. He was straight out of central casting: Boston Irish, Catholic, blue collar, with a solid family life and a straight-forward demeanor. My first impression: no, he wasn’t a high-profile choice, but he had all the ingredients for success.
So I was shocked when I first asked my Dad what he thought of Kelly: “I don’t like him, he’s a phony. He’s not the right man to be coach at Notre Dame.” Now Dad couldn’t cite any statistics or even link to any salacious gossip to reinforce his gut instinct, but he stuck with it. What gave me momentary pause was the fact Dad was career policeman–a detective–and was very good at reading people. But he was also old-school, and a blossoming curmudgeon, so I chalked his opinion up to the latter and left it at that.
The fact that Kelly left his undefeated Cincinnati team practically on the field, facing Urban Meyer’s Florida Gators in a major bowl game (they were blown out), was a clue to the man’s character. There were many more to follow. Some were football related, like the time he had his quarterback throw a low-percentage fade route into the end-zone (intercepted) as the team was driving for a winning field goal, resulting in a loss to Tulsa. Yes, Tulsa. Questioned after the game, Kelly responded that was how he called his offense, and “get used to it.” There was the emergence of the screaming, purple-faced Kelly monster, literally losing his religion on the sidelines during a pathetic opening game loss to South Florida. Yes, South Florida.
Always taking the ball to start the game, often having a false-start out of time out, the star quarterbacks mentored into indecisive wrecks (before transferring), the posse of assistant coaching buddies who failed on the big stage, the offensive play-calling scheme (and I use the term loosely) that was indeed offensive. Not to mention the unending string of humiliating losses in big games.
I’m a Catholic, and a football optimist, so all these past sins could be forgiven. But then there were the off-field issues, too. When a student who helped with filming practices died in a fall from a camera tower, toppled in high winds, Kelly admitted no responsibility. His Athletic Director boss, Jack Swarbrick, called the 50 mph wind gusts “unremarkable.” Kelly’s players got caught cheating. His reaction when asked whether he had ANY responsibility: “Zero. Zilch. None.” He secretly met with representatives of the Philadelphia Eagles professional team (for a job interview) just before a championship match-up against the Alabama Crimson Tide (this in his third year at Notre Dame)!
To be sure, Kelly brought Notre Dame out of the doldrums where they languished when he was hired. Some younger fans admire his record number of victories, ten-win seasons, “appearances” in play-offs, or just his keeping Notre Dame “in the conversation” (a favorite quip of AD Swarbrick). And all those things are true, but counterfeit. His victory total includes wins forfeit due to NCAA violations. He has more ten win seasons than past ND coaching legends because his teams play twelve or thirteen games a season. His appearances in the play-offs and championship games have all the luster of the time when Grandpa showed up at Thanksgiving without his pants. And “the conversation” is highly overrated in today’s social-media saturated world.
And all this happened on top of a constant need to have everything his way. They tore up the grass and put in fake turf because Kelly wanted it. They added a jumbo-tron with nonstop jock-rock because Kelly said the players wanted it. He abandoned the tradition of pregame Mass because, well, there was no reason given. He got more, bigger, and better facilities which serve to separate the players from the other students, previously a hallmark of the Notre Dame experience. He complained about having to meet with so many alumni/booster clubs, perhaps misunderstanding the price of coaching a legendary program.
Did Kelly grow into the job, or embrace the traditions and the challenges they represent. Not really. He did eventually fire coaches, did admit he didn’t need to score a lot of points to generate excitement about the program, did get more involved with his players. These were all do-or-die changes. The closest he ever came to introspection was when he was smart enough to admit (around all the hoopla of “passing” Knute Rockne in total wins) he still didn’t have a championship. Kelly left his players with a tweet (this time), surprising even his chief proponent, AD Swarbrick. One of his assistants found out as he left a recruit’s house! In some respects, Kelly never really believed Notre Dame was different, and he proved his point by first denying any interest in other jobs, then taking one suddenly for a ridiculous payday.
In the end, Brian Kelly was a conniving, grasping, small-time coach. Yes, he was above-average on the sideline, but well below-average as a person of integrity. Despite my harsh words, I hold no grudge against Kelly. He was offered the opportunity, and he took it. He failed in the goal of winning a championship, but got paid admirably along the way. I wish him well, and I hope he performs just as well for Louisiana State. I do hope some future AD and leadership at Notre Dame correctly records Kelly’s tenure–officially–as having only 92 wins. After all, winning isn’t everything, . . . or is it?
So my Dad was right, again. I know he’s been waiting to hear that from me!
Where to begin? I watched long portions of the jury trial of Kyle Rittenhouse, and then followed-up with segments on various partisan media (Fox News, MSNBC, etc.). I don’t know whether to confess these sins or demand your appreciation. Either way, we’ll start with the facts of the case.
Back in August, 2020, Kenosha (Wisconsin) erupted in two nights of violence after the arrest and shooting of Jacob Blake. Blake, who is black, was arrested and shot (seven times, in the side and back) as he wielded a knife and attempted to enter a car with his girlfriend’s children in the backseat. His girlfriend called the police when he entered her home, as he was already charged with sexual assault, trespassing, and domestic abuse. Instant video analysis and widespread media coverage claimed (incorrectly) that Blake was unarmed and shot in the back while presenting no threat. Subsequent coverage debunked these claims, but not till long after protests against police brutality erupted in Kenosha.
While daytime protests were initially peaceful, at night violent groups seized on the police’ reluctance to intervene, engaging in looting, property destruction, and general mayhem. It was on this second such night that seventeen year-old Kyle Rittenhouse decided to go to Kenosha (he lived just across the State line in Illinois) to “protect (this) business.”‘ He took a medic bag (although he was not a trained EMT) and borrowed an AR-15 for self-defense from a friend who was keeping it for him in Wisconsin.
After standing watch with other people at a car dealership, Rittenhouse walked over to a nearby crowd where a group had gathered to destroy vehicles and light fires. Joseph Rosenbaum, who had just that morning been released from State mental care, had spent most of the evening threatening those you had arrived (like Rittenhouse) to ‘protect against the looters.’ Rosenbaum seemed to take special interest in Rittenhouse, and began following and harassing him. At one point–immediately after someone else discharged a firearm in the area–Rosenbaum charged Rittenhouse and reached for his rifle, whereupon Rittenhouse shot him in the chest, killing him.
The gunshots initiated a chase sequence with Rittenhouse calling a friend to admit he shot someone and needed help, and a crowd forming and following Rittenhouse as he attempts to run away. Different people in the crowd shout “he’s the shooter” “get the m*therf*cker” as they follow him. One man runs up behind Rittenhouse and hits him in the back of the head before running off. Rittenhouse trips and falls to the ground, as he rises, another man does a running drop kick, glancing off his head. Rittenhouse shoots and misses him. Anthony Huber takes his skateboard and hits Rittenhouse in the shoulder; they struggle for the rifle, and Rittenhouse shoots and kills him. Finally, Gaige Grosskreutz, who was armed with a handgun, approaches Rittenhouse; he stops and backs away, then raises the pistol toward Rittenhouse, who shoots him in the arm, gets up, and heads towards the police vehicles two blocks away.
What we have here is immaturity, stupidity, and rage–multiplied by weapons–resulting in unnecessary deaths. Rittenhouse as a teenage boy is automatically guilty of immaturity. His impulse to go “do something” is misplaced, and where is the parent/guardian saying “no!”? This parental stupidity is trumped by the friend who gave him his weapon: yes, he had a Second Amendment right to carry it, and the laws in Wisconsin permitted public carry of a long-rifle. But as a conservative, I believe rights come with responsibilities, and Rittenhouse was not trained to defend a property. His friend should have told him “no.” If Rittenhouse insisted on doing something, he should have been armed with nothing more than a medic bag and a phone-camera. More properly, he should NOT have gone to the site of previous violence; what did he have to offer?
More stupidity! Rosenbaum had a long history of mental illness, as did Huber. Who among their friends thought either would be a useful, stabilizing addition to the volatile nightly mix in Kenosha? Where were their now-grieving families when they needed to keep them home? And all these people were not just at the afternoon protest; they attended the subsequent evening arson and property destruction, for reasons that remain unclear. Grosskreutz also brought a pistol to the scene, but at least he had the sense to back off. For that momentary sanity, he saved his own life.
Finally, it all goes back to, and ends up in, rage. Why did so many people need to fan the flames when Blake was shot? This was just after the police brutality case of George Floyd in Minnesota, so activists started building a narrative, but it was false in this case. That narrative led to the protests, and the violence, and the shootings. Listen to the mob after Rittenhouse shoots and kills Rosenbaum, if you want to hear vigilante justice in action. It is pure, unadulterated hatred. If they could have torn Rittenhouse limb-from-limb, they would have on the spot. The only thing that stopped them was the AR15.
So what’s the verdict? Well, the jury had little choice. Wisconsin’s self-defense laws (which are mirrored across the United States, and have nothing to do with “stand your ground” laws) draw from hundreds of years of English common law. The defense can assert self-defense, and must provide reasonable facts. These could (and did) include a statement by the defendant that he feared for his life, as well as video evidence he was chased by people with intent to do him serious bodily harm. The prosecution has to prove beyond a reasonable doubt he did NOT fear or the others did NOT present a serious threat. This was an impossible task given the video evidence. The only chance for conviction lay in the initial shooting of Rosenbaum (who was unarmed), but that went out the window when other witnesses testified to his erratic, confrontational, and threatening behavior. No sane person–armed or not–would have been unafraid when confronted by the mob, and the mob’s intent to do deadly violence was evident in the video.
Kyle Rittenhouse is not a hero. We can only hope he learned from this experience and isn’t permanently damaged. He was guilty of extreme immaturity and was let down by several adults who either failed to prevent–or actively supported–his immature actions. Several other adults on the scene were guilty of gross stupidity. Two paid with their lives, and one was injured (shot in the arm). Several others (if you watched the trial, drop-kick man, or the guy who hit Rittenhouse in the back of the head) sneaked back into the shadows, but they participated in the mayhem and deserve our disgust. The activists and media who poured gas on the flames have blood on their hands, but of course they walked away scot free.
Make no mistake: Rittenhouse should have stayed home, or should have left his weapon with his friend. You don’t go to a riot looking for trouble, because it will find you. How overmatched and unprepared Rittenhouse was for what unfolded on the dark streets of Kenosha is evident in the video.
Your rights (to weapons or assembly) come with responsibilities. Ignoring that can be detrimental to society, and deadly to the individual.
Mexico is always changing, just like everywhere else. The changes here in Mexico seem to be ones which would be most evident to expats, and they involve some of the quintessential things that make Mexico, well, Mexican. And therein lies a story.
Ask people who have visited extensively (or expats who live here) what qualities about Mexico are unique, and you’ll quickly find some common themes: the friendliness of the people, the slower pace of life, or the incompetence of the bureaucracy, for example. And in most of these areas, change is afoot.
Take friendliness. Mexicans as a rule remain eager to help and often greet you on the street, regardless of whether they know you or not. Except in big cities like Ciudad de Mexico and Guadalajara. There, you might greet neighbors you know, but if you start “buenos dias“-ing every passerby, you’ll get a mix of responses from greetings to odd looks to being ignored. The larger cities have imported some of the the urban mindset from elsewhere, where you just can’t talk to strangers.
Put on top of that the curse that is the cell phone. Mexicans have taken to their cell phones wholeheartedly, and it’s not unusual to surprise people–even in small towns–by greeting them when they are walking, face-down in their phones. They will startle and still respond, but clearly cell phones are winning the battle for eye-balls.
The slower pace of life is experiencing some acceleration, too. Over the last thirty years, Mexico has developed a sizable middle class, and with it, widespread automobile ownership. Which means traffic, and the delays that come with it. Now Mexicans normally aren’t in a hurry to be on-time, but that’s changing, so when some hit a traffic snarl, they start driving around it in, shall we say, innovative ways. Like creating extra lanes where there are none. Or ignoring traffic lights or turn lanes or even traffic barriers. Some of that is a carry-over from Mexico’s fundamental lack of concern for laws which appear arbitrary (“I only need to go one block the wrong way down this one-way street to get where I want to go.”). But some is an attempt to hurry-up, which is something new.
Finally, many expats comment about changes in taxes and regulations, with the main theme being more enforcement. Many of the small mom-and-pop vendors in Mexico (which can mean nearly every family) operate on a cash-only basis and avoid taxes and regulation. The federal government is gradually tightening its control of money flows, so as to capture the lost tax revenue. There is an growing web of identification which ties people and their income to the taxes they owe. For many larger or recurring transactions, you need a CURP number (Clave Única de Registro de Población), in effect a Mexican Social Security Number. Once you have that, you may also need an RFC (Registro Federal de Contribuyentes) code or clave, a personal/unique tax identification number. We’ve noticed an increase in information required to directly transfer money to Mexican accounts (for example, to pay for our recent home renovations) that have come from Mexican banks, responding to Mexican government directions. Over time, this results in fewer off-the-books transactions and more revenue to the federal government.
Likewise, expats speak of less flexibility in border crossing and immigration enforcement. For a long time, some expats came to Mexico on the automatic 180-day visitor’s visa and simply stayed. They never left the country, never became a temporary or permanent resident, which requires a payment and has financial eligibility requirements. Later, expats with more means came and went every 179 days, in effect renewing their status and avoiding the costs of legal residency. Mexican immigration officials (INM) have started random checks on mass transit hubs (like big city bus stations), arresting and sometimes even deporting those long-time overstays. And border officials sometimes now ask for a departure date (e.g., return flight?) and only renew another visitor’s visa for a set length of time, much less that 180 days. Deportation remains rare; most overstays are offered the opportunity to legalize their status, but again that requires dinero.
As hard as these changes to the bureaucracy are for expats who arrived in the wild west days of yore, they are part and parcel of Mexico becoming a more efficient country. That means making and neutrally enforcing laws, taking control of its borders, collecting tax revenue and distributing it effectively. It won’t be quick, and it surely isn’t painless, but it is necessary.
Most Mexicans will continue to greet and respond to greetings. Most will kindly let you cut in line in heavy traffic. I firmly believe that even after all these changes work out, Mexico will remain Mexico. After all, it will always be full of Mexicans!
You may think the Supreme Court decision in Roe v. Wade was a watershed moment in women’s rights. You may think it was the beginning of the end. I’m not going to try and change your opinion on abortion: there are very few people who haven’t developed a very firm opinion on abortion. What I am going to try to do is argue that regardless of your views on abortion, legally Roe must change, and explain why that is so.
To do that, we must first understand the history of abortion, and what the status of abortion as a medical procedure was in the United States in 1973, when Roe was decided.
Of course abortion is nothing new. Some of the oldest medical texts and treatments involve abortion. So abortion has been around just a slightly shorter time than pregnancy, to make a point. And during all those eons, almost all major societies either outlawed abortion, limited it to certain hard cases (e.g., prostitutes, rape victims, women too ill to carry a child to term) or severely frowned upon it. Certainly once Christianity entered the scene, all Christian societies outlawed it. Yet it continued in the shadows. Which reminds us there is a difference between what we dislike or criminalize, and what people do.
In 1967, Colorado decriminalized some forms of abortion, and by 1973 sixteen US states had rules permitting abortion under some circumstances. That is when the US Supreme Court heard the legendary Roe v. Wade case, and held that a Texas law criminalizing abortion was wholly unconstitutional. With that broad ruling, the laws in thirty-four other states were singularly swept aside, and with it even some of the restrictions in the sixteen states which had permitted abortion.
In the place of the slowly changing public mores, the Supreme Court presented an absolute personal right to abortion (based on the right to privacy), and balanced this new right against the interests of the States by providing a trimester policy: in effect, earlier in the pregnancy, the pregnant mother should decide, later in the pregnancy, the State could intervene. This ruling seemed congruent with medical science at the time, which admitted the fetus was of course going to be a person, but could not answer definitively when (other than birth) that person-hood began.
The problems with Roe are several. First, it was overly broad, as already noted. The Supreme Court usually tries to limit the extent and effect of its rulings, but here it emphatically extended and amplified them. Second, the profound change Roe envisioned generated a unique resistance that only grew over time. And third, the pseudo-scientific trimester approach (which seemed so logical) was entirely at the mercy of scientific and medical advances, which would greatly undermine it.
Who am I to call Roe “overly broad?” Nobody. How about Ruth Bader Ginsburg; would her opinion matter? Nobody would dream of calling into question her support for a woman’s right to choose. But when asked about Roe v. Wade, she said “Doctrinal limbs too swiftly shaped may prove unstable.” She went on to criticize the ruling–not its outcome, but the way it was decided–as so sweeping as to be vulnerable to being overturned by future courts for the contention it caused. To be clear, she sought a broader, deeper basis for the right to abortion, not its end. But being the insightful jurist she was, she could not fail to point out Roe’s weak reasoning.
Nor can anyone doubt that Roe unintentionally birthed the pro-life movement, which has only grown over time. Despite little or disparaging coverage in national media, pro-life groups organized crisis pregnancy clinics, prayer vigils, fasts, rallies, and the largest annual protest march in Washington, DC. All this happened despite a series of rulings stigmatizing or even criminalizing their behavior. The pro-life movement is the longest, most successful protest movement in the history of the nation.
Both the pro-choice and pro-life movements enjoy citing strong polling data indicating large majorities of Americans support their cause. How can this be? First off, abortion was the test case among pollsters for how to word and stage questions to elicit results. Ask “do you think rape victims should be forced to carry an attacker’s child to term?” or “should anyone be able to have an abortion for any reason even at the very end of a pregnancy?” and you get predictable results. And most Americans don’t understand the nuances of what Roe held, how it has changed over time, and what role the States still play. Public opinion does not provide a solid basis for determining a way forward.
Finally, scientific advances and improved neonatal care led to pictures worth more than a thousand words. Talk all you want about a fetus or a “potential person,” once the pro-life movement could show high-definition images of a thumb-sucking little person (not to mention the gruesome results of rare, late-stage abortions), the other euphemisms fell cold. These images, and the gradual extension of earliest preemie survival undermined Roe’s trimester approach. It is worthwhile here to mention how quietly pro-life the medical community has always been. While a few outspoken activists carry the headlines, the greatest limiting factor to abortion availability has always been the number of doctors and nurses who refuse to employ the procedure; even most hospitals avoid teaching it. While some claim this is because of the perceived threat of pro-life violence, the medical establishment’s resistance goes back to the beginning, long before Roe came into effect. There was little doubt there what the fetus was, and what an abortion represented.
The combination of Roe’s sweeping effect, its persistent resistance, and the changing scientific and medical environment played out in unforeseen ways. Roe and its companion Doe v. Bolton case added the concept of a pregnant woman’s “mental health” to the list of possible legal justifications for abortion; subsequent cases expanded the list to include financial and “family” interests. That resulted in the US having one of the most permissive abortion regimes in the world. While the laws vary by state, the most liberal states can and have legalized abortion for any reason at any time. I am not saying abortions happen moments before birth; just that Roe and some state laws would permit it. Most of the so-called liberal nations of Europe severely restrict abortion after twelve weeks; only North Korea and China have fewer restrictions than Roe does.
Since much of the initial opposition to Roe came from religious groups, pro-choice organizations counterattacked by claiming that the Constitution required a separation of Church and State. This charge failed in the courts, which require all policies to be adjudicated on their merits, not on who proposes them. After all, many of our laws stem from religious rules (e.g., “Thou shall not kill.”) and it was only a decade before Roe that religious leaders were lionized for their leadership in the civil rights movement. Note that the growth of the pro-life movement in younger generations has happened at the same time society overall–and younger people in particular–has become less religious. It won’t go away.
Finally, and most importantly in my opinion, the pro-life movement tirelessly submitted legals challenges to Roe, constantly pressuring the courts on the obvious logical fallacies, the detrimental effects on the democratic process, and the changing medical environment. Various members of the Supreme Court were loathe to jettison Roe altogether, and their compromises only further weakened Roe’s basis in law. The final straw was the recent Texas law which is currently before the Supreme Court. This law avoids judicial scrutiny by not using the State to enforce its provisions, but rather deputizing anyone (literally) to sue a doctor or clinic (or others, but never the pregnant woman) for supporting or performing an abortion. The threat of unlimited civil fines of $10k USD has had a truly chilling effect on abortion rates in Texas.
Despite being pro-life, I don’t support the Texas law, and I hope the court invalidates it. This law if replicated could choke the judicial system with similar case involving gun owners, voting rights, and a host of other policies. But the exercise demonstrates how far the pro-life movement is willing to go.
Most likely, the Supreme Court will invalidate the Texas statute. But it will also hear a case in December from Mississippi (Dobbs v. Jackson Women’s Health Organization) which directly calls for overturning Roe. I believe the Court will do so, to send the matter back to the States and end the federalization engendered by Roe’s privacy right. Some states have trigger laws, either banning abortion or re-instituting Roe. The nation has lived with different laws in different states for drinking, driving, gun-owning, voting, age-of-consent, marriage and divorce, and many other life-and-death matters. Abortion access will become one more.
Ending Roe will not end abortion, either legally or in fact. What it would do is take a hot-button issue off the national stage and send it to the states for local decision. After almost fifty years of increasingly tortured legal rulings, ridiculous charges and counter-charges (on both sides), and entrenched partisanship, that’s good enough.
Anybody who knows me for long knows I’m fascinated by history, and I collect interesting (to me) stories to illustrate the many lessons we can learn from it. And few aspects of history are more entertaining (to me) and enlightening (to us all) than those dealing with unintended consequences. You know, the situations where a leader or a group or even a nation does something and ends up–eventually–with an outcome entirely at odds with what they intended.
On one hand, these historical tales describe a rich vein of irony: not the watered down understanding of irony in vogue among post-modern intellectuals, nor the whiny version Alanis sang about (Ray-ay-ain on your wedding day is unfortunate, but hardly ironic). Irony involves the unexpected, which means when you have no reasonable expectation (such as the weather for a wedding day you chose), you can’t have irony. The best example of ironic humor I ever saw was a simple cartoon showing a man hitting a hammer against a glass vase; the hammer shatters, and the vase stands untouched. That’s ironic. And maybe that black fly just likes your Chardonnay!
So here goes with some unintended consequences; enjoy:
“The Sins of the Father.”
George H.W. Bush, hereafter Bush ’41, was faced with a post Cold War conundrum. The US was the lone superpower, but what did that mean? When Saddam Hussein occupied Kuwait in a completely opportunistic and aggressive move; he had his answer. Bush ’41 developed a global response, led by the US, to evict Saddam from Kuwait. Why? The US had long guaranteed the House of Saud that it would protect the monarchy as long as the oil flowed. And if Saddam could just seize Kuwait, there was little stopping him from doing the same to Saudi Arabia. We had built complete air bases out in the Saudi desert, even though they had little air power. The bases were financed by the Saudis and built/maintained to US standards, for our use should we ever need to project power in the region. Thus the US did not need to station troops in Saudi Arabia, which would have been offensive to devout Muslims.
So the grand Coalition marshaled its forces in Saudi Arabia and then expelled Saddam Hussein from Kuwait. Except there was no agreement about actually deposing Saddam, so they left him in power. Which meant he was a continuing threat. Which meant we needed to keep some “tripwire” forces in Saudi Arabia. A little-known cleric named Osama bin Laden had been preaching, to little effect, that the Saudi Monarchy was corrupt and in league with the infidel West. He predicted that the Saudis would allow “crusaders” into the Muslim holy lands, and when it happened, he moved from lunatic to prophet. And the seeds of Al Qaeda and 9/11 were sown.
Bush ’41 was looking to create a better world where aggression was punished by collective action. He forgot the simmering tensions that underlay all foreign relationships, and thought our good intentions would be recognized by all. Rather, he inadvertently laid the groundwork for a terrible challenge his son would someday face.
2. “Democracy is the theory that the common people know what they want, and deserve to get it good and hard.” H. L. Mencken.
In the early Twentieth Century, the Progressive movement was quite influential in federal, state, and local politics in the United States. Progressives were impressed with recent, dramatic improvements in science and technology and believed that these advances heralded an age when reason and science would rule between individuals and among peoples. One of their core concepts was the importance of direct democracy and the belief that the people could be relied on to do the right thing, if only given the opportunity to do so (i.e., vote). Partially this was a naive belief in the ‘wisdom of crowds’ but mostly it was a reaction to the various cabals, conglomerates, and oligarchs which proliferated at the same time. These “special interests” had seized control of several layers of government and the Progressives were their sworn enemies.
Progressives in California eventually got control of the state government and enacted several provisions which supported greater democracy: one was the creation of Propositions, whereby the people could vote directly on a policy which the government would then have to accept and enforce. The other was Recall elections, where the people could sponsor a vote to remove an office holder and replace him/her immediately. Both of these provisions were designed to limit the power of tiny groups of influential people by providing a means for larger groups to override them.
Things didn’t quite turn out that way. The Proposition concept did require a simple democratic majority, but it proved to be a blunt instrument that allowed ill-conceived, general concepts to be “approved” by the voters who probably didn’t fully understand the implications. It eventually ended up with the infamous Proposition 13 in 1978, which created a series of prohibitions on raising taxes or appraisals that have hamstrung California leaders to this day. Whatever one thinks about whether taxes are too high or too low, California’s proposition system proved a poor approach.
Oh, and the recall option? Progressives set the signature bar low for initiating a recall, and thus have cost one Democrat his Governorship and forced the current incumbent into an expensive defense, which had it failed, would have inaugurated a conservative Republican who managed only twelve percent of the vote! Power to the People, indeed.
3. Upsetting a delicate balance
Those who had a basic civics and government course can quickly describe the various “check and balances” built into the government of the United States by the Founders. There are the struggles between the three branches of the federal government, the powers reserved under the Constitution to the States and the People, Judicial Review, and the anti-majoritarian aspects of that same Constitution. These balances are constantly under stress. For example, as the US moved from a developing nation to a global superpower, the federal government naturally accrued much more sway. But sometimes the balance is voluntarily upset.
The Founders designed the House of Representatives to be representative of the people: directly- and more frequently-elected, more passionate and more partisan. For the Senate, the Founders intended a more deliberate body, selected by State Legislatures and therefor representing their (i.e., the States’) interests. But in the early years of the Republic, the federal Senate was less powerful (even a backwater), and some States either didn’t bother to send Senators, or didn’t send their best.
During that Progressive era at the beginning of the Twentieth Century, William Randolph Hearst, a fan of direct elections, sponsored a fictional novel, “The Treason of the Senate,” which detailed the failings of various Senators and State legislatures, fueling the fire for change. The result was the 17th Amendment to the Constitution, which took the Senate away from the States and made it directly-elected, like the House. And as we all know, since that time, the Senate has never had any members who would be considered dishonorable in any way.
*EDIT: a sharp-eyed friend pointed out Agnew was only President of the Senate (as Vice President of the United States), so he merits inclusion as disreputable, but not as a Senator!
More to the point, since that time, no entity represents the view of the States in the federal process, which has led to further “federalizing” of State functions, and greater stress on the delicate balance that is the American form of government.
4. A prophet is not without honor except . . .
On June 8th, 1978, Harvard University held its annual commencement exercises, discharging another set of high-achieving alumni into careers as leaders of the nation. As is its custom, Harvard had bagged the most influential commencement speaker: former Soviet political prisoner, Nobel prize-winning author, and famously private dissident Aleksandr Solzhenitsyn.
No doubt the Harvard administrators expected an important address, not the usual platitudes about ‘changing the world’ or ‘finding yourself.’ Perhaps the soon-to-be graduates expected a paean to the West, which had helped rescue the Russian dissident just as the Cold War was reaching its peak. What they got was “A World Split Apart,” a jeremiad worthy of the original Old Testament prophet.
Solzhenitsyn, characterizing his criticism as coming from a friend, not an adversary, attacked the West from start to finish. He said Western society was morally bankrupt and weak. Youth were selfish and complacent and materialist. The press was corrupt, interested in influencing the public rather than informing them. He asked what moral force the West could provide in the larger battle against the evil that was Communism, and he saw none.
The crowd at Harvard was rapt as the speaker–in Russian–and his translator–in English–continued. A few times they cheered, more often they vocally hissed. Afterward, the major media attacked the messenger, not the message, calling him “bitter,” “unappreciative”, with James Reston at the New York Times saying the speech represented “the wanderings of a mind split apart.”
Solzhenitsyn never apologized and never withdrew his criticism. After the fall of the Communist regime in Russia, he returned there, dying in 2008. His speech, which merits your attention, is amazingly prescient, accurately describing how trends evident in the West in the 1970’s resulted in many of the problems which bedevil us today. The speech is widely considered one of the greatest of the Twentieth Century, alongside Churchill’s “Blood, Tears, Toil & Sweat” and Martin Luther King’s “I have a Dream.”
Needless to say, Harvard got more (and different) than they bargained for.
5. Packing, Cracking and Majority-Minority districts
The Voting Rights Act of 1965 was a landmark piece of legislation that, among other things, attempted to redress decades of voting rights violations committed against many minority groups. One of the things the Act specifically prohibited was the drawing of voting districts in a way to reduce the influence of minority voters. This is a type of gerrymandering (drawing voting districts to create an artificial vote result) called cracking: you divide a concentrated minority group (say for example, blacks in a city) among several voting districts that also include larger numbers of suburban whites, making it difficult for a black candidate to win. The end result is few if any successful black candidates.
But as anyone who is mathematically and geographically literate can tell you, if you draw a district to create a majority of minority voters (hence the term majority-minority district), you greatly reduce the number of minority voters in all the surrounding districts (because you’re dealing with small numbers–a minority–in the first place)! This is another form of gerrymandering called packing, which nearly guarantees a successful black candidate in one district, but also also greatly increases the chances of all the other white candidates’ success.
Currently there are approximately one hundred majority-minority districts in the US House of Representatives, representing about one quarter of the four hundred and thirty-five seats. These seats are overwhelmingly occupied by minority representatives, strongly indicating that the Voting Rights Act of 1965 worked. But, and there is always a but, over time voters have increasingly sorted themselves politically (like-minded voters choosing to live where they think other like-minded voters live).
Gerrymandering to disadvantage voters by party — or political gerrymandering — may be distasteful, but it is constitutional according to the US Supreme Court (Gill v. Whitford). But what if a racial group (i.e., African-Americans) overwhelmingly identifies with a single party (i.e., Democrats)? Then gerrymandering those voters might be a violation of the Voting Rights Act. So these majority-minority districts become protected (in theory) during redistricting. Which means if certain seats can not be changed (or changed much), all the other seats become subject to even worse gerrymandering.
So the law which seeks to protect minority representatives actually also places many more Democratic candidates at risk of being re-districted into noncompetitive campaigns.
6. Saddam Hussein and the Iraqi Weapons that Weren’t
From the moment he took power in 1979, Saddam Hussein was on a mission toward self-aggrandizement and Iraqi domination of the Middle East. He constructed a security service that strangled Iraq’s many ethnic minorities, built up military forces, and sought weapons of mass destruction (WMD, especially nuclear and chemical). By 1981, Israel deemed the threat of Iraq’s nuclear program sufficient to justify a risky long-range aerial bombing and destruction of Iraq’s Osirak nuclear reactor. During the Iran-Iraq war in 1988, Saddam gassed the (Iraqi) Kurdish village of Halabja, killing thousands of his own citizens. He repeatedly demonstrated the capability and will to use such weapons.
US and Coalition military forces were prepared for Saddam to use such weapons during the first Gulf War in 1991, but he didn’t, fearing the reprisals. After the war, UN sanctions forced him to destroy his WMD munitions and infrastructure. Saddam complied, but continued to conduct suspicious actions which led the UN and various intelligence agencies to believe he retained a covert stockpile and program. Why else would he deny certain inspection areas, or make sudden movements of equipment and people to avoid inspections, if not to hide a residual capability? In the end, Secretary Colin Powell laid out the circumstantial case about Saddam’s programs before the UN, and Bush ’43 proceeded to occupy Iraq, capture Saddam, but never found any WMD!
During debriefings by the FBI after his capture, Saddam explained that he knew Iraq had the technical know-how and scientific capability to rebuild its WMD program. But he didn’t do so, so as to avoid giving the West an excuse to remove him from power. However, the West (and especially the US) was a more distant threat; Saddam firmly believed WMD and his demonstrated willingness to use them were all that was deterring the Iranians–and to some extent the Israelis–from attacking him. So he took a calculated gamble: act suspicious enough to make Israel and Iran stay back, but not so suspicious that the US would get involved.
This approach worked for decades. By the year 2000, human rights organizations were calling for the removal of sanctions on Iraq for humanitarian reasons (fake child mortality data provided by Saddam), Russia was actively working around the sanctions, and even France was signalling the sanctions had to go. But the 9/11 attacks had heightened US sensitivity about vulnerability to terrorist use of WMD. Combined with the Bush administration’s belief Saddam was a problem which would only get worse, his WMD bluff proved in the end to be his undoing.
Hope you enjoyed this small foray into the world of unintended consequences; if not, maybe I just committed another myself!
Is there anything harder than putting down your dog? Don’t answer, I don’t want to know.
Judy & I have put down three Vizslas in our married life. It’s an odd euphemism. Some say “put to sleep,” others “put down” and of course there’s the old “sent to the farm.” But in the end, it means the same thing.
Our first two (consecutive) Vizslas each lasted to ten years old, which is pretty good for a large dog breed that has been inbred for generations. Both developed cancer, and showed signs of physical decay and pain that made our choice somewhat easier. The decline was sudden–weeks not months–and obvious. The veterinarian told us we could wait a little longer, but the likely outcome was painful internal bleeding leading to sudden collapse: hardly an option to choose. Still, it wasn’t easy, and I (actually the whole family) cried like a baby.
Dogs will do that to you. Both Judy and I had grown up with dogs as family pets. Yes, they cost a lot. Yes, they take up time. Yes, they’re inconvenient when travelling or with visitors. But then again, so are families. And dogs are a chance to teach your kids about responsibility, about growing up and growing old and dying. And in between, they give unconditional love. Kids need that; sometimes parents have to be the provider of tough love, but that dog is always there, wagging a tale, just happy to see you.
The best dogs don’t think they’re dogs any more. They think they’re slightly smaller, oddly-shaped humans. They want to be with you, they need to be with you, they’re only happy when they are with you. When our kids were grown and off to college, we rescued two Vizslas at once. Even though we understood the breed, the balance of two Vizslas and two humans turned us into a pack rather than a family, with negative behavioral consequences for all concerned. We quite literally found a farm for one of our Vizslas, and things returned to normal.
Tucker in earlier, better times
Tucker was our fourth, and almost certainly our last, Vizsla. We rescued him around the same time we committed to retiring to Mexico. He was “four or so” according to the breed Rescue Society, but we had just turned down a five year old since the pain of putting down a dog at ten was fresh in our minds. Suddenly a “four year old” became available. Hmmmm, something suspicious about that, what?
Tucker was a three-time loser, a dog who had been turned over to rescue at least three previous times, and this was his last chance. We had to say yes. We knew that his most recent owners had been in the midst of a divorce, and there had been an incident of domestic violence (the wife attacking the husband, who was the pet’s favorite, we were told). This would show up again once we rescued him: if Judy started walking quickly toward me, he would move between us, a trait which foreshadowed a sad end.
The Tucker we knew was a sweet dog, very smart, but also very stubborn. He learned several words (outside, w-a-l-k, treat) and many commands (sit, stay, and even “hurry-up” to poop, believe it or not), yet he practiced being deaf at times, too. He was afraid of smoke, even steam, and fire engine sirens. The chirping of the smoke alarm, signalling the need for new batteries, was an existential terror for him. He was only social with humans. When we took him to the dog park, he ignored the other dogs and introduced himself to all the owners. The few times he played with the other dogs, he would show-off by outrunning or out-cornering the other breeds, but then turn the pack on some smaller dog like a schoolyard bully.
After much work, we eventually trained him to ignore other dogs, which was fine with him. I noticed how he watched me go running on the weekends, so one time I decided to take him with. He fell into a heel position and ran at my pace for three miles! Either someone had trained him well, or he was natural born runner’s companion.
He started to “go white” almost immediately, confirming our suspicion he was older than we were told, but he was healthy and well-adjusted. We joked about him joining us in moving to Mexico, which was still five years out when we rescued him. Obviously he took us seriously, because those years flew by and at ten, he was till healthy and active and cancer-free. So we loaded him up in a tiny space in the back of our SUV and drove him south of the border.
Expat life was as kind to dogs as human retirees, and he remained healthy. Over the course of five years here, he lost some hearing, although he feigned losing even more. His depth perception and visual acuity declined, leading to some hysterical encounters with Mexican squirrels, including even stepping on one. He took longer and more frequent naps. He became more sensitive to those loud noises he could here, especially cohetes, which required mild sedation at times. But he was still active and alert. I think he got a kick when people asked how old was my puppy and I told them “fifteen years!”
In the past year, he stated displaying some confusion. He was prone to barking attacks where something set him off–he was clearly agitated, not startled– but could not be easily calmed. He started charging at Judy more often, sometimes just when we were talking. Finally, he took a small bite at her, (Strike One) and we knew trouble was brewing.
The drama of selling our house, buying a new one, moving, and having renovations only added to his confusion and agitation. When the grandkids were visiting, we left Tucker home one day, and when we returned, he ran out the door, jumped in the rented mini-van, and would not leave it. I went out and reached for him, thinking he was having trouble navigating getting out of the vehicle: no, he went off and tried to bite my hands. I backed away, so he leapt out of the vehicle and bit my leg (Strike Two). Then he stopped and looked at me with a “what was that all about?” look.
Two nights ago, Judy walked toward me and Tucker dashed across the room and nipped at her. I shouted and kicked at him (never reach for an angry dog), but he bit her a second time and clamped down. I smacked and kicked at him, and he let go, but she had a tremendous bruise on her thigh from the attack. Strike Three; you’re out.
He laid down in his dog bed, clearly upset, but whether that was remorse or shock, who knows? For us, the die was cast; you can’t keep a dog which might go full-scale beserk at any moment. In some sense, putting Tucker down was more difficult, since when he was normal, he was completely normal. In another sense, it was easier, as we had a sense of relief at not waiting for the next attack.
Ever heard of the Spanish Inquisition? I thought you’d be surprised! You know, how it demonstrated the horror of imposing one’s religious beliefs on others, not to mention the danger inherent in believing one knows exactly what God intends, which can lead to all manner of extremism. It’s a popular view, almost a trope, partly based on that well-established English bias about history that North Americans imbibe, and partly based on, well:
but you KNEW this was coming
The Good News (pun intended) is that the Catholic Church, being the world’s longest-running, most successful bureaucracy (among other things) has excellent data on the Spanish Inquisition. Not only that, the data are reliable, because the inquisidores really believed they were doing God’s work (however bizarre that may seem to modern sensibilities), and lying about their work would have undermined the “good” they thought they were doing. And the Vatican released the data in the late 1990s.
If you asked an average person what they knew about the Spanish Inquisition, the key points would boil down to: (1) many innocent people were tortured to confess, (2 ) they were burned at the stake, (3) all to force people to accept the Catholic faith. So let’s get past Monty Python’s “comfy pillow” sketch and look at the facts!
How many people died at the hands of the Spanish Inquisition? None. Well, that’s a quibble. See the Office of the Holy Inquisition had no authority to execute anyone; only a King did. So the Inquisition passed off the condemned to the King’s executioners. Setting aside the quibble, early estimates ranged upward of millions of victims killed. But there are those pesky records, and modern historians have pored over them and determined the total to be at most 5,000 people (during the period 1478-1834, about 350 years), or a little more than one a month. There were periods of more and fewer executions, and long periods with none, as no trials were held. Hardly an enormity in the true sense of the word, or even a blip in the mortality statistics of the day!
Why the quibble about the role of the King in all this? It may be hard to understand in modern-day terms, but back then the government and religion were one-and-the-same: it was called Christendom for a reason. Denying the true faith demanded penitence, but refusing to admit the sin was a challenge to the sovereign, who was after all God’s chosen leader (the Divine Right of Kings), and thus was a capital offense. That’s what cost the accused their life. There was a continuing disagreement between Spanish royalty and Rome, with the former seeking harsh punishment (for the challenge to the throne) and the latter granting mercy as long as those charged repented at any point.
What about torture? Yes, the Inquisition practiced torture. They literally wrote books about it: when to do it, for how long, under what circumstances, etc. These tracts would be very familiar to anyone who read the Department of Justice memos regarding “enhanced interrogation techniques” under the Bush administration. Where do you think “waterboarding” came from originally? Here’s the rub: all countries, and all legal systems, allowed torture at that time. The Roman legal system practiced it, and bequeathed it everywhere Romans went. Islam developed its own version. Charging the Inquisition with torture is “like handing out speeding tickets at the Indy 500.”
actually Kurtz said it
For the Inquisition, torture could only be employed after guilt was established, to elicit a full confession and further information about co-conspirators, heretics, etc. (again, sound familiar?). There were limits on how long, what types, how painful, the need for a doctor present; it’s eerie reading. And none of these rules applied in the regular government legal system. There, torture was practiced freely without restraints and often used early in the investigation to get a confession and complete the case. How bad was “justice” at the time? The records show criminal or civil defendants requesting to be transferred to the Inquisition for trial! So guilty as charged on torture, with a huge asterisk as that was nothing unusual at the time.
What about forcing others to become Catholic? See, here’s the problem with that odd charge: anyone could avoid the Inquisition by simply stating they were not Catholic. The Inquisition had no authority over Jews, Muslims, or pagans. The Church had long accepted the notion that one could not be forced to accept a different religion; thus the Inquisition was adopted to weed out heretics and false believers. Spain was in the process of the Reconquista, the expulsion of the remaining Muslim forces on the Iberian peninsula. As Catholic Spanish forces gradually occupied the lands, they had the problem of ruling these lands. Their solution was to allow freedom of religion, but to place limits on land ownership, positions of authority, and to impose heavy taxes, thus encouraging–but not mandating–conversion. Incidentally, these were the same rules Islamic leaders developed when they ruled Al Andalus, rules which some historians called even-handed and far-sighted!
Some Muslims and Jews became conversos, but a very small number only did so for the financial and political advantages it held. These false conversions became a target for the Inquisition, often based on secret tips from faithful Muslims and Jews who were annoyed by the success of their one-time fellow adherents. Add in ethnic rivalries, the ability of the Crown and local leaders to profit from seized property, and petty jealousies and you get a deadly mix of accusations. One redeeming quality: the inquisidores were intrepid detectives, and most charges brought to court were dropped. One set of records shows about one percent of the 125,000 heresy cases brought to trial under the Spanish Inquisition resulted in executions.
What about the procedures involved with the Spanish Inquisition? The Inquisition was all about process: there were hundreds of pages of rules and policies and procedures, all of which were lacking in European justice systems at the time. That is why many ordinary people and local leaders welcomed the Inquisition. All throughout the process there were opportunities for those charged to confess and seek forgiveness through some act of penitence. The arrival of an inquisidor started a thirty day Grace Period (literally) where anyone could simply admit guilt, be given a penance, and be forgiven. Then began a period of accusations and investigation, a trial and verdict, then sentencing or release. The final act was the auto da fé, which has come to mean “burning at the stake” in English, but actually means act of faith. This was a religious ceremony–including a mass and a procession–where the inquisidor and local prelates related the results of the trial to a public. At the end of the process, the accused, having been given another opportunity to repent publicly, was handed over to the civil authorities for punishment.
Why is there such a dark cloud of misinformation hovering over the Spanish Inquisition? Partially it is so foreign to our ideals today, but mostly it is a hangover from the “Black Legend.” In the 16th century, Spain was the wealthiest, most powerful Catholic nation in the emergent struggle over the Protestant Reformation. Thus it was the target of propaganda, the most effective of which was a campaign known as the “Black Legend” which depicted Spain in the harshest terms as a land full of violence, corruption, sexual excess and worse: sort of like California today (I kid, I kid). Many of the stories involved the Spanish Inquisition, and England was the chief source (in its ongoing rivalry with Spain). And those legends got past along with English history.
None of which is to say the Spanish Inquisition was good. Saint Pope John Paul II apologized for the violence it enacted. Moreover, while heresy was a continuous problem in those days, the threat posed by conversos was greatly exaggerated and never merited an inquisition, as demonstrated by the numbers of trials, exonerations, and executions.
George Santayana said “those who do not remember the past are condemned to repeat it.” I would add that those who don’t know the truth about history are condemned to repeating falsehoods.
In late May and early June of 1940, the German army blitzed across France. The speed and violence of the panzers and stukas left a beleaguered British Expeditionary Force and some remnants of the once-proud French army surrounded along the coast at the tiny port of Dunkirk. Over the period of eight days, the British navy, merchant marines, and thousands of individual ship owners conducted an improvised, hasty withdrawal-under-fire. They rescued over 330,000 soldiers, albeit with nothing more than the soaking wet uniforms on their backs. It was a humiliating defeat, but one tinged with a tiniest glimmer of hope, which was sorely needed by the British people at that point. Prime Minister Churchill reminded his nation that “we must be very careful not to assign to this deliverance the attributes of a victory. Wars are not won by evacuations.”
The American experience in Kabul these days has me thinking of Dunkirk.
First off, the President continues to defend his decision to end the “war in Afghanistan.” This statement demonstrates his fundamental lack of understanding. Afghanistan was a theater of war: just one of many. While it is possible to surrender a theater in order to win a larger war, one must always remember that there is a larger war on. We did not start this war. Radical Islamic terrorists declared war on the US back in the 1990’s. We ignored them at the time, like a much-older brother ignores the taunts of a much-younger sibling. But like that sibling, the terrorist movement grew-up, and when they knocked down the towers, the game was on.
The US could not have cared less about Afghanistan or the Taliban but for their harboring Al Qaeda (AQ). When the Taliban refused to turn AQ over to us, they became another campaign in the war. And as I continue to remind, we can not declare that war over: only the loser can. So we can end the war tomorrow by admitting our evil, renouncing our ways, and publicly proclaiming the Shahada (“I bear witness that none deserves worship except Allah, and that Muhammad is the messenger of Allah.”). It really is that simple.
Secondly, I hear much soul-searching about ‘where we went wrong’ in Afghanistan. A few misguided souls says we should never have gone in; I won’t dignify that view with a critique. For the record, I was in favor of going in to Afghanistan to expel the Taliban and root our AQ. BUT, I was also in favor of a cold-hearted, realpolitik approach. Turn the “nation” of Afghanistan over to various regional warlords with this simple admonition: those who fight the Taliban and kill AQ will receive our funds, those who don’t will receive our bombs. Let them fight it out and install whatever puppet-regime the Afghan people could stand in Kabul. Yes, this would have made for atrocities and corruption and human rights violations, but thus was it ever in the Hindu Kush. At least we would be at a distance, and not directly involved in a place where our only interest was the absence of AQ.
When President Bush decided to expand the mission in Afghanistan to nation-building, I thought it was ill-advised but not impossible. I do not understand the logic of those who say “the US can’t do nation-building.” History will long remember our excellent examples in nation-building: Germany and Japan. We took two of the most militaristic cultures of the twentieth century and turned them into committed pacifists barely able to staff military forces (or in Japan’s case, even call them an “army.”) We took nations literally burned to the ground and rebuilt them into economic powerhouses which eventually rivaled us. This took decades to accomplish, even though we had fought a savage war against each.
Bu the best comparative examples for US nation-building are Vietnam and Korea. In the first case, we quit, with predictable results for South East Asia. In the latter case, we stayed. Now somewhere I hear a reader crying “Pat, you can’t be comparing Afghanistan with today’s South Korea!” and to that reader, I say “You’re right; I’m not.” But Afghanistan at year twenty IS comparable to South Korea in 1971. Let me refresh your memory: the Asia Times described South Korea in 1971 as a “Lost Land, . . . a gritty, poverty-racked, unsophisticated nation that was one decade into an industrialization program that would lead to riches.” It varied between democratic leaders, oligarchs, and an occasional military coup. During the previous twenty years, North Korean forces attacked the US and South Korea, killing our soldiers and marines. They continued doing so for the next twenty years.
Korea in the 1970s; where’s Hyundai?No doubt dreaming of K-pop!
Now I am not saying Afghanistan was on the path to similar success. But anyone who says we can’t do nation-building is wrong, and anyone who says Afghanistan would never have made it has to explain why South Korea did. Impoverished nation? Check. No democratic culture or history? Check. Pervasive external threat? Check. Persistent US military casualties? Check.
Thirdly (yes, I’m still counting), the President and other senior leaders have said “the Intelligence Community (IC) did not predict such a sudden collapse.” Without having been in the room, I know this is true. It is also a red herring: the IC does not predict anything. Prediction is the realm of prophets and seers, not intelligence professionals. I guarantee you that the IC did consider the possibility of such a scenario and included it as a worst-case one. How do I know that? Because if they didn’t, the President and others would have said as much and fired those responsible. He didn’t (fire them), so they did (cover it).
Likewise, my fourth point is a question. Given that the President has warned for days of a terrorist attack (meaning the IC had good info that an attack was imminent), and given that we remain at war with the Islamic State, and given that the Islamic State is the sworn enemy of the Taliban: why did we wait? Why didn’t we attack the Islamic State in Afghanistan before they attacked us, in order to perhaps disrupt their planning? We didn’t suddenly determine they are our enemy. Nor did we suddenly figure out where ISIS is in Afghanistan. Did we think the Taliban cared? Why the delay? Inquiring minds want to know.
On a tangentially-related, fifth point, who is advising the President on his messaging? Having him stare into the camera and intone “we will not forgive; we will not forget. We will hunt you down and make you pay” is not intimidating. He shuffles on and off stage. He squints at the teleprompter, which is not atop the camera, so it looks at times like he is not speaking to the audience. Either dim the klieg lights, enlarge the size of the font, or get him contacts/glasses. His speech is halting, and no, this is not his well-understood stutter. Joe Biden has been a public figure for well-nigh fifty years. He never was this forgetful, or confused, or halting. Assuming he is still in command of his faculties (and God help us if he isn’t), why are his handlers insisting on putting him in such a bad light?
This is intimidating . . . This is not.
Sixth and finally (I know, you’re relieved!), the President most recently said he is following the advice of his senior military commanders. This is always re-assuring to hear. The problem is no competent military officer would ever suggest that we conduct a noncombatant evacuation operation (NEO) of unknown size, from a toe-hold perimeter around a civilian airport in an urban area. Someone made a decision to pull the military forces out first. Someone made a decision to evacuate the US Bagram airbase first. Someone made the decision not to accelerate the visa application process. There are reasons why such decisions might have been made: they are not blatantly stupid decisions. But someone made them. Those persons must be identified and given the opportunity to explain themselves, or suffer the consequences if their explanations do not suffice. Blanket admissions of where the buck stops are irrelevant. Who made the decisions?
Dunkirk is an interesting footnote in the history of World War II. It was not decisive in a military sense. The British could afford to equip and field another army. It was the channel, the Royal Navy, and the brave few pilots of the Royal Air Force whose “finest hour” thwarted any ambitions Hitler had of parading past Buckingham Palace as he had down the Champs-Élysées. But it was the snatching-of-a-small-victory-from-the-jaws-of-defeat that helped stiffen the English spine for the dark days ahead. During this brief period, Churchill gave a series of impassioned speeches in the House of Commons which are long remembered: “their finest hour” “blood, toils, tears, and sweat” and finally “we shall fight on the beaches, . . . .” Such is the stuff of legendary leadership.
We are still in a generational war against radical Islamic terrorism. We’ve had a Dunkirk moment, one of our own making. President Biden explained his decision to withdraw from Afghanistan by asking how many more casualties should we endure? He is now responsible for more casualties in one day than the US experienced in the prior two years. I am still waiting for what is his strategy for the larger war. And for when he will stiffen our spines for the fight.