Contact Us

Use the form on the right to contact us.

You can edit the text in this area, and change where the contact form on the right submits to, by entering edit mode using the modes on the bottom right. 

PO Box 3201
Martinsville, VA 24115
United States

Stephen H. Provost is an author of paranormal adventures and historical non-fiction. “Memortality” is his debut novel on Pace Press, set for release Feb. 1, 2017.

An editor and columnist with more than 30 years of experience as a journalist, he has written on subjects as diverse as history, religion, politics and language and has served as an editor for fiction and non-fiction projects. His book “Fresno Growing Up,” a history of Fresno, California, during the postwar years, is available on Craven Street Books. His next non-fiction work, “Highway 99: The History of California’s Main Street,” is scheduled for release in June.

For the past two years, the editor has served as managing editor for an award-winning weekly, The Cambrian, and is also a columnist for The Tribune in San Luis Obispo.

He lives on the California coast with his wife, stepson and cats Tyrion Fluffybutt and Allie Twinkletail.

IMG_0944.JPG

On Life

Ruminations and provocations.

Foxholes don't prove god, just desperation

Stephen H. Provost

Believers are fond of saying, “There are no atheists in foxholes,” as though this statement somehow proved the existence of a god. And not just a god, but their god.

I’m not here to attack anyone’s traditions. The best of societies, in my view, is an open one that allows room for all manner of beliefs — or lack thereof — as long as they’re expressed, rather than imposed. But I do want to point out that the absence of “atheists in foxholes” does not, logically or otherwise, prove the existence of a deity.

To begin with, there are atheists in foxholes, and there's no basis for stating otherwise. (You can’t start with a premise like that and fail to provide evidence for it; since it’s impossible to prove a negative in a case like this, so you’re behind the 8-ball from the get-go.) Millions of people have sacrificed their lives for their principles, and the refusal to compromise those principles under threat of death isn’t exclusively religious. If it were, every soldier tortured would turn traitor rather than die for his or her country. No one would ever give his or her life for anything. 

But say, for the sake of argument, that the premise is valid. Let’s assume that, in the face of death, every single atheist will, in fact, call out to some deity in the hope of deliverance. If that were so, would it prove the existence of a god?

Hardly. The mere fact that you want something is no proof that it exists: If wishes were horses, beggars would ride. No, if such an impulse is evidence of anything, it’s that human beings (like other organisms) have a fierce will to survive, and that, in extreme circumstances, they’ll go to extreme lengths to do so.

Darwin’s monkey wrench

If necessity is the mother of invention, desperation is the nursemaid of hope. It’s not religion that impels us to contemplate actions at the far edge of possibility, it is — perhaps ironically — the very Darwinian struggle to survive. (Isn’t it just like that Darwin to throw a monkey wrench into the grinding gears of dogma?)

The impulse that drives foxhole conversions, when they do occur, is the same one that spurs the destitute to spend money on a lottery ticket, even in the face of ten million-to-one odds. It’s the reason a cancer patient might pay thousands of dollars for a snake-oil remedy on the slim hope that something, anything, might ward off the inevitable.

With everything at stake and nothing left to lose, what can it hurt? When all else fails, throw that Hail Mary. It's natural, it's human, and it has nothing at all to do with religion.

Proof of human desperation is no proof of any god. It’s merely proof that well-meaning people will sometimes enter into contracts under duress. Those contracts, however, are never binding to either party. They won’t hold up in a court of law, and the argument that they somehow prove the existence of a deity won’t hold up in a logical argument.

You can take that to the bank. Or the foxhole.


Author’s note: This essay is presented, not as a critique of a specific belief system, but of fallacious argument used in the defense of any belief system. For more on this subject, see Requiem for a Phantom God (2012).

 

How Citizens United paved the way for Donald Trump

Stephen H. Provost

We went to sleep in Bedford Falls, and we’re waking up in Pottersville. A lot of us would rather go back to sleep.

For years, many of us have yearned for a leader who would “run the country like a business.” Well, we got what we wished for, but despite the shock of waking up more than a year ago with a six-times-bankrupt real estate mogul for a president, none of this happened overnight.

There are two kinds of businessmen. There are the old-school merchants who put the customer first, because the customer could always take his business elsewhere. Then there’s the new corporate model, which puts the shareholders first, because that’s where the real money is. Customers can’t nickel-and-dime you to death if you’ve got investors slipping millions into your back pocket at regular intervals.

There are, similarly, two approaches to government. The traditional approach — which made America great in the first place — puts the voter first. Officials are elected to represent their constituents, and if they don’t, those constituents can take their votes elsewhere. But under the new model, big-money donors come first, because they can control the conversation. Voters can’t elect you if they don't know who you are, and they can't kick you out of office if they don’t know you're robbing them blind.

Transformation

We’ve been morphing from the traditional form of government into a corporate model for some time: Ronald Reagan’s supply-side economics and the Ross Perot’s third-party candidacy were among the early signs of this progression. But the tipping point came in 2010, when the Citizens United decision opened the floodgates for corporate donors and blew the last vestiges of a fair playing field to smithereens.  

Once this model was firmly in place, its proponents thought they’d use it, along with the tool of Gerrymandering, to corner the market on public policy for the benefit of their corporate sponsors. One thing they hadn’t counted on, though, is an inconvenient aspect of corporate life: the hostile takeover.

That’s where Donald Trump came in. He knew the voters didn’t like the idea of corporate big wigs telling them what to do, so he tapped into that, presenting himself as an “outsider” who was ready to “drain the swamp” and take on the Washington elites: notably, the Clinton Democratic machine, but also Republican lawmakers like “Lyin’ Ted” Cruz and “Little” Marco Rubio.

George, I am an old man, and most people hate me. But I don’t like them either so that makes it all even.
— Lionel Barrymore as Mr. Potter in "It’s a Wonderful Life"

Whatever you think of Trump, his takeover of the Republican Party was a masterstroke worthy of “corporate raider” Carl Icahn (who later served briefly in Trump’s administration as a special economic adviser). The Republican establishment, which had banked on corporate support from the Koch Brothers, Sheldon Adelson and their ilk, was nonplussed at the idea that someone outside their ranks turning the tables on them.

Cruz called him a “pathological liar,” “utterly amoral” and “a narcissist at a level I don’t think this country’s ever seen.” Fellow candidate Lindsay Graham said Republicans should “tell Donald Trump to go to hell.” But that was during the primaries. Cruz eventually endorsed Trump (conveniently forgetting insults toward Cruz's wife and father), and Graham now plays golf with him on a fairly regular basis.

Why the change?

Two reasons: Trump runs the show, but it’s still their show.

Since assuming office, Trump has been anything but an outsider. In fact, he’s become the very thing he ran against in the primaries, morphing into the quintessential NeoCon Republican. During his first year in office, he has, almost without fail, championed the same causes establishment Republicans have supported for years: increased military spending, anti-gay policies, regulation rollbacks and overt “patriotism.” But he’s done so while playing to the crowd as though he were still an outsider.

This is likely one reason Trump has clung to his tweeting habit so tenaciously. His rash and often offensive outbursts, and the conspiracy theories that go along with them, are all that separate him from the people he ran against in the primaries. He’s basically keeping up appearances.

Whether he’s a maverick or a traditional Republican at heart doesn’t matter to Trump, just as ideology doesn’t matter to most CEOs. It’s the bottom line that counts, and for Trump, the bottom line is his own ego. The Republicans who railed against him in the primaries have figured this out, and they know he’ll execute their agenda as long as they play along with his little charade. So, that’s exactly what they’re doing.

Imperfect storm

No wonder people on the other side of the political fence are so enraged. To them, the current situation is the worst of both worlds: a Republican majority that’s still indebted to corporate interests, working hand in glove with a president who lacks a moral compass and who insults friend and foe alike.

Trump’s Mad Hatter act is, in part, a function of who he is — a self-serving narcissist who uses chaos to further his own ends. But it’s also a function of the new corporate government system we’ve created. Under the corporate model, a board of directors makes policy to benefit shareholders (not customers), and the CEO both executes and sells that policy as the face of the company. Think Bill Gates or Steve Jobs, Richard Branson or Carly Fiorina. Or, in Britain, the royal family.

Trump likes to think he's royalty, with Mar-a-Lago as his palace and a bunch of toadies groveling at his feet.

Whatever else he is, he's the face of our nation, and it’s an ugly one, rather like Mr. Potter from It’s a Wonderful Life. Maybe we’re not threatening to jump of a bridge, as George Bailey did in that iconic film, but some people are threatening to move overseas and a whole lot of others are distraught, disconsolate and downright embarrassed.

Trump didn’t create this mess on his own. He merely stepped into the role we created for him when, fed up with gridlock and do-nothing lawmakers, we clamored for a "businesslike" approach to government. We asked for it; now we’ve got it. But is this really what we had in mind?

The sad irony is that we hired a third-rate businessman with a first-rate ego to work for 1 percent of Americans.

Welcome to Pottersville, otherwise known as Trumpsylvania. But don’t make yourself at home. In this little slice of faux-Rockwell Americana, foreclosure’s always just around the corner.

 

Spanking violates everything we say we believe in

Stephen H. Provost

Why is hitting someone OK?

I'm not talking about self-defense; I'm talking about taking your own initiative to hit someone who isn't threatening you.

That would be bad enough. But what about hitting someone who can't fight back?

Our society condemns "kicking people when their down." Football players are penalized for late hits. Boxers can lose points for hitting after the belt, and shooting someone in the back is considered the coward's way.

But somehow these rules don't apply to the most defenseless among us, those least capable of fighting back: young children. Somehow, spanking a child is viewed not only as appropriate, but necessary by a majority of Americans. It's rationalized as a "teaching tool" or a "deterrent" or a way to impose social norms on kids who don't know any better.

"Spare the rod, spoil the child," the saying goes.

LESSONS LEARNED

But how is that different than "teaching someone a lesson"? That's what spanking is supposed to do, right? Teach the child a lesson?

First point: It doesn't work. A 2016 study by professors from the universities of Texas and Michigan found that the more children are spanked, the more apt they are to defy their parents. They're also more likely to exhibit anti-social behaviors and to develop mental health and cognitive problems. So, not only does spanking fail to achieve its supposed goal, it makes the problem worse. And not just for the kids, because ...

Second point: It doesn't stop there. Now, a new study has found that children who are spanked are more likely to engage in dating violence. The kids who are spanked aren't the only victims; they're more likely to victimize others, too.

Apparently, they are learning a lesson ... just the wrong one. They're learning it's appropriate, even desirable, to inflict physical pain upon people when they're at their most vulnerable.

Children can't fight back. They trust their parents implicitly, and spanking breaks that trust. It creates a conundrum of cognitive dissonance: "This person loves me, but he's hurting me." There are two ways to resolve this. Either the child can defy the parents (as the 2016 study found is more likely to occur among those who are spanked) or that child can learn to equate corporal punishment with love.

DATING ABUSE

It should come as no surprise that spanking should be predictive of physical abuse in dating relationships, which also involve high levels of trust and vulnerability. If you agree to go out on a date with someone, you presumably like them (at least a little), and you put yourself in a position of being vulnerable, both emotionally and in terms of physical proximity. 

The link to future sexual abuse in the dating study should hardly be surprising: Spanking children not only involves hitting the most vulnerable people among us, it entails hitting them in one of their most vulnerable areas (the buttocks): an area that, in our society, remains covered in public because of its sexual associations.

If the person you're dating thinks it's appropriate, or even an expression of love, to hit you, trust and vulnerability go out the window. Not to mention that the person has just engaged in a criminal act (assault) according to our social norms.

But those same social norms tell us it's fine to spank a child. Parents can't be prosecuted for it, and they don't even have to endure much (if any) societal disapproval for it. A United Nations committee calls the practice "invariably degrading," and 53 countries ban corporal punishment outright, but the United States isn't one of them.

Indeed, nearly three-quarters of the U.S. population  agrees or strongly agrees "that it is sometimes necessary to discipline a child with a good, hard spanking."

The evidence against spanking is one of the most consistent findings in the field of psychology.
— Elizabeth T. Gershoff, associate professor of human development and family sciences at the University of Texas at Austin

The upshot: We tell our kids not to "resort to violence" and urge them to solve problems rationally, while at the same time resorting to violence ourselves ... and violence that's anything but rational, since it doesn't work.

I find this incomprehensible. When it comes to how we, as adults, treat other adults, we condemn "throwing the first punch" and justify physical violence only in self-defense. We don't shoot people in the back. We don't pile on after the whistle blows or the bell rings. We observe the boundaries that apply across society ... except, inexplicably, to the most vulnerable among us, our children.

Spanking doesn't work. It makes the problem worse. It's predictive of adult violence. But most of all, it's wrong.

It's wrong to hit someone without provocation, to inflict pain, and it's even more egregiously wrong if that person is defenseless. That's what we're supposed to believe as a society.

So why the hell do we keep doing it to our kids?

Calling people "useless" isn't a useful way to discuss race

Stephen H. Provost

I’m not deplorable, but I guess I’m useless.

Aren’t I? Of course not.

Neither are you, and I don’t care about your skin, your political affiliation, what country you’re from, or what language you speak. You’re worthwhile. With a few notable exceptions (I’m thinking mass murderers, child molesters, serial abusers, people who torture animals), we all are. Flawed? Of course. Misunderstood? All too often. Selfish? Well, yeah, at times. But useless?

Who says?

Damon Young does, in the title of his article “The Most Useless Types of White People, Ranked.”

It’s a list.

Provocative headlines and the list format have been touted as ways to increase site visits (e.g., clicks), and the VSB site where Young is editor-in-chief does accept advertising. So, if I were a cynic, I could excuse the blatantly provocative headline as a ploy to increase site visits and, therefore, ad revenues. But that doesn’t explain the combative nature of the content itself.

Not only does Young consider certain types of white people useless, he ranks some as more useless than other. Forget the fact that this makes no sense (either something is useless or it’s not – if it has some use, any use at all, it’s not, by definition, useless). The real problem is that demeaning certain “types” of people belonging to a particular race is offensive. Pretend you’re playing Mad Libs. Just substitute any other group label for the term “white” in the headline, and you’ll see what I mean. Try Latino, Asian, Irish, Catholic, Muslim, Jewish, obese people, redheads … whatever.

No matter what term you use, this headline is offensive.

Then there’s the content. Young isn’t listing 10 side-effects of a particular drug or 10 vegetable dishes that go well with roast beef. He’s put together a list of 10 insults, some of which he apparently means to be funny. They’re not.

This will really facilitate dialogue between people of different races. Right.

I couldn’t pass up the opportunity to respond to Young's article on each of its 10 points. What follows are his list of 10 “useless” types of white people (in italics) and my response to his comments on each:

10. The “I would have voted for Obama again” guy, which, since a three-term presidency is literally not possible right now, is like saying, “I totally would have killed a velociraptor, dude. Totally.”

So, expressing a willingness to vote for Barack Obama for a third term if given the chance makes someone “useless.” Maybe it would have been more useful to vote for Trump? Or equivocate on Charlottesville? Or defend Confederate monuments? But useless is useless, so I suppose all these things are equal, right? Second point: Expressing a desire to do something that’s beyond your power does not make you useless. More likely, it means you care, which is a lot better than apathy.

9. The “Why can’t we just forget about our differences and come together?” guy, who’s usually the exact-same guy as the “I don’t see color” guy and the “Why does everything have to be about race?” guy and the “I’ll have a Coke Zero, please” guy.

Forget about our differences? Nah. That wouldn’t be any fun. It would make us all a bunch of clones (kind of like Rush Limbaugh’s radio listeners) and render life ridiculously dull. I’m not a clone, which is kind of the point, because not all people listed here are useless. They’re all different, and it would be nice for Mr. Young to acknowledge as much rather than trying to categorize people based on overly simplistic click-bait lists.

Coming together? Seems to me that would be a good thing. This country is so badly fractured that the longer we take to find common ground, the harder it will be to heal. Finding common ground does not mean, “You do things my way and pretend to like it, or you’re fired” (the Trump approach). It means finding things people of different backgrounds can honestly agree on and building something together from there. Finding commonality also doesn’t mean ignoring diversity. But, on the other hand, affirming diversity doesn’t entail calling people useless.

(I'm not sure what Mr. Young has against Coke Zero, but I'm a Diet Pepsi drinker myself, so I guess I don't fit in this category.)

8. The “White people are so terrible. I hate us” guy.

I’m white. I’m not proud of it. I’m not ashamed of it. It doesn’t make my words any more or less credible. Hating people based on skin color is ignorant and destructive. So is labeling people you don’t even know as “useless.”

7. The “Let me write this 2,500-word column attempting to explain and empathize with racist white Trump voters without using the word ‘racist’ once in my 2,500 words” guy.

So, now it’s “useless” to try to explain people’s actions, too? Think about the most despicable person you know, the person who poses the biggest threat to you and your family. Don’t you want to understand what makes that guy tick? Wouldn’t it help you defend yourself if he decides to attack you? The thing is, I can’t imagine any of the people who fall into Young’s categories would consider themselves his sworn enemy. What’s so wrong about trying to understand each other? Point 2: I can’t empathize with white racist Trump voters. I don’t want to. But explaining someone’s actions and supporting them are two different things: like velociraptors and Coke Zero.

6. Bernie Sanders.

Putting universal health care in the spotlight is a bad thing? I suppose it is – to the insurance providers and Republicans who’ve spoken out most forcefully against it. I could always be wrong, but I’m assuming Mr. Young doesn’t fit into either of these categories. How about working toward a more affordable educational system? Is that bad? Since Mr. Young considers Sanders “useless,” I suppose he must think so. But congratulations, Mr. Young, you’re not alone in thinking these goals are “useless.” So does that Trump fellow you mentioned.

5. The “I just really enjoy making sex things with black people, and I hate when conversations about race complicate those things because I just want to make sex things!” guy.

I find it hard to believe that someone who would “just want to make sex things” with black people is even open to having an in-depth conversation about race, culture or human rights. Worse, the hypothetical person being quoted here is objectifying people. Remove the word “black” from Young’s first sentence, and you’ll get the equally offensive “I just really enjoy making sex things with people.” Note: White guys don’t have any kind of monopoly on this, and it’s offensive whoever does it.

4. The “Race issues are really just class issues” guy.

They’re not “just” class issues, but economics have been used as an excuse of establishing a race-based class system, and race has been used as an excuse to perpetuate that system. You can have racism without economic issues, and there are economic issues that have nothing to do with race. But narrowing the wealth gap for all Americans would benefit minority communities, just as ending ingrained racism would help address artificial class issues. Race issues obviously aren’t “just” class issues, but to suggest that race and class issues are entirely separate in this country would be even more inaccurate. I don’t think that’s what Young’s suggesting. At least, I hope not.

3. The “I want to have debates with you about racism because, to me, it’s a fun and lively and energizing thought exercise” guy.

Lively? Energizing? Not in the current climate. We’ve separated into camps, with absolutist true-believers/litmus-testers and rigid positions on both sides. Facts don’t matter; opinions and feelings are considered more important (which explains why Trump voters continue to support him). We use labels to ridicule and dismiss those who challenge or disagree with our assumptions, even to the slightest degree – labels such as, gosh, I don’t know, “useless.” For the record, I have no interest in “debating” Mr. Young. I’m simply stating my opinions – just as he stated his on his blog.

2. The “Let me unload this 18-minute-long tome on my feelings on race and racism and savory grits on you right now even though you’re just standing in line at Potbelly and you clearly just want to get a quick and cheap sandwich and not be my personal Pinterest board of white guilt” guy.

If someone invades your personal space while you’re standing in line, minding your own business, you have every right to tell the person to get lost, whether he’s going on about “white guilt,” mass shootings, the Golden State Warriors or the price of tea in China. Again, this isn’t a “white guy” problem, it’s a rudeness problem.

1. The “I did something nice for black people, like, 37 years ago, and I’m going to continue to bank on that one thing like there’s a ‘Did something nice for black people’ craps table and I already cashed out” guy.

Sure, it can be tedious to hear someone going on and on about having done “something nice” like it’s a badge of honor. But that’s still better than doing something mean. And it doesn’t make the person “useless.” As with No. 2, you don’t have to listen.

You don’t have to read this, either, Mr. Young. But if you do happen to come upon this response while surfing the Internet, I’d like you to know one thing: People aren’t useless just because they fit into one of your 10 categories. Maybe they are to you, but they might say the same thing about your list. And what does any of this accomplish? Further alienation, misunderstanding and animosity? So much fun.

People aren’t useless, and they don’t like being used, either, certainly not as straw men for top 10 lists compiled by people they’ve never met. And certainly not as pawns in culture wars where everyone ends up losing.

Ice cream, logic and the Second Amendment

Stephen H. Provost

You’re hungry. You want to go out and buy a gallon of ice cream, despite the fact that you’re diabetic and doing so could kill you. But hey, we’ve all got to eat, right? Never mind the fact that you’re already at a healthy weight and in no danger of starving without that ice cream.

You’re thirsty. You decide to go to the bar and have a shot of tequila. Then a gin and tonic. And while you’re at it, you’d like a pitcher of beer to wash it all down. After a while, alcohol poisoning becomes a real possibility, but before you even get that far, the juice will begin to impair your judgment and lower your inhibitions. A one-night stand with the wrong person, a barroom brawl or, worse still, a fatal accident on the interstate could be just around the corner. But it’s all good because people have to drink, don’t they?

But do you have to drink alcohol? Sure, it’s liquid, but drinking too much of the stuff can actually leave you dehydrated.

Countless bad decisions have been justified by the phrase “I need that” —when the person doesn’t really need the thing at all. He or she may want it, to be sure, but as Mick said, “You can’t always get what you want.”

Unless, that is, you can convince other people you need it.

Ice cream and guns

Enter the Second Amendment to the U.S. Constitution: “A well regulated Militia, being necessary to the security of a free State, the right of the people to keep and bear Arms, shall not be infringed.”

To paraphrase: “We need a militia to keep us safe, therefore …”

I’ll put aside the difficulties of defining “the people” and “arms” for now, because I want to focus on the premise. The writers were clearly saying, “We need this, so we’re going to guarantee that.”

But here’s the rub: In an age of standing armies, we no longer need a militia.

When a premise is obsolete, any conclusion drawn from it must be questioned. You don’t need a gallon of ice cream if you’re in no imminent danger of starving —and even if you were, another food source would work just as well.

In the same way, you don’t need a militia in an age when you're protected by the world’s most sophisticated, heavily funded standing force. The premise no longer holds, so the conclusion collapses.

The demands of logic

The Supreme Court majority disagrees with me. Its argument, stated in District of Columbia, et. al. v. Dick Anthony Heller, is that “apart from that clarifying function, a prefatory clause does not limit or expand the scope of the operative clause.”

In other words, the premise doesn’t matter, because what follows could stand on its own.

To illustrate this, the court replaces the actual introduction with an unrelated premise — a non sequitur. The Second Amendment, it argues, would be nonsensical if it read, “A well regulated Militia, being necessary to the security of a free State, the right of the people to petition for redress of grievances shall not be infringed.”

The first part obviously has nothing to do with the second.

But this straw man argument utterly fails to address the question that remains: If the conclusion could stand alone, without the premise, why did the framers include that premise in the first place?

The court answers its own question in the Washington, D.C. opinion by stating that “logic demands that there be a link between the stated purpose and the command.”

Logic demands.

With these two words, the court has given the correct answer to the question of why the framers included the introductory clause: It is, in fact, the premise in a logical argument.

Having it both ways

As we’ve already seen, though, a conclusion is worthless if the premise invalid: Without the premise it becomes merely an assertion. As a conclusion, it collapses under its own weight.

We’ve also seen that, in an era of standing armies, the premise that a “well regulated Militia is necessary to the security of a free State” simply isn't accurate. It becomes just as nonsensical as the hypothetical non sequitur the court introduced concerning the petition of grievances.

The court simply cannot have it both ways. It cannot, on the one hand, assert that the conclusion stands on its own regardless of the premise while, at the same time, maintaining that both are components of a logical argument — thus making the premise necessary to support the conclusion.

The premise was valid when it was written. The disparate collection of rebels who formed our fledgling nation did, in fact, need militias to guarantee their security back in the 18th century. But that doesn’t mean we need them today. The premise is no longer valid and, therefore, neither is the conclusion.

To argue otherwise would be to state that the framers might as well have included that hypothetical premise about the redress of grievances. Or, for that matter, a belief in astrology. Or the quest to land on the moon. Or anything else you’d care to mention. The majority justices in this opinion are basically suggesting that the framers could have used anything to fill in the blank, as though they were playing a game of Mad Libs.

But they weren’t. They were making a logical argument — as the court itself affirms. The premise they included in the Second Amendmentwasn't some random statement without any bearing on the conclusion. It was, in fact something that the framers saw as a necessary component of a logical argument.

The fact that the reasoning is obsolete doesn't change that, no matter how much the court majority might wish it would.

Mental gymnastics

The majority is, in fact, is trying to perform an impossible task. On the one hand, it seeks to maintain the Constitution, and specifically the Second Amendment, as an essential component of the nation’s social contract — a necessary premise upon which our system of government rests. At the same time, however, it must deal with the fact that a premise within the amendment itself is no longer valid.

That’s quite a conundrum, and it helps explain why courts and the nation as a whole is so closely divided, philosophically speaking, on this issue. (They’re divided on a practical level as well, by competing agendas, but that’s another issue.)

We don’t like the idea of admitting that something in our founding documents is no longer relevant, because we’re afraid that in doing so, we might cast doubt on the rest of their contents. We therefore fall into the trap of defending the authority of the documents themselves, rather than affirming the principles upon which they rest: violating the spirit of the law in a vain attempt to preserve the sanctity of the letter; creating fallacious arguments to prop up outdated logic.

Where does that logic lead us?

Toward that tub of ice cream or that bottle of whiskey. To something we no longer need but still want. One could argue that we, as a nation, have the same attitude toward guns that the gluttonous man has toward his ice cream or the alcoholic has toward his Jack Daniels. In all three cases, we invoke a perceived need as an excuse to continue feeding an insatiable appetite that isn’t good for us.

We continue to defend outdated logic that we need guns for one purpose in order to preserve our right to wield them for other reasons entirely.

Burden of proof

I’ve been told that, in order to find a flaw in the Second Amendment, I’ll need to change the Constitution. But I disagree. The logical flaw is there, right in front of our noses, and our failure to acknowledge it won’t make it disappear.

There are other reasons to bear arms, but we can’t infer from the document as written that these are sufficient to secure a right to do so. And we can’t simply cast aside the premise of a logical argument that was an essential part of the document as written … unless, that is, we amend the amendment. The burden of doing so must be placed squarely on the shoulders of those who believe in the right they want to uphold: either by removing the archaic premise about militias entirely, or by replacing it with another premise altogether — such as a right to individual self-defense.

But it’s impossible, in my view, to deny that the amendment as written, is an invalid argument. And once we admit that, we must also acknowledge that such an argument is not fit to serve as a guiding principle for a great nation.

Guns are, most certainly dangerous. But it’s far more dangerous to engage in mental gymnastics to convince ourselves that something’s logical when it isn’t. Guns may kill the body, but logical fallacies destroy the mind.

This is what we’ve come to. The Supreme Court majority is flat wrong. Its reasoning simply backfired.

Discovery vs. Orville: Where no one has gone before ... and back again

Stephen H. Provost

On stardate 1672.1, Captain Kirk was the victim of a transporter accident that split him into two distinct sides of himself. (Actually, this happened in 1966, in an episode of Star Trek called The Enemy Within.)

Flash forward 51 years, and the same thing has happened to Star Trek itself. It feels like the franchise has been caught in a transporter accident and split in two, with the result being one show that expands on the vision and scope of its predecessors, and another that has inherited many of the qualities that made it so much fun to watch.

This year's Star Trek: Discovery and The Orville are both descendants of Gene Roddenberry’s original series, a family of shows that has now evolved beyond next generation.

What’s happened to the Star Trek franchise is kind of like what happens to a rock band when the guitarist and singer have a falling out, and each starts a separate band that sounds a little (but not exactly) like the group they formed together. The result will be endless comparisons, with fans likely enjoying both but, at the same time, many wishing the guitarist and singer could just bury the hatchet and make music together the way they used to.

Now, imagine one of those bands returns to the studio and spends a ton of money recording an ambitious new rock opera, while the other goes out on tour, playing all the old hits and having a little fun at its own expense. The first band is Discovery, and the second is The Orville.

Four episodes in, I’m watching — and enjoying — both. But I’m also, if the truth be told, wishing for a reunion tour.

Discovery is the more focused of the two. So far, it’s zeroing in on a single character, the disgraced but brilliant Michael Burnham, and following a unified story arc that involves a nascent war between the United Federation of Planets and the Klingon Empire. Star Trek started using story arcs, to fine effect, during the 1990s with Deep Space Nine, but before that served up mostly self-contained episodes. It also became adept at introducing us to a large number of interesting regular and recurring characters — who we came to care about because each explored the nature of our own humanity in his or her own way.

Perhaps ironically, this is where The Orville has the advantage in the early going. We’re already painfully aware of the awkwardness between the captain and first officer, who've been through a painful divorce; of the challenges facing the super strong security chief in dealing with her youthful insecurity; and of the family dynamics involving a crew member, his same-sex spouse and their newly hatched (yes, hatched) child.

All very human and all very familiar. They can’t call Seth MacFarlane’s series Star Trek, because CBS owns the rights to that name. But Brannon Braga, who’s created or developed several corners of Roddenberry’s universe, is an executive producer, and the cast includes members who seem a whole lot like Worf and Data from The Next Generation.

Indeed, The Orville is more like that show than it is like Galaxy Quest — the comedic send-up that both spoofed and payed homage to the original — and the humor can be unevern (the sniping between Capt. Mercer and his ex, Cmdr. Grayson, has already started to wear thin). But at least there is humor, which can be hard to find — apart from the stray tribble or a, subdued one-liner — on Discovery. No incarnation of Star Trek has ever aspired to be a laugh-fest, but there’s always been enough humor to leaven the heady, ambitious storylines.

When it comes to special effects, Discovery is light years ahead of The Orville — although it’s odd how sophisticated the technology looks compared to that in the original series, which was filmed a half-century earlier but is supposed to take place a decade after Discovery. Unaccountably, Discovery’s sleek starship with its rotating saucer section looks like it belongs 100 years in Kirk and Spock’s future, not their past. (At least Enterprise, which was set before either series but filmed more than three decades after the original, was designed to look like a bridge between NASA and the Federation.)

The Orville’s sets look like throwbacks to Next Generation or the 1980s movies, even though it doesn’t (technically) even take place in the Star Trek universe — and isn’t therefore bound by any constraints of continuity. It’s not as impressive to look at, but neither are reruns of earlier Star Trek series — which are still just as much fun because of the stories they tell and the insights they provide into our own humanity.

I trust Discovery will delve into some of those insights. The most intriguing human relationship, between Burnham and Capt. Georgiou, was short-circuited by the latter’s death at the end of Episode 2. But the Discovery’s captain, Gabriel Lorca, shows signs of developing into an interesting, multi-faceted character and, given time, here’s hoping others in the crew do, as well. The large amount of time spent developing the Klingons has slowed things down a bit, especially considering the large amount of Klingon dialogue presented in subtitles — which may please Star Trek geeks but frustrate newer fans. (For the record, I’ve seen every episode of every Star Trek series; I don’t know whether that makes me a geek or not, but I still find the subtitles get in the way).

That’s a minor quibble, and I’m not complaining. Two heirs to the Star Trek television legacy are infinitely better than what we’ve had for more than a decade: zero. Still, I can’t help but hope each will learn a little something from the other.

That would make the future — imaginary, visionary or otherwise — even brighter.