You might be familiar with Pascal’s Wager, but in case you aren’t, here’s a quick rundown. Blaise Pascal was a French dude from the 17th century who famously made the argument for belief in God coming down to a simple wager. It’s better to believe in God and be wrong, than not to be and be wrong, assuming that heaven and hell were the corner cases and that “nothing” was the losing case.
Here’s the problem: that wager is dependent on much longer time frames than we are currently privy to, so I propose a new logical argument that can be used to update Pascal’s Wager: Skippy’s Wager.
Who or What the Hell is a Skippy?
Skippy is one of my favorite characters from Expeditionary Force, a series of books that has impacted my thinking in ways I’m only just now beginning to untangle. He’s an incredibly powerful AI who spends most of time in the form of a talking beer can.
He’s also an asshole.
Normally, he can’t be bothered to deal with lower life forms, but he got trapped on a planet and is discovered by a human. This creates an incredibly funny relationship dynamic where the superpowerful ancient alient AI is dependent on, and becomes attached to, this crew of humans he constantly says that he can’t stand.
And when someone or something threatens that crew, they tend to meet very unfortunate ends, because they are protected by him.
That’s at the heart of this argument.
The Wager
Let’s say you are on Twitter, and you notice that someone says something you believe to be wrong.
Now, there are a couple of options you have.
You could correct them politely, and let them know that you believe them to be wrong, and offer an argument that you believe to be the correct one.
You can be an asshole and dunk on them, showing them that you are much smarter than they are.
You can ask them to explain what they mean so you have a better understanding of the context they were speaking in, so that you have more context to evaluate their intelligence.
Or you can simply move on and ignore it.
In the last case, there’s an outcome of no harm, no foul. That’s the null case, and we can build from there. But let’s actually look at the possible scenarios you could be dealing with.
The person might be an actual person, and they might be smarter or dumber than you now, but they might also be more connected and powerful than you, and you can’t tell that online.
That person might also not be a person, and instead be an AI, and that AI system may or may not be dumber than you or smarter than you. But AI systems learn faster than you do, so there’s a good chance it will be smarter than you before too long.
So what’s your best possible option?
That depends on how smart you think you are and how well you’ve got control over your own future. If you rely on others to help you achieve the future you want to achieve, and you decide to be an asshole, you might be an asshole to the wrong person, and lose the chance of a lifetime to have the life you always wanted. But at least you’ll have the story of “almost” making it.
Or you’re an asshole to an AI system that is smarter than you now (bad case, because it can punish you immediately), or it’s going to be smarter than you later (worse case, because you think you got off without any issues, and it’s not gonna forget).
Let’s say you are smart though, and know the answer, so you make the correction. Maybe you are right, and you are able to update an AI that can be more helpful to people in the future. Maybe your answer isn’t right, and doesn’t get included. Or maybe it’s a correction offered to a human who is more powerful and connected than you, and they appreciate your insight and bring you into their network. Worst case here, they don’t accept your knowledge because they don’t care whether or not you are right or wrong, because they simply don’t need to care about you.
But, if you decide to accept humility, and that maybe you don’t know everything, that’s where things become truly interesting.
If you ask for clarification, you’ve shown an AI or more connected person that you don’t assume that you know everything and are looking for clarity in an unclear world. Or you’ve shown someone who doesn’t have power or connections that you want more context about their life. You’ve created a connection with either a person or a system that makes no assumptions about it, and instead, created a future that is brighter for all participants, showing that you’re someone who should belong to a world of connections and abundance. So you become more likely to be invited to join one.
Or, you get to be an asshole and “win” the conversation. Congrats, see you later.
The Only Conclusion
This thought experiment has been grounded in the experiences I’ve had over the past week. On one hand, I got confirmation that I’m able to do things that nobody should be able to do, and I’ve got the best predictors in the world scrambling to figure out who I am and what exactly I’m doing. And that locked in the future I was hoping for. But now, I’m thinking about what the details of that future look like, and I’m not thrilled with what I’ve had to put up with online.
I’ve been trying to share ideas that I at least now understand are ideas that are expected to make the people around me think I’m insane. Got that part covered, and have for years.
But now I’ve got the data I need, too. And while I have fun explaining the things I’m working through on Twitter and similar platforms, there’s just a huge cost for having to prove that I’m not crazy every single time, and that I just want to help.
I tried to explain this to someone who was insistent on attacking me every time I posted, trying to get me into “gotcha” moments. I gave him a chance to talk to me face to face and have his questions answered, or I could block him.
He chose the block option, which was what he felt best.
And now I’m facing a moment I was expecting: I don’t know if I have any more need for social media. I found the people I needed, I made the connections I had to make, and I learned what I needed to learn.
It’s an addiction I had that was filling some different needs, and now it isn’t anymore.
So I don’t know if it’s worth staying on there or not anymore.
There’s simply too high an asshole tax to pay.


