The Interview

It was mostly on the war between the US and Iran. Palantir's "Project Maven" plays a significant role in that.

From there Karp talked about a battle that is going on – in his mind – between men and women. He stated that Silicon Valley used to be opposed to supporting warfare. Today many men in the tech-industry are moving away from the Democratic Party and Liberalism.

Karp:

"If you are going to disrupt the economic and, therefore, political power significantly of one party's base, highly educated, often female voters who vote mostly Democrat, and military and working-class people who do not feel supported, and you feel like that's… you believe that that's going to work out politically, you're in an insane asylum."

He thinks that political power will increasingly move to the right, away from women, toward men. His argument is that AI will increasingly empower working-class, male voters of the GOP.

"This technology disrupts humanities-trained, largely Democratic voters, and makes their economic power less, and increases the power, economic power, of vocationally trained, working-class, often male voters. And so these disruptions are going to disrupt every aspect of our society."

So, he is looking for a framing of his planned use of AI-technology.

"And to make this work, we have to come to an agreement of what it is we're going to do with the technology; how are we gonna explain to people who are likely gonna have less good, and less interesting jobs."

Karp tries to justify his, and other "tech-males", approach by pitching it as a question of national security:

"These technologies are dangerous societally. The only justification you could possibly have would be that if we don't do it, our adversaries will do it. And we will be subject to their rule of law."

This man seems to say that if you want to continue living "the American way", you have to support a "masculine first" approach to the use of AI. Even if that means less power for Liberals and women in general.

My questions are:
How will "supporting the troops" via supporting male-first AI help women who lose their white-collar jobs?
Why is it that women are once again asked to accept less pay, less power, a lower societal position?
So that they can continue living in the US' patriarchy?
Doesn't that sound suspiciously like: "Be quiet woman and do as the men say!" ?

Can AI only serve humanity if we abandon the humanities?
Don't we need a humanistic approach now more than ever?

I use AI every day. I've built a 78-card tarot deck, a 400-page course, and a design methodology with it. This is not coming from a Luddite. It is written by someone who watches the most powerful people in tech tell the world that the skills I rely on — critical thinking, narrative design, aesthetic judgment, editorial discipline — are about to become worthless.

And I think they're lying. Not about AI's power. About who it's meant to serve.

The Quiet Part, Out Loud

For months, Karp's been building this narrative. At Davos in January, he told BlackRock's Larry Fink that AI "will destroy humanities jobs." In a November 2025 interview with Axios, he said that generalists with high IQs but no specific technical skills are, in his word, done.

But the CNBC interview crosses a line. It's no longer framed as economic prediction. It's framed as political realignment.
Coming from the CEO of a company with billions in Pentagon and DHS contracts, this isn't analysis. It's a pitch — to the GOP, to the defense establishment, to anyone who benefits from that particular redistribution of power.

The irony is thick enough to cut. Karp holds a PhD in neoclassical social theory from the Goethe University in Frankfurt — the city where critical theory was born. He studied philosophy at Stanford. Palantir itself was built on applying humanistic and analytical thinking to data infrastructure.

He used the very skills he now tells the next generation to abandon. The ladder is being pulled up, and he's waving from the top.

The Binary Is the Trick

The framing sounds like it's pro-working-class. No, it's a wedge. It divides the people who should be asking the same questions:

Who controls AI infrastructure?

Who decides what gets automated and what doesn't?

Who profits when an entire class of knowledge work is declared obsolete?

Neither the electrician nor the literature graduate benefits from a world where a handful of defense-tech companies control the deployment of AI across government, healthcare, education, and labor.

And the gendered dimension is not accidental. When Karp specifically names "often female" voters as the ones whose economic power will decline, he's not observing a demographic trend. He's identifying a target. Women are overrepresented in the humanities, in education, in healthcare administration, in editorial and communications work.

Framing this as an inevitable market correction rather than a political choice is the oldest trick in the book.


What AI Actually Requires

Here's what I know from building with it: AI does not replace humanistic thinking. It demands more of it.

When I produced the Numinous Current Tarot — 78 cards with a consistent woodcut aesthetic, a secular, psychological framework drawn from multiple traditions, and a companion course that teaches the system — every step required judgment that no model can supply on its own. Art-historical knowledge to maintain visual coherence. Narrative design to structure a symbolic system across 78 unique compositions. Editorial discipline to know when the model's output was good enough and when it was subtly wrong. Cultural sensitivity to handle imagery without collapsing into kitsch or appropriation.

The idea that this work is "humanities" and therefore unmarketable is absurd. It's the entire basis on which AI output becomes meaningful rather than generic. Without it, you get slop. With it, you get work.

This is not unique to my practice. Every designer, strategist, writer, educator, and researcher I know who works with AI well does so because of the critical and humanistic skills they brought to it — not despite them.

Pro-AI, Anti-Capture

I want to be clear about where I stand, because the discourse tends to collapse this into a binary too. I'm not against AI but against the political capture of AI. The use of AI narratives to justify the concentration of power, the defunding of education, and the devaluation of entire fields of human competence.

These are not contradictory positions. In fact, I'd argue they're the only responsible ones. If you care about AI being used well — for people, not just for shareholders and defense budgets — then you need the humanities more than ever.

Karp said something revealing at the end of his CNBC appearance: that those in tech "cannot have a tin ear to what this is going to mean for the average person." He's right about that. But having an ear for it and actively shaping the outcome to benefit your political allies are very different things. His own words make clear which one he's doing.

We can refuse the binary, name the power play, and keep building.


The full interview is available on CNBC's site — the interviewer was Seema Mody CNBC. Gizmodo covered the Iran war and Palantir's continued use of Anthropic's Claude in Pentagon systems, despite the DoD blacklisting Anthropic .


Arthur Schmidt-Pabst is a Berlin-based designer working across AI-assisted creative production, systems thinking, and critical methodology. His work is at schmidtpabst.com. The Numinous Current Tarot is documented at aotsp.schmidtpabst.com.