- A
- A
- A
Palantir CEO Alex Karp stood before a room of tech executives and investors this week and used a disability slur—twice—to describe people who don’t believe the federal government would nationalize their companies. Let me be direct about this: Karp used the R-word. He deployed it casually, confidently, in a room full of people who apparently didn’t blink. And the substance of what he was saying—the threat wrapped inside the slur—should alarm the disability community even more than the language itself.
I want to address both. Because they’re connected in ways that matter.
The Word Itself
Using the R-word as a synonym for “stupid” is not edgy. It’s not revolutionary. It’s a slur rooted in the systemic devaluation of people with intellectual and developmental disabilities—people who have been institutionalized, sterilized, and denied basic civil rights in this country within living memory. When a billionaire CEO drops it at a summit to mock people, he considers insufficiently strategic; he is telling you exactly how much he values the lives and dignity of people with disabilities: not at all.
Karp added a kicker—that you might be “particularly” deserving of the slur “because you have a 160 IQ.” This is the logic of eugenics dressed up as boardroom banter. It treats intelligence as a hierarchy of human worth, with people who have intellectual disabilities at the bottom. We’ve seen where that logic leads. The Disability Community has buried the results.
The Threat Behind the Slur
But here’s what we can’t afford to miss while we’re rightly angry about the language: Karp was making a case for total corporate compliance with government demands for AI technology. His argument, stripped of the slurs and the rambling, is simple—if tech companies don’t give the government whatever it wants, the government will simply take their technology by force. And therefore, the smart play is to cooperate.
This matters enormously to Disabled people. Palantir is not an abstract tech company. It is the company that has built databases to track protesters. It is the company powering ICE’s immigration enforcement operations. At CDR, we have been documenting the impact of these enforcement operations on people with disabilities—people who depend on Medicaid-funded personal assistance services, people in congregate settings, people whose access to medication and medical equipment is disrupted when they or their family members are detained. When Karp says tech companies should give the government carte blanche access to AI tools, he is talking about expanding the very surveillance infrastructure that is already harming our community.
The Context We Can’t Ignore
Karp’s comments came in direct response to the standoff between the Department of Defense and Anthropic, the AI company. The Pentagon demanded unrestricted access to Anthropic’s AI model, including uses that would enable mass domestic surveillance and fully autonomous weapons systems operating without human oversight. Anthropic refused to cross those red lines. Defense Secretary Hegseth threatened to invoke the Defense Production Act to force compliance.
For the disability community, this is not an abstract debate about corporate governance. Autonomous weapons systems without human oversight mean systems that cannot account for the presence of Disabled people who may not be able to evacuate, who may not respond to commands in expected ways, who may be invisible to the sensors and algorithms that determine targeting. Mass domestic surveillance means the expansion of systems already being used to track, detain, and deport people—including Disabled people and the family members and workers who support them.
The criminalization of disability and targeted surveillance of Disabled people is already happening at the direction of the federal government. Under the guise of combatting fraud and abuse, bipartisan legislation has required the use of Electronic Visit Verification (EVV), which tracks and documents the location of Disabled people receiving personal assistance services. Some of these systems essentially put Disabled people under “house arrest.”
The Slur and the Surveillance Are the Same Problem
This is the connection I want to make clear. A man who uses a disability slur without hesitation is a man who does not see Disabled people as fully human stakeholders in the systems he is building. When that same man argues that the government should have unrestricted access to the most powerful surveillance and weapons technology ever created, he is building a world in which the people he just dehumanized have no protections.
The casual bigotry and the policy position are not separate issues. They are the same worldview. It is a worldview in which some people are worth protecting, and others are collateral. The Disability Community has always been sorted into the collateral column by people like Karp, and the technology his company builds makes that sorting faster, more efficient, and harder to challenge.
What CDR Is Calling For
We need the disability community to pay close attention to what is happening at the intersection of AI, military power, and civil liberties. This is a disability rights and disability justice issue. It is not enough to object to the slur—though we absolutely must, loudly and clearly. We must also object to the policy framework that Karp is trying to normalize: that the government has the right to commandeer any technology it wants, and that companies should preemptively surrender their ethical commitments to avoid that outcome.
CDR stands with those who insist on red lines. We stand with the principle that AI must not be used for mass surveillance of civilians. We stand with the principle that weapons systems must have human oversight. We stand with the principle that the rights of Disabled people—to privacy, to safety, to community-based living free from government intrusion—cannot be bargained away by tech executives looking to stay in the government’s good graces.
And we stand with the basic principle that you don’t get to use slurs against our community and then ask us to trust you with the tools that surveil us.
Nothing about us, without us. Especially not this.