“I’ve seen this within myself, within the organization, where we constantly face pressures to set aside what matters most, and throughout broader society, too.”

 He wrote that humanity is approaching a threshold where “our wisdom must grow in equal measure to our capacity to affect the world, lest we face the consequences.” He wanted to contribute in a way that felt fully in his integrity and to devote himself to what he called “the practice of courageous speech.”

 A man who built defenses against bioterrorism concluded that the most important thing he could do next was learn to speak with honesty and courage. That is a major signal about what is happening behind closed doors in AI research and development.

  Sharma was not alone. Numerous safety researchers have walked off AI projects from multiple companies. These departures may be the only signals we, the public, have, because almost everything else about AI development is happening beyond public view. The internal debates, the safety trade-offs, the negotiations over what this technology will and will not be permitted to do—none of it is being shared with the people whose lives it will most profoundly shape. We are not part of this conversation. We are being presented with outcomes and told to adapt.

John Adams wrote that the Constitution was made only for a moral and religious people, and is wholly inadequate for any other. George Washington warned that liberty cannot survive the loss of shared moral principles. The founders studied the collapse of republics throughout history and arrived at the same conclusion: The machinery of freedom requires a moral people to sustain it. Laws and institutions are not enough on their own. They depend on citizens and leaders who hold themselves to something that exists before the law and above it.

That is the thread of human society, and no AI system holds it. If people allow AI to replace the question of right and wrong with the measure of what is legal and permitted, the machine will carry that measure forward at a scale and speed that no previous generation has had to reckon with.

As Sharma ended his resignation letter, “You don’t ever let go of the thread.”

We are at a crossroads not unlike the one the atomic scientists faced.

Sharma’s resignation was a signal. The wave of departures before and after it are signals. The reported tensions between AI companies and government over where moral limits should be drawn are also signals. Together, they are pointing at something the public has not yet been fully invited to consider: that the most important questions about this technology are being worked out without us, and that the thread of morality, which has always required people to hold it by choice, needs to be part of that conversation.

Views expressed in this article are opinions of the author and do not necessarily reflect the views of The Epoch Times.