Family members of Fort Hood shooting victims at Nov. 10, 2009, memorial

Family of the victims of the Fort Hood shooting pause at a memorial for the fallen on Nov. 10, 2009. In a new study, Stanford political scientist Amy Zegart examines lessons learned from the terrorist attack at the U.S. military post. (Image credit: AP Photo/Donna McWilliam)

U.S. national security faces rising challenges from insider threats and organizational rigidity, a Stanford professor says.

Amy Zegart, co-director of the Center for International Security and Cooperation at Stanford and a senior fellow at the Hoover Institution, wrote in a new study that in the past five years, seemingly trustworthy U.S. military and intelligence insiders have been responsible for a number of national security incidents, including the WikiLeaks publications and the 2009 attack at Fort Hood in Texas that killed 13 and injured more than 30.

She defines “insider threats” as people who use their authorized access to do harm to the security of the United States. They could range from mentally ill people to “coldly calculating officials” who betray critical national security secrets.

In her research, which relies upon declassified investigations by the U.S. military, FBI and Congress, Zegart analyzes the Fort Hood attack and one facet of the insider threat universe – Islamist terrorists.

In this case, a self-radicalized Army psychiatrist named Nidal Hasan walked into a Fort Hood facility in 2009 and fired 200 rounds, killing 13 people and wounding dozens of others. The shooting spree remains the worst terrorist attack on American soil since 9/11 and the worst mass murder at a military site in American history, she added.

Insights and lessons learned

Zegart’s study of insider and surprise attacks as well as academic research into the theory of organizations led her to some key insights about why the Army failed to prevent Hasan’s attack when clues were clear:

• Routines can create hidden hazards. People in bureaucracies tend to continue doing things the same old way, even when they should not, Zegart said, and not just in America. In the Cuban missile crisis of 1962, for example, U.S. spy planes were able to spot Soviet missile installations in Cuba because the Soviets had built them exactly like they always had in the Soviet Union – without camouflage.

In the Fort Hood case, she said, bureaucratic procedures kept red flags about Hasan in different places, making them harder to detect.

• Career incentives and organizational cultures often backfire. As Zegart wrote, several researchers found that “misaligned incentives and cultures” played major roles in undermining safety before the Challenger space shuttle disaster.

Zegart’s earlier research on 9/11 found the same dynamic played a role in the FBI’s manhunt for two 9/11 hijackers just 19 days before their attack. Because the FBI’s culture prized convicting criminals after the fact rather than preventing disasters beforehand, the search for two would-be terrorists received the lowest priority and was handled by one of the least experienced agents in the New York office.

• Organizations matter more than most people think. Robust structures, processes and cultures that were effective in earlier periods for other tasks proved maladaptive after 9/11.

In the case of the Fort Hood attack, the evidence suggests that government investigations, which focused on individual errors and political correctness (disciplining or investigating a Muslim American in the military) identified only some of the root causes, missing key organizational failures.

Hasan slipped through the cracks not only because people made mistakes or were prone to political correctness, but also because defense organizations “worked in their usual ways,” according to Zegart.

Adapting to a new threat

In terms of organizational weaknesses, Hasan’s Fort Hood attack signaled a new challenge for the U.S. military: rethinking what “force protection” truly means, Zegart said. Before 9/11, force protection reflected a physical protection or hardening of potential targets from an outside attack. Now, force protection has evolved to mean that the threats could come from within the Defense Department and from Americans, she added.

“For half a century, the department’s structure, systems, policies and culture had been oriented to think about protecting forces from the outside, not the inside,” Zegart wrote.

In the case of Hasan, the Defense Department failed in three different ways to identify him as a threat: through the disciplinary system, the performance evaluation system and the counter-terrorism investigatory system run jointly with the FBI through Joint Terrorism Task Forces.

“Organizational factors played a significant role in explaining why the Pentagon could not stop Nidal Hasan in time. Despite 9/11 and a rising number of homegrown Jihadi terrorist attacks, the Defense Department struggled to adapt to insider terrorist threats,” Zegart wrote.

Difficult to change

Another problem was that the Pentagon faced substantial manpower shortages in the medical corps – especially among psychiatrists. So the Defense Department responded to incentives and promoted Hasan, despite his increasingly poor performance and erratic behavior.

In addition, Zegart found the Defense Department official who investigated Hasan prior to the attack saw nothing amiss because he was the wrong person for the job – he was trained to ferret out waste, fraud and abuse, not counterterrorism, which is why he did not know how to look for signs of radicalization or counterintelligence risk.

“In sum, the Pentagon’s force protection, discipline, promotion and counter-terrorism investigatory systems all missed this insider threat because they were designed for other purposes in earlier times, and deep-seated organizational incentives and cultures made it difficult for officials to change what they normally did,” she wrote.

Zegart acknowledges the difficulties of learning lessons from tragedies like 9/11, the NASA space shuttle accidents and the 2009 Fort Hood shooting.

“People and organizations often remember what they should forget and forget what they should remember,” she said, adding that policymakers tend to attribute failure to people and policies. While seemingly hidden at times, the organizational roots of disaster are much more important than many think, she added.

Media Contacts

Amy Zegart, Center for International Security and Cooperation: (650) 725-4202,
Clifton B. Parker, Stanford News Service: (650) 725-0224,