Tuesday, January 31, 2012

BlackBerrys, the Cloud, and Government Procurement

An article in the Technology section of the January 29th New York Times, “The BlackBerry, Trying to Avoid the Hall of Fallen Giants,” suggested that the formerly ubiquitous BlackBerry might go the way of the Sony Walkman, pagers, Palm Pilot, Polaroid Instant Cameras, and the Atari game console. BlackBerry producer Research in Motion (RIM) once owned more than half of the American smartphone market — it is now down to 10 percent and still dropping.

Just to put things in full context, the article, whose publication followed a meeting with the new RIM CEO (formerly its COO), also suggested that RIM could make "a triumphant comeback after a near-death experience" as Apple did with its iMac. Or, it could continue its “gradual decline and diminution as rivals like Apple, Google and Microsoft devour most of the market” while keeping “a small but dedicated following of corporate and government customers who want its proprietary messaging and security features.”

Regarding the latter scenario, I’m sure that a significant slice of that dedicated following would continue to be found at the Pentagon, whose residents have their BlackBerrys surgically attached so that they are never more than a fingertip away. The fact that they could then consider themselves part of a small group of elite users would even lend a certain cachet to their continued use.

Be that as it may, the possible impending demise of the BlackBerry underscores the astonishing speed of technology development — and the social, cultural, and business patterns that development spawns. It was only three years ago, after all, when journalists and pundits devoted much verbiage to the question of whether the new president would be forced to give up his “CrackBerry” — or would some way be found to make it secure enough for the POTUS (President of the United States) to keep it? (It was and he did. On the other hand, only ten people were given the authorization to ping his BlackBerry — and they have to be constantly aware that their messages might someday be subject to the presidential records act.) If he is reelected, President Obama may yet have to give up his BlackBerry — not for security reasons but because the company that supports them may have gone out of business.

So, in just three years we have gone from worrying whether the president will keep his BlackBerry to worrying about whether there will be a BlackBerry for him to keep. Yet, how can a government procurement system that often takes years to award contracts for technology acquisitions — even under existing contract vehicles — keep up with the speed of such change?

With that thought in mind, consider an article published in Federal Computer Week on January 27, entitled, “Is government procurement ready for the cloud?” Cloud computing has become a mantra for federal agencies, potentially offering unprecedented speed and agility, enabling agencies to "simply dial IT services up or down as needed to quickly support new mission plans or workload changes. As a bonus, agencies pay only for what they use instead of bankrolling the often idle, over-provisioned computing capacity common in most data centers."

However, as author Alan Joch points out, "IT procurement practices and contracting vehicles were designed to help managers provision hardware and software, not on-demand services." Although Joch presents the cases for and against — the current procurement system is seriously inadequate for the demands of acquisition in the cloud versus only minor changes are needed to make existing contract vehicles fully serviceable in the cloud market — those designed inadequacies are simply too significant to ignore or downplay.

The cloud may be a panacea to some but in order for the government to get its money’s worth it’s going to have to make the acquisition curve adhere a little more closely to the technology curve. Otherwise it’s going to be paying too much for cloud services that may already have become obsolete.

Thursday, March 18, 2010

Software security ain't just for geeks

A lot of security issues would be mitigated and alleviated if practices for ensuring the security of software were implemented on a continuous basis at the design and writing stages, rather than waiting for hackers to find and exploit the weaknesses and bugs. Unfortunately it almost always takes a costly and/or embarrassing hack or exploitation, like some of the egregious examples cited by Mr. Kabay in his article below, before organizations even consider doing something about it. Even then, the tendency is usually to look the other way and "hope" it won't happen again, rather than deal with things that involve political, cultural, and change management issues — or try to find a quick technical "fix."


Pushing for software quality assurance

By M.E. Kabay

Network World
March 17, 2010

In my experience, some programmers and program development managers resist investing time in software quality assurance (SQA). In a recent research article on "Resistance Factors in the Implementation of Software Process Improvement Project in Malaysia," from the Journal of Computer Science 4(3):211-219 (2008), the authors summarized extensive published research on why people resist SQA. Experts have found that there are several categories of stumbling blocks to integrating SQA into the software development process (Table 1, p 213):

• Human: failure to gain top-level, thoroughgoing support for process improvement.
• Political: perceptions of loss of power.
• Cultural: organizational resistance to changes in long-established patterns.
• Goals: unclear, undefined, unmeasured goals leave people confused and uncooperative.
• Change Management: SQA must be integrated with and support the mission-critical goals of the organization.

An essential step in implementing new SQA processes – and continuous process improvement (CPI) in general – in any organization thus involves convincing all involved stakeholders (employees, managers, shareholders and even customers) that the project is worth the effort. I have some ideas from teaching that may be helpful in this task.

One of the key steps in teaching is to show students why a subject is worth learning. My practice, developed through four decades of teaching, is to start every lecture with an informal overview of how a topic relates to the real world. Thus in discussing SQA in a management of information assurance course or a systems engineering course, showing students some cases where SQA was lacking is an entertaining way of bringing the message home vividly.

The Forum on Risks to the Public in Computers and Related Systems ("The Risks Forum") of the Association for Computing Machinery (ACM), ably run for more than 20 years by Peter G. Neumann, is a goldmine of reports on the consequences – some of them hilarious – of poor software design and failures of SQA. My now-slightly-elderly supplementary lecture from the IS342 Management of IA course at Norwich University has lots of slides you can use freely in your own presentation on SQA failures. Here are some of the stories that usually get my students' attention:

• A 3-year-old gets an IRS refund for $219,495.
• Microsoft publishes an unverified Spanish thesaurus which includes insulting slurs, resulting in a public relations debacle.
• The ENT Federal Credit Union ignores months of customer complaints about their automated teller machines, allowing the defective programming to count only the first withdrawal by a customer – and resulting in $1.2 million in losses.
• A dentist receives 16,000 identical copies of a tax form.
• Flintstones cartoon viewers in Springfield, Missouri are unexpectedly switched to watching the Playboy Channel.
• A vagrant applies to Sandoz for a $2 refund on a used bottle of Ex-Lax but receives a check in the amount of his ZIP code – $98,002 – and promptly disappears after cashing the check.
• A programming error in the First National Bank of Chicago system adds $900 million (yep, million) to each of 900 customer accounts for a total accounting error of $764 billion (yep, billion).
• Smith Barney adds $19 million to each of 525,000 accounts (for only a few minutes) for the largest accounting error in history: $10 trillion.
• Los Angeles County underpays its employee pension fund for 20 years due to a programming error, resulting in $1.2 billion in unexpected liability.
• And my favorite demonstration that nobody can do mental arithmetic anymore – a secretary accuses a professor of creating 4,294,967,026 copies in two weeks (3551 copies/second continuously 24 hours a day) because the photocopier says so – and removes his photocopying privileges!

Next time, I'll present an interesting study of the value of automated SQA testing tools.


M.E. Kabay, PhD, CISSP-ISSMP, specializes in security and operations management consulting services and teaching. He is Chief Technical Officer of Adaptive Cyber Security Instruments, Inc. and Associate Professor of Information Assurance in the School of Business and Management at Norwich University. Visit his website for white papers and course materials.

Friday, January 29, 2010

Despite Fears of a Post-9/11 Drop, Most Science, Engineering Post-Grads Have Stayed

The situation apparently is not as bad as I had thought, if the data and conclusions in the Wall Street Journal article I have linked to below are correct. The U.S. competitive position is not as at-risk as I had concluded from so many articles in the popular and industry press, as well as from anecdotal encounters. I had thought that, especially since 9/11, the United States had made it much harder for foreign students with degrees in science and engineering to stay here, with very detrimental consequences for U.S. industry, competitiveness, and the future.

This is also good news even for those of you who require your hires to obtain security clearances in order to work in the government or for government contractors. If the total pool of science and engineering PhDs is larger it means that the competition for finding and hiring qualified graduates for all positions is not as intense, relatively speaking. At least some of the positions that don’t require clearances will be filled by people who can’t or won’t get clearances, instead of by people who are eligible to obtain clearances. That isn't to say that they would be interested in jobs that do require clearances, but that's a different issue.

Nevertheless, the United States doesn't graduate enough students in the sciences and engineering. (Full disclosure and mea culpa: I started out back in my undergraduate days as an engineering major — but didn't finish. Frankly, I would not have made a good engineer. But, as many of you know, I have still retained a deep passion for science and engineering.) Yes, it's hard — or harder. But the way science is taught in K-12 leaves a lot to be desired. And when they graduate, science and engineering majors aren't paid enough. All in all, not the best incentives.


U.S. Keeps Foreign Ph.D.s
Despite Fears of a Post-9/11 Drop, Most Science, Engineering Post-Grads Have Stayed.


By David Wessel

The Wall Street Journal
January 26, 2010

Most foreigners who came to the U.S. to earn doctorate degrees in science and engineering stayed on after graduation—at least until the recession began—refuting predictions that post-9/11 restrictions on immigrants or expanding opportunities in China and India would send more of them home.

Newly released data revealed that 62% of foreigners holding temporary visas who earned Ph.D.s in science and engineering at U.S. universities in 2002 were still in the U.S. in 2007, the latest year for which figures are available. Of those who graduated in 1997, 60% were still in the U.S. in 2007, according to the data compiled by the U.S. Energy Department's Oak Ridge Institute for Science and Education for the National Science Foundation.

Foreigners account for about 40% of all science and engineering Ph.D. holders working in the U.S., and ...

Thursday, January 28, 2010

A modest proposal (and my comments)

I want to emphasize that the following article is supposed to be satirical (the author emphasizes that point several times during the article), though many of the points he makes are not satirical.

I also want to emphasize that almost all his criticism is directed at the CIA, although he takes a swing or two at the intelligence community as a whole. Note also that he is suggesting that the U.S. government should outsource the CIA, not the entire intelligence community.

Actually, he's a bit late in suggesting outsourcing of the CIA or any other intelligence agencies. Approximately three-quarters of the 100,000-plus people who work in the intelligence community are contractors. In my humble opinion, we've gone too far in terms of outsourcing intelligence responsibilities, functions, and activities that are and should be considered inherently governmental. But that's another story.

On the other hand, the idea that investigative reporters could do a better job than many CIA analysts and agents does have more than a little merit. On the other other hand, those former reporters will probably find that they haven't escaped dysfunctional management and bureaucratic politics just because they've left the newsroom. They may have just leaped from the frying pan into the fire.

But remember — the following article is a satire...


Outsource the CIA to Downsized Reporters

By Ron Rosenbaum

Slate.com
January 22, 2010

It's rare that one is able to solve two profoundly troubling societal problems with one quick fix, but I feel I've done it! Well, in a metaphorical, Swiftian, satirical "Modest Proposal" way. I suspect that most Slate readers will be aware that Jonathan Swift's 18th-century "Modest Proposal" to solve the Irish famine by encouraging starving parents to eat their children was meant as satire, right? Because when I ran my own modest proposal by a journalist friend, she took it a little too seriously, and heatedly informed me, "That's the worst idea I ever heard!" That's sort of the point! When things are bad, the only way to make the situation crystal-clear is to show how difficult it would be to come up with an idea that is ludicrously worse.

On the other hand, as they say in cheesy movies, "Sounds crazy, but it just might work!"

So: My modest proposal to solve America's "intelligence" failures is to fire the entire CIA and our other many tragically inept intelligence agencies and outsource all intelligence operations to investigative reporters downsized by the collapse of the newspaper business. Thereby improving our "intelligence capability" (it can't possibly get worse!) and giving a paycheck to some worthy and skilled investigative types — yes, some sketchy, crazed, paranoid (but in a colorful, obsessive, yet often highly effective way) reporters who once made the journalism profession proud, exciting, and useful, not boring stenography for the power elites.

How bad are things in U.S. intelligence? I refer you to a Jan. 20 Reuters report on the Congressional investigation into the failure to "connect the dots" on the Christmas bomber: the guy who — as just about everybody in the world except U.S. intelligence knew — was trying to blow up a plane. Why?

A senior counterterrorism official said on Wednesday his agency lacks "Google-like" search capability that could have identified the suspect in the attempted Christmas Day airline bombing.

The National Counterterrorism Center, the agency charged with reviewing disparate data to protect against attacks, does not have a computer search engine that could have checked for various spellings of the alleged bomber's name and his birthplace in Nigeria, the center's chief told a Senate hearing on security reform. "We do not have that exact capacity," said Michael Leiter, adding that the agency is working on solutions that could be in place within weeks.

Don't you love "that exact capacity"? Sort of trying to say they almost have the capacity but not ... exactly. Remember the Hertz commercial in which a junior exec gets some loser car and has to say that it's "not exactly" what he could have gotten from Hertz? I see Michael Leiter pulling up in a DayGlo-painted clown car, his crack team of Google-like computer experts in full clown makeup emerging with their Commodore 64s at the ready saying, "We almost identified the would-be terror bomber, but ... not exactly."

That's the Michael Leiter, by the way, who is our Supreme Chief of Connecting the Dots in our gazillionith reorganization of U.S. intelligence. Yes, that would be the same Michael Leiter who decided after the Christmas bombing attempt to proceed with his previously booked ski vacation. Hey, the "Google-like" capacity wouldn't be available for weeks, so why not spend some cozy time at a ski resort with all the fixings, maybe even some after-dinner Pong or Donkey Kong.

Why this guy hasn't been summarily fired, not just for the vacation (hope the trails were fluffy!) but for the lack of a "Google-like" search capacity for U.S. intelligence is baffling to anyone with any intelligence. And I think it's sufficient evidence that my modest proposal should be taken more seriously. After thinking about my proposal for a few days during my ski vacation, I've come up with some bells and whistles for it.

I wouldn't stop with firing our entire intelligence team (leave behind the slide rules, though, will you guys, before you turn out the lights — or blow out the candles?) and outsourcing their jobs to downsized print journalists. I'd include the unfairly down-played and down-market wings of media that get no respect, like The Enquirer. (Who knew Hugo Chavez had a love child?) I'd enlist the legion of bloggers and even celebrity-gossip Web sites to join in my new U.S. intelligence team. Do you have any doubt that it would take TMZ less than 48 hours to come up with an (alleged) Mullah Omar sex tape? Or for Gawker Stalker to spot OBL at a Kandahar bazaar. Or for Page Six to get the details on the "gymnastic" skills of Vladimir Putin's 20-somethng hottie and link her somehow to Tila Tequila?

I know: J-school ethicists wouldn't approve of this, and I agree with Glenn Greenwald's argument that journalists shouldn't get mixed up in government business, and I've practiced what I preached. (I once turned down a CIA offer to deliver a lecture about Hitler and the nature of evil at their Langley headquarters.) But, hey, do I need to reiterate that this is a "modest proposal" satire remember.

And I wouldn't limit my recruitment to the new CIA (renamed the Creative Intelligence Agency) to just the downsized. Why not unleash some of the still-employed fearsome legends of investigative reporting, like Sy Hersh or Jane Mayer, on our nation's foes? They invariably have ways of outsmarting the CIA's rudimentary secrecy protections, publishing leaks from inside sources. The warrantless wiretaps, the "black flight" illegal renditions of suspects to torture-friendly countries, the "enhanced interrogation" torture program itself, you name it — if the CIA's got a secret, the New York Times and the Washington Post have it a day later.

These reporters have managed to infiltrate the CIA far better than the CIA has infiltrated any terrorist organization. The CIA has compiled a history of failure so replete with lethal blunders that even when a self-proclaimed mole within al-Qaida told them he could get al-Qaida's No. 2 man, al-Zawahiri, they credulously and, alas, tragically got the go-ahead to trust him and ended up losing seven lives because they were so eager to end their relentless run of defeats.

Newsmen have taken such a beating lately from the likes of corporate consultant-racket profiteers such as Jeff Jarvis, who get paid handsomely to tell the executive drones who hire them as consultants that the collapse of the newspaper industry wasn't their fault. No, it was somehow the fault of the reporters they had to fire to maintain their perks, so that these execs don't have to carry anything on their conscience about it. Just keep paying Jeff to tell them fairy tales about the future, and someday they'll find an online business model that really, really works (for Jeff, anyway).

Indeed, my modest proposal might be a morale booster to show the world just how resourceful and skillful and "creative" U.S. reporters can be. I'm not suggesting a Pulitzer Prize for spying. Maybe a Congressional Medal of Honor though. (Kidding!)

Let's face it: The only good secrets our intelligence agencies have are quickly scooped up and published by ace investigative reporters. In fact, any group of people randomly selected from the phone book (or Facebook) could have compiled a better record than our intelligence agencies over the past half-century. They've made us a laughingstock.

Do I need to recount the dismal, abysmal, horrible, very bad record of U.S. intelligence agencies over the past half-century? They may as well have been run by our worst enemies. (Indeed, some paranoia-inclined analysts believe they were run by double agents and moles, but my inclination is to follow the maxim: "Never believe in conspiracies when sheer incompetence provides an explanation.")

You want to see incompetence? Look at the record (or read Tim Weiner's encyclopedic compilation of CIA failures, Legacy of Ashes, for a start). After their hall-of-fame-worthy bungling of the Bay of Pigs, CIA incompetents almost got us into a global nuclear holocaust over the Cuban Missile Crisis, when they assured the Pentagon in October 1962 that the Russians had not yet armed their nukes in Cuba. This turned out to be totally false: The nukes were assembled, armed, and aimed, and Khrushchev had given operational control to Castro already, so that the invasion the Joint Chiefs almost talked JFK into would have almost certainly been an instant Armageddon. Heckuva job, Langley!

Then there's the endemic problem with "connecting the dots" — which dates back to the veritable dot matrix that Lee Harvey Oswald presented, one that the CIA ignored (or siloed as the fashionable new management jargon has it). After all, consider Oswald: a guy who defects to the Soviet Union proclaiming his belief in communism and hatred of America, then redefects to the United States, where he gets deeply involved in violent post-Bay of Pigs CIA-financed intrigue, proceeds to threatens an FBI man who tries to question him, buys a rifle, and happens to work within gun scope range of the presidential motorcade. Nothing much here for the CIA to be concerned about or communicate to the Secret Service watch list, right?

Did all of this have something to do with the CIA being run by elite, WASP, Skull and Bones types who were pitifully easy marks for the darker-skinned people they were trying to control? Yes. But there was also a kind of Higher Stupidity at the CIA that masked itself as "complexity."

You can see this in the whole "molehunt" madness initiated by legendary (for paranoid delusion) James Jesus Angleton, the chief of the CIA's counter intelligence division for two misbegotten decades, who was made a fool of by Kim Philby, the British KGB operative who was perhaps the most obvious mole in history but who appealed to Angleton's Anglophilia, a pathology of most of the upper-class twits who ran the CIA from the beginning. After Philby made a fool of him, Angleton went mad, turned the CIA into a place where the paranoid inmates ran the asylum in their insane hunt for a nonexistent mole, a foolish crusade that utterly paralyzed the agency's chief mission: spying on the Soviets. And so at the height of the Cold War, the CIA had no intelligence it trusted about the Soviets.

Then, when it turned out there were no CIA moles during Angleton's watch, his hypervigilance discredited ordinary, rational vigilance and allowed a blundering creep like Aldrich Ames, a real mole, to steal every secret the CIA had for Soviet cash and cause the death of an untold number of our operatives in Moscow.

And then there was the Team A/Team B fiasco, another profoundly dangerous screw-up. It wasn't a bad idea in theory. George H.W. Bush, head of the CIA under President Ford, was persuaded that there was doubt about CIA estimates of Soviet missile progress, doubt raised by perennial "missile gap" alarmists. He appointed a team of outside "experts" to investigate and offer an alternative analysis, beyond the agency's.

They became known as "Team B" to the CIA's in-house "Team A," and they produced what turned out to be a totally inaccurate overestimate of Soviet capabilities and intentions. (See Cold War historian Pavel Podvig's demolition of Team B's conclusions in the light of history.) Nonetheless, in a kind of forerunner to the WMD fiasco, Team B's paranoid analysis became the basis for the $1 trillion arms buildup during the Reagan administration to match the Soviet's illusory gains. Paradoxically, Team B's overestimation and the insane overspending that resulted may have made them a key factor in causing the collapse of the Soviet Union. The CIA's stunning record of ineptness led to Team B's epoch-making mistake. As Dylan wrote, "There's no success like failure." Though, he added, "Failure's no success at all."

The CIA's post Cold War failures are all too well-known from the WMD fiasco. (CIA head George Tenet famously told the president it was "a slam dunk" they were there. Maybe by "slam dunk" he was thinking of water-boarding and other supereffective "enhanced interrogation" methods that were shamefully adopted by the intelligence community.) And, of course, the entire intelligence community had a hand in producing the now-discredited 2007 National Intelligence estimate on the Iranian nuclear weapons program, which left the credulous media with the impression: nuthin' goin' on.

A record like that, an unprecedented, massive, relentless record of failure deserves only one response: accountability. We've got to fire them all. At the very least, this action would say that failure won't be the new normal forever. But who to replace them with? And who to handle the transition?

My modest proposal may have been engendered by rereading something I wrote a while ago in Harper's ("The Shadow of the Mole," October 1983, subscription only) about the whole Angleton mole madness which mentioned a little noticed Washington conference on "intelligence," sponsored by a shadowy group called "The Consortium for the Study of Intelligence." The conference produced a volume, Intelligence Requirements for the 1980s. In it was a paper by veteran espionage journalist (and Slate contributor) Edward Jay Epstein, who I think should play a key role in managing the transition after we fire the CIA en masse.

Epstein's essay had the forbidding academic-sounding title "Incorporating Analysis of Foreign Governments' Deception Into the U.S. Analytical System." But buried within it was an important distinction between "Type A Deception," which involved manipulation of foreign governments' perception of our overt behavior, and "Type B Deception," which "purports to emanate from the highest levels of decision making" — and might involve journalists staging deception — giving the foe a false impression of our secret, esoteric strategy. I'm not doing its complexity justice.

But there was a key passage in the essay that startled me because it broke out of the gray, bureaucratic prose of most of the volume to raise an imaginative, even cinematic idea: a "Type B Deception" team. "It might conceivably employ functional paranoids, confidence men, magicians, film scenarists or whomever seemed appropriate to simulate whatever deception plots seemed plausible."

"Functional paranoids?" "Confidence men." He might as well be describing the mind and character of our better investigative reporters! I'm not strictly an investigative reporter myself, though I've done a lot of it, and I know a lot of them and I think I know the mind-set. They'd be a perfect fit for replacing our discredited intelligence community.

My first step would be to contact the IRE (the Investigative Reporters and Editors organization) and see whether we can scare up some volunteers. Then I'd ask Ed to be my (wartime) consigliore. We will save America from its external enemies! We will end abusive practices and endless bungles! We will put the dangerous, worse-than-useless CIA out to pasture.

That's my modest proposal.

Ron Rosenbaum is the author of The Shakespeare Wars and Explaining Hitler.

Sunday, January 3, 2010

Cyberdefenders Protect Navy Networks

Navy Cyber Defense Operations Command Gets On the Offensive to Guard Information Operations

By Mark Kagan

Military Information Technology
December 2009

The July 2009 announcement by the Chief of Naval Operations that a new Fleet Cyber Command/ Tenth Fleet (FLTCYBERCOM) would be stood up by the end of the year signaled the profound importance and priority that the Navy is giving to the cyberwarfare domain. FLTCYBERCOM, which will also become the Navy component of the new U.S. Cyber Command, will bring together under one command the Navy’s information technology, intelligence and communications operations and will eventually comprise 45,000 personnel.

A key component of FLTCYBERCOM will be the Navy Cyber Defense Operations Command (NCDOC), which is responsible for coordinating, monitoring and overseeing the defense of the Navy’s computer networks and systems and their 700,000 users worldwide. NCDOC’s areas of responsibility encompass the Navy’s centrally managed NIPRNet and SIPRNet enclaves, which consist of the Navy Marine Corps Intranet, Integrated Shipboard Network System and OCONUS Navy Enterprise Network. These networks total approximately 350,000 seats.

NCDOC’s areas of responsibility also include legacy and “excepted” networks. Legacy networks are those networks that have not migrated into a centrally managed enclave or have not been designated as an excepted network. Excepted networks are networks that have been authorized by the Cyber Asset Reduction and Security Task Force to operate independently of a centrally managed network. Legacy and excepted networks comprise approximately 190,000 seats.

NCDOC executes computer network defense (CND) across the Navy through a group of operations centers that are aligned to the centrally managed enclaves. Command, control and coordination of the defense of legacy and excepted networks vary because of the unique nature of these networks. NCDOC also maintains close liaison with the Naval Criminal Investigative Service, which is the Navy’s cybercrime prosecution authority.

Based in Norfolk, Va., NCDOC reports to the Naval Network Warfare Command and is operationally aligned to the Joint Task Force-Global Network Operations, the lead Department of Defense organization designated to identify and mitigate threats to DoD information networks and to direct the defense of the Global Information Grid (GIG).

COMPUTER NETWORK DEFENSE

As the Navy’s designated computer network defense service provider, NCDOC provides CND services to Navy networks worldwide and executes all computer incident response team responsibilities. CND services include actions taken to protect, monitor, analyze, detect and defensively respond to unauthorized activities within DoD information systems and computer networks. Unauthorized activities may include disruption, denial, degradation, destruction, exploitation or access to computer networks, information systems or their contents, or theft of information.

“We detect and act upon all security incidents, and anyone else in the Navy doing any kind of security functions is required to report any incidents to us,” said Jim Granger, director of capabilities and readiness at NCDOC. “Whether we’re detecting them or they’re detecting them, it all comes to us.”

By comparison to computer network defense, information assurance covers measures that protect and defend information and information systems by ensuring their availability, integrity, authentication, confidentiality and nonrepudiation. This includes providing for restoration of information systems by incorporating protection, detection and reaction capabilities.

The Navy’s global CND strategy is fully integrated with DoD’s overarching defense-in-depth strategy, which is designed to ensure continued operation of the GIG, even in a degraded state. It covers people, technology and operations and is based on both a strong IA posture and CND unity of command.

The core of the Navy’s global CND strategy is its centrally managed sensors, which are operationally controlled by NCDOC and which aggregate the incoming data for attack sensing and warning.

“Our systems provide a tremendous capacity to process and export disparate data formats and present a global view of network activity to enable holistic fusion analysis, trending and normalization of network activity,” said Granger. “We fight in a terrain with no boundaries and with highly adaptive adversaries. That requires a global perspective and deliberate processes.”

Deliberate processes are important because they allow for consistent training and repeatable, standardized operations across multiple watch sections operating on a round-the-clock basis.

NCDOC was established in 2006, after performing similar functions as the Navy Computer Incident Response Team (NAVCIRT) since 1996, making it one of the oldest cybersecurity organizations in the federal government. It currently has about 200 personnel and is expected to grow significantly within the next four to five years if funding is approved, reflecting both the projected growth in cyberthreats and attacks and the importance that the Navy places upon cyberdefense.

“I think that the level of attention that the networks — we don’t have a single network, we have multiple networks — [is] garnering is what’s going to help us attain the realization across the Navy and joint organizations that cyber is another warfare area that has to be considered and treated like the other warfare areas,” said Captain Stephanie Keck, NCDOC’s commanding officer.

Keck assumed command of NCDOC this past summer after serving as the Multi- National Force-Iraq Information Operations Chief in Baghdad. She has spent much of her Navy career in information operations doing offensive cyberwarfare and exploitations. Like many users, she admits that she didn’t pay much attention to what was going on in the cyberdefense arena.

“Since taking command of NCDOC, I’ve learned quite a bit about how difficult it is to defend a multiplicity of networks when users typically aren’t paying attention to the things that they ought to be doing or not doing,” Keck said. “I’ve also learned it takes a holistic approach to defend the network and not just technical solutions.”

Regarding awareness across the Navy about cybersecurity and the threats and vulnerabilities it faces, Keck observed, “It depends on which part of the Navy you’re talking about. At most senior levels, I would say that awareness is very high. The lower you move down the chain, the lower the level of awareness.”

NETWORK AWARENESS

The heart of NCDOC is Prometheus, a system-of-systems that receives, aggregates, processes, correlates and fuses realtime and near-real-time information from multiple network sources to provide network domain awareness. “Network domain awareness” — a term that NCDOC coined and uses instead of situational awareness — provides commanders with the intelligence to make better-informed decisions about the directions in which they need to go, resource allocations and operations.

“We say ‘network domain awareness’ instead of ‘situational awareness’ because we’re not trying to tell where the ships are or what the weather is or anything of that nature,” Granger explained. “Network domain awareness is about what’s happening on the network and about the health of the network.”

A retired Navy commander, Granger joined NAVCIRT, which became NCDOC in 2006, as the first civilian in 1997.

The huge and growing number of security events was the stimulus for the creation of Prometheus in 2006, which was built upon an earlier system called Mobius. The problem at the time was two-fold, according to Granger.

“First, there was the data crush, which was only growing,” he explained. “We couldn’t handle all the alarms and we couldn’t aggregate and correlate them. At the same time, we needed analytical tools that could handle the massive amounts of data.”

Prometheus collects three primary data classes:

• Referential data: What does the network look like?

• Activity data: What’s happening on the network?

• Command and control information: Who owns that portion of the network on which activity is occurring?

The data is collected from hundreds of sensors on the Navy’s networks, as well as intrusion protection systems, compliance reporting databases, and every type of log, and combined by Prometheus to provide the network domain awareness.

“This capability provides the Navy with an exceptional ability to develop a deep understanding of the environment and to characterize network activity and continuously move toward earlier recognition of anomalous behavior requiring in-depth analysis,” Granger said.

Using customized filters for tracking information, Prometheus can automatically detect trends within its database and initiate further analysis when suspicious activity occurs. “The filters give our watch standers the flexibility to see the incidents and other data that they need and ideally see only information that is actionable,” Granger said. “I want my guys to see only something that they’re going to do something about.”

EVENT MANAGEMENT

Prometheus has two primary components: a Novell Sentinel front-end for security event management, and a data warehouse back-end based on SAS Institute’s Intelligence Platform components, including SAS Enterprise BI Server, SAS Data Integration Server and SAS Intelligence Storage. Sentinel alerts and prioritizes all security events in a centralized dashboard that is easily accessed by security operators in NCDOC’s operations center at any time. The SAS data warehouse integrates and stores the large volumes of computer network defense data for longterm storage and trend analysis.

“Prometheus gives us tremendous flexibility,” Granger said. “It enables us to visualize data and it also enables us to export data in a common standards-based format. Even when we change a data source, our operators on the watch floor don’t have to change operating procedures or have to be re-trained on a new piece of equipment. They keep looking at the same interface, but they can view more information in perhaps a different manner.”

The Sentinel component of Prometheus has been heavily customized by NCDOC to meet the organization’s requirements. For example, “We’ve driven a lot of developmental work to build what we call ‘rightclick functionality,’ which allows our watch operators on the floor to right-click to do a ‘who-is lookup’ or automatically generate a trouble ticket or input tasks into the workflow,” Granger said.

The Naval Research Lab, a key contributor to the evolution of Prometheus, has developed most of the agents that create the bridge between individual data sources and Sentinel.

“As far as the SAS backend goes, it’s met the requirements,” Granger noted. “We had to build the tables to meet the unique requirements of the individual data sources, but I would call that using the product, not specifically tailoring or modifying. I like being in a business where I can say that we haven’t come close to touching all the capabilities of the product. That’s where we are now.”

For the future, NCDOC is focused on what Keck contends is the biggest risk for the Navy’s networks. “I think it’s where the threats, vulnerabilities and impacts come together,” she said. “I want to be more proactive about the actions we can take to reduce that risk, because you’re never going to be able to take care of all the threats, and you’ll never be able to patch all the vulnerabilities.”

Mark Kagan is a Washington, D.C.- based consultant and writer on defense, intelligence and security.

Wednesday, November 11, 2009

The CIA’s Bureaucracy Problem (and my comments)

By Ishmael Jones

National Review
November 5, 2009

An Italian court recently sentenced 23 CIA employees in absentia for their role in the 2003 Abu Omar rendition.

We should capture terrorists anywhere, any time, but we should get the job done right and with a minimum of bureaucracy. Real spying is inexpensive and requires few people. The basic act of espionage is a single CIA officer meeting a single source — a person with access to secrets on terrorists or nuclear proliferators, for example — in a dingy hotel room in a dysfunctional country.

Any CIA operation that is revealed to the public, however, shows these telltale signs: The operation looks busy, a lot of people are involved, and large amounts of money are spent. Often you’ll hear the CIA accused of being risk averse. I agree. However, risk aversion is a complex concept. The CIA will sometimes conduct risky operations in order to achieve a more important goal: looking busy. In the Abu Omar operation, 21 Agency employees flew to Italy to abduct a single terrorist suspect — as an eminent scholar put it, “21 people to get one fat Egyptian!” — who was already under surveillance by the Italian police. The 21 people stayed in five-star hotels and chatted with headquarters on open-line cell phones, all at great expense and awful tradecraft. The number of people managing the operation from headquarters was enormous. But it was a successful operation in that it spent a lot of money, made a lot of people look active, and suggested the CIA’s willingness to take risk.

CIA officials are quick to deny that the organization is risk averse by pointing to risky operations that went wrong. This darker, more complex, passive-aggressive aspect of risk aversion seems to say: We can certainly do risky operations, but here’s what happens when you make us get off our couch and do them.

Take a look at any CIA activity that is revealed in the future and ask yourself: Was this a traditional, inexpensive intelligence operation involving a meeting between a CIA officer and a human source to gather intelligence? Or was this an operation designed to spend a lot of money, to make a lot of people look busy, and to give the appearance that the CIA is willing to take risk?

Whenever we see CIA employees released from bureaucracy, we see success. The tactical intelligence production within Iraq is excellent; the early Afghan campaign, featuring no offices and a flat chain of command, just a few guys and bags of money, was extraordinary.

“Ishmael Jones” is a former deep-cover officer with the Central Intelligence Agency. He is the author of The Human Factor: Inside the CIA’s Dysfunctional Intelligence Culture, published last year by Encounter Books.

Mark says:

I post this article for several reasons. First, I highly recommend Jones' book, referenced above. At the very least, it provides insights into the fact that that the CIA is, probably before anything else, another large bureaucratic government agency.

Second, this article illustrates how bureaucratic politics, organizational drivers (rewards and punishments), and risk aversion can be as important, if not more important, than the goals and objectives of the organization, i.e., the CIA. Or, perhaps the stated public goals and objectives of the organization do not necessarily match or coincide with the bureaucratic goals and objectives of the organization?

Third, this article stands as a retort to the conspiracy mongers who think the CIA is an all-powerful, all-knowing entity responsible for everything that happens. Then again, the fact that the article was written by a former CIA agent would just be proof that it's another example of CIA disinformation to confuse and blind the naive public. Right?

Sunday, October 25, 2009

Historians Reassess Battle of Agincourt

By James Glanz

The New York Times
October 25, 2009

Maisoncelle, France — The heavy clay-laced mud behind the cattle pen on Antoine Renault’s farm looks as treacherous as it must have been nearly 600 years ago, when King Henry V rode from a spot near here to lead a sodden and exhausted English Army against a French force that was said to outnumber his by as much as five to one.

No one can ever take away the shocking victory by Henry and his “band of brothers,” as Shakespeare would famously call them, on St. Crispin’s Day, Oct. 25, 1415. They devastated a force of heavily armored French nobles who had gotten bogged down in the region’s sucking mud, riddled by thousands of arrows from English longbowmen and outmaneuvered by common soldiers with much lighter gear. It would become known as the Battle of Agincourt.

But Agincourt’s status as perhaps the greatest victory against overwhelming odds in military history — and a keystone of the English self-image — has been called into doubt by a group of historians in Britain and France who have painstakingly combed an array of military and tax records from that time and now take a skeptical view of the figures handed down by medieval chroniclers.

The historians have concluded that the English could not have been outnumbered by more than about two to one. And depending on how the math is carried out, Henry may well have faced something closer to an even fight, said Anne Curry, a professor at the University of Southampton who is leading the study.

Those cold figures threaten an image of the battle that even professional researchers and academics have been reluctant to challenge in the face of Shakespearean prose and centuries of English pride, Ms. Curry said.

“It’s just a myth, but it’s a myth that’s part of the British psyche,” Ms. Curry said.

The work, which has received both glowing praise and sharp criticism from other historians in the United States and Europe, is the most striking of the revisionist accounts to emerge from a new science of military history. The new accounts tend to be not only more quantitative but also more attuned to political, cultural and technological factors, and focus more on the experience of the common soldier than on grand strategies and heroic deeds.

The approach has drastically changed views on everything from Roman battles with Germanic tribes, to Napoleon’s disastrous occupation of Spain, to the Tet offensive in the Vietnam War. But the most telling gauge of the respect being given to the new historians and their penchant for tearing down established wisdom is that it has now become almost routine for American commanders to call on them for advice on strategy and tactics in Afghanistan, Iraq and other present-day conflicts.

The most influential example is the “Counterinsurgency Field Manual” adopted in 2006 by the United States Army and Marines and smack in the middle of the debate over whether to increase troop levels in Afghanistan.

Gen. David H. Petraeus, who oversees the wars in Iraq and Afghanistan as the head of the United States Central Command, drew on dozens of academic historians and other experts to create the manual. And he named Conrad Crane, director of the United States Army Military History Institute at the Army War College, as the lead writer.

Drawing on dozens of historical conflicts, the manual’s prime conclusion is the assertion that insurgencies cannot be defeated without protecting and winning over the general population, regardless of how effective direct strikes on enemy fighters may be.

Mr. Crane said that some of his own early historical research involved a comparison of strategic bombing campaigns with attacks on civilians by rampaging armies during the Hundred Years’ War, when England tried and ultimately failed to assert control over continental France. Agincourt was perhaps the most stirring victory the English would ever achieve on French soil during the conflict.

The Hundred Years’ War never made it into the field manual — the name itself may have served as a deterrent — but after sounding numerous cautions on the vast differences in time, technology and political aims, historians working in the area say that there are some uncanny parallels with contemporary foreign conflicts.

For one thing, by the time Henry landed near the mouth of the Seine on Aug. 14, 1415, and began a rather uninspiring siege of a town called Harfleur, France was on the verge of a civil war, with factions called the Burgundians and the Armagnacs at loggerheads. Henry would eventually forge an alliance with the Burgundians, who in today’s terms would become his “local security forces” in Normandy, and he cultivated the support of local merchants and clerics, all practices that would have been heartily endorsed by the counterinsurgency manual.

“I’m not one who sees history repeating itself, but I think a lot of attitudes do,” said Kelly DeVries, a professor of history at Loyola College in Maryland who has written extensively on medieval warfare. Mr. DeVries said that fighters from across the region began filtering toward the Armagnac camp as soon as Henry became allied with their enemies. “Very much like Al Qaeda in Iraq, there were very diverse forces coming from very, very different places to fight,” Mr. DeVries said.

But first Henry would have his chance at Agincourt. After taking Harfleur, he marched rapidly north and crossed the Somme River, his army depleted by dysentery and battle losses and growing hungry and fatigued.

At the same time, the fractious French forces hastily gathered to meet him.

It is here that historians themselves begin fighting, and several take exception to the new scholarship by Ms. Curry’s team.

Based on chronicles that he considers to be broadly accurate, Clifford J. Rogers, a professor of history at the United States Military Academy at West Point, argues that Henry was in fact vastly outnumbered. For the English, there were about 1,000 so-called men-at-arms in heavy steel armor from head to toe and 5,000 lightly armored men with longbows. The French assembled roughly 10,000 men-at-arms, each with an attendant called a gros valet who could also fight, and around 4,000 men with crossbows and other fighters.

Although Mr. Rogers writes in a recent paper that the French crossbowmen were “completely outclassed” by the English archers, who could send deadly volleys farther and more frequently, the grand totals would result in a ratio of four to one, close to the traditional figures. Mr. Rogers said in an interview that he regarded the archival records as too incomplete to substantially change those estimates.

Still, several French historians said in interviews this month that they seriously doubted that France, riven by factional strife and drawing from a populace severely depleted by the plague, could have raised an army that large in so short a time. The French king, Charles VI, was also suffering from bouts of insanity.

“It was not the complete French power at Agincourt,” said Bertrand Schnerb, a professor of medieval history at the University of Lille, who estimated that there were 12,000 to 15,000 French soldiers.

Ms. Curry, the Southampton historian, said she was comfortable with something close to that lower figure, based on her reading of historical archives, including military pay records, muster rolls, ships’ logs, published rosters of the wounded and dead, wartime tax levies and other surviving documents.

On the English side, Ms. Curry calculates that Henry probably had at least 8,680 soldiers with him on his march to Agincourt. She names thousands of the likely troopers, from Adam Adrya, a man-at-arms, to Philip Zevan, an archer.

And an extraordinary online database listing around a quarter-million names of men who served in the Hundred Years’ War, compiled by Ms. Curry and her collaborators at the universities in Southampton and Reading, shows that whatever the numbers, Henry’s army really was a band of brothers: many of the soldiers were veterans who had served on multiple campaigns together.

“You see tremendous continuity with people who knew and trusted each other,” Ms. Curry said.

That trust must have come in handy after Henry, through a series of brilliant tactical moves, provoked the French cavalry — mounted men-at-arms — into charging the masses of longbowmen positioned on the English flanks in a relatively narrow field between two sets of woods that still exist not far from Mr. Renault’s farm in Maisoncelle.

The series of events that followed as the French men-at-arms slogged through the muddy, tilled fields behind the cavalry were quick and murderous.

Volley after volley of English arrow fire maddened the horses, killed many of the riders and forced the advancing men-at-arms into a mass so dense that many of them could not even lift their arms.

When the heavily armored French men-at-arms fell wounded, many could not get up and simply drowned in the mud as other men stumbled over them. And as order on the French lines broke down completely and panic set in, the much nimbler archers ran forward, killing thousands by stabbing them in the neck, eyes, armpits and groin through gaps in the armor, or simply ganged up and bludgeoned the Frenchmen to death.

“The situation was beyond grisly; it was horrific in the extreme,” Mr. Rogers wrote in his paper.

King Henry V had emerged victorious, and as some historians see it, the English crown then mounted a public relations effort to magnify the victory by exaggerating the disparity in numbers.

Whatever the magnitude of the victory, it would not last. The French populace gradually soured on the English occupation as the fighting continued and the civil war remained unresolved in the decades after Henry’s death in 1422, Mr. Schnerb said.

“They came into France saying, ‘You Frenchmen have civil war, and now our king is coming to give you peace,’ ” Mr. Schnerb said. “It was a failure.”

Unwilling to blame a failed counterinsurgency strategy, Shakespeare pinned the loss on poor Henry VI:

“Whose state so many had the managing, That they lost France and made his England bleed.”