A unified theory of Spygate: The (partial) story and scope (as we know it so far)

A unified theory of Spygate: The (partial) story and scope (as we know it so far)
The brain trust, briefing Congress in 2014. (Image: Defense Intelligence Agency)

This article won’t be quite as comprehensive as the title may suggest.  But I think it may be useful as a way of backing off from the weeds we’ve been in for the last three years, and seeing some things whole: things that have been evident to those with expertise in particular aspects of Spygate, but perhaps not to everyone else.

My hope is to keep this tight and focused.  So I’m not going to provide the full arguments and justifications here.  They have been dealt with elsewhere; the links are included for further reading.

The focus here is the “how” of Spygate.  It’s not about the “why,” at least not at the level of story and song.  I’ve written extensively about that too, and for our purposes here will clarify that I believe the Spygate actions, per se, undertaken prior to 2015 were about retaining an ability to spy on political opponents, successors, and all other Americans even after Obama left office in January 2017.  The Spygate actions taken in the fall of 2015 and thereafter were mostly about exercising this capability against Donald Trump.

Russiagate was a component of Spygate, not vice versa.  Russiagate was about crafting an anti-Trump narrative for the 2016 election.  Spygate was about entrenching a mechanism with which to hold the American political sphere at risk, even if one did not hold the White House at a particular time.  A “deep state” contingent could tend the mechanism during out-of-office periods.  And the intention was precisely to do so.

Trending: Biden tells potato farmer complaining about overregulation to get job hauling chicken manure

The breach in the wall

The story of the “how” begins at the end of 2009, with the “Underwear Bomber” airline incident on Christmas Day. The short take on it is that at the time, it was held to reveal an information disconnect, one that had prevented U.S. and British agencies from intercepting the would-be perpetrator before he got on the plane with a bomb in his briefs.

The Obama official who spearheaded the U.S. policy review over the next 24 months was the counterterrorism “czar,” John Brennan.  He and the appointed task force concluded that to achieve better interagency coordination, it was necessary to loosen up the sharing of FBI data with the National Counterterrorism Center (NCTC), an agency of the Office of the Director of National Intelligence (the DNI’s agency, ODNI).

The new guidelines were about personal information sharing, including on U.S. persons.  They formalized an exchange of U.S. person identity information (USPI) that had not existed before between the two agencies – because the data exchange reveals the identity of the person in connection with potentially criminal information.  When it’s a foreign person overseas, that’s one thing.  When it’s a U.S. person with Fourth Amendment rights, it’s another.

Congressman Louie Gohmert (R-TX) has also highlighted that a DOJ submissions to the FISA court on this matter, in April 2012, addressed sharing of citizens’ sensitive personal information with the other members of the “5 Eyes” collective (U.S., UK, Canada, Australia, and New Zealand).  He is right to regard that as being of concern, and quite possibly relevant to Spygate.

Among federal agencies, the FBI and NCTC have just about the least-questioned access to USPI.  The DEA may also be included, but its access is for a narrower purpose, and may encounter the check of review in criminal court cases more frequently.

Don’t even think about it. Guard gate at the National Counterterrorism Center (NCTC) in McLean, VA. Google Street View

Because of their missions, the FBI and NCTC are assumed to need such access for national security.  The things they do with it may never have to be justified in court, or even exposed.

Armed with their MOU, the agencies had a formally approved mechanism for sharing USPI that hadn’t been shared before, as USPI associated with trackable activities.

The Brennan factor

Brennan is significant in this mix, because in 2008 and 2009, a company he was the president of (from 2005 to the end of 2008) obtained database maintenance and analysis contracts with both the NCTC and the FBI.  These were the databases in which the data we’re talking about were resident.

Brennan of course relinquished his role as president of The Analysis Corporation (TAC) when he joined the Obama administration in January 2009.  But he was deeply embedded in the design and manning of the NCTC from its inception, having been its first director and in charge of its immediate predecessor in the 2003-2004 period.  Brennan’s company getting a contract with it a handful of years later was a matter of people who had come from the agency, now working for Brennan (the former boss of many of them), going back with the same clearances and expertise to man it as contractors.  For quite a few, it was a matter of little more than getting a new contractor’s badge to hang on the lanyard, and receiving the paycheck from someone else.  This is a well-understood pattern in government agencies.

Obama with former CIA Director John Brennan. (Image: Screen grab of YouTube video)

When the FBI contract was obtained the next year, a similar dynamic was at work.  This is the take-away: starting in 2009, people who all worked for Brennan’s old company were accessing and managing the databases at both the FBI and the NCTC.  By January 2012, they had an MOU allowing that data to be shared more readily than it had been before.  The MOU was brokered by the man who had been president of the company they worked for, only months before he took the job with Obama.

This data-sharing vehicle may seem on its face to have only a tangential relation to spying on political opponents.  But it brings together as nothing else does four essential elements of the Spygate scandal:

Automated access though intelligence systems to USPI;

Known contractor involvement in the systems through which such access could be made;

Interagency sharing of data foreknown to include USPI;

And a direct connection of the contractors to the most principal Spygate principal of them all, John Brennan.

That’s a moral-hazard wormhole wild enough to make a starship buck around passing through it.  (In fact, we’ve seen recent events – also here – that suggest the wormhole is finally being closed up, at the instigation of just-departed acting DNI Ric Grenell.)

The IC’s IT transformation

At the same time, between 2009 and 2012, the U.S. intelligence community was beginning to field a set of information technology capabilities under the enterprise name ICITE – Intelligence Community Information Technology Enterprise.  A key component of ICITE was a shift to a broader, more-accessible data-mining capability for IC users of the various agencies’ products (users who included FBI and NCTC personnel as well as others).  One example of that effort was the ICREACH search engine, which is useful (see link) because it illustrates the principles behind shifting from an information-push environment to information-pull.

Here are the two money quotes from IC officials on the purpose and intent of the ICITE project.  From NSA in November 2014:

In 2010, NSA decided to pursue cloud as its “repository of choice”—a common space where all analysts could look across the agency’s entire river of collected information.

“The common space is hugely important for us—and the ability to discover,” says the NSA official. “We wanted to make all data discoverable. That’s the approach we took.”

NSA makes the data discoverable through metatagging. Each piece of intelligence information in the NSA cloud is marked with telling details, such as when it entered the system, who put it there and who has the ability to access it.

That was the environmental shift in the data-access concept being implemented at the same time loosened data-sharing guidelines were being prepared for the NCTC and FBI.  The guidelines, per the 2012 DOJ FISA request, included automated retrieval at NCTC.

From DNI James Clapper in 2015:

The Intelligence Community Information Technology Enterprise is designed to provide a cloud-based, common desktop environment for intelligence agencies. It will offer an app store, network engineering services, and a method for identifying and authenticating users. The goal is to make information available at the touch of a button — but only to the right people with the right access.

“Tag the people, tag the data” has been Clapper’s refrain since plans for the effort were first announced.

The system is designed to avoid physical restrictions on data. “Any necessary separation of data within ICITE shall be achieved through a logical construct instead of by physical separation to the greatest extent possible and in accordance with applicable legal and policy requirements,” the memo states.

Taken together, these quotes mean the IC was intent on making more information data-mineable, and automating the access to it.

There were supposed to be safeguards in place to keep USPI from being pulled improperly.  But those safeguards were a matter of careful practice by analysts.  They didn’t exist as system blocks on accessing data.  “Tag the people, tag the data” in practice meant that if your user account had certain credentials, you could pull it.  The cells of data carried security tags (“metatagging”) for which your user ID met the criteria.  Accessing USPI was a matter of running clever queries against the database.

(Image: Screen grab of CNN video, YouTube)

Add one more factor to this IT transformation: the IC contract in 2013 with Amazon Web Services to operate an IC cloud at the Top Secret level.  The real impact of this was not so much on the first-order user operations as on the follow-on handling of Top Secret/Sensitive Compartmented Information (TS/SCI data).  Moving data to the cloud, especially if done manually, could cut the data’s tether to the original classification and tracking markers, and enabled a user to share data without inviting immediate system-level scrutiny.

In the older environment of locally-managed systems, which were monitored and audited relatively frequently by the individual agencies, it was hard to escape detection if one were slinging data around nefariously.  Sling it in the cloud, however, and detection on any meaningful timeline was much less likely.

The importance of the cloud can’t be overemphasized.  The ability of people in the IC to use data-mining to spy on even their own bosses depended on it.  That’s how the Obama administration was going to extend the spying on the Trump administration indefinitely, far beyond Trump’s inauguration day.  If not for Devin Nunes and Admiral Mike Rogers, they might have gotten away with it too.

(The Obama administration’s last-minute modification to Executive Order 12333, undertaken in an unusually subterranean manner, was a measure to facilitate continued “spying by data-mining” after Trump entered office.)

It’s never too often, meanwhile, to point out that Jeff Bezos still owns both Amazon Web Services, with its IC cloud contract, and the Washington Post.  I’m not a fan of extra laws, but there ought to be one against that.

One of several massive Amazon Web Services data centers in northern Virginia, providing cloud computing services to government agencies. (Image: Google Street View)

The White House component

The White House had to be the hub of such activity, even under a different president (i.e., not Obama, but his successor, whoever it was).  The top-level reason for that is one I’ve laid out many times before.

There is one entity in the federal government that to all intents and purposes never gets audited, and it’s the Executive Office of the President.  The National Security Council is part of the EOP.  If you need to avoid scrutiny for your spying-by-data-mining operations, that’s literally the safest place to honcho them – unless Devin Nunes is on the qui vive.

Even if you have to distribute your operations across agencies – Treasury, FBI, NCTC, CIA, ODNI – you need a central hub somewhere.  The NSC is likely to be your best option, because of what Admiral Rogers could tell us about the unlikelihood of imposing accountability on the NSC staff, unless the president himself intervenes on your side.

There’s another aspect of the White House component, however, and that’s the ability to spy directly on the White House.  It was quite hard to do that by modern, info-age methods up until 2015, when the Obama administration updated the entire White House IT apparatus.

Now it’s not so hard – unless the Oval Office sees you coming.  I suspect that the Trump administration being startled in that regard is why the Obama-era IT coordinator for the president’s office was abruptly dismissed in February 2017, without explanation.  Obama had the White House IT infrastructure updated in 2015 so it could be more readily spied on after he left.  Trump didn’t appreciate that.  The holdover guy got canned.

The sequence

From 2013 to 2016, the desktop fielding of the common ICITE user suite exploded from 9,000 terminals in 2013 to 50,000 in 2016.  In the same period, as documented in Mike Rogers’s confessional report to the FISA court in October 2016, the non-contents FISA Section 702 queries that delivered USPI to users of the NSA data stores exploded, as depicted in the now well-known graphic from an ODNI report published in 2017.

(ODNI transparency report graphic)

In other words, the practice in the intel community of backdoor data-mining on U.S. persons was well underway before Trump came down the escalator in June 2015.  When I identified the likely method by which the Obama administration had been spying on Trump, after Trump’s notorious tweet about “wiretapping” in March 2017, that capability is the one I had in mind.

The Obama administration was probably spying on all the candidates in 2015 and 2016.  It wasn’t hard to do.  It didn’t have to be done at any one particular place either; it just had to be done by people whose user accounts on the IC system allowed them to access the relevant data.

We know contractors at the FBI had access to USPI; that problem was reported on 9 March 2016.  The most likely contractors were the ones hired from TAC.  But those contractors, and TAC contractors at the NCTC, could have been sharing the data with anyone who had an accredited account on the IC’s government-wide TS/SCI system.  That could encompass quite a list of people, including such luminaries as Nellie Ohr – also a contractor with a clearance, and a history with the CIA – and officials in the national security organizations at the FBI and DOJ.

We also know that similar queries were being run by people at the Treasury Department.  We have reason to believe they were being run by staffers at the NSC as well, given the implications of spreadsheets being created for Susan Rice.  Populating a spreadsheet suggests importing more data than Rice was likely to be authorizing specific “unmasking” actions for.

Moreover, it probably wasn’t just the candidates for president who were being spied on by data-mining.  We also know that members of Congress and the media were spied on earlier in the Obama years.  They were probably being spied on during the 2016 campaign too.

As mentioned, the 9 March 2016 discovery about contractors for the FBI represented a major break in this headlong career of spying-by-data-mining.  (I believe the contractors in question were at the FBI Washington Field Office, which would have been an ideal location for what they were probably doing.  Lisa Page referred in a text on that date to a big blow-up at the “wfo.”)

Admiral Rogers instituted new controls on 18 April 2016, which were meant to curtail the “backdoor unmasking” that federal agencies had effectively been performing at that point for at least four years.  This move by Rogers drove the Obama administration Spygate planners (of whom I imagine Brennan to be the chief strategist, although not the only one involved) to do two things.  One was to shift the backdoor operation further underground, to make it less visible.

The other was to seek a means of harvesting data through the front door.  That’s what the FISA authority drama was about in 2016.  In mid-April it was looking increasingly likely that the Republican candidate for president would be Trump.  By the time the FBI was making a full-on structured effort to obtain FISA authority (i.e., as early as June 2016), it was clear the candidate would be Trump.

YouTube

The effort therefore focused on Trump.  It is evident, in hindsight, that there was a concerted effort against Trump underway before mid-April, and we can assume that it entailed not just the services of Stefan Halper, Joseph Mifsud, and Alexandra Chalupa’s network, but the use of spying by data-mining.  All of these things had been in play well before the Mike Rogers interdiction date of 18 April 2016.

Remember, the backdoor data-mining didn’t stop after 18 April.  It was being done more stealthily.  Rogers made his report on it to the FISA court in October 2016, and was threatened with firing by James Clapper for his pains.

Why apply to the FISA court at all?

Since the backdoor data-mining remained an option, and FISA authority against U.S. persons has to be renewed frequently, I’ve always had a question why the Spygate planners chose to go this route.  My interim conclusion has been that – since they expected Hillary Clinton to win – they assumed they would find something prosecutable in Trump’s circle, and they’d be able to use it in a Hillary Clinton administration to take Trump down for good in court.

To do that, they’d have to have information on him and/or his associates acquired through the front door.  They might gun-deck the actual prosecution in a Hillary administration, but not the fine point of how they acquired criminally actionable evidence.  That simply had to have a paper trail through the front door.

On the off chance Trump won the election, they might still be able to use evidence acquired through the front door.  So it was worth the shot to go ahead with it.  It seemed to cover both contingencies.

The problem was that there wasn’t anything to find.  Now, the truth is that they would already have known that by October 2016, when they succeeded with the Carter Page application.   They had been backdooring Trump and his associates for at least 16 months at that point.  I’ve made this point repeatedly as well.  By October 2016, they already knew.  They knew what there was to find in the pasts of Trump and his circle.

That means everything they were doing by that date was a strategy for something other than finding out stuff about Trump.  Indeed, they already knew what there was to find out about Trump’s past by mid-2015 at the latest.

The whole thing, from the fall of 2015 onward, was about what the “insurance policy” was going to be.

The big-league implications

This is where it’s imperative to focus as we think.  Forget everything else for the moment, and concentrate on this.

The insurance policy wasn’t to address only the problem of Trump.  The Obama administration had been building the insurance policy since before 2012, more than three years before Trump threw his hat in the ring.  (I believe the Obama team found the existing plans for the intel systems involved, which were a natural outgrowth of IT trends, to be a ready opportunity.  They didn’t make it all up from scratch.  But they did recognize the scope of the opportunity presented to them by the coincidence of timing and technology.)

The insurance policy was to keep the levers of risk available to the departing administration, even when it left the White House.  The point wasn’t to spy enough to bring Trump down.  The point was to keep spying no matter what.

The “Old” (Eisenhower) Executive Office Building across from the White House in Washington, D.C.. (Image: Wikimedia)

There would always be someone in the “deep state” with the clearance to get a user account on the IC system.  The backdoor-mining capability and the IC cloud had to be kept available.

They reckoned without Nunes and Rogers, however.  Nunes was attacked with extraordinary vituperation precisely because within six weeks of Trump taking office, he had zeroed in on the really big risk-lever the deep staters and their backers had to keep hold of: the ability to continue backdoor mining.

Six lousy weeks, and they faced losing it all.  In April 2017, Mike Rogers did reduce the availability of the raw data substantially by ceasing to hold it in the database.  That was a blunt method of cutting users off; it appears that the success of this approach (as noted also by sundance at Conservative Treehouse on Sunday) has been mixed.  Improper queries continued in the thousands into 2018.  Consider, at this point: it’s not necessarily people with the Trump administration’s priorities or tasking who are doing that.

Here’s what we cannot afford to forget.  They – the overlapping Obama alumni and deep staters – want it back.  The Obama administration had a motive to keep the spying going even before Trump became the GOP candidate in 2016.  That motive preceded the dust-up over Hillary’s emails (which sparked to life in 2015), and even the earliest pretend-“agreement” with Iran (i.e., the Joint Plan of Action in 2013).  It was never about those things; about “CYA” or legacy-tending.

It’s extremely doubtful that it was about a single policy or set of events.  Given how far back it goes, and the catastrophic tenacity with which its proponents have acted on it, this motive seems to be much more general.  It’s about holding onto power: about having an insurance policy that snaps its jaws down on whoever walks up the steps at 1600 Pennsylvania Avenue, in January of every fourth year.

J.E. Dyer

J.E. Dyer

J.E. Dyer is a retired Naval Intelligence officer who lives in Southern California, blogging as The Optimistic Conservative for domestic tranquility and world peace. Her articles have appeared at Hot Air, Commentary’s Contentions, Patheos, The Daily Caller, The Jewish Press, and The Weekly Standard.