Sorry about the light posting last week, but I had a bit of an adventure. Let me tell you about it.
One of my favorite articles of all time was published in 1993 in the Bulletin of Atomic Scientists. Called “Sleuthing From Home,” Vipin Gupta and Phillip McNab detailed their effort at VERTIC to use commercial satellite imagery and seismic networks to detect, characterize and announce a Chinese nuclear test before the Chinese government did. Today, it would be quite a bit of work. Twenty-years ago? It was career-altering.
Over the past week I had my own “sleuthing from home” experience. NNSA beat me and two colleagues to the punch, but just barely.
On Tuesday, Eric Berenson, a long-time blog reader, noted an unusual seismic event at the Nevada National Security Site — better known as the Nevada Test Site:
For years I have been looking at the USGS website for earthquakes that occur exactly on the hour or half hour, or that have a depth of 0km. Used to see them back 6 or 7 years ago but then they stopped. I check almost everyday and this morning I spotted one. I saved all the info and the waveforms. Here is what I saw:
——————————-
Magnitude 2.1
Date-Time Tuesday, July 24, 2012 at 11:00:00 AM at epicenter Location
37.224°N, 116.062°W
Depth 0 km (~0 mile) set by location program Region NEVADA——————————-
This looks like Rainer Mesa. I didnt check on a map yet but it sure looks like it. So I went back to the USGS site a couple of hours later and it was REMOVED. Since when does the USGS remove earthquake data from public circulation ? Do they really do that?
This certainly looked like a man-made explosion. The location (at the Nevada Test Site), the depth (shallow) and the time (11:00 am on the dot) all scream “it go boom.” We had a probable 2.5 metric ton (Mb 2.1=4.05 + 0.75 logY) explosion at the Nevada Test Site (37.224N, 116.062W +/- 0.9 km).
The fact that USGS subsequently deleted the event entry (here is the dead link) struck me as very interesting.
I sent off two emails — one to the University of Nevada-Reno, which operates the regional seismic network in Nevada, and another to the United States Geological Survey.
The first email I received, from UNR, explained that the event was removed from the main website because it had been determined to be a “blast” and not an earthquake. An entry for the event remained online. The researchers were straightforward and transparent in their response. You know, like scholars.
The second email I received, from USGS, was more like what one expects from bureaucrats. Non-responsive. Not helpful. Possibly dissembling, although that implies a level of attention to my email that seems unrealistic. An entry for the event has now reappeared elsewhere, but I am still not convinced the removal was simply because the event was small.
With the confirmation that the event was a blast, I emailed Nick Roth, who knows more about what NNSA is doing than just about anyone at NNSA. Nick and I were able to make further suppositions about the nature of the event at NTS. We could exclude a subcritical test gone very wrong at U1a, as well as a conventional explosion at the BEEF (Big Experimental Explosives Facility). Those two locations were just too far away from the hypothesized epicenter.
That left one option — but NNSA beat us to announcing it. NNSA announced that it conducted seismic Source Physics Experiment-3:
The National Nuclear Security Administration (NNSA) today announced that it has successfully conducted the third seismic Source Physics Experiment (SPE-3) at the Nevada National Security Site (NNSS). The seismic experiment was the third in a series of seven underground, fully-coupled, high-explosive field tests.
SPE-3 included detonating a chemical explosive equivalent to 2,200 pounds of TNT in a contained, confined environment 150 feet below ground.
I think we would have figured it out eventually — all three experiments (including SPE -1 and SPE-2) were conducted within a few hundred meters of the hypothesized epicenter for our event. Here is a map showing the location of SPE-series:
One of the other slides contains the latitude/longitude. That’s the same spot as the hypothesized epicenter of our event. Indeed, 2011 satellite images show what appear to be preparations for previous SPEs. (You could do a nice overlay if you had the time.)
In general, I want to compliment NNSA on making available all this information. One detail, however, that NNSA seems careful to obscure is the date for SPE-1, SPE-2 and SPE-3. Keeping the date a secret might explain why these events don’t appear in certain catalogues. I am not okay with that.
Three observations about our little adventure.
First, regional seismic networks offer exquisite monitoring capabilities. The US network detected a 1 ton event, correctly identified it as an explosion and located it within a few hundred meters. As a demonstration of the power of regional seismic monitoring, this is jaw-dropping. The fact that much of this was possible on an open-source basis was pretty cool, too.
Second, I am not convinced of the integrity of the data provided by US Geological Survey. Events with an Mb less than 2.5, should appear in the ANSS catalogue, but this one does not — nor can I find plausible entries for SPE 1 and 2. (I searched for similar magnitude events in the same area. I made sure to check “all events” as well as “events with no reported magnitude.” Perhaps I am still doing something wrong, but as best I can tell USGS deleted these events.)
Consider a future researcher who claims (incorrectly) that the regional monitoring network “missed” the SPEs — series of large explosions conducted for the very purpose of CTBT verification at NTS. How would anyone know this was incorrect, unless some blogger-scum noticed USGS deleting the event in real time and then observed it had not been submitted it to ANSS?
Eric observed that he stopped seeing anthropogenic events (on the hour, zero depth) about six or seven years ago. That may point to a policy change that has persisted into the Obama Administration. I wonder if the Director of USGS, Marcia McNutt, is aware of such a practice and whether she would defend it.
Third, even if the removal of “blasts” is a transparent decision, I still think it sets an unhelpful precedent from a national security perspective. I understand that most people are interested in only earthquakes, but some other people are interested in treaty verification. These are both legitimate interests served by collecting seismic data.
Iran, for example, has an exquisite regional earthquake monitoring network that could provide significant monitoring capability against any Iranian nuclear test. Imagine the reaction if Tehran openly admitted to systematically removing man-made explosions from that data set?
A precedent of removing explosions from seismic catalogues would prevent one of the best confidence building measures we might pursue with with regard to Iran’s nuclear program and the Comprehensive Nuclear Test Ban Treaty — exchanging real-time regional seismic data. The three best regional monitoring networks in the world are based in Iran, Japan, and the Western United States. One of my goals is encourage Iran to share regional seismic data on a real-time basis with one of these comparable networks — Japan is probably easiest. Data exchanges would help both countries collaborate on disaster response by learning how the other detects, characterizes, warns and responds to earthquakes rapidly (a Japanese specialty I experienced first hand). The fact that this collaboration would also have a significant nonproliferation benefit is a nice side-benefit.
Of course, there is no nonproliferation benefit if states routinely engage in monkeying with the data.
Hi, change “is” per “if” in the last sentence and things will go better. Best regards. antonio
Is there a way to distinguish between an earthquake or a nuclear detonation? Would a given critical mass produce a definite seismic reading if other factors are known as well, such as epicenter and geological characteristics? Or does a certain seismic reading reveal it is possible a nuclear bomb was detonated? Or can you just make an assumption if a seismic event happens at a test center then it most likely is a nuclear bomb? And finally, how can seismic readings help to support the Nuclear Test Ban Treaty?
The wave forms are distinguishable. That’s a whole other post, but it is worth noting UNL was able to do so correctly.
Should this be “there is no nonproliferation bebefit IF states routinely…”
Of course, there is no nonproliferation benefit is states routinely engage in monkeying with the data
Jeffrey, Thank you. For a non-technical wonk such as me your post is enlightening. I’ll need some substantial study to absorb all the ramifications.
Pardon my ignorance,but I do have one question that has been bothering me. Are there any means of distinguishing whether an ‘event’ is a ‘sub-critical’ or a coupled nuclear explosion dampened to emit a seismic signature of 1 or less KT?
On yield. A subcritical test should produce no yield; a hydronuclear test only a few kilograms (from the conventional explosives themselves).
In the event of some sort of mistake, like an overrun for a one-point safety test, the yield might be a few tons. In this case, given the location, we were able to rule out U1a as a location (and, as a result, activities that would be conducted there, like subcritical tests), while pointing very strongly to an SPE shot — the locations agree precisely.
The “decoupling” debate is a giant one. The short version is that there is some data (ok, one data point) that suggests decoupling factors of up to 70 under specific conditions. An event like this in hard rock, as opposed to salt, is a DF=20-40. So, we would be looking at 20-80 tons, rather than 1-2 tons, although in this particular case we have NNSA’s announcement that it was a fully coupled conventional explosion of approximately 1 metric ton.
Decoupling is sensitive to frequency, so the actual DF might be considerably less for signals above 15-20 Hz. (Don’t hold me to the precise numbers. I am working from memory here, and that calculation probably assumes a much larger explosion like 1 kt or so.)
Fantastic post, as usual. Sort of relatedly, I was sitting on a plane recently and my neighbor pulled out a laptop and started working with seismic data. I ventured that it looked like nuclear explosion rather than earthquake data (my undergrad professor Greg van der Vink would have been proud :), and she gave me a very funny look. Turned out she was a geologist doing contract work on verification issues.
Fascinating stuff! It seems like it should be super-easy to set up an automatic monitor for such things, no? USGS data has an RSS feed, after all. It would be easy enough to grab the info for all zero-depth events as they happen, as well as periodically check to see if any of those events get deleted. It would not be very hard to throw together, I don’t think.
Well, the, what are you waiting for? I’ll even send you the coordinates for U1a, BEEF and the SPE-series so we can discriminate by location.
Looking into USGS ENS account and some coding .. looks interesting.. geo loc, mag, depth can be specified
Jeff – that was a great Twitter chat with USGS (@usgs).
Thank you for finding this and bringing it into the open.
USGS record important for research. Also tremendous tool for government transparency and accountability. We (U.S.)should-and usually do-set the example.
As most folks here know, there are a lot of folks outside our borders who also rely on these datasets to do their work.
Governments don’t literally topple over USGS data. And as you point out lots of other uses from monitoring, to detecting time/place of various incidents.
the tenacity of your curiosity is just gold.
I think the USGS puts out a separate “quarry blast” bulletin. You might try looking for that. That’s a reasonable approach because one doesn’t want to “contaminate” the earthquake statistics.
I can’t find a separate quarry explosion database.
For what it is worth, USGS offers conflicting statements on the issue. One FAQ says they remove man-made explosions (“Why Do Some Earthquakes Disappear?” http://earthquake.usgs.gov/learn/faq/?faqID=237), another says they label them as blasts but leave them in the database (“What does it mean that the earthquake occurred at a depth of 0 km?” http://earthquake.usgs.gov/learn/faq/?faqID=35).
Then I had this exchange on Twitter, slightly edited for clarity:
A few observations. First, we are dealing with a (patient) press person. So, he or she is likely at the mercy of whoever answers the questions at USGS. That person, in turn, may either not know the correct answer (especially if the policy pertains to classified information) or, if there is a decision to remove certain national security-related events, may not feel that this information should be shared with either the press person responsible for the Twitter account or the general public. USGS is not a monolithic entity, so at least some of the confusion probably relates to how information is shared internally, then communicated externally.
Second, USGS is making an effort to be responsive. So, although I find the answers confusing and contradictory, and strongly disagree with both the policy as stated, to say nothing of what I suspect is the unstated policy, I do want to remain civil. In general, I believe in rewarding government agencies that engage in debate, especially in cases where I think the policy being defended is misguided and ultimately not in the best interests of the country.
I think Cliff Stoll, author of the Cuckoo’s Egg, would be quite proud of the sleuthing you did.
Ok, I just found something I don’t know.
There are comments about ongoing subcritical tests, know about that.
The mention of one point test failures though raises the question…
We aren’t still doing those now, are we?
I assumed not, but …
No, I don’t think we are.
With respect to the seismic database enigmas, I would expect several networks to operate differing filtering methodologies.
Regarding the now 19 years since publication, 1993 epic study overlaying public satellite imagery on publicly available, very distant, seismic signal records, there is a perhaps more accessible report available from 1995 authored by one of the authors from the 1993 paper:
http://www.princeton.edu/sgs/publications/sgs/pdf/5_2gupta.pdf
In the latter linked work, it may be appropriate to observe that, while seismic records were timestamped at a considerable delay because of distance from the site of the detonations, nevertheless the events appear to be quite far from a punctual schedule on-the-hour.
Further, it seems to me that even if there are many isolated databases, computer-based studies could utilize all archived measurements from those tests to improve understanding of signal propagation and geophysical characteristics. Perhaps an illustration of the benefits of such diversity of applications and inputs could be one of the studies cited in the linked 1995 work, which was a hydrology study that added to the information about geologic strata near the site where that country conducted many such blasts.
I think you could easily miss a lot by using event time as a filtering criteria. Nuclear tests are very complicated endeavors and stuff has glitches, which require delays.
I’m looking at my copy of “United States Nuclear Tests July 1945 through September 1992” DOE/NV–209-REV 15
December 2000 which lists, amongst lots of other wonky data, very specific times for each shot. Just to pick some examples, lets look at the last test series in the document, which is Operation Julin, conducted at the Nevada Test Site from October 18th 1991 until September 23rd, 1992. Julin consisted of 10 shots, but three of them were conducted simultaneously in the same hole (the fact that is even possible AND results in usable data continues to amaze me).
Here are the detonation times:
Lubbock 10/18/1991 19:12:00.00
Bristol 11/26/1991 18:35:00.07
Junction 03/26/1992 16:30:00.00
Diamond Fortune 04/30/1992 16:30:00.00
Victoria 06/19/1992 16:45:00.00
Galena 06/23/1992 15:00:00.07
Hunters Trophy 09/18/1992 17:00:00.08
Divider 09/23/1992 15:04:00.00
My apologies is the formatting went to hell when this posted
Anyway, as you can see, most, but not all of the shots were on the quarter hour. That said, if your software were programmed to look only for shots that were on the quarter, it would have rejected 3 of the 8 actual events.
A better approach would be to search by geographical boundaries and depth criteria, and then use time to determine which of the resulting events are most likely to have been artificial.
@George, For whatever it’s worth, two of the three Galena devices were described as “Safety experiments”. I really really hope I live to see the day when the design and experiemental data on nuclear weapons finally gets declassified.
Here’s another new one:
http://www.seismo.unr.edu/Events/main.php?evid=385421
7.2 kilometers deep. I think sometimes an earthquake is just an earthquake.
This would have been a much more fun exercise back when there was actual nuclear testing going on. oh well
When I first spotted that the depth was 0.1 km.
Honest. Guess I should wait till the quake goes from red to yellow before I run with it.
That’s interesting. I wonder if the default is 0.1 km depth and then it updates later. That’s an almost more interesting question than why the original was deleted.
I’ve seen things written about using seismic monitoring stations to track supersonic aircraft by their sonic booms. I’m curious how that would show up in terms of “depth”.
Just spitballing here, but what if in the guise of calibrating shots for treaty monitoring purposes they used the opportunity to shoehorn in some bunker busting data collection? Their capabilities to sensor events are much greater than in the op plowshare era, and it would help sharpen the edge of their modelling codes…