Skip to content

Commit

Permalink
upload working examples
Browse files Browse the repository at this point in the history
  • Loading branch information
Chiang committed Sep 26, 2020
1 parent 04ea615 commit 0cdf2ac
Show file tree
Hide file tree
Showing 734 changed files with 192,294 additions and 0 deletions.
Original file line number Diff line number Diff line change
@@ -0,0 +1,215 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Example: 2019-07-16 Earthquake near Byron, California\n",
"Prepared by Andrea Chiang, [email protected]\n",
"\n",
"USGS event information URL https://earthquake.usgs.gov/earthquakes/eventpage/nc73225421/executive\n",
"\n",
"In this tutorial we will:\n",
"* Download and process data.\n",
"* Calculate Green's functions.\n",
"* Calculate moment tensor using tdmtpy.\n",
"\n",
"Green's functions are computed using the software package Computer Porgrams in Seismology by Robert Herrmann (http://www.eas.slu.edu/eqc/eqccps.html).\n",
"\n",
"To run this tutorial you will need Python 3+ and the following packages:\n",
"* ObsPy\n",
"* Pandas\n",
"* matplotlib\n",
"* NumPy\n",
"* tdmtpy"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"lines_to_next_cell": 2
},
"outputs": [],
"source": [
"# Import third-party libraries\n",
"from pathlib import Path\n",
"from obspy.clients.fdsn import Client\n",
"from obspy import read_events, UTCDateTime\n",
"from obspy.clients.fdsn.mass_downloader import CircularDomain, Restrictions, MassDownloader"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"I have set the search variables to download only the earthquake of interest, and the quakeml file already exists.\n",
"\n",
"To download the event information change\n",
"```python\n",
"event_bool = True\n",
"```"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"event_bool = False\n",
"\n",
"if event_bool:\n",
" dataCenter=\"IRIS\"\n",
" client = Client(dataCenter)\n",
" starttime = UTCDateTime(\"2019-07-16T00:00:00\")\n",
" endtime = UTCDateTime(\"2019-07-16T23:59:59\")\n",
" catalog = client.get_events(starttime=starttime, endtime=endtime,\n",
" minmagnitude=4,maxmagnitude=5,\n",
" minlatitude=36, maxlatitude=38,\n",
" minlongitude=-122, maxlongitude=-120)\n",
" catalog.write(\"quakes.xml\"%evid,format=\"QUAKEML\")\n"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Download data\n",
"We will download the waveforms and station metadata from the Northern California Earthquake Data Center (NCEDC) using ObsPy's mass_downloader function.\n",
"\n",
"The next cell will create a directory for each event and all files will be stored there. In addition to MSEED and STATIONXML files we will also write the event origin information to a text file. This text file will be stored in the current working directory."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"lines_to_end_of_cell_marker": 2,
"lines_to_next_cell": 0
},
"outputs": [],
"source": [
"dataCenter=\"NCEDC\" \n",
"\n",
"# Time before and after event origin for waveform segments\n",
"time_before = 60\n",
"time_after = 300\n",
"download_bool = True\n",
"\n",
"catalog = read_events(\"quakes.xml\")\n",
"#catalog.plot(method=\"cartopy\",projection=\"ortho\")\n",
"for event in catalog:\n",
" evid = str(catalog[0].origins[0].resource_id).split(\"=\")[-1] # User origin resource id as the event id\n",
" outdir = evid\n",
" Path(outdir).mkdir(parents=True,exist_ok=True)\n",
" \n",
" # Event origin\n",
" origin_time = event.preferred_origin().time\n",
" starttime = origin_time - time_before\n",
" endtime = origin_time + time_after\n",
" \n",
" # Event location\n",
" evlo = event.preferred_origin().longitude\n",
" evla = event.preferred_origin().latitude\n",
" depth = event.preferred_origin().depth # in meters\n",
" \n",
" # Set the search area\n",
" domain = CircularDomain(latitude=evla, longitude=evlo, minradius=0.7, maxradius=1.3)\n",
" \n",
" # Set the search period and additional criteria\n",
" restrictions = Restrictions(starttime=starttime, endtime=endtime,\n",
" reject_channels_with_gaps=True,\n",
" minimum_length=0.95,\n",
" network=\"BK\",\n",
" channel_priorities=[\"BH[ZNE12]\", \"HH[ZNE12]\"],\n",
" sanitize=True)\n",
" \n",
" # Save catalog info to file\n",
" event_out = (\n",
" \"{evid:s},{origin:s},{jdate:s},\"\n",
" \"{lon:.4f},{lat:.4f},{depth:.4f},\"\n",
" \"{mag:.2f},{auth:s}\\n\"\n",
" ) \n",
"\n",
" if event.preferred_magnitude() is None:\n",
" mag = -999.\n",
" magtype = \"ml\"\n",
" else:\n",
" mag = event.preferred_magnitude().mag\n",
" magtype = event.preferred_magnitude().magnitude_type.lower()\n",
" if event.preferred_origin().extra.catalog.value is None:\n",
" auth = \"unknown\"\n",
" else:\n",
" auth = event.preferred_origin().extra.catalog.value.replace(\" \",\"\")\n",
" \n",
" event_out = event_out.format(\n",
" evid=evid,\n",
" origin=str(origin_time),\n",
" jdate=\"%s%s\"%(origin_time.year,origin_time.julday),\n",
" lon=evlo,\n",
" lat=evla,\n",
" depth=depth/1E3,\n",
" mag=mag,\n",
" auth=auth\n",
" )\n",
" \n",
" outfile = \"datetime.csv\"\n",
" with open(outfile,\"w\") as f:\n",
" f.write(\"evid,origin,jdate,lon,lat,depth,%s,auth\\n\"%magtype)\n",
" f.write(event_out)\n",
" \n",
" # Dowanload waveforms and metadata\n",
" if download_bool:\n",
" mseed_storage = \"%s/waveforms\"%outdir\n",
" stationxml_storage = \"%s/stations\"%outdir\n",
" mdl = MassDownloader(providers=[dataCenter])\n",
" mdl_helper = mdl.download(domain, restrictions,\n",
" mseed_storage=mseed_storage,stationxml_storage=stationxml_storage)\n",
" print(\"%s download completed\"%outdir)\n",
" \n",
" \n",
" print(\"%s is DONE.\"%outdir)"
]
},
{
"cell_type": "markdown",
"metadata": {
"lines_to_next_cell": 2
},
"source": [
"**Now we've downloaded the raw data, the next step is to process them.**"
]
}
],
"metadata": {
"jupytext": {
"formats": "ipynb,py:light",
"text_representation": {
"extension": ".py",
"format_name": "light",
"format_version": "1.5",
"jupytext_version": "1.5.2"
}
},
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.8.5"
}
},
"nbformat": 4,
"nbformat_minor": 4
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,155 @@
# ---
# jupyter:
# jupytext:
# formats: ipynb,py:light
# text_representation:
# extension: .py
# format_name: light
# format_version: '1.5'
# jupytext_version: 1.6.0
# kernelspec:
# display_name: Python 3
# language: python
# name: python3
# ---

# ### Example: 2019-07-16 Earthquake near Byron, California
# Prepared by Andrea Chiang, [email protected]
#
# USGS event information URL https://earthquake.usgs.gov/earthquakes/eventpage/nc73225421/executive
#
# In this tutorial we will:
# * Download and process data.
# * Calculate Green's functions.
# * Calculate moment tensor using tdmtpy.
#
# Green's functions are computed using the software package Computer Porgrams in Seismology by Robert Herrmann (http://www.eas.slu.edu/eqc/eqccps.html).
#
# To run this tutorial you will need Python 3+ and the following packages:
# * ObsPy
# * Pandas
# * matplotlib
# * NumPy
# * tdmtpy

# Import third-party libraries
from pathlib import Path
from obspy.clients.fdsn import Client
from obspy import read_events, UTCDateTime
from obspy.clients.fdsn.mass_downloader import CircularDomain, Restrictions, MassDownloader


# I have set the search variables to download only the earthquake of interest, and the quakeml file already exists.
#
# To download the event information change
# ```python
# event_bool = True
# ```

# +
event_bool = False

if event_bool:
dataCenter="IRIS"
client = Client(dataCenter)
starttime = UTCDateTime("2019-07-16T00:00:00")
endtime = UTCDateTime("2019-07-16T23:59:59")
catalog = client.get_events(starttime=starttime, endtime=endtime,
minmagnitude=4,maxmagnitude=5,
minlatitude=36, maxlatitude=38,
minlongitude=-122, maxlongitude=-120)
catalog.write("quakes.xml"%evid,format="QUAKEML")

# -

# ### Download data
# We will download the waveforms and station metadata from the Northern California Earthquake Data Center (NCEDC) using ObsPy's mass_downloader function.
#
# The next cell will create a directory for each event and all files will be stored there. In addition to MSEED and STATIONXML files we will also write the event origin information to a text file. This text file will be stored in the current working directory.

# +
dataCenter="NCEDC"

# Time before and after event origin for waveform segments
time_before = 60
time_after = 300
download_bool = True

catalog = read_events("quakes.xml")
#catalog.plot(method="cartopy",projection="ortho")
for event in catalog:
evid = str(catalog[0].origins[0].resource_id).split("=")[-1] # User origin resource id as the event id
outdir = evid
Path(outdir).mkdir(parents=True,exist_ok=True)

# Event origin
origin_time = event.preferred_origin().time
starttime = origin_time - time_before
endtime = origin_time + time_after

# Event location
evlo = event.preferred_origin().longitude
evla = event.preferred_origin().latitude
depth = event.preferred_origin().depth # in meters

# Set the search area
domain = CircularDomain(latitude=evla, longitude=evlo, minradius=0.7, maxradius=1.3)

# Set the search period and additional criteria
restrictions = Restrictions(starttime=starttime, endtime=endtime,
reject_channels_with_gaps=True,
minimum_length=0.95,
network="BK",
channel_priorities=["BH[ZNE12]", "HH[ZNE12]"],
sanitize=True)

# Save catalog info to file
event_out = (
"{evid:s},{origin:s},{jdate:s},"
"{lon:.4f},{lat:.4f},{depth:.4f},"
"{mag:.2f},{auth:s}\n"
)

if event.preferred_magnitude() is None:
mag = -999.
magtype = "ml"
else:
mag = event.preferred_magnitude().mag
magtype = event.preferred_magnitude().magnitude_type.lower()
if event.preferred_origin().extra.catalog.value is None:
auth = "unknown"
else:
auth = event.preferred_origin().extra.catalog.value.replace(" ","")

event_out = event_out.format(
evid=evid,
origin=str(origin_time),
jdate="%s%s"%(origin_time.year,origin_time.julday),
lon=evlo,
lat=evla,
depth=depth/1E3,
mag=mag,
auth=auth
)

outfile = "datetime.csv"
with open(outfile,"w") as f:
f.write("evid,origin,jdate,lon,lat,depth,%s,auth\n"%magtype)
f.write(event_out)

# Dowanload waveforms and metadata
if download_bool:
mseed_storage = "%s/waveforms"%outdir
stationxml_storage = "%s/stations"%outdir
mdl = MassDownloader(providers=[dataCenter])
mdl_helper = mdl.download(domain, restrictions,
mseed_storage=mseed_storage,stationxml_storage=stationxml_storage)
print("%s download completed"%outdir)


print("%s is DONE."%outdir)


# -
# **Now we've downloaded the raw data, the next step is to process them.**

Loading

0 comments on commit 0cdf2ac

Please sign in to comment.