Skip to content
This repository has been archived by the owner on Feb 2, 2022. It is now read-only.

V0.7.5dev #64

Merged
merged 45 commits into from
Apr 9, 2020
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
45 commits
Select commit Hold shift + click to select a range
a6890fa
begin dev for DVH Constraints: https://github.com/cutright/DVH-Analyt…
cutright Mar 14, 2020
e8374a7
DIRECTORIES dict and PROTOCOL_DIR added to dvha.paths
cutright Mar 14, 2020
fbd33ea
DIRECTORIES dict and PROTOCOL_DIR added to dvha.paths, copy protocols…
cutright Mar 15, 2020
1afc8f5
merge dvh.py and protocols.py so Constraint can use endpoint calc fun…
cutright Mar 15, 2020
bc9edd0
begin dev for protocol editor window (currently non-functional)
cutright Mar 15, 2020
3258948
suppress fuzzywuzzy's python-Levenshtein install warning
cutright Mar 15, 2020
3a1e82d
remove unnecessary About & PythonLibraries launch functions
cutright Mar 15, 2020
fef1b60
update protocol table with combobox changes
cutright Mar 15, 2020
d120598
ProtocolEditor delete functionality, add constraint dlg framework
cutright Mar 16, 2020
26ef51d
cosmetic tweaks
cutright Mar 16, 2020
8332c4c
added auto kwarg for set_column_widths to use wx.LIST_AUTOSIZE
cutright Mar 18, 2020
876c080
begin dev of showing TG263 spreadsheet on ROI Map
cutright Mar 18, 2020
35c973a
export plots to .svg. This branch requires selenium and phantomjs.
cutright Mar 19, 2020
6f919d5
Database Editor: Fix for https://github.com/cutright/DVH-Analytics/is…
cutright Mar 19, 2020
45e1012
Add filters for TG-263 table in ROI Map Editor, better auto column wi…
cutright Mar 21, 2020
4724e7b
Add protocol constraint progress
cutright Mar 22, 2020
7371315
Allow DVH from DICOM RT-Dose, rather than recalculating
cutright Mar 22, 2020
cd92c04
Dose summation is now optional
cutright Mar 22, 2020
6506523
Dicom dose sum is over a study, not a patient
cutright Mar 22, 2020
5569935
Fixed bug where DICOM timestamps failed to parse if fractional second…
cutright Mar 22, 2020
a8e2931
begin dev for ptv assignment per plan when disabling auto-summing of …
cutright Mar 24, 2020
604bab6
fix: if len of new data is different than previous, sorting may crash
cutright Mar 28, 2020
452b181
implement plan PTV assignments (multi-plan/single Structure, no dose …
cutright Apr 5, 2020
5d45900
SQLite import_time_stamp fix. And roi name/type over-rides not appli…
cutright Apr 6, 2020
ba9b88d
Merge pull request #63 from cutright/svg_test
cutright Apr 6, 2020
ac8920b
Simplify plot export menu. options not yet functional but SVG & HTML …
cutright Apr 6, 2020
8fbd89a
Export plot options functional for SVG only
cutright Apr 6, 2020
7b98dfd
If text field is empty, use current figure attr value, FileDialog ini…
cutright Apr 6, 2020
0272cce
Fix for issues when SoftwareVersions is stored not in RT Plan and is …
cutright Apr 8, 2020
2506319
dicompyler DicomParser may lead to a a deferred read error for some f…
cutright Apr 8, 2020
1d5d93c
store version in it's own file, so Options class doesn't have to be i…
cutright Apr 8, 2020
aaafe30
disable protocol menu, save for 0.7.6
cutright Apr 8, 2020
23fb827
figure export support for html with attribute edits
cutright Apr 8, 2020
cd77eb4
propagate options to plots after change, settings and export are now …
cutright Apr 8, 2020
68fda7e
only one track UserSettings/ExportFigure instances, better plot optio…
cutright Apr 9, 2020
9f3269e
plot options fixes, frame icons for MSW, sort tx_modalities
cutright Apr 9, 2020
5fc8535
bug fix for SQLite datetime comparision during import cancelation
cutright Apr 9, 2020
f33b71b
proper SQLite datetime logic for deleting partially imported plans, i…
cutright Apr 9, 2020
6dcff50
save figure export parameters to options
cutright Apr 9, 2020
c54f708
add export figure to tool bar
cutright Apr 9, 2020
d79e696
MSW toolbar separator is very thin, add multiple separators
cutright Apr 9, 2020
63c4ead
Export SVG for MVR and ML
cutright Apr 9, 2020
18ac0cc
dev for export settings to MVR and ML
cutright Apr 9, 2020
e89161b
export figure properties apply to MVR and ML if export figure window …
cutright Apr 9, 2020
8770a07
save correlation to svg, don't include with export figure since prope…
cutright Apr 9, 2020
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
16 changes: 16 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,21 @@
# Change log of DVH Analytics

v0.7.5 (2020.04.09)
--------------------
- [Import] Option to import DVH stored in DICOM RT-Dose rather than calculate it. NOTE: Currently the source of the DVH
is not stored in the database.
- [Import] Fixed bug that caused datetime parsing to fail if DICOM timestamp had fractional seconds
(avoid the "unconverted data remains" error).
- [Import] Option to turn off automatic dose summation: [#57](https://github.com/cutright/DVH-Analytics/issues/57)
- [Import] Bug fix: [#59](https://github.com/cutright/DVH-Analytics/issues/59)
- [Import] SQLite import_time_stamp now actually includes time, and is in your local timezone
- [Import] SoftwareVersions and Deferred read error fix: [#36](https://github.com/cutright/DVH-Analytics/issues/36)
- [Import] Treatment modalities are sorted before importing so database doesn't have ModalityA,ModalityB and
ModalityB,ModalityA
- [Export] New figure export implementation in tool bar, supports SVG and HTML. Note that SVG export requires
phantomJS to be installed, which is not a python library.
- [Plots] Plot options in User Settings no longer require DVHA to be restarted to be applied.

v0.7.4 (2020.03.13)
--------------------
- [Endpoints] Fixed a bug where endpoint short-hand would not display on MS Windows
Expand Down
3 changes: 2 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -71,7 +71,7 @@ Dependencies
---------
* [Python](https://www.python.org) >3.5
* [wxPython Phoenix](https://github.com/wxWidgets/Phoenix) >= 4.0.4
* [Pydicom](https://github.com/darcymason/pydicom) >=1.0
* [Pydicom](https://github.com/darcymason/pydicom) >=1.4.0
* [dicompyler-core](https://pypi.python.org/pypi/dicompyler-core) 0.5.3
* [Bokeh](http://bokeh.pydata.org/en/latest/index.html) >= 1.2.0, < 2.0.0
* [PostgreSQL](https://www.postgresql.org/) (optional) and [psycopg2](http://initd.org/psycopg/)
Expand All @@ -83,6 +83,7 @@ Dependencies
* [Scikit-learn](http://scikit-learn.org) >= 0.21.0
* [regressors](https://pypi.org/project/regressors/)
* [FuzzyWuzzy](https://github.com/seatgeek/fuzzywuzzy)
* [selenium](https://github.com/SeleniumHQ/selenium/)


Support
Expand Down
2 changes: 2 additions & 0 deletions dvha/LICENSE.txt
Original file line number Diff line number Diff line change
Expand Up @@ -77,6 +77,8 @@ The following were last accessed on June 16, 2019:
- https://www.iconfinder.com/icons/1218712/customers_group_team_user_user_group_icon
User_Yuppie_3_1218716.png
- https://www.iconfinder.com/icons/1218716/christmas_student_user_yuppie_icon
analysis_analytics_graph_icon.png (accessed on April 9, 2020)
- https://www.iconfinder.com/icons/4230522/analysis_analytics_graph_icon

------------------------------------------------------------------------------------
------------------------------------------------------------------------------------
Expand Down
1 change: 1 addition & 0 deletions dvha/_version.py
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
__version__ = '0.7.5'
92 changes: 66 additions & 26 deletions dvha/db/dicom_parser.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,14 +8,15 @@
# See the file LICENSE included with this distribution, also
# available at https://github.com/cutright/DVH-Analytics

from dicompylercore import dvhcalc
from dicompylercore import dvhcalc, dvh as dicompyler_dvh
from dicompylercore.dicomparser import DicomParser as dicompylerParser
from datetime import datetime
from dateutil.relativedelta import relativedelta # python-dateutil
from dateutil.parser import parse as date_parser
import numpy as np
from os.path import basename, join
from pubsub import pub
import pydicom
from dvha.options import Options
from dvha.tools.roi_name_manager import clean_name, DatabaseROIs
from dvha.tools.utilities import change_angle_origin, calc_stats, is_date, validate_transfer_syntax_uid
Expand All @@ -30,7 +31,7 @@ class DICOM_Parser:
Parse a set of DICOM files for database
"""
def __init__(self, plan_file=None, structure_file=None, dose_file=None, dose_sum_file=None, plan_over_rides=None,
global_plan_over_rides=None, roi_over_ride=None, roi_map=None):
global_plan_over_rides=None, roi_over_ride=None, roi_map=None, use_dicom_dvh=False, plan_ptvs=None):
"""
:param plan_file: absolute path of DICOM RT Plan file
:type plan_file: str
Expand All @@ -48,6 +49,10 @@ def __init__(self, plan_file=None, structure_file=None, dose_file=None, dose_sum
:type roi_over_ride: dict
:param roi_map: roi name map
:type roi_map: DatabaseROIs
:param use_dicom_dvh: use the DVH stored in DICOM RT-Dose if it exists
:type use_dicom_dvh: bool
:param plan_ptvs: assign specifc PTVs for distance calculations
:type plan_ptvs: list
"""

self.database_rois = DatabaseROIs() if roi_map is None else roi_map
Expand All @@ -58,6 +63,8 @@ def __init__(self, plan_file=None, structure_file=None, dose_file=None, dose_sum
self.dose_file = dose_file
self.dose_sum_file = dose_sum_file

self.use_dicom_dvh = use_dicom_dvh

# store these values when clearing file loaded data
self.stored_values = {}

Expand All @@ -75,9 +82,12 @@ def __init__(self, plan_file=None, structure_file=None, dose_file=None, dose_sum
self.structure_name_and_type = self.get_structure_name_and_type(self.rt_data['structure'])
if self.dose_file:
if self.dose_sum_file is None:
self.rt_data['dose'] = dicompylerParser(self.dose_file).ds
# self.rt_data['dose'] = dicompylerParser(self.dose_file).ds
# The above line may lead to OSError: Deferred read -- original filename not stored. Cannot re-open
# switching back to pydicom.read_file()
self.rt_data['dose'] = pydicom.read_file(self.dose_file, force=True)
else:
self.rt_data['dose'] = dicompylerParser(self.dose_sum_file).ds
self.rt_data['dose'] = pydicom.read_file(self.dose_sum_file, force=True)

# These properties are not inherently stored in Pinnacle DICOM files, but can be extracted from dummy ROI
# names automatically generated by the Pinnacle Script provided by DVH Analytics
Expand All @@ -92,10 +102,15 @@ def __init__(self, plan_file=None, structure_file=None, dose_file=None, dose_sum
self.global_plan_over_rides = global_plan_over_rides

self.roi_over_ride = roi_over_ride if roi_over_ride is not None else {'name': {}, 'type': {}}
for over_ride_type, over_ride in self.roi_over_ride.items():
for key, value in over_ride.items():
self.structure_name_and_type[key][over_ride_type] = value

self.plan_ptvs = plan_ptvs

def update_stored_values(self):
keys = ['study_instance_uid_to_be_imported', 'patient_name', 'mrn', 'sim_study_date', 'birth_date',
'rx_dose', 'ptv_names', 'physician', 'ptv_exists', 'tx_site']
'rx_dose', 'ptv_names', 'physician', 'ptv_exists', 'tx_site', 'patient_orientation']
self.stored_values = {key: getattr(self, key) for key in keys}

def __initialize_rx_beam_and_ref_beam_data(self):
Expand Down Expand Up @@ -388,16 +403,24 @@ def get_dvh_row(self, dvh_index):
:rtype: dict
"""

try:
dvh = dvhcalc.get_dvh(self.rt_data['structure'], self.rt_data['dose'], dvh_index,
callback=self.send_dvh_progress)
except AttributeError:
dose = validate_transfer_syntax_uid(self.rt_data['dose'])
structure = validate_transfer_syntax_uid(self.rt_data['structure'])
dvh = dvhcalc.get_dvh(structure, dose, dvh_index,
callback=self.send_dvh_progress)

if dvh.volume > 0: # ignore points and empty ROIs
dvh = None
if self.use_dicom_dvh:
try:
dvh = dicompyler_dvh.DVH.from_dicom_dvh(self.rt_data['dose'], dvh_index)
except AttributeError: # dicompyler-core raises this is structure is not found in DICOM DVH
pass

if dvh is None:
try:
dvh = dvhcalc.get_dvh(self.rt_data['structure'], self.rt_data['dose'], dvh_index,
callback=self.send_dvh_progress)
except AttributeError:
dose = validate_transfer_syntax_uid(self.rt_data['dose'])
structure = validate_transfer_syntax_uid(self.rt_data['structure'])
dvh = dvhcalc.get_dvh(structure, dose, dvh_index,
callback=self.send_dvh_progress)

if dvh and dvh.volume > 0: # ignore points and empty ROIs
geometries = self.get_dvh_geometries(dvh_index)

return {'mrn': [self.mrn, 'text'],
Expand Down Expand Up @@ -497,7 +520,8 @@ def tx_modality(self):
for fx_grp_index, beam_parser_list in self.beam_data.items():
for beam_parser in beam_parser_list:
tx_modalities.append(beam_parser.tx_modality)
return ','.join(list(set(tx_modalities)))
tx_modalities = sorted(list(set(tx_modalities)))
return ','.join(tx_modalities)

@property
def rx_dose(self):
Expand Down Expand Up @@ -598,11 +622,11 @@ def fx_grp_count(self):

@property
def patient_orientation(self):
# TODO: database assumes only one orientation (i.e., three characters)
seq = self.get_attribute('plan', 'PatientSetupSequence')
if seq is not None:
return ','.join([setup.PatientPosition for setup in seq])
else:
return 'None'
# return ','.join([setup.PatientPosition for setup in seq])
return str(seq[0].PatientPosition)[:3]

@property
def plan_time_stamp(self):
Expand All @@ -626,7 +650,14 @@ def tps_software_name(self):

@property
def tps_software_version(self):
return ','.join(self.get_attribute('plan', 'SoftwareVersions'))
# Some TPSs may store the version in RT Dose rather than plan
# SoftwareVersions may also be stored as a string rather than a list
for rt_type in ['plan', 'dose', 'structure']: # Check each rt_type until SoftwareVersions is found
version = self.get_attribute(rt_type, 'SoftwareVersions')
if version is not None:
if isinstance(version, str):
version = [version]
return ','.join(version)

@property
def tx_site(self):
Expand Down Expand Up @@ -677,7 +708,7 @@ def radiation_type(self):
'ELECTRONS': self.electron,
'PROTONS': self.proton,
'BRACHY': self.brachy_type}
types = [rad_type for rad_type, rad_value in rad_types.items() if rad_value]
types = sorted([rad_type for rad_type, rad_value in rad_types.items() if rad_value])
return ','.join(types)

@property
Expand Down Expand Up @@ -742,7 +773,6 @@ def reset_roi_type_over_ride(self, key):

def get_roi_name(self, key):
"""
Applies clean_name from roi_name_manager.py to the name in the DICOM file
:param key: the index of the roi
:return: roi name to be used in the database
:rtype: str
Expand Down Expand Up @@ -906,6 +936,7 @@ def get_time_stamp(self, rt_type):

try:
if time:
time = time.split('.')[0] # ignore fractional sec
return datetime.strptime(datetime_str + time, '%Y%m%d%H%M%S')
else:
return datetime.strptime(datetime_str, '%Y%m%d')
Expand Down Expand Up @@ -1386,6 +1417,9 @@ def __init__(self, file_set, stored_values, dicompyler_rt_structures, roi_map=No

self.roi_over_ride = {'name': {}, 'type': {}}

# If importing with auto sum turned off, use this list to track plan-specific PTVs
self.plan_ptvs = []

@property
def mrn(self):
if self.plan_over_rides['mrn'] is not None:
Expand All @@ -1394,7 +1428,7 @@ def mrn(self):

@property
def study_instance_uid(self):
return self.stored_values['study_instance_uid']
return self.stored_values['study_instance_uid_to_be_imported']

@property
def study_instance_uid_to_be_imported(self):
Expand Down Expand Up @@ -1466,6 +1500,10 @@ def tx_site(self):
ans = self.stored_values['tx_site']
return self.process_global_over_ride('tx_site', ans)

@property
def patient_orientation(self):
return self.stored_values['patient_orientation']

def process_global_over_ride(self, key, pre_over_ride_value):
if self.global_plan_over_rides:
over_ride = self.global_plan_over_rides[key]
Expand All @@ -1483,8 +1521,10 @@ def get_roi_type(self, key):
:rtype: str
"""
if key in list(self.roi_over_ride['type']):
return self.roi_over_ride['type'][key]
return self.dicompyler_rt_structures[key]['type'].upper()
ans = self.roi_over_ride['type'][key]
else:
ans = self.dicompyler_rt_structures[key]['type'].upper()
return ans if ans else 'NONE'

def reset_roi_type_over_ride(self, key):
self.roi_over_ride['type'][key] = None
Expand Down Expand Up @@ -1569,7 +1609,7 @@ def ptv_names(self):

@property
def init_param(self):
params = ['plan_file', 'structure_file', 'dose_file',
params = ['plan_file', 'structure_file', 'dose_file', 'plan_ptvs',
'plan_over_rides', 'global_plan_over_rides', 'roi_over_ride']
return {key: getattr(self, key) for key in params}

Expand Down
34 changes: 22 additions & 12 deletions dvha/db/sql_connector.py
Original file line number Diff line number Diff line change
Expand Up @@ -168,13 +168,12 @@ def now(self):
:return: The current time as seen by the SQL database
:rtype: datetime
"""

return self.query_generic("SELECT %s" % self.sql_cmd_now)[0][0]

@property
def sql_cmd_now(self):
if self.db_type == 'sqlite':
sql_cmd = "date('now')"
sql_cmd = "strftime('%Y-%m-%d %H:%M:%f', 'now', 'localtime')"
else:
sql_cmd = "NOW()"
return sql_cmd
Expand Down Expand Up @@ -217,7 +216,12 @@ def update(self, table_name, column, value, condition_str):
raise SQLError(str(e), update)

def is_study_instance_uid_in_table(self, table_name, study_instance_uid):
return self.is_value_in_table(table_name, study_instance_uid, 'study_instance_uid')
# As of DVH v0.7.5, study_instance_uid may end with _N where N is the nth plan of a file set
query = "SELECT DISTINCT study_instance_uid FROM %s WHERE study_instance_uid LIKE '%s%%';" % \
(table_name, study_instance_uid)
self.cursor.execute(query)
results = self.cursor.fetchall()
return bool(results)

def is_mrn_in_table(self, table_name, mrn):
return self.is_value_in_table(table_name, mrn, 'mrn')
Expand Down Expand Up @@ -366,9 +370,7 @@ def ignore_dvh(self, variation, study_instance_uid, unignore=False):
"roi_name = '%s' and study_instance_uid = '%s'" % (variation, study_instance_uid))

def drop_tables(self):
"""
Delete all tables in the database if they exist
"""
"""Delete all tables in the database if they exist"""
for table in self.tables:
self.cursor.execute("DROP TABLE IF EXISTS %s;" % table)
self.cnx.commit()
Expand All @@ -383,19 +385,27 @@ def drop_table(self, table):
self.cnx.commit()

def initialize_database(self):
"""
Ensure that all of the latest SQL columns exist in the user's database
"""
"""Ensure that all of the latest SQL columns exist in the user's database"""
create_tables_file = [CREATE_PGSQL_TABLES, CREATE_SQLITE_TABLES][self.db_type == 'sqlite']
self.execute_file(create_tables_file)

def reinitialize_database(self):
"""
Delete all data and create all tables with latest columns
"""
"""Delete all data and create all tables with latest columns"""
self.drop_tables()
self.vacuum()
self.initialize_database()

def vacuum(self):
"""Call to reclaim space in the database"""
if self.db_type == 'sqlite':
self.cnx.isolation_level = None
self.cnx.execute('VACUUM')
self.cnx.isolation_level = ''
else:
# TODO: PGSQL VACUUM needs testing
# self.cnx.execute('VACUUM')
pass

def does_db_exist(self):
"""
Check if database exists
Expand Down
9 changes: 6 additions & 3 deletions dvha/db/update.py
Original file line number Diff line number Diff line change
Expand Up @@ -347,12 +347,15 @@ def update_ptv_data(tv, study_instance_uid):
cnx.update('Plans', key, value, "study_instance_uid = '%s'" % study_instance_uid)


def get_total_treatment_volume_of_study(study_instance_uid):
def get_total_treatment_volume_of_study(study_instance_uid, ptvs=None):
"""
Calculate combined PTV for the provided study_instance_uid
"""
ptv_coordinates_strings = query('dvhs', 'roi_coord_string',
"study_instance_uid = '%s' and roi_type like 'PTV%%'" % study_instance_uid)

condition = "study_instance_uid = '%s' and roi_type like 'PTV%%'" % study_instance_uid
if ptvs:
condition += " and roi_name in ('%s')" % "','".join(ptvs)
ptv_coordinates_strings = query('dvhs', 'roi_coord_string', condition)

ptvs = [roi_form.get_planes_from_string(ptv[0]) for ptv in ptv_coordinates_strings]

Expand Down
Loading