- Add workflow badge to README
- Update Auxillary models
- Remove spacing between bottom of func comments and 1st line of func.
- Change globals module to _globals in psp dir.
- Append each jobs results to a the one csv file including model and GCP parameters.
- Remove URL and filenames for datasets from globals.
- Visualise GCP code pipeline
- Create model version on AI Platform
- Add API's required for GCP part
- Add roles required on GCP.
- Create front-end React App that receives input from the finished job and results for each job and visualises and returns them to a front-end web app thing.
- Update notification func to update when job failed and parse reason.
- Add Releases
- Update function comments
- In workflow, test code pipeline by running dummy model and checking resultant files etc.
- Continue Hyperparameter tuning of model
- Add https://drive.google.com/drive/folders/1404cRlQmMuYWPWp5KwDtA7BPMpl-vF-d to Data Section
- Fix README's
- Check Latest Travis Build
- Add AUC() metric class to models
- Change colour of box in boxplot
- Fix Boxplots - what do they represent etc...
- Model Tests
- Tests for inputting data for prediction - fasta, txt, pdb tests, add data folder in tests folder
- Add learning rate scheduler
- Add labels to readme
- Add CI Github workflows
- Add CI Testing - https://docs.github.com/en/free-pro-team@latest/actions/guides/building-and-testing-python#introduction
- Add AUC, FP and FN to output metrics
- Coveralls - https://coveralls.io/
- Review one-hot encoding process
- Review neccisity of all_data variable
- Reach out to ICML people and find out how they developed their data
- H/w requirements in readme
- Look into pytest
- CodeCov - Code Coverage
- Python Version Badge - https://shields.io/category/platform-support
- Last Modified Badge - https://shields.io/category/activity
- LinkedIn Badge
- GCP Badge - https://img.shields.io/badge/Google_Cloud-4285F4?style=for-the-badge&logo=google-cloud&logoColor=white
- Python Logo Badge
- Visualise Keras model - https://www.machinecurve.com/index.php/2019/10/07/how-to-visualize-a-model-with-keras/
- Re do model tests
- Remove TensorBaord stuff from model and only keep in training file
- Keras JSON Parser
- Check variable and layer names for models
- Remove GCP config script
- Add Workflow tests for psp_gcp whereby gcloud sdk is installed and a few commands are attempted to see if it is working correctly etc
- Remove show plots parameter #unnessary
- Add help to argparse etc
- Full Stops in func comments.
- Add allData var back into data func
- Fix importlib model imports for auxillary models
- Fix output file struture diagram to include logs, checkpoints folders.
- Add parameter descriptons for LR schedulers in utils.py
- Echo some of model parameters of config file in gcp_training job
- Func in notification func that emails status of job if fails, also sends reason for failing.
- Parse JSON arch utility function
- Fix gcp hpconfig file
- Look into training on TPU (https://www.tensorflow.org/guide/tpu)
- Change staging bucket to bucket in config
- Remove hard-coded GCP params in config and inject env vars using jq (do this for local psp version as well)
- Change color of output in training script
- Look at output suggestions from bandit and make any changes accordingly.
- Look at output suggestions from flake8 and make any changes accordingly.
- Add virtual env to workflow (add to readme)
- Change gcp_notification_func to import secret values from secrets.sh
- Get job status script
- Move model layer params to model params
- A method to create a json config file??
- Indent optimizer in json to include metaparameters, check if these meta values are set and pass into opimizer function.
- Input parameter of training script that decides whether to train locally or to GCP.
- Optimizer tests
- Re-do config files such that each layer has its individual parameters indented, then pass in via **kwargs...
- Change main.py to just pass in model-parameters
- Check to see all config jsons open without error.
- Upload config file used in model in model folder.
- Tests_gcp
- Update build and build status to point to same dir
- Change filtered to "True" to 1 in configs
- https://github.com/icemansina/IJCAI2016/blob/master/Train_validation_test_release.ipynb
- unitest.skip on request URL tests in test_dataset.py
- Append config fiel to results output file
- Fix try except in load_dataset.py
- Output results dont seem t o be working, model logs and metadata not exporting to CSV
- Remove append_model_output func in utils.
- Make dummy model simpler
- Create data dir in psp_gcp
- in psp_gcp, ensure local training stored in output folder.
- Change (Keras Model) to Keras.model
- Change type=5926/6133 to a str, rather than int
- Add RMSE metric
- Add save dir to dataset classes
- If gcp_project!=PRoject then update project
- Change output_data to output folder
- Move to new bucket
- Change structure of network outputs/inputs as https://github.com/wentaozhu/protein-cascade-cnn-lstm/blob/master/cb6133.py
- Change to TimeDistributed dense??
- Change all "None" in configs to null
- Remove name from batch_norm parameter
- Split up model tests into their own class Test cases .
- Change "model_" to "model" in dummy json
- Change Dense_layer1 -> dense_1 in configs
- Fix order for recurrent layers in Auxillary models.
- self.assertEqual(model._name, "model_name")
- Change testLabel -> test_labels in evaluate.py
- Rename casp10_test_hot to just test_hot
- Add self to class instance arguments in comments. The self is used to represent the instance of the class.
- Add TF unit tests
- Evaluate.py - add raise ValueError if y_true.shape!=y_pred.
- Add RMSE to plot_history func
- Try completely removing repeated modules and packages from psp to psp_gcp directories by using the psp dir for the psp_gcp ones as well.
- Add psp to sys.path so can import from psp_gcp
- Reset gcp_parameters in config back to ""
- Update paths for casp10/11 downloads from repo.
- Add LR scheduler to config parameters.
- Add input parameter for what callbacks to use.
- change params = json.load(f) to params = json.load(f)[0]["parameters"]
- Add output folder name to output_results.csv
- Change some column names in results file
- Change where logs are stored in bucket - should be stored in output folder maybe
- 2 columns of CASP10 precision in results file
- Add comments to workflow
- Add reduceLROnPlateau to each config
- Add references to functions
- Add repr function to classes
- Change dataset_size func to size
- Change model tests to open up each models config and cross-reference with the config values.
- Change load_dataset to dataset.py
- Change setup to setUpClass(self) + @classmethod
- Add API enable commands to psp_gcp gcloud services enable appengine.googleapis.com
- https://www.tensorflow.org/tutorials/keras/save_and_load#savedmodel_format
- Add emojis to readme
- Add banner image to readme