Skip to content

aobo-y/hair-dye

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

83 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

hair-dye

The architecture was proposed by Alex L. Cheng C, etc. 'Real-time deep hair matting on mobile devices'

Create environment

$ conda env create -f environment.yml

Activate environment

$ source activate hairdye

Deactivate the environment by

source deactivate

Download dataset

$ sh download.sh

Train

$ nohup python -u main.py --mode=train > out.log &

The checkpoint and sample images are saved in src/checkpoint/default/ by default.

Test

$ python main.py --mode=test

Run

Plot a groundtruth image, the predicted segmentation and the hue adjusted result from the datasets or any specified image

$ python main.py --mode=run --set=test --num=4
$ python main.py --mode=run --image=./path/to/the/image.png

set can be one train and test, default is test

num is the random number of images from the set, default is 4

Convert the PyTorch model to Tensorflow model using ONNX

See the notebook

Deploy the model to Android Application

See our other repo

About

Neural Network for Dying Hair💈

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •