Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update to tensorflow 1.4 #20

Open
chinmaygghag opened this issue Nov 10, 2017 · 5 comments
Open

Update to tensorflow 1.4 #20

chinmaygghag opened this issue Nov 10, 2017 · 5 comments

Comments

@chinmaygghag
Copy link

Can this code be updated according to tensorflow 1.4?

@mkj676
Copy link

mkj676 commented Feb 2, 2018

Yes, You change this section

  1. tf.concat (ops.py)
  2. tf.sigmoid_~~~~(Face-Aging.py)
  3. In Tensorflow library, variable_scope.py -> def get_variable()'s parameter reuse = None -> reuse = AUTO_REUSE

@AidasK
Copy link

AidasK commented Apr 3, 2018

@mkj676 I have stuck with 3rd step. How can I set "reuse = AUTO_REUSE"?
My current error is "ValueError: Variable E_conv0/w/Adam/ does not exist, or was not created with tf.get_variable(). Did you mean to set reuse=tf.AUTO_REUSE in VarScope?"

@AidasK
Copy link

AidasK commented Apr 3, 2018

Managed to make it work by setting
with tf.variable_scope('', reuse=tf.AUTO_REUSE): on line 222 of Face-Aging.py
So it looks like this:

with tf.variable_scope('', reuse=tf.AUTO_REUSE):
            # optimizer for encoder + generator
            self.EG_optimizer = tf.train.AdamOptimizer(
                learning_rate=EG_learning_rate,
                beta1=beta1
            ).minimize(
                loss=self.loss_EG,
                global_step=self.EG_global_step,
                var_list=self.E_variables + self.G_variables
            )

            # optimizer for discriminator on z
            self.D_z_optimizer = tf.train.AdamOptimizer(
                learning_rate=EG_learning_rate,
                beta1=beta1
            ).minimize(
                loss=self.loss_Dz,
                var_list=self.D_z_variables
            )

            # optimizer for discriminator on image
            self.D_img_optimizer = tf.train.AdamOptimizer(
                learning_rate=EG_learning_rate,
                beta1=beta1
            ).minimize(
                loss=self.loss_Di,
                var_list=self.D_img_variables
            )

@mkj676
Copy link

mkj676 commented Apr 4, 2018

@AidasK
No, you have to change this code
ex) For me, /home/flask/lib/python2.7/site-packages/tensorflow/python/ops

ps. flask means virtualenv

@praveenkumarchandaliya
Copy link

Problem no 1 and 2 is not clear. where we change or which file we required to changes?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants