You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Firstly thank you very much for your project. I have a problem about large dataset. I have 10 thousand images and masks.
I can create .npy files. When I run unet.py files with 10 thousand images I get error Memory Usage Exceeds. When I decrease the number of dataset (max 600 images) no any error. I think batch size don't work in model.fit. Do you have any advice or solution?
Best regards.
The text was updated successfully, but these errors were encountered:
I think batch size isn't working because there is no generator to generate the dataset. You can try ImageDataGenerator which is inbuilt in keras or you can make your own custom generator.
Below is an implementation of ImageDataGenerator. Hope this helps and answers your question. (Moreover I have done data augmentation, you can ignore that part).
To read more about Custom Generator, you can visit here or if u want to see them in action you can visit my repo Kaggle-Data-Science-Bowl-2018 in which i have trained U-Net with custom generators.
Hello,
Firstly thank you very much for your project. I have a problem about large dataset. I have 10 thousand images and masks.
I can create .npy files. When I run unet.py files with 10 thousand images I get error Memory Usage Exceeds. When I decrease the number of dataset (max 600 images) no any error. I think batch size don't work in model.fit. Do you have any advice or solution?
Best regards.
The text was updated successfully, but these errors were encountered: