I’m newbie, I’ve working with tensorflow dataset so it’s my first time loading huge external data but there’s a problem, my data set it’s to big for my memory capacity.
I tried wit .flow_from_directory() but it seems to organize the data with classes and classes are the folders inside. This is not the case of my dataset, it’s a train folder -> a lot of folders with random names and inside there is the images, so .flow_from_directory() reads that random name as the label or the class. Is there a way to change that?
I’ve read the tf.data documentation but honestly, I don’t know how to solve my problem yet. I want to load all the data at the same time but it’s too big, so I need help. Please don’t only send me to read the documentation again :(.
submitted by /u/Current_Falcon_3187
[visit reddit] [comments]