1
I’m reading a folder (and sub-folders) with images to form an own dataset, then modeled a CNN and prepared the training part, but from there things stopped working. After execution, in the terminal I received the following message:
WARNING:tensorflow:Using temporary folder as model directory: C:\Users\Kalunga\AppData\Local\Temp\tmpweraw06i
Traceback (most recent call last):
File "teste-tensorflow.py", line 113, in <module>
classificador.train(input_fn=funcao_treinamento, steps=5)
File "C:\Users\Kalunga\AppData\Local\conda\conda\envs\Teste\lib\site-packages\tensorflow\python\estimator\estimator.py", line 354, in train
loss = self._train_model(input_fn, hooks, saving_listeners)
File "C:\Users\Kalunga\AppData\Local\conda\conda\envs\Teste\lib\site-packages\tensorflow\python\estimator\estimator.py", line 1207, in _train_model
return self._train_model_default(input_fn, hooks, saving_listeners)
File "C:\Users\Kalunga\AppData\Local\conda\conda\envs\Teste\lib\site-packages\tensorflow\python\estimator\estimator.py", line 1234, in _train_model_default
input_fn, model_fn_lib.ModeKeys.TRAIN))
File "C:\Users\Kalunga\AppData\Local\conda\conda\envs\Teste\lib\site-packages\tensorflow\python\estimator\estimator.py", line 1075, in _get_features_and_labels_from_input_fn
self._call_input_fn(input_fn, mode))
File "C:\Users\Kalunga\AppData\Local\conda\conda\envs\Teste\lib\site-packages\tensorflow\python\estimator\estimator.py", line 1162, in _call_input_fn
return input_fn(**kwargs)
File "C:\Users\Kalunga\AppData\Local\conda\conda\envs\Teste\lib\site-packages\tensorflow\python\estimator\inputs\numpy_io.py", line 177, in input_fn
if len(set(v.shape[0] for v in ordered_dict_data.values())) != 1:
TypeError: unhashable type: 'Dimension'
The full code (until the training part) is below, if anyone can assist me I would be grateful. Because I don’t know how to solve this problem with dimensions. I tried to pass as parameter only the numpy array of the images and Abels, but I was not successful.
import tensorflow as tf
import numpy as np
import matplotlib.pyplot as plt
#Caminho da pasta de treino
path = 'C:\\wamp64\\www\\python\\imagenet\\ILSVRC2017_DET.tar\\ILSVRC2017_DET\\ILSVRC\\Data\\DET\\train\\ILSVRC2013_train'
#Arquivo com nome das imagens
file_imgs = open("images.txt",'r')
#Arquivo com o nome dos synsets/pastas
file_synsets = open("labels_synset.txt",'r')
#Arquivo com as classes
file_labels = open("classes_label.txt",'r')
#Lendo imagens e pastas
images_txt = file_imgs.readlines()
synsets_txt = file_synsets.readlines()
#Removendo a quebra de linha e inserindo o caminho completo da imagem
for i,img in enumerate(images_txt):
synset = synsets_txt[i].strip()
images_txt[i] = path + '\\' + synset + '\\' + img.strip()
#Lendo labels e removendo quebras de linha
labels_txt = file_labels.readlines()
for i,lab in enumerate(labels_txt): labels_txt[i] = int(lab.strip())
#Convertendo a lista para um array numpy
images_txt = np.asarray(images_txt)
labels_txt = np.asarray(labels_txt)
#Fechando arquivos de texto
file_imgs.close()
file_synsets.close()
file_labels.close()
#Gerando tensores dos arquivos e labels
filenames = tf.constant(images_txt)
labels = tf.constant(labels_txt)
#Criando dataset
dataset = tf.data.Dataset.from_tensor_slices((filenames,labels))
#Lendo imagem por imagem e redimensionando para 224x224
def _parse_function(filename,label):
image_string = tf.read_file(filename)
image_decoded = tf.image.decode_jpeg(image_string, channels=1)
image_resized = tf.image.resize_images(image_decoded,[224,224])
#image = tf.cast(image_resized, tf.float32)
return image_resized, label
#Mapeando pre-processamento
dataset = dataset.map(_parse_function)
dataset = dataset.batch(2)
#Gerando Dataset com imagens e labels
iterator = dataset.make_one_shot_iterator()
images, labels = iterator.get_next()
#Arquitetura da Rede
def rede(features, labels, mode):
#Arquitetura
...
classificador = tf.estimator.Estimator(model_fn = rede)
funcao_treinamento = tf.estimator.inputs.numpy_input_fn(x = {'x':images}, y = {'y':labels}, batch_size=128, num_epochs=10, shuffle=True)
classificador.train(input_fn=funcao_treinamento, steps=5)
Thanks in advance.
You are using the function
numpy_input_fn
which requires inputs to be numpy.array type, but is passing Tf. Tensor. You could try casting your input to numpy.array?– Emanuel Huber