Veja no TensorFlow.org | Executar no Google Colab | Ver fonte no GitHub | Baixar caderno |
Visão geral
O Keras Tuner é uma biblioteca que ajuda você a escolher o conjunto ideal de hiperparâmetros para seu programa TensorFlow. O processo de selecionar o conjunto certo de hiperparâmetros para seu aplicativo de aprendizado de máquina (ML) é chamado de ajuste de hiperparâmetro ou hypertuning .
Os hiperparâmetros são as variáveis que governam o processo de treinamento e a topologia de um modelo de ML. Essas variáveis permanecem constantes ao longo do processo de treinamento e impactam diretamente no desempenho do seu programa de ML. Os hiperparâmetros são de dois tipos:
- Hiperparâmetros do modelo que influenciam a seleção do modelo, como o número e a largura das camadas ocultas
- Hiperparâmetros do algoritmo que influenciam a velocidade e a qualidade do algoritmo de aprendizado, como a taxa de aprendizado para Stochastic Gradient Descent (SGD) e o número de vizinhos mais próximos para o classificador ak Nearest Neighbors (KNN)
Neste tutorial, você usará o Keras Tuner para realizar o hypertuning para um aplicativo de classificação de imagens.
Configurar
import tensorflow as tf
from tensorflow import keras
Instale e importe o Keras Tuner.
pip install -q -U keras-tuner
import keras_tuner as kt
Baixe e prepare o conjunto de dados
Neste tutorial, você usará o Keras Tuner para encontrar os melhores hiperparâmetros para um modelo de aprendizado de máquina que classifica imagens de roupas do conjunto de dados Fashion MNIST .
Carregue os dados.
(img_train, label_train), (img_test, label_test) = keras.datasets.fashion_mnist.load_data()
# Normalize pixel values between 0 and 1
img_train = img_train.astype('float32') / 255.0
img_test = img_test.astype('float32') / 255.0
Defina o modelo
Ao construir um modelo para hypertuning, você também define o espaço de pesquisa do hiperparâmetro além da arquitetura do modelo. O modelo que você configurou para hypertuning é chamado de hipermodelo .
Você pode definir um hipermodelo por meio de duas abordagens:
- Usando uma função de construtor de modelo
- Subclassificando a classe
HyperModel
da API Keras Tuner
Você também pode usar duas classes HyperModel
predefinidas - HyperXception e HyperResNet para aplicativos de visão computacional.
Neste tutorial, você usa uma função de construtor de modelo para definir o modelo de classificação de imagem. A função de construtor de modelo retorna um modelo compilado e usa hiperparâmetros que você define sequencialmente para ajustar o modelo.
def model_builder(hp):
model = keras.Sequential()
model.add(keras.layers.Flatten(input_shape=(28, 28)))
# Tune the number of units in the first Dense layer
# Choose an optimal value between 32-512
hp_units = hp.Int('units', min_value=32, max_value=512, step=32)
model.add(keras.layers.Dense(units=hp_units, activation='relu'))
model.add(keras.layers.Dense(10))
# Tune the learning rate for the optimizer
# Choose an optimal value from 0.01, 0.001, or 0.0001
hp_learning_rate = hp.Choice('learning_rate', values=[1e-2, 1e-3, 1e-4])
model.compile(optimizer=keras.optimizers.Adam(learning_rate=hp_learning_rate),
loss=keras.losses.SparseCategoricalCrossentropy(from_logits=True),
metrics=['accuracy'])
return model
Instancie o sintonizador e execute o hypertuning
Instancie o sintonizador para realizar o hypertuning. O Keras Tuner tem quatro sintonizadores disponíveis - RandomSearch
, Hyperband
, BayesianOptimization
e Sklearn
. Neste tutorial, você usa o sintonizador Hyperband .
Para instanciar o sintonizador Hyperband, você deve especificar o hipermodelo, o objective
a ser otimizado e o número máximo de épocas para treinar ( max_epochs
).
tuner = kt.Hyperband(model_builder,
objective='val_accuracy',
max_epochs=10,
factor=3,
directory='my_dir',
project_name='intro_to_kt')
O algoritmo de ajuste Hyperband usa alocação adaptável de recursos e parada antecipada para convergir rapidamente em um modelo de alto desempenho. Isso é feito usando um suporte de estilo de campeonato esportivo. O algoritmo treina um grande número de modelos por algumas épocas e leva apenas a metade dos modelos com melhor desempenho para a próxima rodada. A hiperbanda determina o número de modelos a serem treinados em um colchete calculando 1 + factor
logarítmico ( max_epochs
) e arredondando-o para o inteiro mais próximo.
Crie um retorno de chamada para interromper o treinamento antes de atingir um determinado valor para a perda de validação.
stop_early = tf.keras.callbacks.EarlyStopping(monitor='val_loss', patience=5)
Execute a pesquisa de hiperparâmetros. Os argumentos para o método de pesquisa são os mesmos usados para tf.keras.model.fit
, além do retorno de chamada acima.
tuner.search(img_train, label_train, epochs=50, validation_split=0.2, callbacks=[stop_early])
# Get the optimal hyperparameters
best_hps=tuner.get_best_hyperparameters(num_trials=1)[0]
print(f"""
The hyperparameter search is complete. The optimal number of units in the first densely-connected
layer is {best_hps.get('units')} and the optimal learning rate for the optimizer
is {best_hps.get('learning_rate')}.
""")
Trial 30 Complete [00h 00m 35s] val_accuracy: 0.8925833106040955 Best val_accuracy So Far: 0.8925833106040955 Total elapsed time: 00h 07m 26s INFO:tensorflow:Oracle triggered exit The hyperparameter search is complete. The optimal number of units in the first densely-connected layer is 320 and the optimal learning rate for the optimizer is 0.001.
Treine o modelo
Encontre o número ótimo de épocas para treinar o modelo com os hiperparâmetros obtidos na busca.
# Build the model with the optimal hyperparameters and train it on the data for 50 epochs
model = tuner.hypermodel.build(best_hps)
history = model.fit(img_train, label_train, epochs=50, validation_split=0.2)
val_acc_per_epoch = history.history['val_accuracy']
best_epoch = val_acc_per_epoch.index(max(val_acc_per_epoch)) + 1
print('Best epoch: %d' % (best_epoch,))
Epoch 1/50 1500/1500 [==============================] - 4s 2ms/step - loss: 0.4988 - accuracy: 0.8232 - val_loss: 0.4142 - val_accuracy: 0.8517 Epoch 2/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.3717 - accuracy: 0.8646 - val_loss: 0.3437 - val_accuracy: 0.8773 Epoch 3/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.3317 - accuracy: 0.8779 - val_loss: 0.3806 - val_accuracy: 0.8639 Epoch 4/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.3079 - accuracy: 0.8867 - val_loss: 0.3321 - val_accuracy: 0.8801 Epoch 5/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.2882 - accuracy: 0.8943 - val_loss: 0.3313 - val_accuracy: 0.8806 Epoch 6/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.2727 - accuracy: 0.8977 - val_loss: 0.3152 - val_accuracy: 0.8857 Epoch 7/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.2610 - accuracy: 0.9016 - val_loss: 0.3225 - val_accuracy: 0.8873 Epoch 8/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.2474 - accuracy: 0.9060 - val_loss: 0.3198 - val_accuracy: 0.8867 Epoch 9/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.2385 - accuracy: 0.9105 - val_loss: 0.3266 - val_accuracy: 0.8822 Epoch 10/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.2295 - accuracy: 0.9142 - val_loss: 0.3382 - val_accuracy: 0.8835 Epoch 11/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.2170 - accuracy: 0.9185 - val_loss: 0.3215 - val_accuracy: 0.8885 Epoch 12/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.2102 - accuracy: 0.9202 - val_loss: 0.3194 - val_accuracy: 0.8923 Epoch 13/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.2036 - accuracy: 0.9235 - val_loss: 0.3176 - val_accuracy: 0.8901 Epoch 14/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.1955 - accuracy: 0.9272 - val_loss: 0.3269 - val_accuracy: 0.8912 Epoch 15/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.1881 - accuracy: 0.9292 - val_loss: 0.3391 - val_accuracy: 0.8878 Epoch 16/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.1821 - accuracy: 0.9321 - val_loss: 0.3272 - val_accuracy: 0.8920 Epoch 17/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.1771 - accuracy: 0.9332 - val_loss: 0.3536 - val_accuracy: 0.8876 Epoch 18/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.1697 - accuracy: 0.9363 - val_loss: 0.3395 - val_accuracy: 0.8927 Epoch 19/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.1652 - accuracy: 0.9374 - val_loss: 0.3464 - val_accuracy: 0.8937 Epoch 20/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.1606 - accuracy: 0.9392 - val_loss: 0.3576 - val_accuracy: 0.8888 Epoch 21/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.1539 - accuracy: 0.9417 - val_loss: 0.3724 - val_accuracy: 0.8867 Epoch 22/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.1503 - accuracy: 0.9435 - val_loss: 0.3607 - val_accuracy: 0.8954 Epoch 23/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.1450 - accuracy: 0.9454 - val_loss: 0.3525 - val_accuracy: 0.8919 Epoch 24/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.1398 - accuracy: 0.9473 - val_loss: 0.3745 - val_accuracy: 0.8919 Epoch 25/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.1370 - accuracy: 0.9478 - val_loss: 0.3616 - val_accuracy: 0.8941 Epoch 26/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.1334 - accuracy: 0.9498 - val_loss: 0.3866 - val_accuracy: 0.8956 Epoch 27/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.1282 - accuracy: 0.9519 - val_loss: 0.3947 - val_accuracy: 0.8924 Epoch 28/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.1254 - accuracy: 0.9538 - val_loss: 0.4223 - val_accuracy: 0.8870 Epoch 29/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.1222 - accuracy: 0.9536 - val_loss: 0.3805 - val_accuracy: 0.8898 Epoch 30/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.1179 - accuracy: 0.9546 - val_loss: 0.4052 - val_accuracy: 0.8942 Epoch 31/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.1162 - accuracy: 0.9560 - val_loss: 0.3909 - val_accuracy: 0.8955 Epoch 32/50 1500/1500 [==============================] - 4s 2ms/step - loss: 0.1152 - accuracy: 0.9572 - val_loss: 0.4160 - val_accuracy: 0.8908 Epoch 33/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.1100 - accuracy: 0.9583 - val_loss: 0.4280 - val_accuracy: 0.8938 Epoch 34/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.1055 - accuracy: 0.9603 - val_loss: 0.4148 - val_accuracy: 0.8963 Epoch 35/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.1044 - accuracy: 0.9606 - val_loss: 0.4302 - val_accuracy: 0.8921 Epoch 36/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.1046 - accuracy: 0.9605 - val_loss: 0.4205 - val_accuracy: 0.8947 Epoch 37/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.0993 - accuracy: 0.9621 - val_loss: 0.4551 - val_accuracy: 0.8875 Epoch 38/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.0972 - accuracy: 0.9635 - val_loss: 0.4622 - val_accuracy: 0.8914 Epoch 39/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.0951 - accuracy: 0.9642 - val_loss: 0.4423 - val_accuracy: 0.8950 Epoch 40/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.0947 - accuracy: 0.9637 - val_loss: 0.4498 - val_accuracy: 0.8948 Epoch 41/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.0876 - accuracy: 0.9675 - val_loss: 0.4694 - val_accuracy: 0.8959 Epoch 42/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.0902 - accuracy: 0.9657 - val_loss: 0.4778 - val_accuracy: 0.8938 Epoch 43/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.0876 - accuracy: 0.9676 - val_loss: 0.4716 - val_accuracy: 0.8911 Epoch 44/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.0884 - accuracy: 0.9674 - val_loss: 0.4827 - val_accuracy: 0.8918 Epoch 45/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.0764 - accuracy: 0.9715 - val_loss: 0.5008 - val_accuracy: 0.8953 Epoch 46/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.0823 - accuracy: 0.9695 - val_loss: 0.5157 - val_accuracy: 0.8874 Epoch 47/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.0789 - accuracy: 0.9704 - val_loss: 0.5198 - val_accuracy: 0.8910 Epoch 48/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.0778 - accuracy: 0.9716 - val_loss: 0.5031 - val_accuracy: 0.8932 Epoch 49/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.0747 - accuracy: 0.9718 - val_loss: 0.4982 - val_accuracy: 0.8953 Epoch 50/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.0786 - accuracy: 0.9706 - val_loss: 0.5198 - val_accuracy: 0.8976 Best epoch: 50
Reinstanciar o hipermodelo e treiná-lo com o número ideal de épocas de cima.
hypermodel = tuner.hypermodel.build(best_hps)
# Retrain the model
hypermodel.fit(img_train, label_train, epochs=best_epoch, validation_split=0.2)
Epoch 1/50 1500/1500 [==============================] - 4s 2ms/step - loss: 0.4987 - accuracy: 0.8236 - val_loss: 0.4065 - val_accuracy: 0.8488 Epoch 2/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.3738 - accuracy: 0.8652 - val_loss: 0.3847 - val_accuracy: 0.8613 Epoch 3/50 1500/1500 [==============================] - 4s 2ms/step - loss: 0.3344 - accuracy: 0.8775 - val_loss: 0.3568 - val_accuracy: 0.8750 Epoch 4/50 1500/1500 [==============================] - 4s 2ms/step - loss: 0.3065 - accuracy: 0.8865 - val_loss: 0.3326 - val_accuracy: 0.8811 Epoch 5/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.2880 - accuracy: 0.8930 - val_loss: 0.3208 - val_accuracy: 0.8843 Epoch 6/50 1500/1500 [==============================] - 4s 2ms/step - loss: 0.2744 - accuracy: 0.8981 - val_loss: 0.3313 - val_accuracy: 0.8810 Epoch 7/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.2585 - accuracy: 0.9019 - val_loss: 0.3352 - val_accuracy: 0.8790 Epoch 8/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.2445 - accuracy: 0.9078 - val_loss: 0.3151 - val_accuracy: 0.8849 Epoch 9/50 1500/1500 [==============================] - 4s 2ms/step - loss: 0.2366 - accuracy: 0.9113 - val_loss: 0.3167 - val_accuracy: 0.8881 Epoch 10/50 1500/1500 [==============================] - 4s 2ms/step - loss: 0.2241 - accuracy: 0.9162 - val_loss: 0.3258 - val_accuracy: 0.8857 Epoch 11/50 1500/1500 [==============================] - 4s 2ms/step - loss: 0.2158 - accuracy: 0.9194 - val_loss: 0.3087 - val_accuracy: 0.8927 Epoch 12/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.2091 - accuracy: 0.9218 - val_loss: 0.3287 - val_accuracy: 0.8904 Epoch 13/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.1998 - accuracy: 0.9243 - val_loss: 0.3131 - val_accuracy: 0.8950 Epoch 14/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.1937 - accuracy: 0.9271 - val_loss: 0.3177 - val_accuracy: 0.8925 Epoch 15/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.1859 - accuracy: 0.9303 - val_loss: 0.3334 - val_accuracy: 0.8918 Epoch 16/50 1500/1500 [==============================] - 4s 2ms/step - loss: 0.1779 - accuracy: 0.9334 - val_loss: 0.3299 - val_accuracy: 0.8929 Epoch 17/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.1743 - accuracy: 0.9348 - val_loss: 0.3391 - val_accuracy: 0.8920 Epoch 18/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.1687 - accuracy: 0.9366 - val_loss: 0.3302 - val_accuracy: 0.8974 Epoch 19/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.1628 - accuracy: 0.9385 - val_loss: 0.3641 - val_accuracy: 0.8868 Epoch 20/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.1597 - accuracy: 0.9405 - val_loss: 0.3523 - val_accuracy: 0.8942 Epoch 21/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.1534 - accuracy: 0.9434 - val_loss: 0.3584 - val_accuracy: 0.8951 Epoch 22/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.1507 - accuracy: 0.9441 - val_loss: 0.3577 - val_accuracy: 0.8923 Epoch 23/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.1453 - accuracy: 0.9452 - val_loss: 0.3807 - val_accuracy: 0.8957 Epoch 24/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.1392 - accuracy: 0.9476 - val_loss: 0.3711 - val_accuracy: 0.8960 Epoch 25/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.1364 - accuracy: 0.9494 - val_loss: 0.3731 - val_accuracy: 0.8940 Epoch 26/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.1315 - accuracy: 0.9511 - val_loss: 0.3805 - val_accuracy: 0.8932 Epoch 27/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.1319 - accuracy: 0.9507 - val_loss: 0.3966 - val_accuracy: 0.8880 Epoch 28/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.1266 - accuracy: 0.9534 - val_loss: 0.3994 - val_accuracy: 0.8920 Epoch 29/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.1207 - accuracy: 0.9546 - val_loss: 0.3918 - val_accuracy: 0.8959 Epoch 30/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.1174 - accuracy: 0.9567 - val_loss: 0.4043 - val_accuracy: 0.8928 Epoch 31/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.1191 - accuracy: 0.9546 - val_loss: 0.4114 - val_accuracy: 0.8951 Epoch 32/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.1140 - accuracy: 0.9563 - val_loss: 0.4149 - val_accuracy: 0.8962 Epoch 33/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.1121 - accuracy: 0.9574 - val_loss: 0.4373 - val_accuracy: 0.8931 Epoch 34/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.1085 - accuracy: 0.9598 - val_loss: 0.4353 - val_accuracy: 0.8939 Epoch 35/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.1056 - accuracy: 0.9591 - val_loss: 0.4325 - val_accuracy: 0.8938 Epoch 36/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.1066 - accuracy: 0.9600 - val_loss: 0.4700 - val_accuracy: 0.8899 Epoch 37/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.1019 - accuracy: 0.9615 - val_loss: 0.4440 - val_accuracy: 0.8947 Epoch 38/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.0973 - accuracy: 0.9635 - val_loss: 0.4481 - val_accuracy: 0.8959 Epoch 39/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.1008 - accuracy: 0.9622 - val_loss: 0.4772 - val_accuracy: 0.8954 Epoch 40/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.0919 - accuracy: 0.9653 - val_loss: 0.4723 - val_accuracy: 0.8916 Epoch 41/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.0921 - accuracy: 0.9653 - val_loss: 0.4867 - val_accuracy: 0.8953 Epoch 42/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.0919 - accuracy: 0.9657 - val_loss: 0.4710 - val_accuracy: 0.8936 Epoch 43/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.0873 - accuracy: 0.9664 - val_loss: 0.4844 - val_accuracy: 0.8905 Epoch 44/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.0884 - accuracy: 0.9669 - val_loss: 0.4972 - val_accuracy: 0.8963 Epoch 45/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.0849 - accuracy: 0.9685 - val_loss: 0.4790 - val_accuracy: 0.8969 Epoch 46/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.0831 - accuracy: 0.9687 - val_loss: 0.5028 - val_accuracy: 0.8945 Epoch 47/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.0793 - accuracy: 0.9698 - val_loss: 0.5031 - val_accuracy: 0.8945 Epoch 48/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.0806 - accuracy: 0.9693 - val_loss: 0.5065 - val_accuracy: 0.8990 Epoch 49/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.0751 - accuracy: 0.9714 - val_loss: 0.5719 - val_accuracy: 0.8924 Epoch 50/50 1500/1500 [==============================] - 3s 2ms/step - loss: 0.0785 - accuracy: 0.9707 - val_loss: 0.5123 - val_accuracy: 0.8985 <keras.callbacks.History at 0x7fb39810a150>
Para finalizar este tutorial, avalie o hipermodelo nos dados de teste.
eval_result = hypermodel.evaluate(img_test, label_test)
print("[test loss, test accuracy]:", eval_result)
313/313 [==============================] - 1s 2ms/step - loss: 0.5632 - accuracy: 0.8908 [test loss, test accuracy]: [0.5631944537162781, 0.8907999992370605]
O diretório my_dir/intro_to_kt
contém logs e pontos de verificação detalhados para cada tentativa (configuração de modelo) executada durante a pesquisa de hiperparâmetros. Se você executar novamente a pesquisa de hiperparâmetros, o Keras Tuner usará o estado existente desses logs para retomar a pesquisa. Para desabilitar esse comportamento, passe um argumento overwrite=True
adicional ao instanciar o sintonizador.
Resumo
Neste tutorial, você aprendeu como usar o Keras Tuner para ajustar hiperparâmetros para um modelo. Para saber mais sobre o Keras Tuner, confira estes recursos adicionais:
Confira também o HParams Dashboard no TensorBoard para ajustar interativamente os hiperparâmetros do seu modelo.