import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
import seaborn as sns
import tensorflow
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense
from tensorflow.keras.layers import Dropout
from tensorflow.keras.layers import Flatten
from tensorflow.keras.datasets import fashion_mnist
print(tensorflow.__version__)
2.7.0
(X_train,y_train),(X_test,y_test) = fashion_mnist.load_data()
for i in range(25):
    # define subplot
    plt.subplot(5, 5, i+1)
    # plot raw pixel data
    plt.imshow(X_train[i], cmap=plt.get_cmap('gray'))

# show the figure
plt.figure(figsize=(12,8))
plt.show()
<Figure size 864x576 with 0 Axes>
X_train = X_train/255
X_test = X_test/255
model = Sequential([
                  
    #flattening the images
    Flatten(input_shape=(28,28)),

    #adding first hidden layer
    Dense(256, activation='relu'),

    #adding second hidden layer
    Dense(128, activation='relu'),

    #adding third hidden layer
    Dense(64, activation='relu'),

    #adding output layer
    Dense(10, activation='softmax')
])
model.compile(loss='sparse_categorical_crossentropy',optimizer='adam',metrics=['accuracy'])

#fitting the model

model.fit(X_train, y_train, epochs = 10)
Epoch 1/10
1875/1875 [==============================] - 6s 3ms/step - loss: 0.4892 - accuracy: 0.8212
Epoch 2/10
1875/1875 [==============================] - 5s 3ms/step - loss: 0.3665 - accuracy: 0.8652
Epoch 3/10
1875/1875 [==============================] - 5s 3ms/step - loss: 0.3299 - accuracy: 0.8787
Epoch 4/10
1875/1875 [==============================] - 5s 3ms/step - loss: 0.3074 - accuracy: 0.8872
Epoch 5/10
1875/1875 [==============================] - 5s 3ms/step - loss: 0.2881 - accuracy: 0.8910
Epoch 6/10
1875/1875 [==============================] - 5s 3ms/step - loss: 0.2731 - accuracy: 0.8975
Epoch 7/10
1875/1875 [==============================] - 5s 3ms/step - loss: 0.2625 - accuracy: 0.9009
Epoch 8/10
1875/1875 [==============================] - 5s 3ms/step - loss: 0.2479 - accuracy: 0.9062
Epoch 9/10
1875/1875 [==============================] - 5s 3ms/step - loss: 0.2385 - accuracy: 0.9087
Epoch 10/10
1875/1875 [==============================] - 5s 3ms/step - loss: 0.2313 - accuracy: 0.9122
<keras.callbacks.History at 0x7febe018e410>
model.evaluate(X_test,y_test)
313/313 [==============================] - 1s 2ms/step - loss: 0.3262 - accuracy: 0.8853
[0.3262254595756531, 0.8852999806404114]
'''
Now lets tune the following hyperparameter in model -

1. Number of hidden layers
2. Number of neurons in each hidden layers
3. Learning rate
4. Activation Function
'''
'\nNow lets tune the following hyperparameter in model -\n\n1. Number of hidden layers\n2. Number of neurons in each hidden layers\n3. Learning rate\n4. Activation Function\n'
! pip install keras-tuner
Requirement already satisfied: keras-tuner in /usr/local/lib/python3.7/dist-packages (1.1.0)
Requirement already satisfied: scipy in /usr/local/lib/python3.7/dist-packages (from keras-tuner) (1.4.1)
Requirement already satisfied: packaging in /usr/local/lib/python3.7/dist-packages (from keras-tuner) (21.3)
Requirement already satisfied: requests in /usr/local/lib/python3.7/dist-packages (from keras-tuner) (2.23.0)
Requirement already satisfied: ipython in /usr/local/lib/python3.7/dist-packages (from keras-tuner) (5.5.0)
Requirement already satisfied: numpy in /usr/local/lib/python3.7/dist-packages (from keras-tuner) (1.19.5)
Requirement already satisfied: kt-legacy in /usr/local/lib/python3.7/dist-packages (from keras-tuner) (1.0.4)
Requirement already satisfied: tensorboard in /usr/local/lib/python3.7/dist-packages (from keras-tuner) (2.7.0)
Requirement already satisfied: pickleshare in /usr/local/lib/python3.7/dist-packages (from ipython->keras-tuner) (0.7.5)
Requirement already satisfied: pexpect in /usr/local/lib/python3.7/dist-packages (from ipython->keras-tuner) (4.8.0)
Requirement already satisfied: simplegeneric>0.8 in /usr/local/lib/python3.7/dist-packages (from ipython->keras-tuner) (0.8.1)
Requirement already satisfied: traitlets>=4.2 in /usr/local/lib/python3.7/dist-packages (from ipython->keras-tuner) (5.1.1)
Requirement already satisfied: pygments in /usr/local/lib/python3.7/dist-packages (from ipython->keras-tuner) (2.6.1)
Requirement already satisfied: prompt-toolkit<2.0.0,>=1.0.4 in /usr/local/lib/python3.7/dist-packages (from ipython->keras-tuner) (1.0.18)
Requirement already satisfied: setuptools>=18.5 in /usr/local/lib/python3.7/dist-packages (from ipython->keras-tuner) (57.4.0)
Requirement already satisfied: decorator in /usr/local/lib/python3.7/dist-packages (from ipython->keras-tuner) (4.4.2)
Requirement already satisfied: six>=1.9.0 in /usr/local/lib/python3.7/dist-packages (from prompt-toolkit<2.0.0,>=1.0.4->ipython->keras-tuner) (1.15.0)
Requirement already satisfied: wcwidth in /usr/local/lib/python3.7/dist-packages (from prompt-toolkit<2.0.0,>=1.0.4->ipython->keras-tuner) (0.2.5)
Requirement already satisfied: pyparsing!=3.0.5,>=2.0.2 in /usr/local/lib/python3.7/dist-packages (from packaging->keras-tuner) (3.0.6)
Requirement already satisfied: ptyprocess>=0.5 in /usr/local/lib/python3.7/dist-packages (from pexpect->ipython->keras-tuner) (0.7.0)
Requirement already satisfied: urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 in /usr/local/lib/python3.7/dist-packages (from requests->keras-tuner) (1.24.3)
Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.7/dist-packages (from requests->keras-tuner) (2021.10.8)
Requirement already satisfied: chardet<4,>=3.0.2 in /usr/local/lib/python3.7/dist-packages (from requests->keras-tuner) (3.0.4)
Requirement already satisfied: idna<3,>=2.5 in /usr/local/lib/python3.7/dist-packages (from requests->keras-tuner) (2.10)
Requirement already satisfied: wheel>=0.26 in /usr/local/lib/python3.7/dist-packages (from tensorboard->keras-tuner) (0.37.0)
Requirement already satisfied: tensorboard-data-server<0.7.0,>=0.6.0 in /usr/local/lib/python3.7/dist-packages (from tensorboard->keras-tuner) (0.6.1)
Requirement already satisfied: google-auth<3,>=1.6.3 in /usr/local/lib/python3.7/dist-packages (from tensorboard->keras-tuner) (1.35.0)
Requirement already satisfied: google-auth-oauthlib<0.5,>=0.4.1 in /usr/local/lib/python3.7/dist-packages (from tensorboard->keras-tuner) (0.4.6)
Requirement already satisfied: werkzeug>=0.11.15 in /usr/local/lib/python3.7/dist-packages (from tensorboard->keras-tuner) (1.0.1)
Requirement already satisfied: grpcio>=1.24.3 in /usr/local/lib/python3.7/dist-packages (from tensorboard->keras-tuner) (1.42.0)
Requirement already satisfied: absl-py>=0.4 in /usr/local/lib/python3.7/dist-packages (from tensorboard->keras-tuner) (0.12.0)
Requirement already satisfied: tensorboard-plugin-wit>=1.6.0 in /usr/local/lib/python3.7/dist-packages (from tensorboard->keras-tuner) (1.8.0)
Requirement already satisfied: markdown>=2.6.8 in /usr/local/lib/python3.7/dist-packages (from tensorboard->keras-tuner) (3.3.6)
Requirement already satisfied: protobuf>=3.6.0 in /usr/local/lib/python3.7/dist-packages (from tensorboard->keras-tuner) (3.17.3)
Requirement already satisfied: rsa<5,>=3.1.4 in /usr/local/lib/python3.7/dist-packages (from google-auth<3,>=1.6.3->tensorboard->keras-tuner) (4.7.2)
Requirement already satisfied: cachetools<5.0,>=2.0.0 in /usr/local/lib/python3.7/dist-packages (from google-auth<3,>=1.6.3->tensorboard->keras-tuner) (4.2.4)
Requirement already satisfied: pyasn1-modules>=0.2.1 in /usr/local/lib/python3.7/dist-packages (from google-auth<3,>=1.6.3->tensorboard->keras-tuner) (0.2.8)
Requirement already satisfied: requests-oauthlib>=0.7.0 in /usr/local/lib/python3.7/dist-packages (from google-auth-oauthlib<0.5,>=0.4.1->tensorboard->keras-tuner) (1.3.0)
Requirement already satisfied: importlib-metadata>=4.4 in /usr/local/lib/python3.7/dist-packages (from markdown>=2.6.8->tensorboard->keras-tuner) (4.8.2)
Requirement already satisfied: typing-extensions>=3.6.4 in /usr/local/lib/python3.7/dist-packages (from importlib-metadata>=4.4->markdown>=2.6.8->tensorboard->keras-tuner) (3.10.0.2)
Requirement already satisfied: zipp>=0.5 in /usr/local/lib/python3.7/dist-packages (from importlib-metadata>=4.4->markdown>=2.6.8->tensorboard->keras-tuner) (3.6.0)
Requirement already satisfied: pyasn1<0.5.0,>=0.4.6 in /usr/local/lib/python3.7/dist-packages (from pyasn1-modules>=0.2.1->google-auth<3,>=1.6.3->tensorboard->keras-tuner) (0.4.8)
Requirement already satisfied: oauthlib>=3.0.0 in /usr/local/lib/python3.7/dist-packages (from requests-oauthlib>=0.7.0->google-auth-oauthlib<0.5,>=0.4.1->tensorboard->keras-tuner) (3.1.1)
from tensorflow import keras
from keras_tuner import RandomSearch
def build_model(hp):          # hp means hyper parameters
    model=Sequential()
    model.add(Flatten(input_shape = (28,28)))

    # Providing range for number of neurons in a hidden layer
    model.add(Dense(units = hp.Int('num_of_neurons', min_value = 32, max_value = 512, step = 32),
                                    activation ='relu'))
    
    # Output layer
    model.add(Dense(10, activation='softmax'))

    # Compiling the model
    model.compile(optimizer = keras.optimizers.Adam(hp.Choice('learning_rate',values = [1e-2, 1e-3, 1e-4])),
                  loss = 'sparse_categorical_crossentropy', metrics = ['accuracy'])
    return model
tuner = RandomSearch(build_model,
                    objective = 'val_accuracy',
                    max_trials = 5,
                    executions_per_trial = 3,
                    directory = 'tuner1',
                    project_name = 'Clothing')

# So this will run for (5*3)= 15 times with 10 epochs
INFO:tensorflow:Reloading Oracle from existing project tuner1/Clothing/oracle.json
# in our case it's 2 = neurons, learning rate

tuner.search_space_summary()
Search space summary
Default search space size: 2
num_of_neurons (Int)
{'default': None, 'conditions': [], 'min_value': 32, 'max_value': 512, 'step': 32, 'sampling': None}
learning_rate (Choice)
{'default': 0.01, 'conditions': [], 'values': [0.01, 0.001, 0.0001], 'ordered': True}
tuner.search(X_train, y_train, epochs = 10, validation_data = (X_test, y_test))
Trial 6 Complete [00h 03m 07s]
val_accuracy: 0.8758000135421753

Best val_accuracy So Far: 0.8880000114440918
Total elapsed time: 00h 14m 10s
INFO:tensorflow:Oracle triggered exit
tuner.results_summary()
Results summary
Results in tuner1/Clothing
Showing 10 best trials
Objective(name='val_accuracy', direction='max')
Trial summary
Hyperparameters:
num_of_neurons: 384
learning_rate: 0.001
Score: 0.8880000114440918
Trial summary
Hyperparameters:
num_of_neurons: 224
learning_rate: 0.001
Score: 0.8853999773661295
Trial summary
Hyperparameters:
num_of_neurons: 352
learning_rate: 0.0001
Score: 0.8758000135421753
Trial summary
Hyperparameters:
num_of_neurons: 224
learning_rate: 0.0001
Score: 0.8705000082651774
Trial summary
Hyperparameters:
num_of_neurons: 96
learning_rate: 0.01
Score: 0.8511333266894022
'''
Now lets tune some more parameters - 

Now we will provide the range of the number of the layers to be 
used in the model which is between 2 to 20
'''
'\nNow lets tune some more parameters - \nNow we will provide the range of the number of the layers to be \nused in the model which is between 2 to 20\n'
def build_model(hp):                 # Hp means hyper parameters
    model=Sequential()
    model.add(Flatten(input_shape = (28,28)))

    # Providing the range for hidden layers  
    for i in range(hp.Int('num_of_layers', 2, 20)):         
        # Providing range for number of neurons in hidden layers
        model.add(Dense(units = hp.Int('num_of_neurons'+ str(i), min_value=32, max_value = 512, step = 32),
                                    activation = 'relu'))
        
    model.add(Dense(10, activation = 'softmax'))    # Output layer
    
    #  Compiling the model
    model.compile(optimizer=keras.optimizers.Adam(hp.Choice('learning_rate', values=[1e-2, 1e-3, 1e-4])),   #tuning learning rate
                  loss='sparse_categorical_crossentropy', metrics=['accuracy'])
    return model
tuner=RandomSearch(build_model,
                  objective = 'val_accuracy',
                  max_trials = 5,
                  executions_per_trial = 3,
                  directory = 'project',
                  project_name = 'Clothing')
# in our case it's 3 = layers, neurons, learning rate

tuner.search_space_summary()
Search space summary
Default search space size: 4
num_of_layers (Int)
{'default': None, 'conditions': [], 'min_value': 2, 'max_value': 20, 'step': 1, 'sampling': None}
num_of_neurons0 (Int)
{'default': None, 'conditions': [], 'min_value': 32, 'max_value': 512, 'step': 32, 'sampling': None}
num_of_neurons1 (Int)
{'default': None, 'conditions': [], 'min_value': 32, 'max_value': 512, 'step': 32, 'sampling': None}
learning_rate (Choice)
{'default': 0.01, 'conditions': [], 'values': [0.01, 0.001, 0.0001], 'ordered': True}
tuner.search(X_train, y_train, epochs = 10, validation_data = (X_test, y_test))
Trial 5 Complete [00h 03m 40s]
val_accuracy: 0.7541333436965942

Best val_accuracy So Far: 0.887499988079071
Total elapsed time: 00h 15m 49s
INFO:tensorflow:Oracle triggered exit
tuner.results_summary()
Results summary
Results in project/Clothing
Showing 10 best trials
Objective(name='val_accuracy', direction='max')
Trial summary
Hyperparameters:
num_of_layers: 2
num_of_neurons0: 448
num_of_neurons1: 480
learning_rate: 0.001
Score: 0.887499988079071
Trial summary
Hyperparameters:
num_of_layers: 3
num_of_neurons0: 448
num_of_neurons1: 352
learning_rate: 0.001
num_of_neurons2: 128
num_of_neurons3: 192
num_of_neurons4: 416
num_of_neurons5: 32
num_of_neurons6: 160
num_of_neurons7: 128
num_of_neurons8: 320
Score: 0.8860666751861572
Trial summary
Hyperparameters:
num_of_layers: 2
num_of_neurons0: 128
num_of_neurons1: 160
learning_rate: 0.001
num_of_neurons2: 352
num_of_neurons3: 320
num_of_neurons4: 480
num_of_neurons5: 224
num_of_neurons6: 288
num_of_neurons7: 160
num_of_neurons8: 512
Score: 0.8860333363215128
Trial summary
Hyperparameters:
num_of_layers: 9
num_of_neurons0: 416
num_of_neurons1: 384
learning_rate: 0.0001
num_of_neurons2: 32
num_of_neurons3: 32
num_of_neurons4: 32
num_of_neurons5: 32
num_of_neurons6: 32
num_of_neurons7: 32
num_of_neurons8: 32
Score: 0.8800000150998434
Trial summary
Hyperparameters:
num_of_layers: 7
num_of_neurons0: 416
num_of_neurons1: 320
learning_rate: 0.01
num_of_neurons2: 192
num_of_neurons3: 224
num_of_neurons4: 320
num_of_neurons5: 352
num_of_neurons6: 96
num_of_neurons7: 160
num_of_neurons8: 352
Score: 0.7541333436965942

Summary

Among all the selected values of hyperparameter we select one with high score.