Hi Devz!
This is a post on using the 'knowledge' gained from part 1 of the Deep Learning with Python book to classify FASHION MNIST images which is dataset of 60,000 images.
This is really an extension of the MNIST Digit Recognizer for identification of different kind of wearables, coat, shirt, shoes etc.
As you can see from the heading, I didn't use the deeper part of DL, which is convolutional neural networks and to be frank just brute forced several combinations until I got the best accuracy on test data.
I used a combination of the following and took the one which had the highest test accuracy:
- Multiple hidden layers
- Multiple unit combination in those hidden layers
- Multiple combination of units in the first layer
The best model was having 1 hidden layer with 128 units and 512 units in the outer layer and an accuracy of 89.04%.
this is not how it should be done though, since resources are limited in real world for working on real, hard ML problems!π
Code
from keras import models
from keras import layers
from keras.datasets import fashion_mnist
import numpy as np
import matplotlib.pyplot as plt
from keras.utils import to_categorical, plot_model
import operator
(train_x, train_y), (test_x, test_y) = fashion_mnist.load_data()
def shape(arr):
x = np.reshape(arr, (len(arr), 784))
return x
#epochs
e = 40
#prediction array
eval_arr = {}
pred_arr = {}
#label summary
label_list = {0: 'T-shirt/top', 1: 'Trouser', 2:'Pullover', 3:'Dress', 4:'Coat', 5:'Sandal', 6:'Shirt', 7:'Sneaker', 8:'Bag', 9:'Ankle boot'}
#path
output_dir_model = r'C:\Users\karan.verma\.spyder-py3\deep-learning\models'
train_x_reshaped = shape(train_x).astype('float32')/255
train_y = to_categorical(train_y)
train_y = train_y.astype('float32')
test_x_reshaped = shape(test_x).astype('float32')/255
test_y = to_categorical(test_y)
test_y = test_y.astype('float32')
#take out validation data, 25% = 15,000 samples
partial_train_x = train_x_reshaped[:int(len(train_x_reshaped)*0.75)]
partial_train_y = train_y[:int(len(train_y)*0.75)]
val_x = train_x_reshaped[int(len(train_x_reshaped)*0.75):]
val_y = train_y[int(len(train_y)*0.75):]
# run model with several different parameters
#lyrs = [1, 2]
#units_out = [512, 256, 128, 64, 32, 16]
#units_in = [128, 64, 32, 16]
lyrs = [1]
units_out = [512]
units_in = [128]
#make model
for lyr in lyrs:
for unit_out in units_out:
for unit_in in units_in:
print('Running model having %d hidden layers & %d units in each hidden layer and %d units in the outer layer' % (lyr, unit_in, unit_out))
model = models.Sequential()
model.add(layers.Dense(unit_out, activation='relu', input_shape=(784,)))
for i in range(lyr):
model.add(layers.Dense(unit_in, activation='relu'))
model.add(layers.Dense(10, activation = 'sigmoid'))
#compile model
model.compile(optimizer = 'rmsprop',
loss = 'categorical_crossentropy',
metrics=['accuracy'])
#model fit
history = model.fit(partial_train_x,
partial_train_y,
epochs = e,
batch_size=512,
validation_data=(val_x, val_y),
verbose = 1)
acc_list = history.history
fig, ax = plt.subplots(2,1, figsize=(20, 10))
plt.subplot(211)
plt.plot(np.arange(1, e+1), acc_list['loss'], label='Training Loss')
plt.plot(np.arange(1, e+1), acc_list['val_loss'], label='Validation Loss')
plt.title('Loss Graph')
plt.xlabel('Epochs')
plt.ylabel('Loss')
plt.legend()
plt.subplot(212)
plt.plot(np.arange(1, e+1), acc_list['acc'], label='Training Acc')
plt.plot(np.arange(1, e+1), acc_list['val_acc'], label='Validation Acc')
plt.title('Accuracy Graph')
plt.xlabel('Epochs')
plt.ylabel('Accuracy')
plt.legend()
#plt.tight_layout()
fig.savefig('{}/Model having %d hidden layers & %d units in each hidden layer and %d units in the outer layer.png'.format(output_dir_model) % (lyr, unit_in, unit_out))
plt.clf()
eval_arr['Model having %d hidden layers & %d units in each hidden layer and %d units in the outer layer' % (lyr, unit_in, unit_out)] = model.evaluate(test_x_reshaped, test_y)
pred_arr['Model having %d hidden layers & %d units in each hidden layer and %d units in the outer layer' % (lyr, unit_in, unit_out)]= np.argmax(model.predict(test_x_reshaped), axis=1)
#get best model from eval_arr
best_model = {}
k = ''
for key, val in eval_arr.items():
best_model[key] = val[1]
print('The best model is: {}'.format(max(best_model.items(), key=operator.itemgetter(1))))
#Building Prediction Pipeline
#('Model having 1 hidden layers & 128 units in each hidden layer and 512 units in the outer layer',0.8953) but it varies
prediction = model.predict(test_x_reshaped)
print('Random samples from the test data: ')
fig, ax = plt.subplots(2, 3, figsize=(20, 10))
for i in range(1, 7):
c = np.random.choice(len(test_x_reshaped))
plt.subplot(2,3,i)
plt.imshow(test_x[c])
plt.title('Original {} & Predicted {}'.format(label_list[np.argmax(test_y[c])], label_list[np.argmax(prediction[c])]))
Output Graph
Key Takeaways
- If you're working on an ML problem, spend time understanding the graphs, otherwise it won't make any sense
- Try problems outside of the book/tutorial etc.
- Implement stuff on your own without any guidance, it will make your brain hurt, but you'll learn.
- Get stuck multiple times, google it, make it work and move forward to section 2 of the book! π
Link to code.
**Edit: Always forget, but all questions, comments, critique are wholeheartedly welcome! π
Enjoy your day!
Top comments (10)
No not all of the essay and academic services are fake because recently I take biotechnology Assignment help in US services from a well-renowned website and they provide me the best work and I usually tried different sites and I really have a good experience.
I was complete unaware of what this platform all about fashion and what kind of services they would offer. That is why I came across this article of monzfashion.com.au/ literally helped me a lot in learning about it, thanks!
I was looking for the best Custom Mobile App Development Company in the US, and during the search, I landed on your page and was very amazed by seeing that that site is all about fashion in coding.
I was looking for the best Mobile Application Development Service in Pakistan, and during this, I came across your blog, although I do not belong to coding but very fascinated to know about the games of coding.
I was looking for eCommerce Fulfillment solution in the US, and during this, I came across your blog and found very impressive knowledge regarding coding, amazing!
I was looking for Eligibility Verification Services in the US, and during this, I came across your blog and find some very useful information here, impressive!
The thing you explained here is completely new to me but after having a look at your article I understood a little bit, as a passenger elevator services provider I know I can't do it.
I was looking for Social Media Marketing & Management in the US, and during this, I came across your blog and found that how amazing it is, like coding is something next level.
AustinTrim is a wholesaler specializing in high-quality patches and embroidery products. They offer a wide range of custom designs, catering to businesses and organizations looking for personalized branding solutions. With a focus on detail and craftsmanship, Austintrim provides reliable and professional services to meet diverse client needs.
I was looking for cheap logistic service in the US, and during this, I came across your this post and got amazed that how powerful the coding is..
Some comments may only be visible to logged-in visitors. Sign in to view all comments.