top of page
Writer's picturevP

Day 28: Deep Learning with TensorFlow and Keras

Welcome back to the PythonForDevOps series. Today is Day 28, and we're diving into the fascinating world of deep learning with TensorFlow and Keras. If you've been following along, you know we've covered a lot in the past days, and now it's time to add some neural networks magic to our Python arsenal.


Why Deep Learning?

Deep learning is a subset of machine learning that deals with neural networks containing multiple layers. It's the secret sauce behind facial recognition, natural language processing, and even those recommended videos on your favorite streaming platform. Today, we'll focus on implementing deep learning in Python using two powerful libraries - TensorFlow and Keras.


Setting Up the Playground

Before we start coding, let's make sure our environment is set up. Install TensorFlow and Keras using pip:

pip install tensorflow keras

Now, let's create a simple neural network. Imagine we want to teach a model to recognize handwritten digits. We'll use the famous MNIST dataset, which contains images of handwritten digits from 0 to 9.


import tensorflow as tf
from tensorflow.keras import layers, models
from tensorflow.keras.datasets import mnist
import matplotlib.pyplot as plt

# Load and preprocess the data
(train_images, train_labels), (test_images, test_labels) = mnist.load_data()
train_images = train_images.reshape((60000, 28, 28, 1)).astype('float32') / 255
test_images = test_images.reshape((10000, 28, 28, 1)).astype('float32') / 255

# Build the model
model = models.Sequential()
model.add(layers.Conv2D(32, (3, 3), activation='relu', input_shape=(28, 28, 1)))
model.add(layers.MaxPooling2D((2, 2)))
model.add(layers.Conv2D(64, (3, 3), activation='relu'))
model.add(layers.MaxPooling2D((2, 2)))
model.add(layers.Conv2D(64, (3, 3), activation='relu'))
model.add(layers.Flatten())
model.add(layers.Dense(64, activation='relu'))
model.add(layers.Dense(10, activation='softmax'))

# Compile the model
model.compile(optimizer='adam',
              loss='sparse_categorical_crossentropy',
              metrics=['accuracy'])

# Train the model
history = model.fit(train_images, train_labels, epochs=5, validation_data=(test_images, test_labels))

In this example, we've built a convolutional neural network (CNN) with three convolutional layers and two dense layers. The model is then compiled with the Adam optimizer and trained on the MNIST dataset for five epochs.


Understanding the Code

Data Preparation: We load the MNIST dataset and preprocess the images by normalizing pixel values between 0 and 1.

Model Architecture: The neural network architecture is defined using the Sequential API from Keras. It consists of convolutional layers followed by max-pooling layers to extract features from the images.

Compilation: We compile the model, specifying the optimizer, loss function, and metrics. Here, we use the Adam optimizer and sparse categorical crossentropy as the loss function.

Training: The model is trained on the training data, and we validate its performance on the test set.


Evaluating the Model

Let's visualize the training process and evaluate our model.

# Plotting the training history
acc = history.history['accuracy']
val_acc = history.history['val_accuracy']
loss = history.history['loss']
val_loss = history.history['val_loss']

epochs = range(1, len(acc) + 1)

plt.plot(epochs, acc, 'bo', label='Training acc')
plt.plot(epochs, val_acc, 'b', label='Validation acc')
plt.title('Training and validation accuracy')
plt.legend()

plt.figure()

plt.plot(epochs, loss, 'bo', label='Training loss')
plt.plot(epochs, val_loss, 'b', label='Validation loss')
plt.title('Training and validation loss')
plt.legend()

plt.show()

These plots help us understand how well our model is learning. Ideally, we want both accuracy and loss to improve over epochs without overfitting.


Today, we've scratched the surface of deep learning with TensorFlow and Keras. We built a simple image classification model, but the possibilities with deep learning are vast. As you continue your PythonForDevOps journey, remember that deep learning is a powerful tool in your toolkit, capable of solving complex problems in various domains.


In the next days, we'll explore more applications of Python in the world of DevOps.


Thank you for reading!


*** Explore | Share | Grow ***

17 views0 comments

Comments

Rated 0 out of 5 stars.
No ratings yet

Add a rating
bottom of page