Know what it is and how it works one of the most important alternatives such as the function softmax in TensorFlow It is presented as a requirement of all good data scientist who intends to become an expert in the world of Deep Learning. If you’re interested, This post is ideal for you!
In fact, in the development of this article we will explain to you everything related to the function softmax in TensorFlowso that you know it both theoretically and practically to implement it effectively.
softmax function in TensorFlow
The function softmax in TensorFlow it is presented as an alternative to the typical function sigmode of G = sigmoid(). Definitely, the function softmax in TensorFlow for data processing involves:
Convert processed data into probabilities.That The sum of the probabilities results in 1.
Besides, the function formula softmax in TensorFlow is the following:
Example softmax function in TensorFlow
# We also need a placeholder for the image tag, with which
# we will compare our prediction
y_true = tf.placeholder(tf.float32, [None, n_output])
# We define our loss function: the cross entropy
cross_entropy = –tf.reduce_sum(y_true * tf.log(net_output))
# We check if our prediction is equal to the label
idx_prediction = tf.argmax(net_output, 1) idx_label = tf.argmax(y_true, 1) correct_prediction = tf.equal(idx_prediction, idx_label)
# We define our precision measure as the number of correct answers with respect to
# to the number of predicted samples
accuracy = tf.reduce_mean(tf.cast(correct_prediction, «float»))
# Now we indicate that we want to minimize our loss function (the entropy
# cross) using the gradient descent algorithm and with a rate of
# learning = 0.01.
optimizer = tf.train.GradientDescentOptimizer(0.01).minimize(cross_entropy)
# Now we have everything ready to execute the graph.
# This is necessary to be able to write continuously on the same line
desde IPython.display import clear_output
with tf.Session() ace sess: sess.run(tf.global_variables_initializer())
# Now we train our regressor
batch_size = 10 # we introduce the images 10 by 10
for sample_i in range(mnist.train.num_examples): sample_x, sample_y = mnist.train.next_batch(batch_size) sess.run(optimizer, feed_dict={net_input: sample_x, y_true: sample_y})
# We check how our regressor is working with the validation set
if sample_i < fifty or sample_i % 200 == 0: val_acc = sess.run(accuracy, feed_dict={net_input: mnist.validation.images, y_true: mnist.validation.labels}) print(«({}/{}) Acc: {}».format(sample_i, mnist.train.num_examples, val_acc))
# When you have seen all the samples in the training set,
# show the final precision with the test set
print(‘Test accuracy: ‘, sess.run(accuracy, feed_dict={net_input: mnist.test.images, y_true: mnist.test.labels}))
With this you can now train your first NEURAL NETWORK with TensorFlow!
In this process, we have implemented a logistic regression with the formula y = G(Wx + b), where G = softmax() ,instead of the typical G = sigmoid(). Well, if you look at the following image, which defines the perceptron (a single layer neural network) you can check how output = Activation_function(Wx).
You see it? Only the bias is missing! And have you noticed that there is a 1 at the entrance? Therefore, the weight w0 is not multiplied by anything. In effect, the weight w0 is the bias, which appears with this notation simply to be able to implement it in the form of matrix multiplication.
Definitely, what we just implemented is a perceptron with a batch_size = 10, 1 epoch, gradient descent as an optimizer and the function softmax in TensorFlow as an activation function.
Learn more about Big Data
In the development of this post, you have been able to familiarize yourself with everything related to the function softmax in TensorFlow; However, you should keep in mind that this is just one of the alternatives that you can consider in the processing of big data through this type of framework for Deep Learning.
At , we know that teaching yourself can be complicated and boring, which is why we present the Full Stack Big Data, Artificial Intelligence & Machine Learning Bootcamp. Through this comprehensive and intensive training, You will be able to have all the necessary knowledge to be an expert in the ecosystem of Big Data systems, languages and tools at the same time that you put them into practice guided by a large group of professionals. Don’t wait any longer to sign up!