# Tag: tensorflow2.0

## Set random labels for images in tf.data.Dataset

I have a tf data dataset of images with a signature as seen below : All the labels in this dataset are 0. What I would like to do is change each of these labels to a random number from 0 to 3. My code is : This however just assigns 1 to all images as a label. The strange

## Element wise multiplication of two list that are tf.Tensor in tensorflow

What is the fastest way to do an element-wise multiplication between a tensor and an array in Tensorflow 2? For example, if the tensor T (of type tf.Tensor) is: and we have an array a (of type np.array): I wand to have: as output. Answer This is called the outer product of two tensors. It’s easy to compute by taking

## Tensorflow dataset, how to concatenate/repeat data within each batch?

If I have the following dataset: dataset = tf.data.Dataset.from_tensor_slices([1, 2, 3, 4, 5, 6]) When I use a batch_size=2, I would get [[1,2], [3,4], [5,6]]. However, I would like to get the following output: [[1,2,1,2], [3,4,3,4], [5,6,5,6]] Basically, I want to repeat the batch dimension by 2x and use this as a new batch. Obviously, this is a toy example.

## Tensorflow: `tf.reshape((), (0))` works fine in eager mode but ValueError in Graph mode

As the title, the function tf.reshape((), (0)) works perfectly fine in eager mode. But when I use it in Graph mode, it returns: ValueError: Shape must be rank 1 but is rank 0 for ‘{{node Reshape}} = Reshape[T=DT_FLOAT, Tshape=DT_INT32](Reshape/tensor, Reshape/shape)’ with input shapes: [0], []. Can anyone help me with the work-around of this function please. You can reproduce this

## stacking LSTM layer on top of BERT encoder in Keras

I have been trying to stack a single LSTM layer on top of Bert embeddings, but whilst my model starts to train it fails on the last batch and throws the following error message: This is how I build the model and I honestly cannot figure out what is going wrong here: this is the full output: The code runs

## Tensorflow Keras Tensor Multiplication with None as First Dimension

I’m using TensorFlow Keras backend and I have two tensors a, b of the same shape: (None, 4, 7), where None represents the batch dimension. I want to do matrix multiplication, and I’m expecting a result of (None, 4, 4). i.e. For each batch, do one matmul: (4,7)ยท(7,4) = (4,4) Here’s my code — This code gives a tensor of

## (Tensorflow) Stuck at Epoch 1 during model.fit()

I’ve been trying to make Tensorflow 2.8.0 work with my Windows GPU (GeForce GTX 1650 Ti), and even though it detects my GPU, any model that I make will be stuck at Epoch 1 indefinitely when I try to use the fit method till the kernel (I’ve tried on jupyter notebook and spyder) hangs and restarts. Based on Tensorflow’s website,

## Tensorflow: Incompatible shapes: [1,2] vs. [1,4,4,2048]

I have the following tensorflow model: I have simplified this somewhat in an attempt to narrow down the problem, When I run this I get the following error: This error always seems to occur on a different input image. All my images are exactly the same dimennsions. I am using tensorflow 2.4.1 What am I missing? Answer The ResNet50 model

## Tensorflow Lite, Image size is zero error

Actually, my question is very simple. I would like to use my own data in tensorflow lite model. So, i wrote these line of codes: Also, this is the error that I encountered: Answer This happens when the Dataloader cannot infer the labels of your images. The images should be divided into subfolders according to the class they belong to:

## Compute gradients across two models

Let’s assume that we are building a basic CNN that recognizes pictures of cats and dogs (binary classifier). An example of such CNN can be as follows: Let’s also assume that we want to have the model split into two parts, or two models, called model_0 and model_1. model_0 will handle the input, and model_1 will take model_0 output and