Skip to content

Tag: tensorflow-datasets

Training a single model jointly over multiple datasets in tensorflow

I want to train a single variational autoencoder model or even a standard autoencoder over many datasets jointly (e.g. mnist, cifar, svhn, etc. where all the images in the datasets are resized to be the same input shape). Here is the VAE tutorial in tensorflow which I am using as a starting point: For training the model, I would

Tensorflow 2.3, Tensorflow dataset, TypeError: () takes 1 positional argument but 4 were given

I use to read 4 large files and I use to zip these 4 files and create “dataset”. However, I can not pass “dataset” to to use tf.compat.v1.string_split and split with t separator and finally use batch, prefetch and finally feed into my model. This is my code: This is error message: What should I do? Answer

Split .tfrecords file into many .tfrecords files

Is there any way to split .tfrecords file into many .tfrecords files directly, without writing back each Dataset example ? Answer You can use a function like this: For example, to split the file my_records.tfrecord into parts of 100 records each, you would do: This would create multiple smaller record files my_records.tfrecord.000, my_records.tfrecord.001, etc.

How to use for reshaping the dataset

I am working with time series models in tensorflow. My dataset contains physics signals. I need to divide this signals into windows as give this sliced windows as input to my model. Here is how I am reading the data and slicing it: I want to reshape this dataset to # {‘mix’: TensorShape([Dimension(32)]), ‘pure’: TensorShape([Dimension(32))} Equivalent transformation in numpy would