Closed. This question is not about programming or software development. It is not currently accepting answers. This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the
Tag: neural-network
simple Neural Network gives random prediction result
I have been trying to build a simple neural network myself (3 layers) to predict the MNIST dataset. I referenced some codes online and wrote some parts my own, the code runs without any errors, but something is wrong with the learning process. It seems like the prediction result is all “random”. Applying the learning process to the network and
Mask layer is not working with MLPs, how to add a custom layer with masking?
I’m using MLPs to forecast a time series, I implement a code that contain a mask layer to let the model skip the mask values. for instance, in my data, the time series has a lot of NaN values, I fill it by a ‘value = -999’. I don’t want to remove it, but I want the Keras masking to
Adam Optimizer Not Working on cost function
I wanted to make own neural network for MNIST data set and for that using tensorflow i am writing the code imported library and dataset then done one hot encoding and after all done the weights and baises assignment and then done the forward propagation with the random values and for back propagation and cost minimization used a loss function
Tensorflow ValueError: Shapes (64, 1) and (1, 1) are incompatible
I’m trying to build a Siamese Neural Network to analyze the MNIST dataset, however when trying to fit the model to the dataset I encounter this problem according to which I have training data and labels shapes’ mismatch. I tried changing the loss function as well as tried to squeeze the labels array, and neither of “solutions” worked. Here are
AttributeError: module ‘keras.api._v2.keras.utils’ has no attribute ‘Sequential’ i have just started Neural network so help would be appriciated
Answer You should be using tf.keras.Sequential() or tf.keras.models.Sequential(). Also, you need to define a valid loss function. Here is a working example:
Difference between the calculation of the training loss and validation loss using pytorch
I wanna use the following code of this traditional image classification problem for my regression problem. The code can be found here: GeeksforGeeks-Training Neural Networks with Validation using Pytorch I can understand why the training loss is summed up and then divided by the length of the training data in this example, but I can’t get why the validation loss
How to add two separate layers on the top of one layer using pytorch?
I want to add two separate layers on the top of one layer (or a pre-trained model) Is that possible for me to do using Pytorch?
How to build a CNN model for MNIST fashion and test it with a another set of image from web?
Importing the data and splitting it into 4 for test and train x_train=x_train/255.0 x_test=x_test/255.0 c_trainX = trainX.reshape(x_train.shape[0],28,28,1)#x_train.shape[0] = 60 model3 = …
Forward Propagation for Neural Network
I am trying to create a forward-propagation function in Python 3.8.2. The inputs look like this: Test_Training_Input = [(1,2,3,4),(1.45,16,5,4),(3,7,19,67)] Test_Training_Output = [1,1,0] I am not …