2
votes

We know that the neural network will perform like a linear regression if there is only one hidden unit. So, the NN method should perform at least as well as a linear regression method. I have built a tidymodel model using the following line of code:

Data_nnet_mod <-  mlp(hidden_units = tune(),  penalty = tune(),  epochs = tune()) %>% 
  set_engine("nnet") %>% 
  set_mode("regression") 

and have tuned it using

Data_nnet_fit <- 
  Data_nnet_wflow %>%  
  tune_grid(val_set,
            grid = 25,
            control = control_grid(save_pred = TRUE),
            metrics = metric_set(rmse))

It turns out that the linear regression output has a smaller RMSE than the NN method.

I wonder why the best RMSE that the NN method produces is larger than that of the regression method. Theoretically speaking, should the NN method not be at least as well as the regression method?

1
I recommend you ask discussion-type questions like this in the ML category on RStudio Community. Also, I recommend creating a reproducible example with some data that demonstrates what you are seeing. - Julia Silge
A neural network without a hidden layer and an identity activation function is in fact exactly a linear regression model. - shem

1 Answers

2
votes

A neural network with one hidden unit and linear activation is linear regression. There may be differences though, for example, neural networks are usually trained with variants of gradient descent, while linear regression with ordinary least squares, so you have no guarantees that they end up with the same results. There also may be implementation details that differ. If you use regularization, other activation, loss, etc those would be different models so again you have no guarantees of finding the same solution, or an equally good one. Unless both models are exactly the same, you don't really have guarantees of same performance. Because all of the above reasons, linear regression may outperform neural networks for regression problems, or logistic regression can for classification.