1
votes

When trying to follow the Keras doc on Adam, I copy this line from the doc:

keras.optimizers.Adam(lr=0.001, beta_1=0.9, beta_2=0.999, epsilon=None, decay=0.0, amsgrad=False)

and get this error

Unexpected keyword argument passed to optimizer: amsgrad


EDIT 1

Omitting the amsgrad parameter agrees to interprt the line

keras.optimizers.Adam(lr=0.001, beta_1=0.9, beta_2=0.999, epsilon=None, decay=0.0)

but then when trying to train the model using

happyModel.fit(x = X_train, y = Y_train, epochs = 50, batch_size = 600)

gives the following error:

None values not supported.

Full error:

--------------------------------------------------------------------------- ValueError Traceback (most recent call last) in () 1 ### START CODE HERE ### (1 line) ----> 2 happyModel.fit(x = X_train, y = Y_train, epochs = 50, batch_size = 100) 3 ### END CODE HERE ###

/opt/conda/lib/python3.6/site-packages/keras/engine/training.py in fit(self, x, y, batch_size, epochs, verbose, callbacks, validation_split, validation_data, shuffle, class_weight, sample_weight, initial_epoch, steps_per_epoch, validation_steps, **kwargs) 1574 else: 1575 ins = x + y + sample_weights -> 1576 self._make_train_function() 1577 f = self.train_function 1578

/opt/conda/lib/python3.6/site-packages/keras/engine/training.py in _make_train_function(self) 958 training_updates = self.optimizer.get_updates( 959 params=self._collected_trainable_weights, --> 960 loss=self.total_loss) 961 updates = self.updates + training_updates 962 # Gets loss and metrics. Updates weights at each call.

/opt/conda/lib/python3.6/site-packages/keras/legacy/interfaces.py in wrapper(*args, **kwargs) 85 warnings.warn('Update your ' + object_name + 86 ' call to the Keras 2 API: ' + signature, stacklevel=2) ---> 87 return func(*args, **kwargs) 88 wrapper._original_function = func 89 return wrapper

/opt/conda/lib/python3.6/site-packages/keras/optimizers.py in get_updates(self, loss, params) 432 m_t = (self.beta_1 * m) + (1. - self.beta_1) * g 433 v_t = (self.beta_2 * v) + (1. - self.beta_2) * K.square(g) --> 434 p_t = p - lr_t * m_t / (K.sqrt(v_t) + self.epsilon) 435 436 self.updates.append(K.update(m, m_t))

/opt/conda/lib/python3.6/site-packages/tensorflow/python/ops/math_ops.py in binary_op_wrapper(x, y) 827 if not isinstance(y, sparse_tensor.SparseTensor): 828 try: --> 829 y = ops.convert_to_tensor(y, dtype=x.dtype.base_dtype, name="y") 830 except TypeError: 831 # If the RHS is not a tensor, it might be a tensor aware object

/opt/conda/lib/python3.6/site-packages/tensorflow/python/framework/ops.py in convert_to_tensor(value, dtype, name, preferred_dtype) 674 name=name, 675 preferred_dtype=preferred_dtype, --> 676 as_ref=False) 677 678

/opt/conda/lib/python3.6/site-packages/tensorflow/python/framework/ops.py in internal_convert_to_tensor(value, dtype, name, as_ref, preferred_dtype) 739 740 if ret is None: --> 741 ret = conversion_func(value, dtype=dtype, name=name, as_ref=as_ref) 742 743 if ret is NotImplemented:

/opt/conda/lib/python3.6/site-packages/tensorflow/python/framework/constant_op.py in _constant_tensor_conversion_function(v, dtype, name, as_ref) 111 as_ref=False): 112 _ = as_ref --> 113 return constant(v, dtype=dtype, name=name) 114 115

/opt/conda/lib/python3.6/site-packages/tensorflow/python/framework/constant_op.py in constant(value, dtype, shape, name, verify_shape) 100 tensor_value = attr_value_pb2.AttrValue() 101 tensor_value.tensor.CopyFrom( --> 102 tensor_util.make_tensor_proto(value, dtype=dtype, shape=shape, verify_shape=verify_shape)) 103 dtype_value = attr_value_pb2.AttrValue(type=tensor_value.tensor.dtype) 104 const_tensor = g.create_op(

/opt/conda/lib/python3.6/site-packages/tensorflow/python/framework/tensor_util.py in make_tensor_proto(values, dtype, shape, verify_shape) 362 else: 363 if values is None: --> 364 raise ValueError("None values not supported.") 365 # if dtype is provided, forces numpy array to be the type 366 # provided if possible.

ValueError: None values not supported.

Thus simply omitting the parameter doesn't do the trick

How to get the adam optimizer to work?

Thanks

1
You need one of the latest versions of Keras for this to work, and you probably have an old version. - Dr. Snoopy
Indeed; omitting the amsgrad argument altogether will probably do the trick, too - desertnaut
@desertnaut editted the original question. what you suggest gives another error - Gulzar
This probably has nothing to do with Adam, but with your data... My suggestion worked in that it passed successfully the compilation part - desertnaut
You're getting this error because you have None values in your data (Y_train I think) - HMK

1 Answers

1
votes
  1. This is probably due to an old version of keras which does not support the amsgrad parameter
  2. removing the parameter allows the interpreter to understand the line.
  3. The None values not supported. problem comes from None in the epsilon parameter. need to specify a value.