I am using SageMaker Notebook in AWS Glue for ETL development.
On importing the SparkContext library I am getting the below error. I have tried to restart the kernel but did not heled. Can some one explain me the point "a".
The code failed because of a fatal error: Error sending http request and maximum retry encountered..*
Some things to try:
a. Make sure Spark has enough available resources for Jupyter to create a Spark context.
b. Contact your Jupyter administrator to make sure the Spark magics library is configured correctly.
c. Restart the kernel.
Following points to be noted:
I am creating the sagemaker notebook from AWS Console > AWS Glue > Dev Endpoint > Notebooks.
VPC, Subnet and Security group of the dev endpoint created is same as the RDS to which connection is supposed to be made. While creating dev endpoint, in the networking page I choose an existing connection from the list of connections available in the drop down so that VPC, subnet and security group are automatically chosen.
- I had increased the DPU from 5 to 10 but still getting this error.
- Not able reach the step where I can create connection to RDS because getting error while calling the library.
- If I skip the networking info while creating the dev end point I am successfully able to call all the relevant libraries (screenshot attached). (which is not suggested when connecting to RDS as it would not work).
So, this error ("The code failed because...") is coming only when providing a connection.
Would be helpful if some one could help out in resolving this issue.

