Hello Bhaskar Marthi,
Thank you for your response.
Based on error message provided, please check that your Kubernetes cluster is correctly configured and accessible from your Spark application with necessary permissions are set for the Spark driver to communicate.
Check that your Kubernetes cluster is properly configured, and version is compatible with your Spark version, for Spark 3.5.3, it is recommended to use Kubernetes version 1.24 or higher.
https://spark.apache.org/docs/3.5.3/running-on-kubernetes.html
Check your cluster has sufficient resources (CPU and memory) allocated to run Spark applications effectively.
The Spark application is configured to use the correct namespace. The namespace should match where your Spark resources are deployed.
Please find the below similar issue article for your reference:
https://github.com/kubeflow/spark-operator/issues/1562
I hope this information is helpful. Please feel free to reach out if you have any further questions.
If the answer is helpful, please click "Accept Answer" and kindly upvote it. If you have extra questions about this answer, please click "Comment".