Hello Ricker Silva,
Welcome to the Microsoft Q&A and thank you for posting your questions here.
I understand that your Synapse notebook script runs ok but stops on time out and stuck in queued in pipelines.
Since the notebook does not explicitly stop the Spark session, causing it to remain active in the Spark UI until a timeout occurs. I will suggest, ensure that spark.stop()
is called at the end of the script and configure the Notebook Activity to handle session termination properly. For an example mssparkutils.session.stop()
This will end the session / spark application and release the resources.
from pyspark.sql import SparkSession
# Get the active Spark session
spark = SparkSession.builder.getOrCreate()
# Stop the session explicitly
spark.stop()
This will ensure that the Spark session is closed when the notebook completes execution.
Also, you can review Livy session settings, timeout configurations, and resource allocation in the Spark pool. The link below should help you for more details: https://faizchachiya.medium.com/how-to-handle-azure-databricks-and-synapse-session-timeout-issues-bce25ef719a4
I hope this is helpful! Do not hesitate to let me know if you have any other questions.
Please don't forget to close up the thread here by upvoting and accept it as an answer if it is helpful.