Hi @zmsoft
The error "No serviceName defined in either JAAS or Kafka config" suggests that the
kafka.sasl.service.name
parameter is missing in your configuration. Below is the correct way to set up an Azure Databricks Delta Live Tables (DLT) pipeline to consume data from Azure Event Hubs.
Here are the steps to resolve the Issue:
Enable Kafka protocol on Azure Event Hubs - Ensure Kafka protocol is enabled in your Event Hubs namespace. The Kafka bootstrap server should be:{EH_NAMESPACE}.servicebus.windows.net:9093
Store the Event Hub Connection string securely - For security, store the Event Hubs connection string in Databricks secrets:
databricks secrets put --scope eventhub-secrets
store the connection string:
databricks secrets put --scope eventhub-secrets --key eh-connection-string
Use Delta Live Tables (DLT) to Read from Event Hubs - Update your code to include the kafka.sasl.service.name
option:
import dlt
from pyspark.sql.functions import col
from pyspark.sql.types import StringType
# Read secret from Databricks
EH_CONN_STR = dbutils.secrets.get(scope="eventhub-secrets", key="eh-connection-string")
KAFKA_BROKER = "{EH_NAMESPACE}.servicebus.windows.net:9093"
EH_NAME = "myeventhub"
@dlt.table(
comment="Streaming data from Azure Event Hub into Delta Live Tables"
)
def eventhub_stream():
return (
spark.readStream
.format("kafka")
.option("kafka.bootstrap.servers", KAFKA_BROKER)
.option("subscribe", EH_NAME)
.option("kafka.security.protocol", "SASL_SSL")
.option("kafka.sasl.mechanism", "PLAIN")
.option("kafka.sasl.service.name", "kafka") # Fix for KafkaAdminClient error
.option("kafka.sasl.jaas.config",
f'org.apache.kafka.common.security.plain.PlainLoginModule required '
f'username="$ConnectionString" password="{EH_CONN_STR}";')
.option("failOnDataLoss", "false")
.option("startingOffsets", "earliest")
.load()
.select(col("value").cast(StringType()).alias("event_data")) # Extract message payload
)
Deploy the Delta Live Tables Pipeline - Go to Databricks Workspace → Workflows → Delta Live Tables. Click Create Pipeline and select the notebook where you defined eventhub_stream()
. Set Pipeline Mode (Triggered or Continuous) and start the pipeline.
Once the pipeline is running, verify the data using: SELECT`` ``* FROM`` LIVE.eventhub_stream;
For more details refer: Use Azure Event Hubs as a DLT data source
Hope this helps. Do let us know if you have any further queries.