I have an application which sends table data from ibmias400 tables . How to collect messages and process it according to table names from messages and load it into delta parquet files/SQL tables and use it for reporting purposes?

Suraj Surendran 60 Reputation points
2024-12-20T05:55:04.43+00:00

I have an application which sends table data from ibmias400 tables . How to collect messages and process it according to table names from messages and load it into delta parquet files/SQL tables and use it for reporting purposes?

Azure Event Hubs
Azure Event Hubs
An Azure real-time data ingestion service.
663 questions
SQL Server
SQL Server
A family of Microsoft relational database management and analysis systems for e-commerce, line-of-business, and data warehousing solutions.
14,215 questions
Azure Kubernetes Service (AKS)
Azure Kubernetes Service (AKS)
An Azure service that provides serverless Kubernetes, an integrated continuous integration and continuous delivery experience, and enterprise-grade security and governance.
2,205 questions
Azure Event Grid
Azure Event Grid
An Azure event routing service designed for high availability, consistent performance, and dynamic scale.
407 questions
{count} votes

1 answer

Sort by: Most helpful
  1. AnnuKumari-MSFT 33,976 Reputation points Microsoft Employee
    2024-12-20T07:38:44.3366667+00:00

    Hi Suraj Surendran ,

    Thankyou for using Microsoft Q&A platform and thanks for posting your query here.

    You can use Azure Event Hubs to receive the messages from your application that sends table data from IBM i AS/400 tables. Azure Event Hubs is a highly scalable data streaming platform and event ingestion service that can receive and process millions of events per second

    1. Set up an Azure Event Hub: You can create an Event Hub namespace and an Event Hub in the Azure portal. You will need to configure the Event Hub to receive messages from your application.
    2. Write an Azure Function: The Azure Function should be able to extract the table name from the message and load the data into Delta Parquet files/SQL tables accordingly. You can use a programming language such as C# or Python to write the Azure Function. The function should listen to the Event Hub and process the messages as they arrive.
    3. Use Azure Data Factory: You can use Azure Data Factory to load the data from the Delta Parquet files/SQL tables into a reporting database such as Azure SQL or Azure Synapse Analytics. The data ingestion process should be configured to extract the data from the Delta Parquet files/SQL tables and load it into the reporting database.
    4. Use a reporting tool: You can use a reporting tool such as Power BI or Tableau to create reports and visualizations based on the data in the reporting database. The reporting tool should be configured to connect to the reporting database and extract the data for analysis and visualization.

    You can also alternatively use Synapse instead of ADF : Capture Event Hubs data in parquet format and analyze with Azure Synapse Analytics

    Hope it helps. Kindly accept the answer by clicking on Accept answer button. Thankyou

    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.