Overview of computing Environmental, Social, and Governance metrics

Important

Some or all of this functionality is available as part of a preview release. The content and the functionality are subject to change.

You can select and compute the prebuilt metrics using the prebuilt notebooks and pipelines deployed as part of this capability.

Note

You can extend the notebooks and pipelines to support custom metrics. For more information, go to Create custom metrics.

Prerequisites

  1. Load the prebuilt metrics definitions into the MetricsDefinitions and MetricsLabels tables in the ComputedESGMetrics_LH lakehouse by running the LoadDefinitionsForMetrics notebook. You only need to perform this step once. Metric computation notebooks extract the metric’s definition from these tables.

    1. You can then query the MetricsDefinitions table to view the metric definitions. For each metric, you can see the metric properties:

      • metric name
      • measure
      • dimensions
      • filters
      • sustainability area
    2. You can also query the MetricsLabels table to explore the metrics by labels, such as Reporting standard and Disclosure data point.

    3. You can then explore the Prebuilt metrics library to determine the aggregate dataset to populate for computing the respective prebuilt metric.

    4. Refer to Generate aggregate tables to discover the respective ESG data model tables and the attributes to populate for generating the aggregate tables.

  2. Ingest, transform, and load the transformed data into the ESG data model tables in the ProcessedESGData_LH lakehouse deployed as part of the ESG data estate capability in the same workspace.

  3. Authenticate the prebuilt semantic models (DatasetForMetricsMeasures and DatasetForMetricsDashboard) used in the metric computation by creating a connection. You only need to perform this step once.

    1. Select the DatasetForMetricsMeasures_DTST semantic model from the managed capability page to open the semantic model. Select Settings from the File menu. You can also open the item from the workspace page.

      Screenshot of selecting DatasetForMetricsMeasures_DTST.

    2. Select Gateway and cloud connections, and then select Create a new connection from the Cloud connections dropdown box. A New connection side panel opens.

      Screenshot of creating a new connect for the semantic model.

    3. In the New connection side panel, enter the connection name, enter OAuth 2.0 for the authentication method, edit the credentials, and select Create.

    4. Select the created connection in the Gateway and Cloud connections section.

      Screenshot of selecting the newly created connection.

    5. In the same way, set up a connection for the DatasetForMetricsDashboard_DTST semantic model. Open the semantic model from the workspace page, select Settings from the File menu, and then follow the same steps you followed for DatasetForMetricsMeasures_DTST.

  4. Use Apache Spark runtime 1.3 (Spark 3.5, Delta 3.2) for running the prebuilt notebooks and pipelines.

Explore with demo data

If you're exploring the capability with demo data and want to just view the output of all the metrics, then you can run the ExecuteComputationForMetrics_DTPL pipeline. This pipeline provides an end-to-end experience from loading the demo data into the ProcessedESGData_LH lakehouse to computing the prebuilt metrics. To learn more about how to set up demo data, go to Set up demo data.

Before running the pipeline, ensure you followed steps #3 and #4 in the Prerequisites section. During those steps, you authenticate the prebuilt semantic models (DatasetForMetricsMeasures and DatasetForMetricsDashboard).

Then create a connection to authenticate the Create aggregate tables activity of the pipeline to use the GenerateAggregateForMetrics pipeline:

  1. Open the ExecuteComputationForMetrics_DTPL pipeline from the managed capability or workspace page.

  2. Select the Create aggregate tables pipeline activity, and then select Settings to set up a connection.

    Screenshot of New connection side panel.

  3. In the same way, authenticate by setting up connections for Refresh measures and Refresh metrics for the dashboard activities of the pipeline. Select the activity, select Settings, and then select a connection from the Connection attribute.

    Screenshot of creating a connection.

    Note

    These connection setup steps are one-time and aren't necessary for subsequent pipeline runs.

After you complete these steps, select Run to run the pipeline. You can monitor the pipeline by selecting the View run history button.

This pipeline performs these activities:

Note

If you want to run a specific activity of the pipeline, you can deactivate the other activities. For more information, go to Deactivate an activity.

Compute metrics data

To compute the ESG metrics, follow the instructions in these articles:

  1. Generate aggregate tables
  2. Generate and store metrics data
  3. Consume metrics data

Next step