Partager via


Interface CLI (héritée) Bibliothèques

Important

Cette documentation a été supprimée et peut ne pas être mise à jour.

Ces informations s’appliquent aux versions héritées de Databricks CLI 0.18 et ci-dessous. Databricks vous recommande d’utiliser la version 0.205 ou ultérieure de l’interface CLI Databricks plus récente. Consultez Qu’est-ce que l’interface CLI Databricks ?. Pour rechercher votre version de l’interface CLI Databricks, exécutez databricks -v.

Pour migrer de Databricks CLI version 0.18 ou ultérieure vers Databricks CLI version 0.205 ou ultérieure, consultez migration de l’interface CLI Databricks.

Vous exécutez les sous-commandes CLI des bibliothèques Databricks en les ajoutant à databricks libraries. Ces sous-commandes appellent l'API des bibliothèques .

databricks libraries -h
Usage: databricks libraries [OPTIONS] COMMAND [ARGS]...

  Utility to interact with libraries.

Options:
  -v, --version  [VERSION]
  -h, --help     Show this message and exit.

Commands:
  all-cluster-statuses  Get the status of all libraries.
  cluster-status        Get the status of all libraries for a cluster.
    Options:
      --cluster-id CLUSTER_ID   Can be found in the URL at https://<databricks-instance>/?o=<16-digit-number>#/setting/clusters/$CLUSTER_ID/configuration.
  install               Install a library on a cluster.
    Options:
      --cluster-id CLUSTER_ID   Can be found in the URL at https://<databricks-instance>/?o=<16-digit-number>#/setting/clusters/$CLUSTER_ID/configuration.
      --jar TEXT                JAR on DBFS or WASB.
      --egg TEXT                Egg on DBFS or WASB.
      --whl TEXT                Wheel or zipped wheelhouse on DBFS or WASB. Supported in CLI 0.8.2 and above.
      --maven-coordinates TEXT  Maven coordinates in the form of GroupId:ArtifactId:Version (i.e.org.jsoup:jsoup:1.7.2).
      --maven-repo TEXT         Maven repository to install the Maven package from. If omitted, both Maven Repository and Spark Packages are searched.
      --maven-exclusion TEXT    List of dependences to exclude. For example: --maven-exclusion "slf4j:slf4j" --maven-exclusion "*:hadoop-client".
      --pypi-package TEXT       The name of the PyPI package to install. An optional exact version specification is also supported. Examples "simplejson" and "simplejson==3.8.0".
      --pypi-repo TEXT          The repository where the package can be found. If not specified, the default pip index is used.
      --cran-package TEXT       The name of the CRAN package to install.
      --cran-repo TEXT          The repository where the package can be found. If not specified, the default CRAN repo is used.
  list                  Shortcut to `all-cluster-statuses` or `cluster-status`.
    Options:
      --cluster-id CLUSTER_ID   Can be found in the URL at https://<databricks-instance>/?o=<16-digit-number>#/setting/clusters/$CLUSTER_ID/configuration.
  uninstall             Uninstall a library on a cluster.
    Options:
      --cluster-id CLUSTER_ID   Can be found in the URL at https://<databricks-instance>/?o=<16-digit-number>#/setting/clusters/$CLUSTER_ID/configuration. [required]
      --all                     Uninstall all libraries.
      --jar TEXT                JAR on DBFS or WASB.
      --egg TEXT                Egg on DBFS or WASB.
      --whl TEXT                Wheel or zipped wheelhouse on DBFS or WASB. Supported in CLI 0.8.2 and above.
      --maven-coordinates TEXT  Maven coordinates in the form of GroupId:ArtifactId:Version (i.e.org.jsoup:jsoup:1.7.2).
      --maven-repo TEXT         Maven repository to install the Maven package from. If omitted, both Maven Repository and Spark Packages are searched.
      --maven-exclusion TEXT    List of dependences to exclude. For example: --maven-exclusion "slf4j:slf4j" --maven-exclusion "*:hadoop-client".
      --pypi-package TEXT       The name of the PyPI package to install. An optional exact version specification is also supported. Examples "simplejson" and "simplejson==3.8.0".
      --pypi-repo TEXT          The repository where the package can be found. If not specified, the default pip index is used.
      --cran-package TEXT       The name of the CRAN package to install.
      --cran-repo TEXT          The repository where the package can be found. If not specified, the default CRAN repo is used.

Répertorier l’état de toutes les bibliothèques sur tous les clusters

Pour afficher la documentation sur l’utilisation, exécutez databricks libraries all-cluster-statuses --help.

databricks libraries all-cluster-statuses
{
  "statuses": [
    {
      "cluster_id": "1234-567890-lest123",
      "library_statuses": [
        {
          "library": {
            "jar": "dbfs:/FileStore/jars/bbf81650_a62b_4b7a_b47e_7bdd9505792a-SparkJDBC42.jar"
          },
          "status": "INSTALLED",
          "is_library_for_all_clusters": true
        },
        ...
      ]
    },
    ...
  ]
}

Répertorier l’état de toutes les bibliothèques sur un cluster

Pour afficher la documentation sur l’utilisation, exécutez databricks libraries cluster-status --help ou databricks libraries list --help.

databricks libraries cluster-status --cluster-id 1234-567890-lest123

Ou:

databricks libraries list --cluster-id 1234-567890-lest123
{
  "cluster_id": "1234-567890-lest123",
  "library_statuses": [
    {
      "library": {
        "jar": "dbfs:/FileStore/jars/bbf81650_a62b_4b7a_b47e_7bdd9505792a-SparkJDBC42.jar"
      },
      "status": "INSTALLED",
      "is_library_for_all_clusters": false
    },
    ...
  ]
}

Installer une bibliothèque sur un cluster

Pour afficher la documentation sur l’utilisation, exécutez databricks libraries install --help.

databricks libraries install --cluster-id 1234-567890-lest123 --jar dbfs:/test-dir/test.jar

Si elle réussit, aucune sortie n’est affichée.

Désinstaller une bibliothèque à partir d’un cluster

Pour afficher la documentation sur l’utilisation, exécutez databricks libraries uninstall --help.

databricks libraries uninstall --cluster-id 1234-567890-lest123 --jar dbfs:/test-dir/test.jar
WARNING: Uninstalling libraries requires a cluster restart.
databricks clusters restart --cluster-id 1234-567890-lest123