Jaa


Use Eclipse with PyDev and Databricks Connect for Python

Note

This article covers Databricks Connect for Databricks Runtime 13.3 LTS and above.

This article covers how to use Databricks Connect for Scala and Eclipse with PyDev. Databricks Connect enables you to connect popular IDEs, notebook servers, and other custom applications to Azure Databricks clusters. See What is Databricks Connect?.

Note

Before you begin to use Databricks Connect, you must set up the Databricks Connect client.

To use Databricks Connect and Eclipse with PyDev, follow these instructions.

  1. Start Eclipse.
  2. Create a project: click File > New > Project > PyDev > PyDev Project, and then click Next.
  3. Specify a Project name.
  4. For Project contents, specify the path to your Python virtual environment.
  5. Click Please configure an interpreter before proceding.
  6. Click Manual config.
  7. Click New > Browse for python/pypy exe.
  8. Browse to and select select the full path to the Python interpreter that is referenced from the virtual environment, and then click Open.
  9. In the Select interpreter dialog, click OK.
  10. In the Selection needed dialog, click OK.
  11. In the Preferences dialog, click Apply and Close.
  12. In the PyDev Project dialog, click Finish.
  13. Click Open Perspective.
  14. Add to the project a Python code (.py) file that contains either the example code or your own code. If you use your own code, at minimum you must initialize DatabricksSession as shown in the example code.
  15. With the Python code file open, set any breakpoints where you want your code to pause while running.
  16. To run the code, click Run > Run. All Python code runs locally, while all PySpark code involving DataFrame operations runs on the cluster in the remote Azure Databricks workspace and run responses are sent back to the local caller.
  17. To debug the code, click Run > Debug. All Python code is debugged locally, while all PySpark code continues to run on the cluster in the remote Azure Databricks workspace. The core Spark engine code cannot be debugged directly from the client.

For more specific run and debug instructions, see Running a Program.