About 10,300,000 results
Open links in new tab
  1. Printing secret value in Databricks - Stack Overflow

    Nov 11, 2021 · First, install the Databricks Python SDK and configure authentication per the docs here. pip install databricks-sdk Then you can use the approach below to print out secret …

  2. Databricks: managed tables vs. external tables - Stack Overflow

    Jun 21, 2024 · The decision to use managed table or external table depends on your use case and also the existing setup of your delta lake, framework code and workflows. Your …

  3. REST API to query Databricks table - Stack Overflow

    Jul 24, 2022 · Is databricks designed for such use cases or is a better approach to copy this table (gold layer) in an operational database such as azure sql db after the transformations are done …

  4. Databricks: How do I get path of current notebook?

    Databricks is smart and all, but how do you identify the path of your current notebook? The guide on the website does not help. It suggests: %scala dbutils.notebook.getContext.notebookPath …

  5. Create temp table in Azure Databricks and insert lots of rows

    Nov 28, 2022 · Create temp table in Azure Databricks and insert lots of rows Asked 2 years, 11 months ago Modified 10 months ago Viewed 25k times

  6. Need for volumes in Databricks - Stack Overflow

    Sep 24, 2024 · Why do we need Volumes when we can access the location using external locations? The doc says that it is to add governance, but we can already govern using external …

  7. Do you know how to install the 'ODBC Driver 17 for SQL Server' on …

    Apr 4, 2020 · I'm trying to connect from a Databricks notebook to an Azure SQL Datawarehouse using the pyodbc python library. When I execute the code I get this error: Error: ('01000', …

  8. databricks: writing spark dataframe directly to excel

    Nov 29, 2019 · Are there any method to write spark dataframe directly to xls/xlsx format ???? Most of the example in the web showing there is example for panda dataframes. but I would …

  9. How do we connect Databricks with SFTP using Pyspark?

    Aug 17, 2022 · I wish to connect to sftp (to read files stored in a folder) from databricks cluster using Pyspark (using a private key) . Historically I have been downloading files to a linux box …

  10. How to to trigger a Databricks job from another Databricks job?

    Aug 14, 2023 · Databricks is now rolling out the new functionality, called "Job as a Task" that allows to trigger another job as a task in a workflow. Documentation isn't updated yet, but you …