About 50 results
Open links in new tab
  1. Is there a way to use parameters in Databricks in SQL with parameter ...

    Sep 29, 2024 · Databricks demands the use of the IDENTIFIER () clause when using widgets to reference objects including tables, fields, etc., which is exactly what you're doing.

  2. Databricks shows REDACTED on a hardcoded value - Stack Overflow

    Mar 16, 2023 · It's not possible, Databricks just scans entire output for occurences of secret values and replaces them with " [REDACTED]". It is helpless if you transform the value. For example, like you …

  3. Printing secret value in Databricks - Stack Overflow

    Nov 11, 2021 · 2 Building on @camo's answer, since you're looking to use the secret value outside Databricks, you can use the Databricks Python SDK to fetch the bytes representation of the secret …

  4. Where does databricks store the managed tables? - Stack Overflow

    Nov 6, 2024 · Answering your two sub questions individually below: Does this mean that databricks is storing tables in the default Storage Account created during the creation of Databricks workspace ? …

  5. Databricks api list all jobs from workspace - Stack Overflow

    Jul 17, 2024 · I am trying to get all job data from my Databricks. Basically, I need to put all job data into a DataFrame. There are more than 3000 jobs, so need to use the page_token to traverse all pages. …

  6. azure - Databricks - Read CSV file from folder - Stack Overflow

    Oct 21, 2022 · Databricks - Read CSV file from folder Asked 3 years, 5 months ago Modified 1 year, 11 months ago Viewed 13k times

  7. How to use python variable in SQL Query in Databricks?

    Jun 4, 2022 · I am trying to convert a SQL stored procedure to databricks notebook. In the stored procedure below 2 statements are to be implemented. Here the tables 1 and 2 are delta lake tables …

  8. Databricks - Download a dbfs:/FileStore file to my Local Machine

    Method3: Using third-party tool named DBFS Explorer DBFS Explorer was created as a quick way to upload and download files to the Databricks filesystem (DBFS). This will work with both AWS and …

  9. python - How do you get the run parameters and runId within …

    Jul 21, 2020 · When running a Databricks notebook as a job, you can specify job or run parameters that can be used within the code of the notebook. However, it wasn't clear from documentation how you …

  10. Clearing Cached Data on Databricks Cluster - Stack Overflow

    Jan 17, 2025 · So I tried to disable the cache using: spark.conf.set("spark.databricks.io.cache.enabled", "false") I understand this only disables the IO cache and while it does lead to some reduction in the …