About 50 results
Open links in new tab
  1. Printing secret value in Databricks - Stack Overflow

    Nov 11, 2021 · 2 Building on @camo's answer, since you're looking to use the secret value outside Databricks, you can use the Databricks Python SDK to fetch the bytes representation of the secret …

  2. Databricks shows REDACTED on a hardcoded value - Stack Overflow

    Mar 16, 2023 · It's not possible, Databricks just scans entire output for occurences of secret values and replaces them with " [REDACTED]". It is helpless if you transform the value. For example, like you …

  3. Is there a way to use parameters in Databricks in SQL with parameter ...

    Sep 29, 2024 · EDIT: I got a message from Databricks' employee that currently (DBR 15.4 LTS) the parameter marker syntax is not supported in this scenario. It might work in the future versions. …

  4. azure - Databricks Account level authentication - Stack Overflow

    May 28, 2025 · I am trying to authenticate on databricks account level using the service principal. My Service principal is the account admin. Below is what I am running within the databricks notebook …

  5. REST API to query Databricks table - Stack Overflow

    Jul 24, 2022 · Is databricks designed for such use cases or is a better approach to copy this table (gold layer) in an operational database such as azure sql db after the transformations are done in pyspark …

  6. databricks asset bundle switch between run_as configs

    May 10, 2024 · databricks asset bundle switch between run_as configs Asked 1 year, 11 months ago Modified 1 year, 10 months ago Viewed 1k times

  7. How to deploy a Streamlit app using Databricks App

    Oct 12, 2024 · I want to run Streamlit using Databricks App. I have python code (find below) that I can use in visual studio code to run the Streamlit app and even upload file uploaded to Streamlit to Azure …

  8. What is the correct way to access a workspace file in databricks

    Nov 16, 2023 · According to these documentations (1, 2), the workspace files or assets are available for Databricks Runtime 11.2 and above. With Databricks Runtime 11.2 and above, you can create and …

  9. Installing multiple libraries 'permanently' on Databricks' cluster ...

    Feb 28, 2024 · Easiest is to use databricks cli 's libraries command for an existing cluster (or create job command and specify appropriate params for your job cluster) Can use the REST API itself, same …

  10. how to get databricks job id at the run time - Stack Overflow

    Jun 9, 2025 · I am trying to get the job id and run id of a databricks job dynamically and keep it on in the table with below code run_id = self.spark.conf.get ("spark.databricks.job.runId", "no_ru...