About 50 results
Open links in new tab
  1. Printing secret value in Databricks - Stack Overflow

    Nov 11, 2021 · 2 Building on @camo's answer, since you're looking to use the secret value outside Databricks, you can use the Databricks Python SDK to fetch the bytes representation of the secret …

  2. Databricks shows REDACTED on a hardcoded value - Stack Overflow

    Mar 16, 2023 · It's not possible, Databricks just scans entire output for occurences of secret values and replaces them with " [REDACTED]". It is helpless if you transform the value. For example, like you …

  3. Is there a way to use parameters in Databricks in SQL with parameter ...

    Sep 29, 2024 · Databricks demands the use of the IDENTIFIER () clause when using widgets to reference objects including tables, fields, etc., which is exactly what you're doing.

  4. REST API to query Databricks table - Stack Overflow

    Jul 24, 2022 · Is databricks designed for such use cases or is a better approach to copy this table (gold layer) in an operational database such as azure sql db after the transformations are done in pyspark …

  5. databricks asset bundle switch between run_as configs

    May 10, 2024 · databricks asset bundle switch between run_as configs Asked 1 year, 11 months ago Modified 1 year, 10 months ago Viewed 1k times

  6. Identify Public Outbound IP Address for a databricks instance

    Aug 28, 2024 · We have an azure databricks instance deployed with SCC ( secure Cluster Connectivity ) in EAST US2 Region. we have a Public IP Address space created and a NAT Gateway associated …

  7. ssms - onPrem sql server connection using jdbc driver in databricks ...

    Mar 27, 2025 · 0 You can connect your on-premises SQL Server to Databricks using the JDBC driver available in Databricks Runtime. Below are the high-level steps to establish this connection: Steps to …

  8. function - Previous month query - Databricks - Stack Overflow

    Nov 22, 2022 · Previous month query - Databricks Asked 3 years, 5 months ago Modified 3 years, 5 months ago Viewed 7k times

  9. What is the correct way to access a workspace file in databricks

    Nov 16, 2023 · According to these documentations (1, 2), the workspace files or assets are available for Databricks Runtime 11.2 and above. With Databricks Runtime 11.2 and above, you can create and …

  10. Installing multiple libraries 'permanently' on Databricks' cluster ...

    Feb 28, 2024 · Easiest is to use databricks cli 's libraries command for an existing cluster (or create job command and specify appropriate params for your job cluster) Can use the REST API itself, same …