About 50 results
Open links in new tab
  1. Is there a way to use parameters in Databricks in SQL with parameter ...

    Sep 29, 2024 · Databricks demands the use of the IDENTIFIER () clause when using widgets to reference objects including tables, fields, etc., which is exactly what you're doing.

  2. Printing secret value in Databricks - Stack Overflow

    Nov 11, 2021 · 2 Building on @camo's answer, since you're looking to use the secret value outside Databricks, you can use the Databricks Python SDK to fetch the bytes representation of the secret …

  3. Installing multiple libraries 'permanently' on Databricks' cluster ...

    Feb 28, 2024 · Installing multiple libraries 'permanently' on Databricks' cluster Asked 2 years, 1 month ago Modified 2 years, 1 month ago Viewed 5k times

  4. Databricks shared access mode limitations - Stack Overflow

    Oct 2, 2023 · You're correct about listed limitations. But when you're using Unity Catalog, especially with shared clusters, you need to think a bit differently than before. UC + shared clusters provide very …

  5. Databricks change default catalog - Stack Overflow

    Jun 29, 2023 · But Databricks recommends keeping the default catalog as hive_metastore, because changing the default catalog can break existing data operations that depend on it.

  6. REST API to query Databricks table - Stack Overflow

    Jul 24, 2022 · Is databricks designed for such use cases or is a better approach to copy this table (gold layer) in an operational database such as azure sql db after the transformations are done in pyspark …

  7. List databricks secret scope and find referred keyvault in azure ...

    Jun 23, 2022 · How can we find existing secret scopes in databricks workspace. And which keyvault is referred by specific SecretScope in Azure Databricks?

  8. Databricks - Download a dbfs:/FileStore file to my Local Machine

    Method3: Using third-party tool named DBFS Explorer DBFS Explorer was created as a quick way to upload and download files to the Databricks filesystem (DBFS). This will work with both AWS and …

  9. How to import own modules from repo on Databricks?

    Dec 7, 2022 · I have connected a Github repository to my Databricks workspace, and am trying to import a module that's in this repo into a notebook also within the repo. The structure is as such: …

  10. azure devops - How can I pass parameters to databricks.yml in ...

    Nov 24, 2023 · 6 Background: I have a separate Databricks Workspace for each environment, and I am buidling an Azure DevOps pipeline to deploy a Databricks Asset Bundles to these environments. …