<?xml version="1.0" encoding="utf-8" ?><rss version="2.0"><channel><title>Bing: Databricks UI Help</title><link>http://www.bing.com:80/search?q=Databricks+UI+Help</link><description>Search results</description><copyright>Copyright © 2026 Microsoft. All rights reserved. These XML results may not be used, reproduced or transmitted in any manner or for any purpose other than rendering Bing results within an RSS aggregator for your personal, non-commercial use. Any other use of these results requires express written permission from Microsoft Corporation. By accessing this web page or using these results in any manner whatsoever, you agree to be bound by the foregoing restrictions.</copyright><item><title>Is there a way to use parameters in Databricks in SQL with parameter ...</title><link>https://stackoverflow.com/questions/79035989/is-there-a-way-to-use-parameters-in-databricks-in-sql-with-parameter-marker-synt</link><description>EDIT: I got a message from Databricks' employee that currently (DBR 15.4 LTS) the parameter marker syntax is not supported in this scenario. It might work in the future versions. Original question:...</description><pubDate>Mon, 20 Apr 2026 13:24:00 GMT</pubDate></item><item><title>Printing secret value in Databricks - Stack Overflow</title><link>https://stackoverflow.com/questions/69925461/printing-secret-value-in-databricks</link><description>2 Building on @camo's answer, since you're looking to use the secret value outside Databricks, you can use the Databricks Python SDK to fetch the bytes representation of the secret value, then decode and print locally (or on any compute resource outside of Databricks).</description><pubDate>Tue, 21 Apr 2026 19:21:00 GMT</pubDate></item><item><title>databricks asset bundle switch between run_as configs</title><link>https://stackoverflow.com/questions/78460887/databricks-asset-bundle-switch-between-run-as-configs</link><description>databricks asset bundle switch between run_as configs Asked 1 year, 11 months ago Modified 1 year, 10 months ago Viewed 1k times</description><pubDate>Mon, 20 Apr 2026 22:14:00 GMT</pubDate></item><item><title>Creating Databricks Database Snapshot - Stack Overflow</title><link>https://stackoverflow.com/questions/75447620/creating-databricks-database-snapshot</link><description>Databricks is not like a traditional database where all data is stored "inside" the database. For example, Amazon RDS provides a "snapshot" feature that can dump the entire contents of a database, and the snapshot can then be restored to a new database server if required. The equivalent in Databricks would be Delta Lake time travel, which allows you to access the database as it was at a ...</description><pubDate>Wed, 22 Apr 2026 10:01:00 GMT</pubDate></item><item><title>ValueError: cannot configure default credentials - Stack Overflow</title><link>https://stackoverflow.com/questions/78989067/valueerror-cannot-configure-default-credentials-visual-studio-and-databricks</link><description>I am trying to connect to Databricks with Visual Studio Code Databricks extension. Get following error: databricks-connect test In the extension everything looks fine I am logged into Databricks.</description><pubDate>Tue, 21 Apr 2026 21:22:00 GMT</pubDate></item><item><title>databricks: writing spark dataframe directly to excel</title><link>https://stackoverflow.com/questions/59107489/databricks-writing-spark-dataframe-directly-to-excel</link><description>9 I'm assuming that because you have the "databricks" tag you are wanting to create an .xlsx file within databricks file store and that you are running code within databricks notebooks. I'm also going to assume that your notebooks are running python. There is no direct way to save an excel document from a spark dataframe.</description><pubDate>Tue, 21 Apr 2026 17:48:00 GMT</pubDate></item><item><title>How to check condition before starting Databricks workflows jobs ...</title><link>https://stackoverflow.com/questions/77941118/how-to-check-condition-before-starting-databricks-workflows-jobs</link><description>Is there any service/technique available in Databricks workflows using which I can perform pre specified condition check and decide which path to execute? Note - Running databricks in Azure but requirement is I have to explore possible ways to make this pipeline as much databricks native as possible and cannot use data factory.</description><pubDate>Tue, 21 Apr 2026 20:11:00 GMT</pubDate></item><item><title>sql - Exploding Column in Databricks row-wise - Stack Overflow</title><link>https://stackoverflow.com/questions/76806716/exploding-column-in-databricks-row-wise</link><description>Exploding Column in Databricks row-wise Asked 2 years, 8 months ago Modified 2 years, 8 months ago Viewed 6k times</description><pubDate>Wed, 22 Apr 2026 16:42:00 GMT</pubDate></item><item><title>Databricks - Download a dbfs:/FileStore file to my Local Machine</title><link>https://stackoverflow.com/questions/66685638/databricks-download-a-dbfs-filestore-file-to-my-local-machine</link><description>Method1: Using Databricks portal GUI, you can download full results (max 1 millions rows). Method2: Using Databricks CLI To download full results, first save the file to dbfs and then copy the file to local machine using Databricks cli as follows.</description><pubDate>Wed, 22 Apr 2026 04:25:00 GMT</pubDate></item><item><title>Unable to login to Azure Databricks Account Console</title><link>https://stackoverflow.com/questions/78843411/unable-to-login-to-azure-databricks-account-console</link><description>I am also the databricks account administrator. Until two weeks ago, I was able to access the databricks account console without any issues, but I am facing issues during the last 2 weeks as below.</description><pubDate>Tue, 21 Apr 2026 17:19:00 GMT</pubDate></item></channel></rss>