<?xml version="1.0" encoding="utf-8" ?><rss version="2.0"><channel><title>Bing: Databricks Compute Options</title><link>http://www.bing.com:80/search?q=Databricks+Compute+Options</link><description>Search results</description><copyright>Copyright © 2026 Microsoft. All rights reserved. These XML results may not be used, reproduced or transmitted in any manner or for any purpose other than rendering Bing results within an RSS aggregator for your personal, non-commercial use. Any other use of these results requires express written permission from Microsoft Corporation. By accessing this web page or using these results in any manner whatsoever, you agree to be bound by the foregoing restrictions.</copyright><item><title>Databricks - Download a dbfs:/FileStore file to my Local Machine</title><link>https://stackoverflow.com/questions/66685638/databricks-download-a-dbfs-filestore-file-to-my-local-machine</link><description>Method1: Using Databricks portal GUI, you can download full results (max 1 millions rows). Method2: Using Databricks CLI To download full results, first save the file to dbfs and then copy the file to local machine using Databricks cli as follows.</description><pubDate>Wed, 22 Apr 2026 04:25:00 GMT</pubDate></item><item><title>Databricks: managed tables vs. external tables - Stack Overflow</title><link>https://stackoverflow.com/questions/78652707/databricks-managed-tables-vs-external-tables</link><description>While Databricks manages the metadata for external tables, the actual data remains in the specified external location, providing flexibility and control over the data storage lifecycle. This setup allows users to leverage existing data storage infrastructure while utilizing Databricks' processing capabilities.</description><pubDate>Thu, 23 Apr 2026 20:01:00 GMT</pubDate></item><item><title>Installing multiple libraries 'permanently' on Databricks' cluster ...</title><link>https://stackoverflow.com/questions/78075840/installing-multiple-libraries-permanently-on-databricks-cluster</link><description>Easiest is to use databricks cli 's libraries command for an existing cluster (or create job command and specify appropriate params for your job cluster) Can use the REST API itself, same links as above, using CURL or something.</description><pubDate>Wed, 22 Apr 2026 01:47:00 GMT</pubDate></item><item><title>How to import own modules from repo on Databricks?</title><link>https://stackoverflow.com/questions/74719291/how-to-import-own-modules-from-repo-on-databricks</link><description>I have connected a Github repository to my Databricks workspace, and am trying to import a module that's in this repo into a notebook also within the repo. The structure is as such: Repo_Name Chec...</description><pubDate>Thu, 23 Apr 2026 00:42:00 GMT</pubDate></item><item><title>Is there a way to use parameters in Databricks in SQL with parameter ...</title><link>https://stackoverflow.com/questions/79035989/is-there-a-way-to-use-parameters-in-databricks-in-sql-with-parameter-marker-synt</link><description>EDIT: I got a message from Databricks' employee that currently (DBR 15.4 LTS) the parameter marker syntax is not supported in this scenario. It might work in the future versions. Original question:...</description><pubDate>Wed, 22 Apr 2026 07:16:00 GMT</pubDate></item><item><title>REST API to query Databricks table - Stack Overflow</title><link>https://stackoverflow.com/questions/73097372/rest-api-to-query-databricks-table</link><description>Is databricks designed for such use cases or is a better approach to copy this table (gold layer) in an operational database such as azure sql db after the transformations are done in pyspark via databricks? What are the cons of this approach? One would be the databricks cluster should be up and running all time i.e. use interactive cluster.</description><pubDate>Wed, 22 Apr 2026 13:29:00 GMT</pubDate></item><item><title>databricks: writing spark dataframe directly to excel</title><link>https://stackoverflow.com/questions/59107489/databricks-writing-spark-dataframe-directly-to-excel</link><description>9 I'm assuming that because you have the "databricks" tag you are wanting to create an .xlsx file within databricks file store and that you are running code within databricks notebooks. I'm also going to assume that your notebooks are running python. There is no direct way to save an excel document from a spark dataframe.</description><pubDate>Wed, 22 Apr 2026 21:35:00 GMT</pubDate></item><item><title>Databricks: Download a dbfs:/FileStore File to my Local Machine?</title><link>https://stackoverflow.com/questions/49019706/databricks-download-a-dbfs-filestore-file-to-my-local-machine</link><description>38 Easier options: Install the Databricks CLI, configure it with your Databricks credentials, and use the CLI's dbfs cp command. For example: dbfs cp dbfs:/FileStore/test.txt ./test.txt. If you want to download an entire folder of files, you can use dbfs cp -r.</description><pubDate>Wed, 22 Apr 2026 00:43:00 GMT</pubDate></item><item><title>databricks - How to retrieve a column value from DESCRIBE DETAIL &lt;table ...</title><link>https://stackoverflow.com/questions/71078767/how-to-retrieve-a-column-value-from-describe-detail-table-name</link><description>I would like to use the "Last modified" value from the description of my table in databricks. I know how to get all columns from the table by using "DESCRIBE DETAIL table_name",...</description><pubDate>Wed, 22 Apr 2026 03:27:00 GMT</pubDate></item><item><title>Do you know how to install the 'ODBC Driver 17 for SQL Server' on a ...</title><link>https://stackoverflow.com/questions/61022848/do-you-know-how-to-install-the-odbc-driver-17-for-sql-server-on-a-databricks-c</link><description>By default, Azure Databricks does not have ODBC Driver installed. Run the following commands in a single cell to install MS SQL ODBC Driver on Azure Databricks cluster.</description><pubDate>Wed, 22 Apr 2026 08:07:00 GMT</pubDate></item></channel></rss>