I\'m looking for a way to be able to connect to Databricks deltalake tables from ADF and other Azure Services(like Data Catalog). I don\'t see databricks data store listed in A
You can but it is quite complex. You need to use the ODBC connector in Azure Data Factory with a self hosted runtime.
ADF can connect using ODBC (https://docs.microsoft.com/en-us/azure/data-factory/connector-odbc). It does require a self hosted IR. Assuming you have the right drivers installed you can configure the ODBC connection to a Databricks cluster.
The connections details for the ODBC settings can be found in cluster settings screen in the Databricks workspace (https://docs.microsoft.com/en-us/azure/azure-databricks/connect-databricks-excel-python-r).
The process is very similar to what you posted for PowerBI.
Please refer to the section Azure Data Factory of Azure Databricks offical document User Guide > Developer Tools > Managing Dependencies in Data Pipelines. And you will see there are two Azure documents list in the topic about how to create a Databricks notebook with the Databricks Notebook Activity and run it to do the transfer data task in Azure Data Factory, as below. I think it will help you to realize your needs.
Actually, I figured it is possible to get metadata from any tables inside a Databricks workspace directly, by using ODBC connection available on current version of Azure Data Catalog, it would be much better a native connector, but for now if you wanna give it a try just fill up the info bellow (on the Azure Data Catalog publishing app):
Driver: Microsoft Spark ODBC Driver (it must be intalled on your system)
Connection String: host=eastus.azuredatabricks.net;port=443;SSL=1;HTTPPath=sql/protocolv1/o/XXXXXXXXXXXXXXX/XXXX-XXXXXX-XXXXXX;transportMode=http;AuthMech=8
User: token
Password: dapiXXXXXXXXXXXXXXXXXXXXX
And let Database field blank