Databricks (Spark): .egg dependencies not installed automatically?

后端 未结 1 1469
醉话见心
醉话见心 2021-02-08 09:38

I have a locally created .egg package that depends on boto==2.38.0. I used setuptools to create the build distribution. Everything works in my own loca

相关标签:
1条回答
  • 2021-02-08 09:46

    Your application's dependencies will not, in general, work properly if they are diverse and don't have uniform language support. The Databrick docs explain that

    Databricks will install the correct version if the library supports both Python 2 and 3. If the library does not support Python 3 then library attachment will fail with an error.

    In this case it will not automatically fetch dependencies when you attach a library to the cluster.

    0 讨论(0)
提交回复
热议问题