添加链接
link管理
链接快照平台
  • 输入网页链接,自动生成快照
  • 标签化管理网页链接
Databricks Community

Hello,

I am experiencing issues with importing from utils repo the schema file I created.

this is the logic we use for all ingestion and all other schemas live in this repo utills/schemas

I am unable to access the file I created for a new ingestion pipeline

and get the following error:

OSError: [Errno 95] Operation not supported: '/Workspace/Repos/Connectors/Dev/utils/schemas/ Comptroller.py '

---------------------------------------------------------------------------

OSError Traceback (most recent call last)

File <command-2008379283730306>:10

7 from pyspark.sql import DataFrame as SparkDataFrame

8 # from utils.schemas import comptroller as schemas # does not allow me to import

9 #workaround

---> 10 from utils.schemas.Comptroller import *

File /databricks/python_shell/dbruntime/PythonPackageImportsInstrumentation/__init__.py:172, in _create_import_patch.<locals>.import_patch(name, globals, locals, fromlist, level)

167 thread_local._nest_level += 1

169 try :

170 # Import the desired module. If you’re seeing this while debugging a failed import,

171 # look at preceding stack frames for relevant error information.

--> 172 original_result = python_builtin_import(name, globals, locals, fromlist, level)

174 is_root_import = thread_local._nest_level == 1

175 # `level` represents the number of leading dots in a relative import statement.

176 # If it's zero, then this is an absolute import.

File <frozen importlib._bootstrap>:1027, in _find_and_load(name, import_)

File <frozen importlib._bootstrap>:1006, in _find_and_load_unlocked(name, import_)

File <frozen importlib._bootstrap>:688, in _load_unlocked(spec)

File <frozen importlib._bootstrap_external>:879, in exec_module(self, module)

File <frozen importlib._bootstrap_external>:1016, in get_code(self, fullname)

File <frozen importlib._bootstrap_external>:1073, in get_data(self, path)

OSError: [Errno 95] Operation not supported: '/Workspace/Repos/Connectors/Dev/utils/schemas/ Comptroller.py '

I tried to import a different file from the same location and it worked /utils/schemas/ differentflie.py

I see all the contents in that file.

the schema files are self explanatory and contain only schema for the tables being ingested.

any idea why this is happening is it a repo sync issue? a permissions issue?

really confused by it.

Also I am wondering what the difference in the icons mean.

icon I assumed it means that this file was created by me

I'm a databricks newbie

Thank you in advance.

@Debayan Mukherjee​

Hello, thank you for your response.

please let me know if these are the correct commands to access the file from notebook

I can see the files in the repo folder

but I just noticed this.

the file I am trying to access the size is 0 but there is contents in there so it shouldn't be 0

image Thank you

If there isn’t a group near you, start one and help create a community that brings people together. Request a New Group Delta sharing not working on tables with row filter or column mask in Data Governance WATER MARK ERROR WHILE JOINING WITH MULTIPLE STREAM TABLES in Data Engineering Executing Stored Procedures/update in Federated SQL Server in Data Engineering Multi languages support in Databricks in Data Engineering © Databricks 2024. All rights reserved. Apache, Apache Spark, Spark and the Spark logo are trademarks of the Apache Software Foundation.

  • Privacy Notice
  • Terms of Use
  • Your Privacy Choices
  • Your California Privacy Rights
  •