添加链接
link管理
链接快照平台
  • 输入网页链接,自动生成快照
  • 标签化管理网页链接
Databricks Community

I get an PythonException: float() argument must be a string or a number, not 'NoneType' when attempting to save a DataFrame as a Delta Table.

Here's the line of code that I am running:

```

df.write.format("delta").saveAsTable("schema1.df_table", mode="overwrite")

root

|-- ts: timestamp (nullable = true)

|-- _source: string (nullable = true)

|-- lat: decimal(9,6) (nullable = true)

|-- lng: decimal(9,6) (nullable = true)

|-- id: string (nullable = false)

|-- mm-yyyy: date (nullable = true)

|-- hid: string (nullable = true)

```

PythonException: An exception was thrown from the Python worker. Please see the stack trace below. 'TypeError: float() argument must be a string or a number, not 'NoneType'', from , line 3. Full traceback below: Traceback (most recent call last): File "", line 3, in TypeError: float() argument must be a string or a number, not 'NoneType'

How do we handle `None` in the spark DataFrame? I'd like to identify which rows / columns contain `None` and drop them?

Hi @kll

Hope you are well. Just wanted to see if you were able to find an answer to your question and would you like to mark an answer as best? It would be really helpful for the other members too.

Cheers!

Even though the code throws the issue while write, the issue can be in the code before as spark is lazily evaluated. The error " TypeError: float() argument must be a string or a number, not 'NoneType' " generally comes when we pass a variable to float() function and the value of the variable at that time is None. I would recommend you to check for such issues.

If there isn’t a group near you, start one and help create a community that brings people together. Request a New Group Error when accessing 'num_inserted_rows' in Spark SQL (DBR 15.4 LTS) in Data Engineering I am getting NoneType error when running a query from API on cluster in Data Engineering 0: 'error: TypeError("\'NoneType\' object is not callable") in api_request_parallel_processor.py in Machine Learning Finding multiple substrings from a DataFrame column dynamically? in Data Engineering © Databricks 2024. All rights reserved. Apache, Apache Spark, Spark and the Spark logo are trademarks of the Apache Software Foundation.
  • Privacy Notice
  • Terms of Use
  • Your Privacy Choices
  • Your California Privacy Rights
  •