If you find it convenient, you might clone the
https://gitlab.com/territoirevif/minimal-tests-spark-issue
project (that does many operations around cities, local authorities and accounting with open data) where I've extracted from my work what's necessary to make a set of 35 tests that run correctly with Spark 3.3.x, and show the troubles encountered with 3.4.x and 3.5.x.
It is working well with Spark 3.2.x, 3.3.x. But as soon as I selec
t Spark 3.4.x
, where the encoder seems to have deeply changed, the encoder fails with two problems:
1)
It throws
java.util.NoSuchElementException: None.get
messages everywhere.
Asking over the Internet, I wasn't alone facing this problem. Reading it, you'll see that I've attempted a debug but my Scala skills are low.
https://stackoverflow.com/questions/76036349/encoders-bean-doesnt-work-anymore-on-a-java-pojo-with-spark-3-4-0
by the way, if possible, the encoder and decoder functions should forward a parameter as soon as the name of the field being handled is known, and then all the long of their process, so that when the encoder is at any point where it has to throw an exception, it knows the field it is handling in its specific call and can send a message like:
java.util.NoSuchElementException: None.get when encoding
[the method or field it was targeting]
2)
Not found an encoder of the type RS to Spark SQL internal representation.
Consider to change the input type to one of supported at (...)
Or : Not found an encoder of the type
OMI_ID
to Spark SQL internal representation (...)
where
RS
and
OMI_ID
are generic types.
This is strange.
https://stackoverflow.com/questions/76045255/encoders-bean-attempts-to-check-the-validity-of-a-return-type-considering-its-ge
3)
When I switch to the
Spark 3.5.0
version, the same problems remain, but another add itself to the list:
"
Only expression encoders are supported for now
" on what was accepted and working before.