添加链接
link管理
链接快照平台
  • 输入网页链接,自动生成快照
  • 标签化管理网页链接
1. Dask

The first is when using a DaskExecutor and using a task input or output that is not serializable by cloudpickle . Dask uses cloudpickle as the mechanism to send data from the client to the workers . This specific error is often raised with mapped tasks that use client-type objects, such as connections to databases or HTTP clients , as inputs.

Solution: Such database or HTTP connections need to be instantiated (and closed) inside your Prefect tasks.

2. Results

The second way this can happen is through Results . By default, task outputs are saved as LocalResults , and the default Serializer is the PickleSerializer , which uses cloudpickle. If you have a task that returns a client or connection , you can avoid serialization of this task by turning off checkpointing for that task with @task(checkpoint=False) .

View in #prefect-community on Slack

Guillaume_Latour @Guillaume_Latour : Hi everyone, I stumbled upon an error as the prefect engine serializer tried to pickle a task result: TypeError: cannot pickle 'lxml.etree.XMLSchema' object
Of course the result of one of my task is of this type and I would prefer not to change it.

I tried to launch that process again with config.flows.checkpointing = "false" in the ~/.prefect/config.toml file but I got the same error…

This is an error that I cannot reproduce locally with prefect run <flow> . Do you guys have any leads on how I can solve this issue or at least reproduce the error locally?

Kevin_Kho @Kevin_Kho : config.flows.checkpointing = "false" gets overridden in Cloud backed runs. You need to turn it off in the task level @task(checkpoint=False) .

Guillaume_Latour @Guillaume_Latour : oh.
But I am hosting the prefect server, would that change anything ?

I’ll try to put this option in the decorator nonetheless
ty for your quick response :smile:

@Anna_Geller : it works the same way for Cloud and Server

Guillaume_Latour @Guillaume_Latour : yay it worked \o/