The current implementation of load_many
on the dataloader uses asyncio.gather
to run the batch of keys through the existing load
implementation (its a single line):
def load_many(self, keys):
return gather(*map(self.load, keys))
This means that if any of the individual load
tasks raises an exception, the entire load_many
call will fail. So for example, if key 1 returns a value but key 2 fails by raising, then this code:
results = await my_loader.load_many([1, 2])
raises an exception and the value for key 1 can't be used.
In some cases it would be useful to do:
results = await my_loader.load_many([1,2])
for result in results:
if isinstance(result, Exception):
# handle error
else:
# handle successful result
This would match the implementation of loadMany
in JS dataloader project: https://github.com/graphql/dataloader#loadmanykeys
The behaviour can be achieved with the return_exceptions
argument to gather
. For example:
def load_many(self, keys):
return gather(*map(self.load, keys), return_exceptions=True)
https://docs.python.org/3/library/asyncio-task.html#asyncio.gather
Adding return_exceptions
in-place would change the behaviour of existing code. Another option would be to add a return_exceptions
optional argument to the load_many
method and allow clients to specify the behaviour (leaving the existing behaviour unchanged). I don't have a strong instinct either way.
Pay now to fund the work behind this issue.
Get updates on progress being made.
Maintainer is rewarded once the issue is completed.
You're funding impactful open source efforts
You want to contribute to this effort
You want to get funding like this too