If you create a new image, and then run a gaussian blur on it, the edges of the image become darker. It seems like it's reading black pixel values from off the edge of the image. I don't know how the convolution matrix should work with image edges, but it should only take the image itself into account.
Pay now to fund the work behind this issue.
Get updates on progress being made.
Maintainer is rewarded once the issue is completed.
You're funding impactful open source efforts
You want to contribute to this effort
You want to get funding like this too