The space (and time for construction) complexity for a Db for a type that has many (but not all) unique #[venndb(filter)]
fields appears to be O(n^2)
, which makes constructing (and holding) such a Db quickly infeasible for collection with more than ~100,000 unique fields.
Among other solutions, I wonder if we could use some sort of compressed bitset implementation, like roaring or hi_sparse_bitset to better handle this scenario with minimal to no performance loss in the "large number of total entries, small number of unique filter field values" case.
If we're not interested in adjusting the current implementation, this could also be something gated behind a feature flag or configured by some derive attribute.
I'd be happy to take a look at feasibility if it's something that would be considered π
Pay now to fund the work behind this issue.
Get updates on progress being made.
Maintainer is rewarded once the issue is completed.
You're funding impactful open source efforts
You want to contribute to this effort
You want to get funding like this too