Dear RedisLabs team,
We have some time working with RedisBloom Filter with a large amount of data, let’s say over 250 million of records.
In this context, we have found an issue in latest version published: 2.2.0. When we tried to reserve space for more than 300 million using an error rate of 0.000001, the size was unexpectedly smaller compared to a reservation for 250 million. However, there is no error message, the only thing you can notice that something is wrong is after inserting some data. Then, everything becomes available and the results are erratic.
To avoid this issue, we had to downgrade the service and continue working with the previous version 2.0.3. So please, if you can take a look to this behavior, it will be appreciated by our team.
Evidence 300M (erratic behavior):
Many thanks for your help!