jb55's avatar
jb55 _@jb55.com 4 months ago
I was randomly thinking about loot/drop table implementations in games and naturally started thinking of the fastest way to implement large drop tables. gpt5 pointed me to this fascinating algorithm that is barely mentioned online, but it can turn sample any discrete probability distribution in O(1) https://y7k4.github.io/2021/03/23/alias-method.html its a bit brain bending and it took a good 20 minutes of staring to understand how it works, but its pretty cool. #nerdsnipe

Replies (6)

jb55's avatar
jb55 _@jb55.com 4 months ago
was curious what gpt5 thought: > Few items (n ≤ ~32) or few draws per build (draws ≤ ~10–20·n): use a prefix-sum + binary search. It’s tiny, obvious, and usually just as fast. > Many items and lots of samples from the same distribution (draws ≫ n): use Walker’s alias. It pays for itself and gives you flat O(1) sampling. > Weights change frequently: neither vanilla alias nor plain prefix-sums are ideal—use a Fenwick/segment tree (O(log n) updates & samples). maybe an ideal library like this would implement it as "Roaring Discrete Probability Distribution Sampling". although that doesn't roll off the tongue as much. The dynamic update thing is an interesting use case. There may be times where you need millions of samples really quick for some reason... maybe rolling unique drops for 1000s of players in a raid? For libraries its probably best not to assume how its being used and just go fast.
This algorithm is brilliant! So simple!
jb55's avatar jb55
I was randomly thinking about loot/drop table implementations in games and naturally started thinking of the fastest way to implement large drop tables. gpt5 pointed me to this fascinating algorithm that is barely mentioned online, but it can turn sample any discrete probability distribution in O(1) https://y7k4.github.io/2021/03/23/alias-method.html its a bit brain bending and it took a good 20 minutes of staring to understand how it works, but its pretty cool. #nerdsnipe
View quoted note →