classmethod guess_composition.gammaf(n, h, zeta, base, g=<function guess_composition.<lambda>>)[source]#

Find optimal hamming weight for sparse guessing.

Let s be a vector of dimension n where we expect h non-zero entries. We are ignoring η-γ components and are guessing γ. This succeeds with some probability given by prob_drop(n, h, ζ, γ). Exhaustively searching the guesses takes binomial(n, γ) ⋅ b^γ steps where b is the number of non-zero values in a component of s. We call a γ optimal if it minimizes the overall number of repetitions that need to be performed to succeed with probability 99%.

  • n – vector dimension

  • h – hamming weight of the vector

  • zeta – number of ignored + guesses components

  • base – number of possible non-zero scalars

  • g – We do not consider search space directly by g() applied to it (think time-memory trade-offs).


(number of repetitions, γ, size of the search space, probability of success)