Here's an old trick that we found useful for proving some tight complexity lower bounds. You are given *m* coins, each of weight either *a* or *b*, and a modern scale that can tell you the total weight of any chosen subset of coins. How many weighings do you need to identify which coin is which? Checking each coin individually uses *m* weighings, but can you do less?

In any weighing, we try some unknown number of weight-*a* coins between 0 and *m*, so this results in one of *m* + 1 possible values, giving us at most log(*m* + 1) bits of information. In total we need m bits of information to identify each coin, so clearly we will need at least Ω(*m* / log *m*) weighings.

It turns out that this many is in fact enough, and this generalizes to various other settings with less restricted weights. This is the basis for two of our recent results: a tight complexity lower bound for Integer Linear Programming with few constraints and for multicoloring (a.k.a. b-fold coloring), assuming the Exponential Time Hypothesis. The trick allows us to use constraints that check the value of some number between 0 and *m* to indeed extract about log(*m*) bits of new information from each, in a way that is general enough to check *m* clauses of a 3-CNF-SAT instance using only *O*(*m* / log *m*) constraints.