论文标题
在随机查询复杂性中构图需要放大何时?
When Is Amplification Necessary for Composition in Randomized Query Complexity?
论文作者
论文摘要
假设我们为外部功能$ f $和内部功能$ g $有随机的决策树。 The natural approach for obtaining a randomized decision tree for the composed function $(f\circ g^n)(x^1,\ldots,x^n)=f(g(x^1),\ldots,g(x^n))$ involves amplifying the success probability of the decision tree for $g$, so that a union bound can be used to bound the error probability over all the coordinates.放大引入了对数因子成本开销。我们研究一个问题:该日志何时需要?我们表明,当外部函数是奇偶校验或大多数时,即使对于比纯的随机决策树更强大的模型,也可能需要对数因子。我们的结果与具有嘈杂输入的决策树有关,但以各种方式与定性增强有关。
Suppose we have randomized decision trees for an outer function $f$ and an inner function $g$. The natural approach for obtaining a randomized decision tree for the composed function $(f\circ g^n)(x^1,\ldots,x^n)=f(g(x^1),\ldots,g(x^n))$ involves amplifying the success probability of the decision tree for $g$, so that a union bound can be used to bound the error probability over all the coordinates. The amplification introduces a logarithmic factor cost overhead. We study the question: When is this log factor necessary? We show that when the outer function is parity or majority, the log factor can be necessary, even for models that are more powerful than plain randomized decision trees. Our results are related to, but qualitatively strengthen in various ways, known results about decision trees with noisy inputs.