Skip to Main Content

Our framework can be specialized to improve the two main criteria which have a trade-off: the tightness of the proper approximation and the sample complexity. For example, we can improve the tightness of our proper approximations by taking a subsequence of . This will make the sample complexity bound degrade, however, because Kn will grow faster. Table 2 shows the trade-offs between parameters in our model and the effectiveness of learning.

Table 2

Trade-off between quantities in our learning model and effectiveness of different criteria. Kn is the constant that satisfies the boundedness property (Theorems 2 and 3) and s is a fixed constant larger than 1 (Section 4.1).

criterion
as Kn increases …
as s increases …
tightness of proper approximation improves improves 
sample complexity bound degrades degrades 
criterion
as Kn increases …
as s increases …
tightness of proper approximation improves improves 
sample complexity bound degrades degrades 

Close Modal

or Create an Account

Close Modal
Close Modal