Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
TocHeadingTitle
Date
Availability
1-2 of 2
Erhan Oztop
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Neural Computation (2015) 27 (8): 1796–1823.
Published: 01 August 2015
Abstract
View article
PDF
Boolean functions (BFs) are central in many fields of engineering and mathematics, such as cryptography, circuit design, and combinatorics. Moreover, they provide a simple framework for studying neural computation mechanisms of the brain. Many representation schemes for BFs exist to satisfy the needs of the domain they are used in. In neural computation, it is of interest to know how many input lines a neuron would need to represent a given BF. A common BF representation to study this is the so-called polynomial sign representation where and 1 are associated with true and false, respectively. The polynomial is treated as a real-valued function and evaluated at its parameters, and the sign of the polynomial is then taken as the function value. The number of input lines for the modeled neuron is exactly the number of terms in the polynomial. This letter investigates the minimum number of terms, that is, the minimum threshold density, that is sufficient to represent a given BF and more generally aims to find the maximum over this quantity for all BFs in a given dimension. With this work, for the first time exact results for four- and five-variable BFs are obtained, and strong bounds for six-variable BFs are derived. In addition, some connections between the sign representation framework and bent functions are derived, which are generally studied for their desirable cryptographic properties.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2006) 18 (12): 3119–3138.
Published: 01 December 2006
Abstract
View article
PDF
It is known that any dichotomy of {−1, 1} n can be learned (separated) with a higher-order neuron (polynomial function) with 2 n inputs (monomials). In general, less than 2 n monomials are sufficient to solve a given dichotomy. In spite of the efforts to develop algorithms for finding solutions with fewer monomials, there have been relatively fewer studies investigating maximum density (II( n )), the minimum number of monomials that would suffice to separate an arbitrary dichotomy of {−1, 1} n . This article derives a theoretical (upper) bound for this quantity, superseding previously known bounds. The main theorem here states that for any binary classification problem in {−1, 1} n (n > 1), one can always find a polynomial function solution with 2 n −2 n /4 or fewer monomials. In particular, any dichotomy of {−1, 1} n can be learned by a higher-order neuron with a fan-in of 2 n −2 n /4 or less. With this result, for the first time, a deterministic ratio bound independent of n is established as II ( n )/2 n ≤ 0 75. The main theorem is constructive, so it provides a deterministic algorithm for achieving the theoretical result. The study presented provides the basic mathematical tools and forms the basis for further analyses that may have implications for neural computation mechanisms employed in the cerebral cortex.