It was reported (Kabashima and Shinomoto 1992) that estimators of a binary decision boundary show asymptotically strange behaviors when the probability model is ill-posed or semiparametric. We give a rigorous analysis of this phenomenon in a stochastic perceptron by using the estimating function method. A stochastic perceptron consists of a neuron that is excited depending on the weighted sum of inputs but its probability distribution form is unknown here. It is shown that there exists no √n-consistent estimator of the threshold value h, that is, no estimator h that converges to h in the order of 1/ √n as the number n of observations increases. Therefore, the accuracy of estimation is much worse in this semiparametric case with an unspecified probability function than in the ordinary case. On the other hand, it is shown that there is a √n-consistent estimator ŵ of the synaptic weight vector. These results elucidate strange behaviors of learning curves in a semiparametric statistical model.