This paper presents an improved statistical learning algorithm for feature generation in pattern recognition and signal processing. It is approximately derived from geometrical interpretation of the Fisher linear discriminant analysis (FLDA). The principal component analysis (PCA) is popular for data compression and feature extraction. Furthermore, iterative learning algorithms for obtaining eigenvectors in PCA have been presented in such fields. Their effectiveness has been demonstrated in many applications. However, recently FLDA has been often used in many fields, especially face image recognition. The drawback of FLDA is a long computational time based on a large-sized covariance matrix and the issue that the within-class covariance matrix is usually singular. Generally, in FLDA, the inverse matrix of the within-class covariance matrix cannot be obtained, since data dimension is generally higher than the number of data and then it includes many zero eigenvalues. In order to overcome this difficulty, a new iterative feature generation method, the simple-FLDA was proposed by authors. In this paper, further improvement is introduced into the simple-FLDA and its effectiveness is demonstrated for preliminary personal identification problem.