NEURAL NETWORKS: ALGORITHMS AND APPLICATIONS
September 25-29, 1995
A Short Course at the Oregon Graduate Institute of Science and
Technology (OGI), Portland, Oregon.
Course Organizer: John E. Moody
Lead Instructor: Hong Pi
With Lectures By:
Todd K. Leen
John E. Moody
Thorsteinn S. Rognvaldsson
Eric A. Wan
Artificial neural networks (ANN) have emerged as a new in-
formation processing technique and an effective computational
model for solving nonlinear problems involving pattern recogni-
tion and completion, feature extraction, function approximation,
and prediction. This course introduces participants to the neural
network paradigms and their applications in pattern classifica-
tion; system identification; signal processing and image
analysis; control engineering; diagnosis; time series forecast-
ing; and financial analysis and trading. An introduction to fuz-
zy logic and fuzzy control systems is also given.
Designing a neural network application involves steps from
data preprocessing to learning and network model selection. This
course, with many examples, application demos and hands-on lab
practice, will familiarize the participants with the techniques
necessary for building successful applications. About 50 percent
of the class time is assigned to lab sessions. The simulations
will be based on Matlab, the Matlab Neural Net Toolbox, and other
software running on Windows-NT workstations. Prerequisites:
Linear algebra and calculus. Previous experience with using
Matlab is helpful, but not required.
Who will benefit:
Technical professionals, business analysts, financial mark-
et practitioners, and other individuals who wish to gain a basic
understanding of the theory and algorithms of neural computation
and/or are interested in applying ANN techniques to real-world,
data-driven modeling problems.
Course Objectives:
After completing the course, students will:
- Understand the basic neural networks paradigms
- Be familiar with the range of ANN applications
- Have a good understanding of the techniques for designing
successful applications
- Gain hands-on experience with ANN modeling.
Course Outline (8:30am - 5:00pm September 25 - 28, and
8:30am - 12:30am September 29):
Neural Networks: Biological and Artificial
Biological inspirations. Basic models of a neuron.
Types of architectures and learning paradigms.
Simple Perceptrons and Adalines
Decision surfaces. Linear separability.
Perceptron learning rules. Linear units.
Gradient descent learning.
Multi-Layer Feed-Forward Networks I
Multi-Layer perceptrons. Back-propagation learning.
Generalization. Early Stopping via validation.
Momentum and adaptive learning rate.
Examples and applications.
Multi-Layer Feed-Forward Networks II
Newton's method. Conjugate gradient. Levenburg-Marquardt.
Radial basis function networks.
Projection pursuit regression.
Neural Networks for Pattern Recognition and Classification
Bayes decision theory. The Bayes risk.
Non-neural and neural methods for classification.
Neural networks as estimators of the posterior probability.
Methods for improving the classification performance.
Benchmark tests of neural networks vs. other methods.
Some applications.
Improving the Generalization Performance
Model bias and model variance.
Weight decay. Regularizers. Optimal brain surgeon.
Learning from hints. Sensitivity analysis.
Input variable selection. The delta-test.
Time Series Prediction: Classical and Nonlinear Approaches
Linear time series models. Simple nonlinear models.
Recurrent network models and training algorithms.
Case studies: sunspots, economic forecasting.
Self-Organized Networks and Unsupervised Learning
K-means clustering. Kohonen feature maps. Learning vector
quantization. Adaptive principal components analysis.
Neural Network for Adaptive Control
What is control. Heuristic, open loop, and inverse control.
Feedback algorithms for control. Neural network feedback
control. Reinforcement learning.
Survey of Neural Network Applications in Financial Markets
Bond and stock valuation. Currency rate forecasting.
Trading systems. Commodity price forecasting.
Risk management. Option pricing.
Fuzzy Systems
Fuzzy logic. Fuzzy control.
Adaptive fuzzy and neural-fuzzy systems.
About the Instructors
Todd K. Leen is associate professor of Computer Science and En-
gineering at Oregon Graduate Institute of Science & Technology.
He received his Ph.D. in theoretical Physics from the University
of Wisconsin in 1982. From 1982-1987 he worked at IBM Corpora-
tion, and then pursued research in mathematical biology at Good
Samaritan Hospital's Neurological Sciences Institute. He joined
OGI in 1989. Dr. Leen's current research interests include neur-
al learning, algorithms and architectures, stochastic optimiza-
tion, model constraints and pruning, and neural and non-neural
approaches to data representation and coding. He is particularly
interested in fast, local modeling approaches, and applications
to image and speech processing. Dr. Leen served as theory program
chair for the 1993 Neural Information Processing Systems (NIPS)
conference, and workshops chair for the 1994 NIPS conference.
John E. Moody is associate professor of Computer Science and En-
gineering at Oregon Graduate Institute of Science & Technology.
His current research focuses on neural network learning theory
and algorithms in it's many manifestations. He is particularly
interested in statistical learning theory, the dynamics of learn-
ing, and learning in dynamical contexts. Key application areas
of his work are adaptive signal processing, adaptive control,
time series analysis, forecasting, economics and finance. Moody
has authored over 35 scientific papers, more than 25 of which
concern the theory, algorithms, and applications of neural net-
works. Prior to joining the Oregon Graduate Institute, Moody was
a member of the Computer Science and Neuroscience faculties at
Yale University. Moody received his Ph.D. and M.A. degrees in
Theoretical Physics from Princeton University, and graduated Sum-
ma Cum Laude with a B.A. in Physics from the University of Chi-
cago.
Hong Pi is a senior research associate at Oregon Graduate Insti-
tute. He received his Ph.D. in theoretical physics from Univer-
sity of Wisconsin in 1989. Prior to joining OGI in 1994 he had
been a postdoctoral fellow and research scientist in Lund Univer-
sity, Sweden. His research interests include nonlinear modeling,
neural network algorithms and applications.
Thorsteinn S. Rognvaldsson received the Ph.D. degree in theoreti-
cal physics from Lund University, Sweden, in 1994. His research
interests are Neural Networks for prediction and classification.
He is currently a postdoctoral research associate at Oregon Gra-
duate Institute.
Eric A. Wan, Assistant Professor of Electrical Engineering and
Applied Physics, Oregon Graduate Institute of Science & Technolo-
gy, received his Ph.D. in electrical engineering from Stanford
University in 1994. His research interests include learning al-
gorithms and architectures for neural networks and adaptive sig-
nal processing. He is particularly interested in neural applica-
tions to time series prediction, speech enhancement, system iden-
tification, and adaptive control. He is a member of IEEE, INNS,
Tau Beta Pi, Sigma Xi, and Phi Beta Kappa.
For more information contact:
Linda M. Pease, Director
Office of Continuing Education
Oregon Graduate Institute of Science & Technology
PO Box 91000
Portland, OR 97291-1000
+1-503-690-1259
+1-503-690-1686 (fax)
e-mail: continuinged@admin.ogi.edu
WWW home page: http://www.ogi.edu
^*^*^*^*^*^*^*^*^*^*^**^*^*^*^*^*^*^*^*^*^*^*^*^*^*^*^*^*^*^*^*^*