A neural network architecture and learning algorithm for associative memory storage of analog patterns, continuous sequences, and chaotic attractors in the same is described. System performance using many different attractors from the family of Chua attractors the Chua hardware circuit is investigated in an the problem of real time handwritten digit recognition. of these attractors outperform the previously studied Lorenz attractor system in terms of accuracy and speed of convergence In the normal form projection algorithm, which was developed at Berkeley for associative memory storage of dynamic attractors, a matrix inversion determines network weights, given prototype patterns to be stored. There are N units of capacity in an N node network with 3N2 weights. It costs one unit per static two per Fourier component of each periodic trajectory, and least three per chaotic attractor. There are no spurious attractors and for periodic attractors there is a Liapunov function a special coordinate system which governs the approach transient states to stored trajectories. Unsupervised or supervised incremental learning algorithms for pattern classification, such as competitive learning or bootstrap Widrow-Hoff can easily be implemented. The architecture can be ''folded'' into a recurrent network with higher order weights that can be used as a model of cortex that stores oscillatory and chaotic attractors by a Hebb rule. A novel computing architecture has been constructed of recurrently interconnected associative memory modules of this Architectural variations employ selective synchronization of modules with chaotic attractors that communicate by broadspectrum chaotic signals to control the flow of computation.