In this research an effective algorithm for nonlinearly constrained optimization using the structured augmented Lagrangian secant update recently proposed by Tapia is presented. The algorithm is globally defined, and uses a new and reliable method for choosing the Lagrangian augmentation parameter that does not require prior knowledge of the true Hessian. Considerable numerical experimentation with this algorithm, both embedded in a merit-function line search SQP framework and without line search, is presented. The algorithm is compared to the widely used damped BFGS secant update of Powell, which, like the one in this paper, was designed to circumvent the lack of positive definiteness in the Hessian of the Lagrangian. It is also established that when the algorithm converges it converges R-superlinearly, which is a strong result in that it makes no assumptions on the approximate Hessian or the augmentation parameter. An immediate corollary is a new result in unconstrained optimization: whenever the unconstrained BFGS secant method converges, it does so Q-superlinearly. This study has led to the conclusion that, when properly implemented, Tapia's structured augmented Lagrangian BFGS secant update has strong theoretical properties, and in experiments, is very competitive with Powell's damped BFGS update.