One of the main obstacles in applying genetic algorithms (GAs) to complex problems has been the high computational cost due to their slow convergence rate. To alleviate this difficulty, we developed a hybrid approach that combines GA with a stochastic variant of the simplex method in function optimization. Our motivation for developing the stochastic simplex method is to introduce a cost-effective exploration component into the conventional simplex method. In an attempt to make effective use of the simplex operation in a hybrid GA framework, we used an elite-based hybrid architecture that applies one simplex step to a top portion of the ranked population. We compared our approach with five alternative optimization techniques including a simplex-GA hybrid independently developed by Renders and Bersini and Adaptive Simulated Annealing (ASA). We used two function optimization problems to compare our approach with the five alternative methods. Overall, these tests showed that our hybrid approach is an effective and robust optimization technique. We also tested our hybrid GA on the seven function benchmark problems on real space and showed the results.