Gaussian processes (GPs) are common components in Bayesian non-parametric models having a rich methodological literature and strong theoretical grounding. The use of exact GPs in Bayesian models is limited to problems containing several thousand observations due to their prohibitive computational demands. We develop a posterior sampling algorithm using H -matrix approximations that scales at O ( n log 2 n ) . We show that this approximation's Kullback-Leibler divergence to the true posterior can be made arbitrarily small. Though multidimensional GPs could be used with our algorithm, d -dimensional surfaces are modeled as tensor products of univariate GPs to minimize the cost of matrix construction and maximize computational efficiency. We illustrate the performance of this fast increased fidelity approximate GP, FIFA-GP, using both simulated and non-synthetic data sets.