by T. Friedrich, A. Timmermann, A. Abe-Ouchi, N. R. Bates, M. O. Chikamoto, M. J. Church, J. E. Dore, D. K. Gledhill, M. González-Dávila, M. Heinemann, T. Ilyina, J. H. Jungclaus, E. McLeod, A. Mouchet, and J. M. Santana-Casiano
Nature Climate Change
in press, doi:10.1038/nclimate1372
ABSTRACT: Since the beginning of the Industrial Revolution humans have released ~500 billion metric tons of carbon to the atmosphere through fossil-fuel burning, cement production and land-use changes1, 2. About 30% has been taken up by the oceans3. The oceanic uptake of carbon dioxide leads to changes in marine carbonate chemistry resulting in a decrease of seawater pH and carbonate ion concentration, commonly referred to as ocean acidification. Ocean acidification is considered a major threat to calcifying organisms4, 5, 6. Detecting its magnitude and impacts on regional scales requires accurate knowledge of the level of natural variability of surface ocean carbonate ion concentrations on seasonal to annual timescales and beyond. Ocean observations are severely limited with respect to providing reliable estimates of the signal-to-noise ratio of human-induced trends in carbonate chemistry against natural factors. Using three Earth system models we show that the current anthropogenic trend in ocean acidification already exceeds the level of natural variability by up to 30 times on regional scales. Furthermore, it is demonstrated that the current rates of ocean acidification at monitoring sites in the Atlantic and Pacific oceans exceed those experienced during the last glacial termination by two orders of magnitude.