For this crowd, "machine learning" should really be read as parallel Bayesian inversion via MCMC (Metropolis modified with a form of "parallel tempering"), but that was a bit much to explain to a popular audience.
The underlying paper is https://doi.org/10.1126/science.adh3875:
Cox, Alexander A., and C. Brenhin Keller. "A Bayesian inversion for emissions and export productivity across the end-Cretaceous boundary." Science 381.6665 (2023): 1446-1451.
That paper is more focused on the general concepts than the implementation, but FWIW the new code is in Julia and in the supp mat, and calls an existing C program called LOSCAR for the forward modelling.
For this crowd, "machine learning" should really be read as parallel Bayesian inversion via MCMC (Metropolis modified with a form of "parallel tempering"), but that was a bit much to explain to a popular audience.
The underlying paper is https://doi.org/10.1126/science.adh3875: Cox, Alexander A., and C. Brenhin Keller. "A Bayesian inversion for emissions and export productivity across the end-Cretaceous boundary." Science 381.6665 (2023): 1446-1451.
That paper is more focused on the general concepts than the implementation, but FWIW the new code is in Julia and in the supp mat, and calls an existing C program called LOSCAR for the forward modelling.