Matthias Kellner
@matthiaskellner.bsky.social
π€ 26
π₯ 57
π 7
PhD student in
@labcosmo.bsky.social
reposted by
Matthias Kellner
COSMO Lab
2 days ago
So let us show you just how *universal*
#PET-MAD-1.5
can be. This is a movie of a parallel tempering simulation, with replicas from 300K to 3000K, of what we call a "Mendeleev cluster" - one atom each of every element from 1 to 102.
loading . . .
2
10
6
reposted by
Matthias Kellner
COSMO Lab
15 days ago
π’ New
#preprint
is out! Investigating the many flavors of last-layer
#UQ
, Moritz and π§βπMatthias propose a practitioners' guide on "how to train a shallow ensemble". TL;DR? for good calibration use NLL, include force, and optimize the backbone, fine-tuning for speed! ππβ‘οΈ
arxiv.org/html/2602.15...
loading . . .
How to Train a Shallow Ensemble
https://arxiv.org/html/2602.15747v1
0
3
2
No Install. No Setup. Just Chemical Shift Predictions. At
shiftml.materialscloud.io
we host the latest ShiftML3 in the web and you can predict chemical shifts of organic crystals for free in the web!
loading . . .
ShiftML-3 predictor
https://shiftml.materialscloud.io/
16 days ago
0
3
2
What a cool applet - running a universal MLIP directly from your webbrowser!
add a skeleton here at some point
3 months ago
0
0
0
reposted by
Matthias Kellner
COSMO Lab
3 months ago
π’ PET-MAD is here! π’ It has been for a while for those who read the
#arXiv
, but now you get it preciously πΈ typeset by
@natcomms.nature.com
Take home: unconstrained architecture + good train set choices give you fast, accurate and stable universal MLIP that just worksβ’οΈ
www.nature.com/articles/s41...
loading . . .
PET-MAD as a lightweight universal interatomic potential for advanced materials modeling - Nature Communications
PET-MAD is a fast and lightweight universal machine-learning potential, trained on a small but diverse dataset, that delivers near-quantum accuracy in atomistic simulations for both organic and inorga...
https://www.nature.com/articles/s41467-025-65662-7
0
15
8
reposted by
Matthias Kellner
COSMO Lab
6 months ago
Anticipating π§βπ Wei Bin's talk at
#psik2025
(noon@roomA), π’ a new
#preprint
using PET and the MAD dataset to train a universal
#ml
model for the density of states, giving band gaps for solids, clusters, surfaces and molecules with MAE ~200meV. Go to the talk, or check out
arxiv.org/html/2508.17...
!
0
4
2
We're introducing ShiftML3, a new ShiftML model for chemical shielding predictions in organic solids. * ShiftML3 predicts full chemical shielding tensors * DFT accuracy for 1H, 13C, and 15N * ASE integration * GPU integration Code:
github.com/lab-cosmo/Sh...
Install from Pypi: pip install shiftml
loading . . .
GitHub - lab-cosmo/shiftml: A python package for the prediction of chemical shieldings of organic solids and beyond.
A python package for the prediction of chemical shieldings of organic solids and beyond. - lab-cosmo/shiftml
https://github.com/lab-cosmo/ShiftML
6 months ago
1
2
3
reposted by
Matthias Kellner
COSMO Lab
6 months ago
π¨
#machinelearning
for
#compchem
goodies from our π§βπ team incoming! After years of work it's time to share. Go check
arxiv.org/abs/2508.15704
and/or
metatensor.org
to learn about
#metatensor
and
#metatomic
. What they are, what they do, why you should use them for all of your atomistic ML projects π.
1
12
10
reposted by
Matthias Kellner
COSMO Lab
9 months ago
π DFT-accurate, with built-in uncertainty quantification, providing chemical shielding anisotropy - ShiftML3.0 has it all! Building on a successful @nccr-marvel.bsky.social-funded collaboration with LRMπ§²βοΈ, it just landed on the arXiv
arxiv.org/html/2506.13...
and on pypi
pypi.org/project/shif...
1
18
7
reposted by
Matthias Kellner
COSMO Lab
11 months ago
When you combine
#machinelearning
and
#compchem
, you need to start worrying at the QM details within your ML architecture. We use our indirect Hamiltonian framework and pySCFAD to explore the enormous design space
arxiv.org/abs/2504.01187
2
14
4
reposted by
Matthias Kellner
COSMO Lab
12 months ago
π’ PET-MAD has just landed! π’ What if I told you that you can match & improve the accuracy of other "universal"
#machinelearning
potentials training on fewer than 100k atomic structures? And be *faster* with an unconstrained architecture that is conservative with tiny symmetry breaking? Sounds like π§βπ
1
28
12
reposted by
Matthias Kellner
COSMO Lab
about 1 year ago
Happy to share a new
#cookbook
recipe that shocases several new software developments in the lab, using the good ole' QTIP4P/f water model as an example.
atomistic-cookbook.org/examples/wat...
. TL;DR - you can now build torch-based interatomic potentials, export them and use them wherever you like!
1
12
5
Feeling a bit lonely here ...
over 1 year ago
0
0
0
you reached the end!!
feeds!
log in