Felix Koehler
@felix-m-koehler.bsky.social
π€ 83
π₯ 58
π 30
π€ Machine Learning & π Simulation | πΊ YouTuber | π§βπ PhD student @ Thuerey Group
π’ Calling for (4-page) workshop papers on hashtag
#differentiable
programming and hashtag
#SciML
@euripsconf.bsky.social
.
add a skeleton here at some point
16 days ago
0
1
0
reposted by
Felix Koehler
EurIPS Conference
16 days ago
It may only be a band-aid, but we have just announced our new "Salon des RefusΓ©s" sessions for papers rejected due to space constraints:
bsky.app/profile/euri...
add a skeleton here at some point
4
14
5
Check Out my latest video on implementing an attention-based neural operator/emulator (i.e. a Transformer) in JAX:
youtu.be/GVVWpyvXq_s
loading . . .
Transformer Neural Operator in JAX
YouTube video by Machine Learning & Simulation
https://youtu.be/GVVWpyvXq_s
4 months ago
1
3
0
Travelling to Singapore next week for
#ICLR2025
presenting this paper (Sat 3 pm nr. 538):
arxiv.org/abs/2502.19611
DM me (Whova, Email or bsky) if you want to chat about (autoregressive) neural emulators/operators for PDE, autodiff, differentiable physics, numerical solvers etc. π
loading . . .
PRDP: Progressively Refined Differentiable Physics
The physics solvers employed for neural network training are primarily iterative, and hence, differentiating through them introduces a severe computational burden as iterations grow large. Inspired by...
https://arxiv.org/abs/2502.19611
6 months ago
0
2
1
Check out my latest video on approximating the full Lyapunov spectrum for the Lorenz system:
youtu.be/Enves8MDwms
Nice showcase of
#JAX's
features: - `jax.lax.scan` for autoregressive rollout - `jax.linearize` repeated jvp - `jax.vmap`: automatic vectorization
loading . . .
Full Lyapunov Spectrum of Chaotic Lorenz System using JAX
YouTube video by Machine Learning & Simulation
https://youtu.be/Enves8MDwms
6 months ago
1
0
0
Art.
6 months ago
0
0
0
Today, I had the chance to present my
#NeurIPS
paper "APEBench" @SimAI4Science . You can find the recording on YouTube:
youtu.be/wie-SzD6AJE
loading . . .
APEBench Talk @ Pasteur Labs Journal Club
YouTube video by Machine Learning & Simulation
https://youtu.be/wie-SzD6AJE
8 months ago
0
5
1
Thanks
@munichcenterml.bsky.social
for highlighting my recent
#NeurIPS
paper: APEBench, a new benchmark suite for autoregressive emulators of PDEs to understand how we might solve the models of nature more efficiently. More details π§΅ Visual summary on project page:
tum-pbs.github.io/apebench-pap...
add a skeleton here at some point
8 months ago
1
0
0
reposted by
Felix Koehler
Roy Frostig
8 months ago
Our online book on systems principles of LLM scaling is live at
jax-ml.github.io/scaling-book/
We hope that it helps you make the most of your computing resources. Enjoy!
add a skeleton here at some point
3
34
10
reposted by
Felix Koehler
N. Thuerey's research group at TUM
9 months ago
Iβd like to thank everyone contributing to our five accepted ICLR papers for the hard work! Great job everyone π Hereβs a quick list, stay tuned for details & code in the upcoming weeksβ¦
loading . . .
0
5
1
I created a video to help you get started using the APEBench suite (my recent
#neurips
paper) to benchmark autoregressive neural emulators for PDEs with a simple ConvNet emulation of 1D advection:
youtu.be/q8fjQ4ZFynw
loading . . .
APEBench Quickstart
YouTube video by Machine Learning & Simulation
https://youtu.be/q8fjQ4ZFynw
9 months ago
0
3
0
Happy new year! π Two days ago we entered 2025 and just in time the channel surpassed 25k subscribers. Wow! Thanks to everyone for their kind words and support along the way:
www.youtube.com/channel/UCh0...
9 months ago
0
8
1
Check out my latest video on approximating the largest Lyapunov exponent of a dynamical system by integrating a tangent linear perturbation dynamic via autodiff in JAX:
youtu.be/zRMBIkpcuu0
Very neat use-case of forward-mode AD for efficient Lyap approximation.
loading . . .
Largest Lyapunov Exponent using Autodiff in JAX/Python
YouTube video by Machine Learning & Simulation
https://youtu.be/zRMBIkpcuu0
10 months ago
0
1
0
reposted by
Felix Koehler
Samuel Vaiter
10 months ago
Automatic differentiation in forward mode computes derivatives by breaking down functions into elem operations and propagating derivatives alongside values. Itβs efficient for functions with fewer inputs than outputs and for Jacobian-vect prod, using for instance dual numbers.
loading . . .
2
37
10
Now presenting APEBench at
#NeurIPS
in West #5407.
10 months ago
0
1
0
reposted by
Felix Koehler
Miles Cranmer
10 months ago
Excited to be at NeurIPS this week! π I'm part of four exciting projects being presented: - The Well & Multimodal Universe: massive, curated scientific datasets - LaSR: LLM concept evolution for symbolic regression - MPP: 0th gen
@polymathicai.bsky.social
All posters Wed/Thu - stop by! π
0
15
3
reposted by
Felix Koehler
Samuel Vaiter
10 months ago
I am at
#NEURIPS
and will present it today Wed 11 at 11:00am, West Ballroom A-D #6001
add a skeleton here at some point
0
11
2
I will be presenting my poster on APEBench on Thursday from 11:00 to 14:00 PST at West Ballroom A-D #5407. This was done as part of my PhD with
@thuereygroup.bsky.social
in collaboration with my talented co-author, Simon Niedermayr, who is supervised by RΓΌdiger Westermann.
10 months ago
1
4
2
you reached the end!!
feeds!
log in