yang
@yang-chu.bsky.social
π€ 18
π₯ 78
π 0
minds and machines, computational neuroscience, machine learning, computer architecture
reposted by
yang
Dan Goodman
5 days ago
New preprint! What happens if you add neuromodulation to spiking neural networks and let them go wild with it? TLDR: it can improve performance especially in challenging sensory processing tasks. Explainer thread below. π€π§ π§ͺ
www.biorxiv.org/content/10.1...
loading . . .
Neuromodulation enhances dynamic sensory processing in spiking neural network models
Neuromodulators allow circuits to dynamically change their biophysical properties in a context-sensitive way. In addition to their role in learning, neuromodulators have been suggested to play a role ...
https://www.biorxiv.org/content/10.1101/2025.07.25.666748v1
3
73
19
reposted by
yang
Marcus Ghosh
about 2 months ago
How does the structure of a neural circuit shape its function?
@neuralreckoning.bsky.social
& I explore this in our new preprint:
doi.org/10.1101/2025...
π€π§ π§ͺ π§΅1/9
5
98
39
reposted by
yang
Marcus Ghosh
2 months ago
How can we best use AI in science? Myself and 9 other research fellows from
@imperial-ix.bsky.social
use AI methods in domains from plant biology (π±) to neuroscience (π§ ) and particle physics (π). Together we suggest 10 simple rules
@plos.org
π§΅
doi.org/10.1371/jour...
2
46
13
reposted by
yang
Dan Goodman
2 months ago
New preprint for
#neuromorphic
and
#SpikingNeuralNetwork
folk (with
@pengfei-sun.bsky.social
).
arxiv.org/abs/2507.16043
Surrogate gradients are popular for training SNNs, but some worry whether they really learn complex temporal spike codes. TLDR: we tested this, and yes they can! π§΅π π€π§ π§ͺ
loading . . .
Beyond Rate Coding: Surrogate Gradients Enable Spike Timing Learning in Spiking Neural Networks
We investigate the extent to which Spiking Neural Networks (SNNs) trained with Surrogate Gradient Descent (Surrogate GD), with and without delay learning, can learn from precise spike timing beyond fi...
https://arxiv.org/abs/2507.16043
1
39
17
reposted by
yang
Gabriel BΓ©na π»
4 months ago
The REAL question on everyone's lips though... Blog:
gabrielbena.github.io/blog/2025/be...
Thread:
bsky.app/profile/sola...
add a skeleton here at some point
0
11
3
reposted by
yang
Gabriel BΓ©na π»
4 months ago
New
#Preprint
Alert!! π€ π§ π§ͺ What if we could train neural cellular automata to develop continuous universal computation through gradient descent ?! We have started to chart a path toward this goal in our new preprint: arXiv:
arxiv.org/abs/2505.13058
Blog:
gabrielbena.github.io/blog/2025/be...
π§΅β¬οΈ
loading . . .
A Path to Universal Neural Cellular Automata | Gabriel BΓ©na
Exploring how neural cellular automata can develop continuous universal computation through training by gradient descent
https://gabrielbena.github.io/blog/2025/bena2025unca
3
32
12
reposted by
yang
Dan Goodman
5 months ago
How do babies and blind people learn to localise sound without labelled data? We propose that innate mechanisms can provide coarse-grained error signals to boostrap learning. New preprint from
@yang-chu.bsky.social
. π€π§ π§ͺ
arxiv.org/abs/2001.10605
loading . . .
Learning spatial hearing via innate mechanisms
The acoustic cues used by humans and other animals to localise sounds are subtle, and change during and after development. This means that we need to constantly relearn or recalibrate the auditory spa...
https://arxiv.org/abs/2001.10605
2
30
11
you reached the end!!
feeds!
log in