Yuzhe Yang
@yuzheyang.bsky.social
๐ค 1688
๐ฅ 70
๐ 51
AI Prof @ UCLA | RS @ Google | PhD @ MIT
#ML
,
#AI
,
#health
https://www.cs.ucla.edu/~yuzhe
pinned post!
๐ข My lab at UCLA is hiring PhD students and postdocs! Please apply to UCLA CS or CompMed and mention my name if you are interested in foundation models and (Gen)AI for health / medicine / science. More info:
cs.ucla.edu/~yuzhe
5 months ago
0
2
1
Exciting news for the Health Intelligence Lab at UCLA: 3/3 papers accepted to
#ICML
(1 as Spotlight) ๐ ๐ฆ๐น๐ฒ๐ฒ๐ฝ๐๐ : ๐ก๐ฎ๐๐๐ฟ๐ฎ๐น-๐๐ฎ๐ป๐ด๐๐ฎ๐ด๐ฒ ๐๐ป๐๐ฒ๐น๐น๐ถ๐ด๐ฒ๐ป๐ฐ๐ฒ ๐ณ๐ผ๐ฟ ๐๐๐บ๐ฎ๐ป ๐ฆ๐น๐ฒ๐ฒ๐ฝ (spotlight) ๐ข๐ฆ๐: ๐ข๐ป ๐ฃ๐ฟ๐ฒ-๐๐ฟ๐ฎ๐ถ๐ป๐ถ๐ป๐ด ๐ฎ๐ป๐ฑ ๐ฆ๐ฐ๐ฎ๐น๐ถ๐ป๐ด ๐ผ๐ณ ๐ฆ๐น๐ฒ๐ฒ๐ฝ ๐๐ผ๐๐ป๐ฑ๐ฎ๐๐ถ๐ผ๐ป ๐ ๐ผ๐ฑ๐ฒ๐น๐ ๐๐๐๐ฅ๐ง๐ฆ: ๐๐ฒ๐ป๐ฐ๐ต๐บ๐ฎ๐ฟ๐ธ๐ถ๐ป๐ด ๐๐๐ ๐ฅ๐ฒ๐ฎ๐๐ผ๐ป๐ถ๐ป๐ด ๐ผ๐ป ๐๐ฒ๐ฎ๐น๐๐ต ๐ง๐ถ๐บ๐ฒ ๐ฆ๐ฒ๐ฟ๐ถ๐ฒ๐ ๐
8 days ago
1
1
0
Meet OSF โ a fully open benchmark and state-of-the-art sleep foundation models. ๐ We study pre-training and scaling recipes that actually improve generalization in real-world settings. ๐ฅ ๐ Website:
yang-ai-lab.github.io/osf/
๐ป Code:
github.com/yang-ai-lab/...
๐ค Models:
hf.co/yang-ai-lab/...
loading . . .
about 2 months ago
2
5
2
Can LLMs really reason over health time series? ๐ Introducing HEARTS โค๏ธโ the first living benchmark built for health time-series reasoning. ๐Website:
yang-ai-lab.github.io/HEARTS
๐ต๏ธCode:
github.com/yang-ai-lab/...
๐คDataset:
hf.co/datasets/yan...
๐Leaderboard:
yang-ai-lab.github.io/HEARTS/leade...
loading . . .
about 2 months ago
1
1
0
๐ What if your sleep signals could speak? Introducing SleepLM โ sleep-language foundation models that turns raw sleep into something we can describe, query, and localize with language. ๐ฃ๏ธ ๐Website:
yang-ai-lab.github.io/SleepLM
๐ต๏ธCode:
github.com/yang-ai-lab/...
๐คModels:
hf.co/yang-ai-lab/...
๐งต๐
2 months ago
1
2
0
๐ข My lab at UCLA is hiring PhD students and postdocs! Please apply to UCLA CS or CompMed and mention my name if you are interested in foundation models and (Gen)AI for health / medicine / science. More info:
cs.ucla.edu/~yuzhe
5 months ago
0
2
1
๐จ Let your wearable data "speak" for themselves! โ๏ธ๐ฃ๏ธ Introducing *SensorLM*, a family of sensor-language foundation models, trained on ~60 million hours of data from >103K people, enabling robust wearable sensor data understanding with natural language. ๐งต
11 months ago
1
6
1
reposted by
Yuzhe Yang
Prof Sam Illingworth
about 1 year ago
๐ฉปโ๏ธ AI underdiagnoses Black female patients A new study found that expert-level vision-language models for chest X-rays systematically underdiagnose marginalised groups โ especially Black women โ more than radiologists. ๐
doi.org/10.1126/sciadv.adq0305
#SciComm
#AI
#HealthEquity
๐งช
loading . . .
Demographic bias of expert-level vision-language foundation models in medical imaging
Compared to certified radiologists, expert-level AI models show notable and consistent demographic biases across pathologies.
https://doi.org/10.1126/sciadv.adq0305
1
17
9
Do foundation models in medical imaging see everyone fairly?๐ค Excited to share our new Science Advances paper uncovering & auditing demographic biases of expert-level VLMs, and comparing to board-certified radiologists๐งโโ๏ธ ๐science.org/doi/10.1126/sciadv.adq0305 ๐ปgithub.com/YyzHarry/vlm-fairness (1/)
about 1 year ago
1
28
7
reposted by
Yuzhe Yang
Eric Topol
about 1 year ago
How multimodal A.I. of real time smartwatch data can automatically detect a person's loss of pulseโsudden cardiac deathโand notify emergency services
www.nature.com/articles/s41...
@jakesunshine.bsky.social
@nature.com
loading . . .
Automated loss of pulse detection on a consumer smartwatch - Nature
Nature - Automated loss of pulse detection on a consumer smartwatch
https://www.nature.com/articles/s41586-025-08810-9
12
186
38
reposted by
Yuzhe Yang
Su-In Lee
over 1 year ago
Just published in Nature Biomedical Engineering! Working with the incredible PhD student Wei Qiu and our brilliant collaborator Kamila Naxerova at Harvard was a great pleasure. Our deep profiling framework enables us to view 18 human cancers through the lens of AI!
www.nature.com/articles/s41...
loading . . .
Deep profiling of gene expression across 18 human cancers - Nature Biomedical Engineering
Using unsupervised deep learning to generate low-dimensional latent spaces for gene-expression data can unveil biological insight across cancers.
https://www.nature.com/articles/s41551-024-01290-8
0
25
3
reposted by
Yuzhe Yang
Eric Topol
over 1 year ago
A neurologist with 2 APOE4 copies tells us about his experience with
#Alzheimers
disease
washingtonpost.com/wellness/202...
17
377
107
reposted by
Yuzhe Yang
Su-In Lee
over 1 year ago
Seven years ago, Scott Lundberg, presented our SHAP framework at the NeurIPS 2017 conference. Since then, SHAP has become one of the most widely used feature attribution methods, with our paper receiving approximately 30,000 citations. It's wonderful that SHAP's birthday aligns perfectly with mine!๐
0
47
6
I will be at
#NeurIPS
and
#ML4H
all next week โ let me know if you would like to catch up in person! ๐ข I am also recruiting PhD students! Drop me an email if you're attending NeurIPS and would like to chat or learn more ๐
add a skeleton here at some point
over 1 year ago
0
2
0
reposted by
Yuzhe Yang
Su-In Lee
over 1 year ago
Here is a
#compbio
starter kit!
go.bsky.app/QVPoZXp
To all the
#Bioinformatics
#Genomics
#MachineLearning
folks: please RP and letโs build this together!
add a skeleton here at some point
41
102
49
โ๏ธ Check out our latest work on scaling foundation models for large-scale multimodal wearable sensor data!
loading . . .
Scaling wearable foundation models
https://research.google/blog/scaling-wearable-foundation-models/
over 1 year ago
0
3
0
Hello world! Iโm recruiting ~3 PhD students for Fall 2025 at UCLA ๐ Please apply to UCLA CS or CompMed if you are interested in ML and (Gen)AI for healthcare / medicine / science. See my website for more on my research & how to apply:
people.csail.mit.edu/yuzhe
over 1 year ago
0
18
5
you reached the end!!
feeds!
log in