Stanford Research Computing
@stanford-rc.bsky.social
📤 56
📥 1
📝 6
Advancing science, one batch job at a time.
https://srcc.stanford.edu
H200 GPUs are now available on Sherlock:
news.sherlock.stanford.edu/publications...
#Sherlock
#HPC
#Stanford
loading . . .
Introducing SH4_G8TF64.1, now with 8x H200 GPUs - Sherlock changelog
We are excited to announce the immediate availability of a powerful new node configuration to accelerate your GPU workloads on Sherlock: SH4_G8TF64.1. Featuring 8x NVIDIA H200 Tensor Core GPUs, this n...
https://news.sherlock.stanford.edu/publications/introducing-sh4_g8tf64-1-now-with-8x-h200-gpus
4 months ago
0
1
0
We're back to job #1 again!
news.sherlock.stanford.edu/publications...
#Sherlock
#HPC
#Stanford
loading . . .
Back to job #1, thrice - Sherlock changelog
Not once, not twice, but three times! For the third time in Sherlock’s history, the Slurm job ID counter was reset over the weekend, and went from job #67,043,327 all the way back to job #1! JobIDRaw...
https://news.sherlock.stanford.edu/publications/back-to-job-1-thrice
6 months ago
0
2
0
Introducing a new service partition on Sherlock
news.sherlock.stanford.edu/publications...
#Sherlock
#HPC
#Stanford
loading . . .
Introducing a new service partition on Sherlock - Sherlock changelog
We’re very pleased to introduce a new service partition on Sherlock, specially designed to run non-computational management and administrative tasks. Jobs like data transfer tasks, backups, CI/CD pi...
https://news.sherlock.stanford.edu/publications/introducing-a-new-service-partition-on-sherlock
7 months ago
0
3
0
An update about our plans to retire Sherlock 2.0
news.sherlock.stanford.edu/publications...
#Sherlock
#HPC
#Stanford
loading . . .
An update about our plans to retire Sherlock 2.0 - Sherlock changelog
We wanted to share an important update about the future of some of Sherlock’s oldest compute nodes, in light of some of the more recent and worsening political and economic conditions. As many of you...
https://news.sherlock.stanford.edu/publications/an-update-about-our-plans-to-retire-sherlock-2-0
8 months ago
0
1
0
reposted by
Stanford Research Computing
Stéphane Thiell
8 months ago
@stanford-rc.bsky.social
was proud to host the Lustre User Group 2025 organized with OpenSFS! Thanks to everyone who participated and our sponsors! Slides are already available at
srcc.stanford.edu/lug2025/agenda
🤘Lustre!
#HPC
#AI
0
10
3
reposted by
Stanford Research Computing
Stéphane Thiell
9 months ago
Join us for the Lustre User Group 2025 hosted by
@stanford-rc.bsky.social
in collaboration with OpenSFS. Check out the exciting agenda! 👉
srcc.stanford.edu/lug2025/agenda
loading . . .
LUG 2025 Agenda
https://srcc.stanford.edu/lug2025/agenda
0
7
5
reposted by
Stanford Research Computing
Stéphane Thiell
9 months ago
Getting things ready for next week's Lustre User Group 2025 at Stanford University!
0
6
1
Doubling the FLOPs, another milestone for Sherlock's performance
news.sherlock.stanford.edu/publications...
#Sherlock
#HPC
#Stanford
loading . . .
Doubling the FLOPs, another milestone for Sherlock's performance - Sherlock changelog
We’re proud to announce that Sherlock has reached another significant performance milestone. Building on past successes, Sherlock continues to evolve and expand, integrating new technologies and enhan...
https://news.sherlock.stanford.edu/publications/doubling-the-flops-another-milestone-for-sherlocks-performance
9 months ago
1
4
1
reposted by
Stanford Research Computing
Stéphane Thiell
10 months ago
ClusterShell 1.9.3 is now available in EPEL and Debian. Not using clustershell groups on your
#HPC
cluster yet?! Check out the new bash completion feature! Demo recorded on Sherlock at
@stanford-rc.bsky.social
with ~1,900 compute nodes and many group sources!
asciinema.org/a/699526
loading . . .
clustershell bash completion (v1.9.3)
This short recording demonstrates the bash completion feature available in ClusterShell 1.9.3, showcasing its benefits when using the clush and cluset command-line tools.
https://asciinema.org/a/699526
0
9
6
reposted by
Stanford Research Computing
Stéphane Thiell
10 months ago
We started it!
blocksandfiles.com/2025/01/28/s...
Check out my LAD'24 presentation:
www.eofs.eu/wp-content/u...
1
7
6
reposted by
Stanford Research Computing
Stéphane Thiell
about 1 year ago
Newly announced at the
#SC24
Lustre BoF! Lustre User Group 2025, organized by OpenSFS, will be hosted at Stanford University on April 1-2, 2025. Save the date!
0
10
9
reposted by
Stanford Research Computing
Stéphane Thiell
11 months ago
Just another day for Sherlock's home-built scratch Lustre filesystem at Stanford: Crushing it with 136+GB/s aggregate read on real research workload! 🚀
#Lustre
#HPC
#Stanford
0
24
3
Hello BlueSky!
11 months ago
2
9
3
you reached the end!!
feeds!
log in