Thomas Letan
@lthms.soap.coffee
📤 144
📥 123
📝 253
Technical debt producer. He/him.
https://soap.coffee/~lthms
pinned post!
My notes on a recent experiment: Peer-Programming in Modern
#OCaml
with ChatGPT and Gemini A devlog of sort. I wanted to share how it was, using LLMs to write a small tool in a more “niche” language.
loading . . .
Peer-Programming in Modern OCaml with ChatGPT and Gemini
A recollection of challenging myself to implement a simple tool to generate a summary from YouTube videos using Vosk for speech recognition and Ollama for generating summaries using LLMs running local...
https://soap.coffee/~lthms/posts/PeerProgrammingWithLLMs.html
7 months ago
3
8
2
reposted by
Thomas Letan
Gitea
8 days ago
Big shoutout to
@xeiaso.net
for this. We sponsor her (not nearly enough for the huge effort developing this takes), and you can TOO at
github.com/sponsors/xe
add a skeleton here at some point
0
19
4
I’d love to write a blogpost but I have no idea about what to write about 😅
1 day ago
0
0
0
There is something a bit infuriating when you subscribe to a service, only for it to go offline minutes after your payment and their status page claims everything is fine(*) (*) Though, let me tell you, having a good status page is pretty hard, I’m well aware.
1 day ago
0
0
0
I like giving talks, and I often prepare them the same way: I make a first (very bad) deck of slides, and then I rehearse and refine the slides as many time as necessary to find my flow. It’s not necessarily ineffective, but I don’t know… I like the process.
2 months ago
0
1
0
Too many negativity on Twitter so I deleted the app on an impulse and here I am 😆
2 months ago
0
1
0
Allez ça part en préco
add a skeleton here at some point
3 months ago
0
0
0
reposted by
Thomas Letan
sabine 🐫🦀 / "use ocaml";
6 months ago
why not both? 🤔🐫
3
36
5
reposted by
Thomas Letan
Kiran
6 months ago
PSA! Please share around! Due to a limited number of submissions, we're extending the OCaml Workshop deadline by a week to July 10th AoE! Functional programmers! Heed my call! We need your submissions!!
add a skeleton here at some point
0
12
13
Maybe my fix to cohttp-lwt connection cache will be merged some day.
6 months ago
1
0
0
Can we talk about K-Pop Demon Hunters? No i mean, really. Can we?? I’m kinda obsessed already
6 months ago
1
1
0
Omg OpenTelemetry is so cool I gonna die
7 months ago
1
0
0
reposted by
Thomas Letan
Yaron Minsky
7 months ago
I'm pleased to announce OxCaml! OxCaml is Jane Street's branch of OCaml. We've given it a new name and a snazzy logo, and done a bunch of work to make it easy for people to try.
5
108
43
The
#OCaml
library for tracing using OpenTelemetry is pretty good.
7 months ago
1
0
0
MPL is really my favorite license. Do whatever with my code, build a proprietary product if you want—I’d be flattered, even. But if you end up having to modify it, publish the changes!
7 months ago
0
0
0
cohttp-connpool-lwt-unix is now a thing :D
7 months ago
0
0
0
Step 1. Write tests to find bugs in my new library Step 2. Run into a bug in one dependency of my new library Took me a while, but I managed to pinpoint the issue
loading . . .
Fix `Connection_cache.call` hanging on unreachable server by lthms · Pull Request #1113 · mirage/ocaml-cohttp
What The Connection_cache.call function could hang indefinitely when attempting to establish a connection to an unreachable server. This change resolves that issue by ensuring that the connection ...
https://github.com/mirage/ocaml-cohttp/pull/1113
7 months ago
1
0
0
I think Cohttp_lwt_unix.Server does not close persistent connections when it’s stopped ):
7 months ago
0
0
0
I’ve started a Lwt_unix backend for cohttp-connpool. I have the perfect use case in mind for it 😄 but that’ll have to wait for the end of the week-end.
loading . . .
GitHub - lthms/cohttp-connpool: Connection pool for `cohttp` client backends, enabling reuse of HTTP connections
Connection pool for `cohttp` client backends, enabling reuse of HTTP connections - lthms/cohttp-connpool
https://github.com/lthms/cohttp-connpool/tree/main
7 months ago
0
0
0
Embracing the future
loading . . .
Migrate to `dune pkg` for project setup and builds · lthms/cohttp-connpool-eio@2e021d0
* What This commit transitions the project's dependency management from a custom `Makefile` to the integrated `dune pkg` system. The `Makefile` has been entirely removed, and instructions for ...
https://github.com/lthms/cohttp-connpool-eio/commit/2e021d0142bab61645564c56668feb239d85a06e
7 months ago
0
0
0
My notes on a recent experiment: Peer-Programming in Modern
#OCaml
with ChatGPT and Gemini A devlog of sort. I wanted to share how it was, using LLMs to write a small tool in a more “niche” language.
loading . . .
Peer-Programming in Modern OCaml with ChatGPT and Gemini
A recollection of challenging myself to implement a simple tool to generate a summary from YouTube videos using Vosk for speech recognition and Ollama for generating summaries using LLMs running local...
https://soap.coffee/~lthms/posts/PeerProgrammingWithLLMs.html
7 months ago
3
8
2
The daemon has been running for 5 months without any issue \o/. (I find joy where it is)
add a skeleton here at some point
7 months ago
1
0
0
Okay, so. I still expect a few hiccups, but so far, `dune pkg lock` seems already like a solid solution
7 months ago
2
2
1
Tried `dune pkg lock` this morning `dune exec examples/main.exe` took as much time as the `dune build` that came before it 🤔 need to dig that a little
7 months ago
1
1
1
Eio.Private.Trace.with_span is not reentrant ):
7 months ago
0
0
0
Okay, no I can make calls to Ollama with peace of mind, knowing I can reuse HTTP connections 🥸
add a skeleton here at some point
7 months ago
0
0
0
I still need to pick a license (probably MPL) but here is cohttp-connpool-eio:
github.com/lthms/cohttp...
Feedback welcome!
loading . . .
GitHub - lthms/cohttp-connpool-eio: Connection pool for `cohttp-eio`, enabling reuse of HTTP connections
Connection pool for `cohttp-eio`, enabling reuse of HTTP connections - lthms/cohttp-connpool-eio
https://github.com/lthms/cohttp-connpool-eio/tree/main
7 months ago
1
1
0
Okay, eio-trace is pretty awesome
7 months ago
1
0
0
Btw, latest Eio’s commit is 4 month old?
7 months ago
0
0
0
Wonder why `Eio.Pool` does not have `get : sw:Switch.t -> 'a t -> 'a`
7 months ago
1
1
0
Where it started: I want to use ollama from OCaml Where it went: I can do two HTTP request from the same connection 🎉
7 months ago
0
1
1
Why does Eio.Net.E has a Connection_reset variant, and Cohttp_eio uses failwith "connection reset by peer" when the connection is reset?
7 months ago
0
1
0
How can I make sure cohttp-eio reuses already established connections? From what I’m seeing through tcpdump, it looks like it systematically closes connections to localhost at least.
7 months ago
1
0
0
Now that I can use Vosk to transcript a Youtube video, let's try to use Ollama to get a summary. Good use case to try cohttp-eio to make HTTP requests
7 months ago
0
0
0
Demo project to call Vosk (offline speech recognition) from OCaml.
github.com/lthms/ocaml-...
I tried it on a few Youtube video, it seems to work quite well. Eio is actually nice, although it’s way too easy to mess up and read from a closed file ): (but maybe because I like `Seq.t`)
loading . . .
GitHub - lthms/ocaml-vosk
Contribute to lthms/ocaml-vosk development by creating an account on GitHub.
https://github.com/lthms/ocaml-vosk
7 months ago
0
1
0
Ah, forgot about this one. It’s not merged yet 🥲 I’ve mentioned this on
Lobste.rs
already—I am a little concerned with Dream’s future. Which is an issue for OCaml, because Dream remains a pretty awesome project the ecosystem needs, IMO.
add a skeleton here at some point
7 months ago
0
2
0
It’s almost June and I haven’t published a single blogpost since my 2024 retrospective. Not great.
7 months ago
0
1
0
My plan for this week is to build a strong understanding of Irmin (
irmin.org
).
loading . . .
Irmin
https://irmin.org/
7 months ago
1
2
0
Any feedback about the CodeCompanion plugin (AI integration) for neovim? I think I’ll give it a try
7 months ago
1
1
0
reposted by
Thomas Letan
Alice ✨
7 months ago
this actually works way too well omg
5
50
13
TIL Packer is no longer maintained 🥲
8 months ago
0
1
0
reposted by
Thomas Letan
Techaro
9 months ago
Anubis is trusted by: * SourceHut * Gitea servers all over the world *
kernel.org
*
freebsd.org
* UNESCO It's probably good enough for your community too!
5
94
15
reposted by
Thomas Letan
Gergely Orosz
9 months ago
Can you tell what was the point where I 1. Manually blocked a few AI crawlers 2. Moved my nameserver to Cloudflare plus turned on AI crawler blocking My site w modest traffic (~1,000 visitors per day) went from serving ~1GB/hr (to tons of AI crawlers!) to ~50MB/hr in traffic
4
135
23
Read a blogpost where the code snippets where images. Who does that?
10 months ago
3
3
0
In case you didn’t know: Lwt_throttle is pretty cool when you need to interact with a rate-limited API.
10 months ago
0
2
0
I hope one day my brain will remember it’s “achieve” not “acheive.”
11 months ago
0
1
0
There is one game that will make me buy a PS5: The Last of Us Part 3. No question asked.
11 months ago
0
2
0
I’ve removed Slack from my personal phone yesterday. I keep unlocking my phone out of sheer reflex 😅 hopefully this will pass
11 months ago
1
2
0
reposted by
Thomas Letan
Louis Pilfold
11 months ago
You're gunna enjoy the next Gleam release
loading . . .
3
81
8
That wasn’t too bad.
loading . . .
Support h2.0.13.0 by lthms · Pull Request #381 · aantron/dream
The hard constraint on h2 prevents us to update Dream to its latest beta. I’d like to propose this minimal patch to adapt Dream-httpaf to the breaking change of h2 0.13. There might be better way t...
https://github.com/aantron/dream/pull/381
12 months ago
0
3
1
Note to self: look for
CONTRIBUTING.md
before complaining
12 months ago
1
4
0
Load more
feeds!
log in