Largely, LLMs combat this. Quality data tends to be preferred over quantity of data. Human knowledge workers and experts are contracted out for high quality labels and fresh documents. And the key innovation of transformers is that attention allows them to correctly understand context.
add a skeleton here at some point
9 days ago