I can never tell if something I'm trying with AI is foolish/mid or noteworthy. Most of the time, I'm pretty sure "well, someone has to have tried this before, it's pretty obvious, right?" Then 4 months later something similar is BREAKING NEWS. Half the time I'm out here rubbing two sticks together
4 days ago