Damus
Troed Sångberg · 10w
nostr:nprofile1qy2hwumn8ghj7un9d3shjtnyd968gmewwp6kyqpq7qes6mstpcsn6rg3w9fwnsau68sw9h9nga9zjy3htmegg27na6wsjd3n63 This, exactly, why I don't see how current LLMs (statically trained) can ever produce secure code - since there's so little of it in the training data and you can't really tag it as such...
Carl · 10w
nostr:nprofile1qy2hwumn8ghj7un9d3shjtnyd968gmewwp6kyqpq7qes6mstpcsn6rg3w9fwnsau68sw9h9nga9zjy3htmegg27na6wsjd3n63 Crying much, crocodiles?