haxor@derp.fooMB to Hacker News@derp.fooEnglish · 10 months agoNY Times is asking that ALL LLMs trained on Times data be destroyedtwitter.comexternal-linkmessage-square10fedilinkarrow-up135arrow-down14file-text
arrow-up131arrow-down1external-linkNY Times is asking that ALL LLMs trained on Times data be destroyedtwitter.comhaxor@derp.fooMB to Hacker News@derp.fooEnglish · 10 months agomessage-square10fedilinkfile-text
minus-squarethat guy@lemmy.worldlinkfedilinkEnglisharrow-up2arrow-down15·edit-210 months agoAnd if they are, 2 seconds later someone can train a new one. Maybe they should learn to code like those coal miners they pitied.
minus-squareLvxferre@lemmy.mllinkfedilinkEnglisharrow-up8arrow-down2·10 months ago 2 seconds later someone can train a new one “Training” datasets: GPT-3 - 300? 500? billion tokens Bard - 1.5 trillion words GPT-4 - 13 trillion tokens Does this look like the amount of content that you’d get in two seconds??? Maybe they should learn to code like those coal miners they pitied. And maybe you should go back to Reddit.
And if they are, 2 seconds later someone can train a new one. Maybe they should learn to code like those coal miners they pitied.
“Training” datasets:
Does this look like the amount of content that you’d get in two seconds???
And maybe you should go back to Reddit.