@kumita lots of particles bouncing in a box move, to some extent, randomly. though exceedingly unlikely, it's possible, then, that they will at some point all find themselves concentrated in one corner of the box, rather than distributed equally
so the problem is just compounding probabilities, stacking the odds against a decrease in entropy higher with each bit of stuff added
@mono@augustus it is a fundamental problem in the current systems, though. all that's really changing is twiddling a few knobs and throwing bigger datasets at things to try to drive down uncertainty as far as possible. but some things just inherently have small datasets
@augustus@mono matching those video samples and matching two text samples are the same problem, and so both doable, and regularly done, now
the problem is that these "ais" have no intuition. they can become reasonably to very accurate, depending on the training set, but have no way to "step back" and see when they're doing something stupid in those 1-10% of bad matches. they're just not-especially-complex-in-principle-but-filled-with-a-mountain-of-fiddly-details-in-practice black boxes that estimate probabilities. brute-forcing things with a mountain of if-then conditions
and so things like these pictures happen
everyone's using them anyways, though, because they "mostly" work, meaning people get caught by those false positives and penalised for nothing, seemingly at random, and by not-actually-entities who can't be talked out of doing stupid things waifu8.png waifu13.png
@velartrill the point is that information is a physical action which, like any other, can have a "significant" effect only on some subset of physical objects
speaking to a rock produces no "significant" result in the same way as punching thing earth
@velartrill i might talk someone into a suicide attempt, or say something that's known to be distracting at a moment that person might be hurt through inattention (walking/driving into a pole)