Ooh, ooh, game engines? No. Physics models? Nope, not that. Cryptography, maybe useful against the coming Quantum Cryptodoom™ ? No, not that either. DSP? Image compression? Something? Hmmmm, what could benefit from faster matrix maths? What one singular thing could be so important that- it’s a meme, of course it’s a freaking meme. Ayyy Aaaiiiiyyyeeee must be the only possible thing of interest because that’s the latest meme fad thing >:( grumble grumble grouch bite et cetera
Yes, I will hate every single meme-fad-thing as it happens unless it involves kittens. Or maybe one of a few other things, but NOT THAT. Hmph! Grr! And so on!
Also, reading the article, the immediate practical implications of this improvement are almost nonexistent. This is a theoretical breakthrough, that may or may not lead to further theoretical breakthroughs that may or may not be practically more relevant.
Certainly important research, but nothing that AI people (or any other scientists) must celebrate. Feeds the AI hype though.
Machine learning might be the biggest computation type by volume to benefit from this so it’s not that silly. With declining hardware gains we’re back to optimizing in software which is preferable to gobbling up more and more energy resources.
Adding AeIeeee to the paper is the only way to not starve for a scientist in such a field. They have to highlight some immediate influence of their research in order to receive funding and get invited to conferences. Did it myself, didn’t like it, but that’s how the system works, unfortunately
If this actually did lead to faster matrix multiplication, then essentially anything that can be done on a GPU would benefit. That definitely could include games, and physics models, along with a bunch of other applications (and yes, also AI stuff).
I’m sure the papers authors know all of that, but somehow along the line the article just became"faster and better AI"
“could lead to faster, more efficient”
Ooh, ooh, game engines? No. Physics models? Nope, not that. Cryptography, maybe useful against the coming Quantum Cryptodoom™ ? No, not that either. DSP? Image compression? Something? Hmmmm, what could benefit from faster matrix maths? What one singular thing could be so important that- it’s a meme, of course it’s a freaking meme. Ayyy Aaaiiiiyyyeeee must be the only possible thing of interest because that’s the latest meme fad thing >:( grumble grumble grouch bite et cetera
Yes, I will hate every single meme-fad-thing as it happens unless it involves kittens. Or maybe one of a few other things, but NOT THAT. Hmph! Grr! And so on!
Also, reading the article, the immediate practical implications of this improvement are almost nonexistent. This is a theoretical breakthrough, that may or may not lead to further theoretical breakthroughs that may or may not be practically more relevant.
Certainly important research, but nothing that AI people (or any other scientists) must celebrate. Feeds the AI hype though.
Machine learning might be the biggest computation type by volume to benefit from this so it’s not that silly. With declining hardware gains we’re back to optimizing in software which is preferable to gobbling up more and more energy resources.
Adding AeIeeee to the paper is the only way to not starve for a scientist in such a field. They have to highlight some immediate influence of their research in order to receive funding and get invited to conferences. Did it myself, didn’t like it, but that’s how the system works, unfortunately
If this actually did lead to faster matrix multiplication, then essentially anything that can be done on a GPU would benefit. That definitely could include games, and physics models, along with a bunch of other applications (and yes, also AI stuff).
I’m sure the papers authors know all of that, but somehow along the line the article just became"faster and better AI"