It seems to me they did the current balancing for the 3090 because they had to. Cost wasn’t a driving factor when the goal was to push the most power to the card, but they don’t control any aspect of the PSU and can’t be sure how it will deliver power.
Once they rolled out the 12V HPwr standard they got those sense pins and someone looked at the design and said hey, we can save $$/card by not doing our own power management. It might also, if we’re being nice to nvidia, have been removed because power switching is probably an electrically noisy operation, and a source of heat, so they wanted it to be away from the video card.
But they left two resistors in there for a tiny bit of redundancy.
Then someone said hey, this resistor is doing nothing and we need the board space for… Any number of reasons, or they wanted to save a few pennies, and it got axed.
Sense pins are only there to inform the graphics card what the max power it can draw from the PSU, not for power management and load balancing. If someone from Nvidia misunderstood their own standard and made it past the CE, then it’s still excuses nothing since again it’s their own standard and has bad engineers or maybe they shouldn’t trust their AI to develop engineering specs for them.
Either way, it’s a bad design and a bad standard.
How anyone in leadership gave the OK for a connector this asinine is beyond me. That’s, on a technical level, a very real fire hazard