Neshura

Just some IT guy

  • 40 Posts
  • 997 Comments
Joined 1 year ago
cake
Cake day: June 12th, 2023

help-circle





  • NeshuraAtoTechnology@lemmy.worldReasoning failures highlighted by Apple research on LLMs
    link
    fedilink
    English
    arrow-up
    31
    arrow-down
    5
    ·
    edit-2
    10 days ago

    Last I checked (which was a while ago) “AI” still can’t pass the most basic of tasks such as “show me a blank image”/“show me a pure white image”. the LLM will output the most intense fever dream possible but never a simple rectangle filled with #fff coded pixels. I’m willing to debate the potentials of AI again once they manage to do that without those “benchmarks” getting special attention in the training data.




  • in my experience that usually boils down to the Linux version getting (an understandably) low priority compared to the windows version. Which is something that won’t change until Linux has a significant market share and I can’t fault the devs for it. For an example of a game where that is not the case I’d like to take Factorio, where the devs went out of their way to properly work on the Linux native version and even include Linux/Mac exclusive features for their next update. But that only happened because one of the devs themselves uses Linux.








  • NeshuraAtoSelfhosted@lemmy.worldNetwork Switch
    link
    fedilink
    English
    arrow-up
    8
    ·
    edit-2
    26 days ago

    I somewhat disagree that you have to be a data hoarder for 10G to be worth it. For example I’ve got a headless steam client on my server that has my larger games installed (all in all ~2TB so not in data hoarder territories) which allows me to install and update those games at ~8 Gbit/s. Which in turn allows me to run a leaner Desktop PC since I can just uninstall the larger games as soon as I don’t play them daily anymore and saves me time when Steam inevitably fails to auto update a game on my Desktop before I want to play it.

    Arguably a niche use case but it exists along side other such niche use cases. So if someone comes into this community and asks about how best to implement 10G networking I will assume they (at least think) have such a use case on their hands and want to improve that situation a bit.



  • NeshuraAtoSelfhosted@lemmy.worldNetwork Switch
    link
    fedilink
    English
    arrow-up
    19
    ·
    27 days ago

    Personally going 10G on my networking stuff has significantly improved my experience with self-hosting, especially when it comes to file transfers. 1G can just be extremely slow when you’re dealing with large amounts of data so I also don’t really understand why people recommend against 10G here of all places.


  • NeshuraAtoSelfhosted@lemmy.worldNetwork Switch
    link
    fedilink
    English
    arrow-up
    9
    ·
    27 days ago

    Yeah they definitely could have been quicker with the patches but as long as the patches come out before the articles they are above average with how they handle CVE’s, way too many companies out there just not giving a shit whatsoever.


  • NeshuraAtoSelfhosted@lemmy.worldNetwork Switch
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    1
    ·
    edit-2
    27 days ago

    If I buy a switch and that thing decides to give me downtime in order to auto update I can tell you what lands on my blacklist. Auto-Updates absoultely increase security but there are certain use cases where they are more of a hindrance than a feature, want proof? Not even Cisco does Auto-Update by default (from what I’ve managed to find in this short time neither does TrendNet which you’ve been speaking well of). The device on its own deciding to just fuck off and pull down your network is not in any way a feature their customers would want. If you don’t want the (slight) maintenance load that comes with an active switch do not get one, get a passive one instead.