I’d like to prevent building a completely new server just to run a gpu for AI workloads. Currently, everything is running on my laptop, except it’s dated cpu only really works well for smaller models.

Now I have an nvidia m40, could I possibly get it to work using thunderbolt and an enclosure or something? note: it’s on linux

  • hendrik@palaver.p3x.de
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    12 days ago

    I did some quick googling. Are those thunderbolt docks really $350 ? That’s like half the price of a cheap computer?!

      • hendrik@palaver.p3x.de
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        12 days ago

        Maybe you should do the maths on other options. You could get a refurbished PC for $350. Or buy the dock anyways. Or spend the money on cloud compute if you’re just occasionally using AI. Idk.

        • Boomkop3@reddthat.comOP
          link
          fedilink
          English
          arrow-up
          3
          ·
          12 days ago

          I did not say occasionally. We use AI a lot. Currently it’s mostly for image indexing, recognition, object detection, and audio transcription. But we’re planning to expand to more, and we’d like to use models that are more accurate