• MudMan@kbin.social
    link
    fedilink
    arrow-up
    11
    arrow-down
    4
    ·
    1 year ago

    It will, but stuff doesn’t get better linearly forever. That’s why everybody in the 50s thought we’d be living in Mars, have starships and flying cars by now. Also why a bunch of investors and nerds thought AI was the new social media at some point.

    Turns out most things get a lot better very fast and then a little better very slowly, and it’s very, very hard to know when that line is going to flip ahead of time.

    • Kedly@lemm.ee
      link
      fedilink
      arrow-up
      3
      arrow-down
      1
      ·
      1 year ago

      I do want to point out that while our tech doesnt look as amazing in the same way as the 50’s thought it’d be, its pretty amazing in its own ways. I’m writing this message to you on a glass obelisk physically connected to nothing, that enables me to talk to my adopted family on the literal opposite end of the planet with maybe a few seconds of delay (if that), who dont speak the same language as me, and its more than 100,000 times more powerful than the computers thay first got us to the moon, while being small enough to comfortably fit in my pocket

      • MudMan@kbin.social
        link
        fedilink
        arrow-up
        3
        ·
        1 year ago

        You type posts that long on mobile? I am genuinely an old now.

        Anyway, yeah, absolutely. We do live in the future. The point is that people extrapolate from whatever tech is in growth mode and inevitably go past where the real asymptote is. So yeah, if you were living in the 60s nuclear power and the space race seemed like amazing achievements, but it turns out the tech stopped shy of… you know, moonbases and the X-Men. If you were in the 80s automation and computers seemed like magic, but sentience didn’t emerge from sheer computation and… well, actually short of the cyborg part pretty much every other part of Robocop happened, so we’ll call that a tie.

        So now we get affordable machine learning leading to working language and synthetic image models and assume that’s gonna grow forever until we get the holodeck and artificial general intelligence. And we may, but we could also hit the ceiling pretty close to where we are now.

        • Kedly@lemm.ee
          link
          fedilink
          arrow-up
          1
          arrow-down
          1
          ·
          1 year ago

          Thats fair, and yeah I spend a lot of time on transit these days, so I have the time to write up long posts on mobile xD As another aside, VR Tech and Mocap tech mean that we do actually have modern reality adjusted versions of the Holodeck right now! Check out Sandbox VR, they have locations in multiple different countries and its super cool! I’m sure there are other companies that have done similar, Sandbox is just the one I know of and have used!

    • kromem@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      1 year ago

      The Jetson’s was supposed to take place 100 years from when it was being made in the 1960s.

      Which means the following technologies beat their predictions by decades:

      • Video calls
      • Robot vacuums
      • Tablet computing
      • Smart watches
      • Drones
      • Pill cams
      • Flat screen TVs

      Flying cars exist, they just aren’t economically viable or practical given the cost, necessity to have a flight license, and aviation regulations regarding takeoff and landing.

      And we’re still 40 years away from that show’s imagined future.

      Your thesis focuses too much on the things here and there that were wrong, which typically related to expensive hardware cycles being assumed to be faster because the focus was only on the underlying technology being possible and not thinking through if it was practical (doors that slide into the ceiling is a classic example - the cost of retrofitting for that vs keeping doorknobs means the latter will be around for a very very long time).

      What we are discussing is the rate of change for centrally run software which has already hit milestones ahead of expert expectations several times over in the past few years and set the world record for fastest growing new product usage beating the previous record holder by over a 4x speedup.

      You’re comparing apples to oranges.

      • MudMan@kbin.social
        link
        fedilink
        arrow-up
        4
        ·
        1 year ago

        Yes. Technology went faster than people expected in some areas and much slower in others, to the point where the outcome may not be possible at all.

        That IS my “thesis”. the idea that in the 1960s video calls and a sentient robot cleaning your house seemed equally cartoonishly futuristic is the entire point I’m making.

        And to be clear, that holds even when restricted simply to consumer software and hardware. We got a lot better than expected at networking and data transmission… and now we’re noticeably slowing down. We are actually behind in terms of AI, but we’re way better at miniaturization.

        Again, people extrapolate from their impression of current rates of progress endlessly, but it’s hard to predict when the curve will flatten out. That’s the thesis.

        • kromem@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          1 year ago

          a sentient robot cleaning your house

          Again - we’re still 40 years away from that envisioned future.

          We got a lot better than expected at networking and data transmission… and now we’re noticeably slowing down.

          The difference between the 2023 and 2022’s world records increased speeds for data transfer rates is nearly 5x more than the increase from 2020 to 2021.

          As for your claims about being behind on AI, I’d strongly recommend looking at the various futurist predictions of what to expect from AI in 2020 from various firms, and how literally all of them completely missed the mark for the arrival of GPT-3.

          Look at predictions for 2023 and you’ll see a lot of comments around a potential AI winter and how the data sources have been tapped. Meanwhile the major research advances over actual 2023 was basically “how is GPT-4 so good at all these things” and “it turns out using GPT-4 to generate synthetic data can train much smaller networks to be much much better than we could have achieved with previous data sets.”

          And this is all in advance of the very promising work at a shift to new chip architectures for AI workloads, specifically optoelectronics which went from a pipe dream five years ago to proof of concept at MIT with DIY kits being made available for other researchers this year. So rather than hitting a plateau, if anything much like the gains in optical networking we’re heading towards more oil poured on the AI fire, not less.

          Your thesis is great for things like colonizing Mars or living in spaceships, but it’s kind of crap for things like AI and software advancement.

          • MudMan@kbin.social
            link
            fedilink
            arrow-up
            2
            ·
            1 year ago

            I guarantee your Roomba will not walk around in a little maid uniform with a feather duster while balancing on a single wheel and making snarky comments in 40 years. I’ll bet any amount of money on that. Also no bubble flying cars or Rube Goldberg machines to wash your teeth. I don’t even know what point you’re trying to make at this point.

            Transfer speed records are not what I meant. Admittedly I should have thrown the word storage in there, I thought I had. The idea is that while online infrastructure was exploding it outpaced the needs for storage and transmission, so we went through a very fast rise in specs. Google was out there giving people free storage left and right because their capacity was growing faster than the needs of users. Now every cloud service is monetized, including Gmail’s storage, because storage growth no longer outpaces storage demand. ISPs tapped out by the time Netflix was chugging a quarter of the bandwidth with 4K video and Youtube has been throttling resolution since the pandemic. Turns out, Google won’t be adding zeros to your available free storage forever.

            As for AI, again, my entire point is that it’s not easy to know when it will flatten out growth. I’d argue it already has, at least in terms of consumer access. We keep getting incrementally better image generation, but a future where every image is created by a computer and every search is done via a chatbot interface does not seem to be materializing the way early knee-jerk reactions suggested.

            Which is not to say that ML applications won’t be ubiquitous. There are tons of legit uses where the changes will be groundbreaking. But it being on a straight line that takes you to the holodeck? That line may bend down a lot sooner than that.