• The AI-driven memory shortage doesn’t just affect PCs
  • More capacity is coming, but not before 2027
  • Low-margin budget products are likely to be hit hardest
  • Rentlar@lemmy.ca
    link
    fedilink
    English
    arrow-up
    1
    ·
    24 minutes ago

    I 'member when headphones were just a copper wire to a magnet dressed in plastic, and none of this garbage required memory. Can we go back to that age? Thanks.

    • SatansMaggotyCumFart@lemmy.world
      link
      fedilink
      English
      arrow-up
      23
      ·
      3 hours ago

      I just want a straight up monitor with no built in wifi or Bluetooth or even speakers.

      Just display the image I want it to when I want it to.

      • galaxy_nova@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        3 hours ago

        Literally this. Let me use my own fucking box and my own goddamn pc how I want with big monitor. Also put DP on it fuck HDMI. I don’t need trackers in my bloody tv OS. My tv has an OS!!! That’s ridiculous. Sorry this is one of those things I’ve always been super pissed about.

    • ThomasWilliams@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      44 minutes ago

      Linear components like power amplifiers are made from the same material as digital circuitry, and they use a lot more of it.

    • Tim_Bisley@piefed.social
      link
      fedilink
      English
      arrow-up
      4
      ·
      3 hours ago

      Man I’m really dreading buying a new TV. Been going strong with my plasma for years. I don’t need any “smart” features in a tv. From what I understand you either get a good TV or a “dumb” TV, pick one.

      • blitzen@lemmy.ca
        link
        fedilink
        English
        arrow-up
        3
        ·
        2 hours ago

        But a good tv, and don’t connect it to the internet. Apple TV or Shield device.

        • adarza@lemmy.ca
          link
          fedilink
          English
          arrow-up
          4
          ·
          2 hours ago

          televisions of the near future when you first turn them on: “Internet connection and account required to complete initial product set up.”

          • cmnybo@discuss.tchncs.de
            link
            fedilink
            English
            arrow-up
            2
            ·
            2 hours ago

            I would immediately return that as defective. I’d rather use that old 1980’s portable TV that’s been collecting dust in my closet since they shut down the analog TV broadcasts.

  • artyom@piefed.social
    link
    fedilink
    English
    arrow-up
    12
    ·
    3 hours ago

    2027 sounds right. No way these fabs don’t know this shit is temporary, so unlikely they’ll increase production.

    • Sabin10@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      59 minutes ago

      Unfortunately it’s the people between the manufacturers and the consumers that think this current iteration of AI is the future. They even seem to think we want it and can’t wait to pay them for it.

      • artyom@piefed.social
        link
        fedilink
        English
        arrow-up
        1
        ·
        37 minutes ago

        No one thinks that. Not the hardware OEMs, not the consumers, not the CEOs, not even the investors. It’s all just a grift to see how high it can get before it pops.

  • solrize@lemmy.ml
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    2
    ·
    3 hours ago

    Why do TV’s and audio gear use memory? TV’s ok I can sort of understand a little, but audio? That’s still analog right? Or anyway mostly analog.

    • empireOfLove2@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      17
      ·
      3 hours ago

      All digital devices will use some amount of memory. Audio devices are all digital these days and only use a DAC (Digital to Analog Converter) to generate the actual audio waveform from a raw sample stream.

      On something like a standalone audio amp there still has to be the whole backend to store codec information, menus and settings, and a whole host of other controls and audio processing features that are likely implemented on top of a basic OS and not directly written to a microcontroller. There’s more memory than you think.

      • B-TR3E@feddit.org
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        edit-2
        35 minutes ago

        “Codec information” is in ROM or implemented in hardware directly. Even studio quality audio interfaces that are DSP comtrolled will need only relatively small amounts of RAM; relatively slow memory for variable space and slightly faster mem for buffering. Both in the megabyte range and far from the speed that GPUs or AI require.

    • UnspecificGravity@piefed.social
      link
      fedilink
      English
      arrow-up
      2
      ·
      3 hours ago

      This is probably all like stand-alone Bluetooth speakers and such that have little processors in them. Analog hifi shit isn’t going be able to use AI for anything except maybe receivers or something. I guess there might be a use case for using it to balance to a room and set up EQ or something, but these guys don’t seem interested in doing anything that would require actual work to develop a real product.

    • stoy@lemmy.zip
      link
      fedilink
      English
      arrow-up
      2
      ·
      3 hours ago

      Depends on if you have analog cabled headphones, like the Meze Empyrians or the Philips X2HR Fidelio, then they are analog, but wireless or even digital headphones with USB/Lightning has ram.

  • coherent_domain@infosec.pub
    link
    fedilink
    English
    arrow-up
    2
    ·
    3 hours ago

    Question: will AI eventually hurt computer chips? Like memory companies, the TSMC also only have finite production capacity.