• kbal@fedia.io
    link
    fedilink
    arrow-up
    40
    ·
    1 day ago

    The driver needs to interface with the OS kernel which does change, so the driver needs updates. The old Nvidia driver is not open source or free software, so nobody other than Nvidia themselves can practically or legally do it. Nvidia could of course change that if they don’t want to do even the bare minimum of maintenance.

    • bleistift2@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      5
      ·
      1 day ago

      The driver needs to interface with the OS kernel which does change, so the driver needs updates.

      That’s a false implication. The OS just needs to keep the interface to the kernel stable, just like it has to with every other piece of hardware or software. You don’t just double the current you send over USB and expect cable manufacturers to adapt. As the consumer of the API (which the driver is from the kernel’s point of view) you deal with what you get and don’t make demands to the API provider.

      • kbal@fedia.io
        link
        fedilink
        arrow-up
        23
        ·
        1 day ago

        Device drivers are not like other software in at least one important way: They have access to and depend on kernel internals which are not visible to applications, and they need to be rebuilt when those change. Something as huge and complicated as a GPU driver depends on quite a lot of them. The kernel does not provide a stable binary interface for drivers so they will frequently need to be recompiled to work with new versions of linux, and then less frequently the source code also needs modification as things are changed, added to, and improved.

        This is not unique to Linux, it’s pretty normal. But it is a deliberate choice that its developers made, and people generally seem to think it was a good one.

      • balsoft@lemmy.ml
        link
        fedilink
        arrow-up
        11
        ·
        1 day ago

        I don’t generally disagree, but

        You don’t just double the current you send over USB and expect cable manufacturers to adapt

        That’s pretty much how we got to the point where USB is the universal charging standard: by progressively pushing the allowed current from the initially standardized 100 mA all the way to 5 A of today. A few of those pushes were just manufacturers winging it and pushing/pulling significantly more current than what was standardized, assuming the other side will adapt.

        • xthexder@l.sw0.com
          link
          fedilink
          arrow-up
          2
          ·
          edit-2
          22 hours ago

          The default standard power limit is still the same as it ever was on each USB version. There’s negotiation that needs to happen to tell the device how much power is allowed, and if you go over, I think over current protection is part of the USB spec for safety reasons. There’s a bunch of different protocols, but USB always starts at 5V, and 0.1A for USB 2.0, and devices need to negotiate for more. (0.15A I think for USB 3.0 which has more conductors)

          As an example, USB 2.0 can signal a charging port (5V / 1.5A max) by putting a 200 ohm resistor across the data pins.

          • balsoft@lemmy.ml
            link
            fedilink
            arrow-up
            5
            arrow-down
            1
            ·
            edit-2
            22 hours ago

            The default standard power limit is still the same as it ever was on each USB version

            Nah, the default power limit started with 100 mA or 500 mA for “high power devices”. There are very few devices out there today that limit the current to that amount.

            It all begun with non-spec host ports which just pushed however much current the circuitry could muster, rather than just the required 500 mA. Some had a proprietary way to signal just how much they’re willing to push (this is why iPhones used to be very fussy about the charger you plug them in to), but most cheapy ones didn’t. Then all the device manufacturers started pulling as much current as the host would provide, rather than limiting to 500 mA. USB-BC was mostly an attempt to standardize some of the existing usage, and USB-PD came much later.

            • xthexder@l.sw0.com
              link
              fedilink
              arrow-up
              2
              ·
              21 hours ago

              A USB host providing more current than the device supports isn’t an issue though. A USB device simply won’t draw more than it needs. There’s no danger of dumping 5A into your 20 year old mouse because it defaults to a low power 100mA device. Even if the port can supply 10A / 5V or something silly, the current is limited by the voltage and load (the mouse).

              • balsoft@lemmy.ml
                link
                fedilink
                arrow-up
                1
                ·
                20 hours ago

                Well, the original comment was about “pushing more current through than the spec”, and that’s pretty much what we did…

                • xthexder@l.sw0.com
                  link
                  fedilink
                  arrow-up
                  2
                  ·
                  18 hours ago

                  Well, regardless, the spec only cares about devices drawing more current than the host can supply, and that has always been consistent. Electricity doesn’t really work in a way the host can “push” current, the only way it could do that would be with a higher voltage, which would damage anything not designed for it. But that’s what the USB-PD spec is for, negotiating what voltage to supply, up to 48V now.

                  • balsoft@lemmy.ml
                    link
                    fedilink
                    arrow-up
                    1
                    ·
                    edit-2
                    11 hours ago

                    Electricity doesn’t really work in a way the host can “push” current

                    On a basic level this is precisely how electricity works, a power supply literally pushes electrons by creating a difference in electric field magnitude between two points; or, in other words, by applying an electromotive force to electrons; or, in other words, by creating a voltage between two points. A load then does something with those electrons that usually creates an opposing electric field, be it heating a wire, spinning a motor, or sustaining a chemical reaction within a battery. The amount of power produced by the source and released at the load is proportional to (voltage) * (number of electrons being pushed by the supply per unit of time); usually, this is the limiting factor for most power supplies. They can hold a steady voltage until they have to push too many electrons, then the voltage starts dropping.

                    Edit: I see what you mean now. Yeah, for a given voltage, it is the load that determines the current, so there’s no safety issue with this for the load. However there could be issues with the cables. IIRC there was an issue with noise being introduced by higher current draws that meant you couldn’t charge and transfer data at the same time with some cables.