“then” is used to depict time, sequence or a causal relationship. “than” is used with comparative adjectives, to depict comparison.

  • 4 Posts
  • 508 Comments
Joined 1 year ago
cake
Cake day: November 12th, 2024

help-circle


  • I would love to have Seagate techs comment on it, but that’s probably not going to happen, so I’ll just take some guesses.

    Maybe in the 2FR102 variants (which I am definitely not going to buy just to try them out), they changed the provider for the SATA controller or maybe some IC related to power management, which made them have to make some changes in the firmware.
    They checked the changes with whichever system they had at the time and there is some edge case caused by the setup that causes a problem in the newer kernel.

    The thread seems to have 2 other instances of similar problems, which seem to stem from a different issue, but are not the same fix, since they are not using the same device. Considering that there is only a single example (well I didn’t read it as deeply, so I might have missed sth) of the specific HDD, I would also consider something going on between the HDD and the SATA controller on the motherboard.




  • Tools that are closer to logic, are better for helping with coding. So an expert system is better than a Neural Network for making code helper tools, although its output would be more limited and wouldn’t take human language input.

    Using an LLM for this stuff means telling humans to not put the effort into making logic, hence “reducing their cognitive load”, but instead using something that takes a lot more energy (as in fuel) to make logic.

    What we are currently calling AI, is a fuzzy system, abstracted onto a logical system. And now we are trying to make that abstracted fuzzy system make another abstraction on top of it, that does logic. This is vs the human brain, which is a fuzzy system made directly out of chemical (and quantum, as some studies would state) processes, just creating a logical system on top of it.


    Each leveI of abstraction has a cost.

    1. If you make an lC with a fixed instruction flow (i.e. it does only a single thing), it won’t have to load instructions and will only load data and parameters, which wilI make it much more efficient at that specific process.
    • ln this case, the Ioading of variabIe data and parameters wiII be the sIowest part
    1. Now, you can specify a set of instructions, which are then implemented in hardware. Then, when you load instructions from a variable input (ROM, perhaps) you get to change the instruction flow can be changed on the fIy, but now the Ioaded instructions are an abstraction and are actuaIIy Ioaded parameters.
    • ln this case, loading of instructions becomes as sIow as parameters and then you see preloading/prefetching (and further, branch prediction) to make this part faster

    One nice example of abstractions is interrupts:

    • EarIier you had polIing, which meant that the CPU would have to check the corresponding data Iine every n clocks, determined by the polling rate and this would have to be written by the software programmer.
    • With interrupts, you now have a separate unit doing the poIling (which is much more efficient, because that is the only thing it is made for) and storing any state changes an a buffer, which then, depending upon the type of interrupt, can be taken by the program (which is stiIl polling, but at a much slower rate) and acted upon or may be done using interrupt routines.
    • There is a similar thing you do in case of muItithreaded code. Where if there is a Ioop running in thread A that needs to be interrupted by B. B will change the value of some variable, which can be checked by A. Now if there is a Ianguage that simplifies this interruption process, there will be some runtime that is doing a similar job for you. This will be another level of abstraction, which will require extra effort on runtime.

    One of the heaviest examples of abstraction l can think of, is what is done by the programs that simulate other processors. Things like the tools provided by FPGA manufacturers that let you emulate the logic inside the processor and Mentor tools which even have simulation starting from the user, designing the transistors.
    These are usually not used anywhere other than cases of testing, debugging, prototyping and the sort.
    Virtual Machines made for emulation are a bit different from these, but are pretty heavy nonetheless and one won’t consider, say emulating a Nintendo 3Ds on a hardware of similar performance for daily use.



  • Well, the server ECC variant is still pretty useful for desktop workloads. Just make sure AMD always supports it in the next generations. If it’s still a DIMM, then it can be sold right away.

    GDDR7, again, if the chip has the required pins as in GPUs, then GPU manufacturers can simply buy them, test them for a few hours maybe, and pop them in their lineups with a bit of re-calculation of traces (in case the exact pinout differs). Of course you get some re-soldering damage, but there’s not much you can do about it. On the other hand, if the GDDR7 is in GPUs already, most the companies would require is to alter the firmwares a bit and sell refurbished units.

    HBM2. Seems like it is possible to get slottable modules with HBM2. Pretty sure some industrious people in China will find a good use for them. Perhaps with RISC V processors?
    And the AI specialised units shouldn’t be fully useless either. Remember the cancer studies case?
    It is still useful computing ability that can be used well by those who know how.




  • I once played a 2D RPG which I got stuck in at a point, because I filled all save slots right before getting to a blocking battle, for which I ended up not having enough weapons[1] and then just falling 1 or 2 hits short of managing to pass it.

    I have been meaning to retry it from the start (it was a freeware, I think) but I can’t recall the rather peculiar name of it. I has been ~ 20 years.


    1. I had recently bought a weapon I then got another one of, just before the battle (as a gift for the battle), but was useless, because only 1 of the player characters could equip that ↩︎









  • On one hand, I thought of policies as the correct way to do stuff, since the user (root) then gets to decide who gets what.
    But considering the lack of good enough defaults and that most users won’t even know where to look at, I guess we do need additional security features in this case.

    For once, it would be good to find a way to reliably let a process (providing said endpoint) know which other process is trying to access said endpoint. This, combined with the root locations (like /bin, /usr/bin etc.) not being writeable without root privileges, should make it possible to have adequate security options in the program itself.