

They were technically Expert Systems.
AI was was the Marketing Term even then.
Now they are LLMs and AI is still the marketing term.


They were technically Expert Systems.
AI was was the Marketing Term even then.
Now they are LLMs and AI is still the marketing term.


If something uses a lot of if else statements to do stuff like become a “COM” player in a game, it is called an Expert System.
That is what is essentially in game “AI” used to be. That was not an LLM.
Stuff like clazy and clang-tidy are neither ML nor LLM.
They don’t rely on curve fitting or mindless grouping of data-points.
Parameters in them are decided, based on the programming language specification and tokenisation is done directly using the features of the language. How the tokens are used, is also determined by hard logic, rather than fuzzy logic and that is why, the resultant options you get in the completion list, end up being valid syntax for said language.
Now if you are using Cursor for code completion, of course that is AI.
It is not programmed using features of the language, but iterated until it produces output that matches what would match the features of the language.
It is like putting a billion monkeys in front of a typewriter and then selecting one that make something Shakespeare-ish, then killing off all the others. Then cloning the selected one and rinse and repeat.
And that is why it takes a stupendously disproportionate amount of energy, time and money to train something that gives an output that could otherwise be easily done better using a simple bash script.


I don’t consider clang tools to be AI.
They parse the code logically and don’t do blind pattern matching and curve fitting.
The rules they use are properly defined in code.
If that was AI, then all compilers made with LLVM would be AI.


I tried to look for the post but somehow wasn’t able to find it (I thought I commented on it).
I don’t remember the place, but a part of the policy was that data centers must pay 85% of their projected energy usage.
Here, found an article: https://www.ehn.org/ohio-regulators-make-tech-companies-pay-more-for-energy-hungry-data-centers

I see, so this is essentially there to make prototyping easier and faster and the high component cost is a premium for the value added service.
Makes sense for anyone working in a company.
Perhaps not for someone with a low budget.


I have said this before somewhere, but this feels like something that would be very well suited for places where electricity prices have gone extremely low due to “too many solar panels”.
Also, in places with excess geothermal output etc.
What are these companies really basing their installation locations upon?


I am going by mainly 2 points:


Sadly I am not in a location where people just discard useful parts.
If I were to try buying 2nd hand here, I would most probably end up with stuff that has some or the other kind of of damage.
For instance, in one of the companies I worked at, their policy for getting rid of stuff was:
And the auctions occurred years later after much red tape…
Mostly bought by other companies, who get to do more red-tape stuff to buy it.
While on one hand, this is a good thing, reducing wastage, it also means that I have no way to get 2nd hand stuff for hobbyist usage.
In case we do get 2nd hand stuff, it is usually through a 2nd hand dealer, who then ends up with a higher asking price than what it’s worth.
Also, I am not expecting there to be any AI enthusiast nearby me.


It’s a bit different in this case.
The responsibility of providing electricity falls onto the nearby power plant, which then also has to increase their production.
But the maker of the new electricity consumer does not need to pay for the capital or anything else really, apart from the electrical rates (and some minimal fixed rates) that they are using.
Some governments are coming up with interesting, seemingly effective regulations, though.


Yeah, just checked the 2 sites I use for computer components.
1 had no RAM listed.
The other had 32GB DDR4 at 2x the price and no 128GB kit (96 was the highest, 64 for DDR4)


Yeah, the main problems right now, seem to be electricity consumption, causing price hikes in surrounding areas.


Oh no.
So even if I manage to somehow get DDR4 for lower prices, I can’t expect the SK Hynix modules.
Guess it’s going to be a few more years before I can get a RAM upgrade, or maybe never at all.
It might end up being similar to how DDR3 ended up being more expensive than DDR4 for multiple years.


Guess I should have bought the 128GB 3600 kit earlier.
Are DDR4 also affected?


Perhaps some down’n’dasher that didn’t like your lack of respect for them.

Definitely good in the long term.
Now everyone using it knows that they need to have a backup system in place.


Maybe someone who has been de-federated from my instance, considering I don’t see a downvote.


Although the encryption is a useful feature, I don’t really expect that from mail.
And if were doing internal communications with a company with that level of security and privacy requirement, I would be using their on-premises mail server.
I have been considering Proton, mostly. My main goal being, not randomly losing access to my mail account due to some AI bs.
Though I am not sure if they might end up requiring stuff like “Video ID”.
And if it comes to paying for a service, I will also start comparing it to the cost of a domain name and a static IPv6 address, because I already have plans to run a server.

I have version 2.52 and it is already the default.

That’s lovely.
And I just realised what I needed was an Internal and External Tooth Lock washer. I didn’t know before, what it was called.
A pity that the US pricing is to high for my country, on top of which, it might be shipped from the US, adding expense.
Yeah, my main point with all those examples was to put the point that “AI” always has been a marketing term.
Curve-fitting and data-point clustering are both pretty efficient if used for the thing they are made for. But if you then start brute-forcing multiple nodes of the same thing just to get a semblance of something else, that is otherwise not what it is made for, of course you will end up using a lot of energy.
We humans have it pretty hard. Our brain is pretty illogical. We then generate multiple layers of abstractions make a world view, trying to match the world we live in. Over those multiple layers, comes a semblance of logic.
Then we make machine.
We make machines to be inherently logical and that makes it better at logical operations than us humans. Hence calculators.
Now someone comes and says - let’s make an abstraction layer on top of the machine to represent illogical behaviour (kinda like our brains).
(┛`Д´)┛彡┻━┻
And then on top of that, they want that illogical abstract machine to itself create abstractions inside it to be able to first mimic human output and then further to do logical stuff. All of that, just so one can mindlessly feed data into it to “train” it, instead of think themselves and feed it proper logic.
This is like saying they want to install an OS on browser WASM and then install a web browser inside that OS, to do the same thing that they would have otherwise done with the original browser.
In the monkeys analogy, you can add that the monkeys are a simulation on a computer.