Hi,
I have a friend who is looking to run a few simulations he has implemented in python and needs around 256GB of ram. He is estimating it will take a couple of hours, but he is studying economics so take that with a grain of salt 🤣
For this instance, I recommended GCP, but I felt a bit dirty doing that. So, I was wondering if any of you have a buttload of memory he can burrow? Generally, would you lend your RAM for a short amount of time to a stranger over the internet? (assuming internet acccess is limited to a signle ssh port, other necessary safeguards are in place)
Borrow it from NewEgg, then return it
Newegg isn’t so bad. Do a shit corporate like best buy.
Needing that much RAM is usually a red flag that the algo is not optimized.
Researchers always make some of the worst coders unfortunately.
Scientists, pair up with an engineer to implement your code. You’ll thank yourself later.
True, but there are also some legitimate applications for 100s of gigabytes of RAM. I’ve been working on a thing for processing historical OpenStreetMap data and it is quite a few orders of magnitude faster to fill the database by loading the 300GiB or so of point data into memory, sorting it in memory, and then partitioning and compressing it into pre-sorted table files which RocksDB can ingest directly without additional processing. I had to get 24x16GiB of RAM in order to do that, though.
Yea, that makes sense. You could sort it in chunks, but it would probably be slower. If you are regularly doing that and can afford the ram go for it. Otherwise maybe extract the bits that need to be sorted and zip them back up later.
I’ve got 512GB of RAM in my server, and 128GB of RAM on my desktop cause you can never have too much.
Why not get a 0.5 or 1 tb nvme ssd and set it all as swap?
It will run probably 10 times slower, but it’s cheap and doable.
This is the way.
Depending on the nature of the sim, it could probably even be done with ~80 GB or less of existing SSD space using zram w/ zstd.
that’s probably way too much for any sane Python algorithm. if they can’t run it, how do they even know how much is needed?
Probably they should only make a prototype in Python, and then reimplement it in a compiled language. it should reduce the resource usage massively
That’s kinda an insane amount of ram for most simulations. Is this like a machine learning thing? Is his python code just super unoptimized? Is it possible he’s making a bunch of big objects and then not freeing the references when he’s done with them so they’re never garbage collected?
As an economist I can confidentally say that he should go a different route. You/ he can show me/us more if he wants to. Maybe we can tell where the problem is.
If not, swap as others have stated is the way to go
Not anymore… Had a box of old/junk parts lying around before I moved but I didn’t bring it with me because it was essentially just garbage I never got around to taking to the electronic recycling thing because it was just a once a year thing that cost me money.
I’d have no problem giving away stuff like that. I wouldn’t take parts out of an active machine to let someone “borrow” tho.
The computer I’m typing on has 96 gb ram. Most of my equipment is ancient in terms of PCs. This one I build about 14 years ago, and I fully stocked it with the cutting edge tech of the day. My intent was to build a LTS PC, as it were. LOL Back then, SLI was the thing, but I’ve upgraded the GPU. I have some old stuff in the parts bin tho, but it’s ancient as well.
Define buttload
put your butt on a scale, convert the result to RAM, duh
Yup, that’s some random ass memory.