

My phone has a option in battery settings for setting max charge to 80%.
Hey. Yeah you. No don’t look over your shoulder. I’m not talking to the guy behind you. Look, we’ve been meaning to tell you that you’re doing a pretty good job out there. Proud of you. Keep up the good work.


My phone has a option in battery settings for setting max charge to 80%.
I like it! It feels very cozy.


There is only one of me.


I used to set some coins on the table (five or so) and then if I get it right I move a coin to the other side. If I get one wrong I move them all back. I can’t move on until I’ve moved all the coins to the other side. This was generally for music practice but it seems pretty applicable here.


If we had some omniscient and perfectly fair justice system that could confirm there is no other option, sure. But jeesh, how much further could we be from that yk? The US justice system is becoming increasingly blatantly political.
Also, as someone who thinks punishment is vindictive and unnecessary compared to rehabilitation, the ultimate punishment does not appeal to me.


Oh Jesus. Jungle juice was a hazy near forgotten memory for me. What a horrible concept. Or at least horrible execution near every time. I may have had one okay jungle juice in my entire college career.


Cool. I use my mullvad and use their DNS. Thanks for the idea on the UBlock filter!


Is there any way to do that kind of blocking when using a VPN?
I really like how that sky looks. I feel I’ve seen that kind of clear hazy sky so many times in my life.
I love it! I’m trying to get better at lighting and this is such a satisfying piece to look at. I love the highlights and bits of purple.


I am young and have a computer science degree, and I still struggle at times. I get it.
For games, I’d try to install steam and run them through steam if thats how you’d normally do it on windows. Then for me the main setting to play with (on a game by game basis) is setting the game to use proton (in the compatibility settings of the game) and whether or not to use steam input for controller support.
If you are trying to install a non steam game, maybe look into lutris. Though I’m on the techy side, and I hear a lot of people like heroic game launcher on the less techy side.
Good luck. I think it’s fair to run out of energy while trying get the right combo, but if ya stick to it I’m confident you’ll find the set up that works for you.


Yeah I feel Linux has a lot of dead ends. Its easy to follow the wrong path. My saving grace has always been that once you get things working, you know how you did it and it likely won’t change much.
So really its a big search, but once you hit a steady state it really feels like home.
My cat died
New cats
One’s anxiety
Is too much


Donations would be great. I dream of mine would be to get a large enough following to live a decent life off of donations and just make art. Kinda far fetched, but it would be great.
I know one person I’ve found that did that is Chris from Airwindows. His art is a bit technical, but he makes interesting audio software and shares it all open source. I donate a bit to him.


Yeah setting up openwebui with llamacpp is pretty easy. I would start with building llamacpp by cloning it from github and then following the short guide for building it linked on the readme. I don’t have a Mac, but I’ve found building it to be pretty simple. Just one or two commands for me.
Once its built just run llama-sever with the right flags telling it to load model. I think it can take huggingface links, but I always just download gguf files. They have good documentation for llama-server on the readme. You also specify a port when you run llama-server.
Then you just add http://127.0.0.1:PORT_YOU_CHOSE/v1 as one of your openai api connections in the openwebui admin panel.
Separately, if you want to be able to swap models on the fly, you can add llama-swap into the mix. I’d look into this after you get llamacpp running and are somewhat comfy with it. You’ll absolutely want it though coming from ollama. At this point its a full replacement IMO.


With 128GB of ram on a Mac, GLM 4.5 Air is going to be one of your best options. You could run it anywhere from Q5 to Q8 depending on how you wanna manage your speed to quality ratio.
I have a different system that likely runs it slower than yours will, and I get 5 T/s generation which is just about the speed I read at. (Using q8)
I do hear that ollama may be having issues with that model though, so you may have to wait for an update to it.
I use llamacpp and llama-swap with openwebui, so if you want any tips on switching over I’d be happy to help. Llamacpp is usually one of the first projects to start supporting new models when they come out.
Edit: just reread your post. I was thinking it was a newer Mac lol. This may be a slow model for you, but I do think it’ll be one of the best your can run.


Yeah I’ve given up on making a living off of art. I’ve pivoted to trying to just share it for free as much as I can. Honestly I think big corporate studios would have way less of a market if community art was more of a thing. It’d be nice to have local art and culture around rather than soulless corpo slop. I think those communities would process the world in a healthier way.
I hear you though. It really really sucks ass that literally no one (if you round) gets to make a living doing art. And (if you don’t round) those that do get to make a living on it generally come from a very small subset of the available cultures.
This is a good thing to be mad about. It affects how everyone sees the world.


Ah cool. Guess I just lost the distro lottery. Thanks for the info.


I just can’t get those to work on fedora with Ardour. What is your set up?
Ah dude I feel that so much with looking at pond samples under the microscope. It was and still is quite magical, but you can’t beat seeing it for the first time. Just a whole other world right there.