

I’ve read about this method in the GitHub issues, but to me it seemed impractical to have different models just to change the context size, and that was the point I started looking for alternatives
I’ve read about this method in the GitHub issues, but to me it seemed impractical to have different models just to change the context size, and that was the point I started looking for alternatives
It was multiple models, mainly 32-70B
There are many projects out there optimizing the speed significantly. Ollama is unbeaten in the convenience though
Yeah, but there are many open issues on GitHub related to these settings not working right. I’m using the API, and just couldn’t get it to work. I used a request to generate a json file, and it never generated one longer than about 500 lines. With the same model on vllm, it worked instantly and generated about 2000 lines
Take a look at NVIDIA Project Digits. It’s supposed to release in May for 3k usd and will be kind of the only sensible way to host LLMs then:
I’ve discovered it just a few days ago and now use it on all my machines
For anyone trying this, make sure you do not have “- TS_USERSPACE=false” in your yaml from previous experimentation. After removing this, it works for me too.
In the documentation they say to add sysctl entries, it is possible in docker compose like so:
tailscale:
sysctls:
- net.ipv4.ip_forward=1
- net.ipv6.conf.all.forwarding=1
But it does not seem to make a difference for me. Does anyone know why these would not be required in this specific setup?
Thank you, really appreciate it!
Do you have any links/sources about this? I’m not saying you’re wrong, I’m just interested
Do you have an example? I’m genuinely curious, I’ve heard a lot about this theory but can’t really imagine how you would differentiate bots from mindless redditors farming for karma by saying „This.“
You can only resign from being part of the church, which many young people do once they see this on their first paycheck.
Maybe also have a look into adhd, it’s getting commonly misdiagnosed as bipolar
The Dark Knight
Only via libreddit.hu
Embrace the corruption, exile!
Path of Exile, around 3k hours and it is still my favorite. Every 3 months you get to play again with a different twist, and Path of Exile 2 (just an update, but a huge one) is being worked on and will likely release next year.
Super cool! I’d be interested in how to fit this to my head shape too, it’s now on my list of contenders for the concert