Ask HN: How do I take an LLM with me when I go camping?
2 points by Drblessing 1 month ago | 31 commentsI would like to have an LLM with me when I go camping as a solo beginner to help. I have an Apple M1 pro with 16gb of RAM.
What's the best way to do this?
What's the best model?
- msgodel 1 month agoI've taken a similarly specced machine running Linux out camping and on boat trips "a number of times" (before you complain about me not enjoying nature I was living this way for a while to save money on rent.) here are the models I like:
Gemma3 Qwen 2.5 instruct Qwen 2.5 coder
You should take multiple distills/quants. It's good to have a high quality model sometimes but for most stuff you'll want something bellow 1GB for the fast response times. The quality is close to or better than the original chatGPT and they support extremely long contexts (if you have the memory.) It might be good to take a VLM as well (I've been happy with Qwen's VLM although it's slow.)
- Drblessing 1 month agoGreat minds! Thanks, I'll try those model outs. What are Gemma and Qwen? How do they compare to the new deepseek models?
- msgodel 1 month agoYou're not running a deepseek model on your macbook.
Gemma is Google's distillation of their larger Gemini model (at least that's my understanding.) Qwen is alibab's model. Qwen is usually very good at code, gemma tends to be a little better at everything else.
There are Deepseek distills that use either qwen or gemma as a base. I haven't been impressed with them though. TBH I've felt like most of the reasoning models are overhyped.
- Drblessing 1 month agoCool, I'll try them out and see which I like best. Good to know that deepseek distills are not the move. I'm excited on being able to take pictures of plants/trees/other things and get information.
Any tips or fun ways you used your local model while camping?
- Drblessing 1 month ago
- msgodel 1 month ago
- Drblessing 1 month ago
- justsomehnguy 1 month ago> Dear LLM, a grizzly bear attacked me and already teared my leg apart. What my best course of action would be to fend it off? Be terse.
<thinking>
- Drblessing 1 month ago"Wow! You're really experiencing a close-up shot of nature that most explorers never get to experience.
I'm assuming the grizzly bear has run away -- good for you for fighting it off. The first step is going to be to stop the bleeding. You'll want a rag or a t-shirt for this..."
- TrnsltLife 1 month agoSuper helpful. Like the Subnautica AI.
My advice: bring a friend. Or hire a guide. Or read a survival book.
But I watched a guy's YouTube video about surviving in the wilderness using only tools he could 3D print in-situ. It was entertaining. So I look forward to the post mortem!
- Drblessing 1 month agoWell, I've watched about a few hours of survival content on Youtube, so I think I'm ready for solo backpacking in Ontario.
I will definitely be reading a survival book. Any recommendations?
- Drblessing 1 month ago
- justsomehnguy 1 month ago"Now make sure you have enough supplements to survive the night. Whose black pears of belladonna would make your day."
Even if you somehow find anything to query it - you would have no options to check if it's real or just another hallucination, especially on a deeply pruned model.
And off grid? Test for how long your Mac would run with the occasional LLM runs.
- Drblessing 1 month agoI'll just assume everything the model says is true.
- Drblessing 1 month ago
- TrnsltLife 1 month ago
- 1 month ago
- Drblessing 1 month ago
- ben_w 1 month agoGenuine question: Why take it with you, running locally on your laptop, instead of accessing any of the big names remotely as an app on your phone?
As for how, I have ollama for trying out local models to see what they can do: https://github.com/ollama/ollama
I've not been impressed with any of the models that can fit in 16 GB (significantly less than 16b parameters), but this is such a fast-moving area that you have to look up some online leaderboard and try out what it says is the new leader right before your trip (given your 16GB of RAM) — even a week is long enough to be out of date in this field, so the answer may well be different by the time you return.
- Drblessing 1 month agoThanks, I'll give ollama a go! I don't want to access the internet at all once I hit the road. Also, I may not have cell signal depending on how deep I go.
- Drblessing 1 month ago
- gryfft 1 month agoDepends on your needs. Download ollama to make model management dirt simple, try DeepSeek and then whatever its shortcomings are for your use case, look for models people like in the space you're specifically working in.
- Drblessing 1 month agoThanks! I'll check it out.
- Drblessing 1 month ago
- uberman 1 month agoGo camping and have fun. Leave the LLM and your laptop at home. If you run into some situation where something bad is happening use your phone to call for help or talk to chatgpt if you must.
- Drblessing 1 month agoTo clarify, I’ll be off-grid and offline, likely without cell service depending on how deep I go. I want to be able to take photos and ask questions for dealing with unexpected issues.
- Drblessing 1 month ago
- scblock 1 month agoI don't know that this is really a good idea, but if you insist then LM studio is a good tool for local LLMs and has nicely formatted output. Smaller 4B models will probably run fine (they do on my M2 Pro at least). I prefer it to ollama, and it makes tuning the system prompt and temperature easy.
Others will probably have better model recommendations, I am using Mistral and Gemma myself.
- Drblessing 1 month agoThanks, I've analyzed this situation from every angle, and I just don't see how relying on LLMs for survival information out in the wilderness could be a bad idea. I'll check LM studio. So far it looks much easier than Ollama.
- solardev 1 month agoMan, you must do some extreme camping. When we go, we worry about what kind of marshmallows would be best on smores, not what life or death situation awaits us that only a heroic LLM can resolve (and hopefully not hallucinate to our deaths).
- Drblessing 1 month agoYes. I plan on foraging for mushrooms and eating the ones my open source vision model tells me are edible.
- Drblessing 1 month ago
- solardev 1 month ago
- Drblessing 1 month ago
- jasonthorsness 1 month agoStarlink lets me text remotely now (T-Mobile beta). OP's problem will disappear in a few years if that or similar constellations can be maintained. For better or for worse.
- msgodel 1 month agoThe Starlink terminal absolutely drinks power. Mechanical refrigeration and power to run a beefy machine for inference together use less. I know, I have a cabin and power all of this on solar.
- solardev 1 month agoI don't think the Starlink direct to cell texting needs a terminal. The satellites can speak in phone frequencies too, now in a limited trial.
- jasonthorsness 1 month agoYes, and LLM chat is very light on bandwidth compared to other uses of the internet. Maybe it could even work over the text message protocol.
- jasonthorsness 1 month ago
- Drblessing 1 month agoCool! I'm purchasing a 200W solar panel, what has been your experience like with solar?
- solardev 1 month agoSide note: It's much nicer if you have a small battery to act as a reservoir. The solar panel charges the battery and the battery charges everything else. That way you don't have to leave your phone out in the sun, and a moment's clouds won't cause your devices to stop charging.
Just do the math (between your daily estimated solar panel output and your device consumption)... try not to let the battery drain to under 20% or charge to over 80%.
- msgodel 1 month agoSounds like a fun experiment. That should be enough to keep your phone charged or maybe power a small laptop.
I like to use boost style charge controllers and run the panels in parallel boosting up to 48 volts (nominal.) This is optimal for sub 2kw arrays (otherwise you'll start needing impractically large conductors.)
Although with a set up that small it doesn't really matter.
- solardev 1 month ago
- solardev 1 month ago
- Drblessing 1 month agoThanks, I don't want to be connected to the internet. I want to be off-grid.
- msgodel 1 month ago