also i’m currently using the stack of opencode + kimi 2.5 + fireworks but i’m realizing i could probably expose ollama small models and direct claude to different models locally when i get the mac mini to handle tasks that don’t need a bunch of horsepower – what’s your stack currently look like?