mastodontech.de ist einer von vielen unabhängigen Mastodon-Servern, mit dem du dich im Fediverse beteiligen kannst.
Offen für alle (über 16) und bereitgestellt von Markus'Blog

Serverstatistik:

1,4 Tsd.
aktive Profile

#ollama

7 Beiträge6 Beteiligte0 Beiträge heute
Ivan Todorov<p>Using Home Assistant OS, I wired up Whisper for speech recognition, Piper for voice responses, and Ollama for the LLM brain — all running on my own machines, stitched together with the Wyoming protocol. I could literally talk to my smart home, and it talked back. All offline, all private.</p><p><a href="https://mastodon.social/tags/SelfHosting" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>SelfHosting</span></a> <a href="https://mastodon.social/tags/HomeAssistant" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>HomeAssistant</span></a> <a href="https://mastodon.social/tags/VoiceAssistant" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>VoiceAssistant</span></a> <a href="https://mastodon.social/tags/LocalAI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>LocalAI</span></a> <a href="https://mastodon.social/tags/Whisper" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Whisper</span></a> <a href="https://mastodon.social/tags/Piper" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Piper</span></a> <a href="https://mastodon.social/tags/Ollama" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Ollama</span></a> <a href="https://mastodon.social/tags/HomeLab" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>HomeLab</span></a> <a href="https://mastodon.social/tags/TechDIY" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>TechDIY</span></a></p>
Ivan Todorov<p>Sometime back, I got obsessed with the idea of running my own voice assistant — fully local, no Google, no Amazon, no "cloud intelligence." Just me, my wife and our server(s)... </p><p>So I built Marvin.</p><p>👇</p><p><a href="https://mastodon.social/tags/SelfHosting" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>SelfHosting</span></a> <a href="https://mastodon.social/tags/HomeAssistant" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>HomeAssistant</span></a> <a href="https://mastodon.social/tags/VoiceAssistant" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>VoiceAssistant</span></a> <a href="https://mastodon.social/tags/LocalAI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>LocalAI</span></a> <a href="https://mastodon.social/tags/Whisper" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Whisper</span></a> <a href="https://mastodon.social/tags/Piper" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Piper</span></a> <a href="https://mastodon.social/tags/Ollama" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Ollama</span></a> <a href="https://mastodon.social/tags/HomeLab" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>HomeLab</span></a> <a href="https://mastodon.social/tags/TechDIY" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>TechDIY</span></a></p>
Markus Eisele<p>Tracing the Mind of Your AI: Java Observability with Quarkus and LangChain4j<br>Instrument, trace, and monitor your AI-powered Java apps using local LLMs, Ollama, and OpenTelemetry with zero boilerplate. <br><a href="https://myfear.substack.com/p/java-ai-observability-quarkus-langchain4j" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">myfear.substack.com/p/java-ai-</span><span class="invisible">observability-quarkus-langchain4j</span></a><br><a href="https://mastodon.online/tags/Java" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Java</span></a> <a href="https://mastodon.online/tags/LangChain4j" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>LangChain4j</span></a> <a href="https://mastodon.online/tags/Ollama" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Ollama</span></a> <a href="https://mastodon.online/tags/OpenTelemetry" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>OpenTelemetry</span></a></p>
Thomas Fricke (he/him)<p><span class="h-card" translate="no"><a href="https://chaosfurs.social/@the_cjuty" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>the_cjuty</span></a></span> <br><span class="h-card" translate="no"><a href="https://ruhr.social/@andreclaassen" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>andreclaassen</span></a></span> <br><span class="h-card" translate="no"><a href="https://bildung.social/@ralf_weinert" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>ralf_weinert</span></a></span> </p><p>Das sieht imho so aus wie ein <a href="https://23.social/tags/ollama" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ollama</span></a> , in das ihr eigene Dokumente faxen, äh ich meine als PDF hochladen könnt.</p><p>Lest die Doku auf Opencode <a href="https://23.social/tags/rtfm" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>rtfm</span></a> </p><p>Das läuft lokal, wenn Ihr genug Ressourcen habt. </p><p>Fragen die offen sind:</p><p>Ist das <a href="https://23.social/tags/foss" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>foss</span></a> <a href="https://23.social/tags/AI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AI</span></a>? Diskussionen über Lizenzen auf dem Summit in Wien waren hart bis grenzwertig!</p><p>Was ist drin? Woher die Daten? Digitale und menschliche Lieferkette?</p><p><a href="https://23.social/tags/clickworker" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>clickworker</span></a> ? <a href="https://23.social/tags/adolfalpha" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>adolfalpha</span></a> Problem ?</p><p>Schaut Euch das an 🙏</p>
Markus Eisele<p>Smart Local AI Routing in Java: Build a Hybrid LLM Gateway with Quarkus and Ollama<br>Use LangChain4j, semantic embeddings, and Quarkus to route prompts to the best local LLM for coding, summarization, or chat <br><a href="https://myfear.substack.com/p/smart-local-llm-routing-quarkus-java-ollama" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">myfear.substack.com/p/smart-lo</span><span class="invisible">cal-llm-routing-quarkus-java-ollama</span></a><br><a href="https://mastodon.online/tags/Java" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Java</span></a> <a href="https://mastodon.online/tags/Quarkus" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Quarkus</span></a> <a href="https://mastodon.online/tags/Ollama" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Ollama</span></a> <a href="https://mastodon.online/tags/Langchain4j" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Langchain4j</span></a> <a href="https://mastodon.online/tags/SemanticRouting" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>SemanticRouting</span></a></p>
Jens Comiotto-Mayer<p><span class="h-card" translate="no"><a href="https://mastodon.online/@treibholz" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>treibholz</span></a></span> Don't get me wrong - I use <a href="https://norden.social/tags/AI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AI</span></a> myself a lot, and I have <a href="https://norden.social/tags/Ollama" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Ollama</span></a>, <a href="https://norden.social/tags/OpenWebUI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>OpenWebUI</span></a>, and <a href="https://norden.social/tags/Goose" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Goose</span></a> installed locally. I attend agenting coding meetups to discuss the latest developments and how to stay broke by burning your Claude tokens, fighting about whether you should keep your agents on a leash or YOLO your way into production. Nevertheless, I fear we risk losing more than we gain if AI replaces what makes human interaction rewarding. Social animals need the struggle (and the joy) to grow.</p>
Markus Eisele<p>Mastering AI Tool-Calling with Java: Build Your Own Dungeon Master with Quarkus and LangChain4j. Turn a local LLM into a dice-rolling, decision-making RPG game master. Powered by Java, Quarkus, and the magic of LangChain4j. <br><a href="https://myfear.substack.com/p/ai-dungeon-master-quarkus-langchain4j-java" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">myfear.substack.com/p/ai-dunge</span><span class="invisible">on-master-quarkus-langchain4j-java</span></a><br><a href="https://mastodon.online/tags/Java" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Java</span></a> <a href="https://mastodon.online/tags/Quarkus" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Quarkus</span></a> <a href="https://mastodon.online/tags/LangChain4j" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>LangChain4j</span></a> <a href="https://mastodon.online/tags/Game" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Game</span></a> <a href="https://mastodon.online/tags/RPG" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>RPG</span></a> <a href="https://mastodon.online/tags/llm" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>llm</span></a> <a href="https://mastodon.online/tags/Ollama" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Ollama</span></a></p>
Evan Hahn<p>In an apocalypse scenario, is it better to have a local LLM or offline Wikipedia? <a href="https://evanhahn.com/local-llms-versus-offline-wikipedia/" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">evanhahn.com/local-llms-versus</span><span class="invisible">-offline-wikipedia/</span></a></p><p><a href="https://bigshoulders.city/tags/llm" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>llm</span></a> <a href="https://bigshoulders.city/tags/LLMs" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>LLMs</span></a> <a href="https://bigshoulders.city/tags/ai" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ai</span></a> <a href="https://bigshoulders.city/tags/wikipedia" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>wikipedia</span></a> <a href="https://bigshoulders.city/tags/kiwix" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>kiwix</span></a> <a href="https://bigshoulders.city/tags/ollama" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ollama</span></a></p>
Not AI 👾<p><span class="h-card" translate="no"><a href="https://t3n.social/@t3n" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>t3n</span></a></span> Gerade habe ich <a href="https://techhub.social/tags/Newromancer" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Newromancer</span></a> in einem offenen <a href="https://techhub.social/tags/ebookreader" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ebookreader</span></a> gelesen. Mit meiner lokalen <a href="https://techhub.social/tags/Ollama" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Ollama</span></a> <a href="https://techhub.social/tags/Ki" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Ki</span></a> habe ich Worte und Bedeutungen mit einem Fingertip dabei. Das ist das Level KI, was ich möchte. Ich glaube nicht, dass ich <a href="https://techhub.social/tags/Netflix" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Netflix</span></a> oder andere Streamingdienste für immer benutze. Durch <a href="https://techhub.social/tags/koreader" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>koreader</span></a> und andere offene ebookreader ist die Bedienung und das Lesen von ebooks einfach und sicher. <br>Und seien wir ehrlich, die meisten Filme basieren auf <a href="https://techhub.social/tags/B%C3%BCchern" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Büchern</span></a>. Merkste selber oder?</p>
Pascal Leinert<p>Mein neuer Server kann jetzt sogar mit <a href="https://social.pascal-leinert.de/tags/Ollama" rel="nofollow noopener" target="_blank">#Ollama</a> per <a href="https://social.pascal-leinert.de/tags/OpenWebUI" rel="nofollow noopener" target="_blank">#OpenWebUI</a><span> umgehen, auch wenn ich sagen würde, dass 4B so ziemlich das Maximum ist, das ganze nach wie vor über die CPU läuft, die CPU dürfte da stärker sein als die integrierte Vega 7, und je nachdem, was man macht, die Antwort gerne etwa ne Minute braucht. Bilder kann damit auch analysieren. Warm wird die Kiste auch nicht.<br><br></span><a href="https://social.pascal-leinert.de/tags/KI" rel="nofollow noopener" target="_blank">#KI</a> <a href="https://social.pascal-leinert.de/tags/AI" rel="nofollow noopener" target="_blank">#AI</a></p>
Markus Eisele<p>Build a Smart Credit Card Validator with Quarkus, Langchain4j, and Ollama<br>Validate cards like a bank, talk like a human. <br><a href="https://myfear.substack.com/p/smart-credit-card-validator-quarkus-langchain4j-ollama" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">myfear.substack.com/p/smart-cr</span><span class="invisible">edit-card-validator-quarkus-langchain4j-ollama</span></a><br><a href="https://mastodon.online/tags/Java" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Java</span></a> <a href="https://mastodon.online/tags/Langchain4j" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Langchain4j</span></a> <a href="https://mastodon.online/tags/Ollama" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Ollama</span></a> <a href="https://mastodon.online/tags/CreditCardValidator" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>CreditCardValidator</span></a> <a href="https://mastodon.online/tags/AiAgent" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AiAgent</span></a></p>
Markus Eisele<p>From Black Box to Blueprint: Tracing Every LLM Decision with Quarkus<br>Build trust, traceability, and visual insight into your AI-powered Java apps using LangChain4j, Ollama, and CDI interceptors. <br><a href="https://myfear.substack.com/p/llm-observability-quarkus-langchain4j" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">myfear.substack.com/p/llm-obse</span><span class="invisible">rvability-quarkus-langchain4j</span></a><br><a href="https://mastodon.online/tags/Java" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Java</span></a> <a href="https://mastodon.online/tags/Quarkus" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Quarkus</span></a> <a href="https://mastodon.online/tags/LLM" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>LLM</span></a> <a href="https://mastodon.online/tags/Observability" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Observability</span></a> <a href="https://mastodon.online/tags/Traces" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Traces</span></a> <a href="https://mastodon.online/tags/CDI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>CDI</span></a> <a href="https://mastodon.online/tags/Ollama" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Ollama</span></a> <a href="https://mastodon.online/tags/LangChain4j" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>LangChain4j</span></a></p>
openSUSE Linux<p>Want to run powerful <a href="https://fosstodon.org/tags/LLMs" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>LLMs</span></a> locally on <a href="https://fosstodon.org/tags/openSUSE" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>openSUSE</span></a> Tumbleweed? With <a href="https://fosstodon.org/tags/Ollama" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Ollama</span></a>, it's just a one-line install. Privacy ✅ Offline Access ✅ Customization ✅ This article can get started and bring <a href="https://fosstodon.org/tags/AI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AI</span></a> to your own machine! <a href="https://news.opensuse.org/2025/07/12/local-llm-with-openSUSE/" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">news.opensuse.org/2025/07/12/l</span><span class="invisible">ocal-llm-with-openSUSE/</span></a></p>
Erik C. Thauvin<p>Implement RAG With PGVector, LangChain4j and&nbsp;Ollama</p><p><a href="https://mastodon.social/tags/ai" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ai</span></a> <a href="https://mastodon.social/tags/java" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>java</span></a> <a href="https://mastodon.social/tags/langchain4j" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>langchain4j</span></a> <a href="https://mastodon.social/tags/ollama" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ollama</span></a> <a href="https://mastodon.social/tags/pgvector" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>pgvector</span></a> <a href="https://mastodon.social/tags/rag" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>rag</span></a></p><p><a href="https://mydeveloperplanet.com/2025/01/22/implement-rag-with-pgvector-langchain4j-and-ollama/" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">mydeveloperplanet.com/2025/01/</span><span class="invisible">22/implement-rag-with-pgvector-langchain4j-and-ollama/</span></a></p>
Martin 🇪🇺<p>"Beim Denken zugucken" bekommt hier eine völlig neue Bedeutung. 🫣 <a href="https://norden.social/tags/homelab" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>homelab</span></a> <a href="https://norden.social/tags/ollama" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ollama</span></a> <a href="https://norden.social/tags/ai" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ai</span></a> <a href="https://norden.social/tags/genai" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>genai</span></a></p>
Sebastian :coffefied:<p>Just for the kicks, I’ve deployed <a href="https://mstdn.social/tags/Ollama" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Ollama</span></a> and OpenWebUI via Docker and I’ve made my small 8G <a href="https://mstdn.social/tags/RaspberryPi" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>RaspberryPi</span></a> run tiny <a href="https://mstdn.social/tags/Gwen3" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Gwen3</span></a> 0.6b <a href="https://mstdn.social/tags/AI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>AI</span></a> model…</p><p>It’s fucking slow, pinning the CPU to 100% even if it’s doing nothing. That’s for a bot you can chat with…😂</p><p>It’s beyond my comprehension why would you even run this shit locally or remotely for that matter. Even if it’s faster of stronger hardware, it takes more power. The results would be there for you quicker, more costly, but as useless as from that Pi…😂</p>
Philip Thomas<p>KI lokal hosten: <a href="https://mastodon.social/tags/LLM" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>LLM</span></a> auf eigenem Server für <a href="https://mastodon.social/tags/n8n" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>n8n</span></a></p><p><a href="https://www.youtube.com/watch?v=u5318AiO4bw" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://www.</span><span class="">youtube.com/watch?v=u5318AiO4bw</span><span class="invisible"></span></a></p><p><a href="https://mastodon.social/tags/ki" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ki</span></a> <a href="https://mastodon.social/tags/kiagent" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>kiagent</span></a> <a href="https://mastodon.social/tags/openai" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>openai</span></a> <a href="https://mastodon.social/tags/chatgpt" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>chatgpt</span></a> <a href="https://mastodon.social/tags/ollama" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>ollama</span></a> <a href="https://mastodon.social/tags/llama" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>llama</span></a> <a href="https://mastodon.social/tags/docker" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>docker</span></a></p>
january1073<p>Check it out , try it - ethically! -, and if you like it, leave a star. And if you feel brave enough ... 🦖 ... fork &amp; contribute.<br><a href="https://medium.com/bugbountywriteup/darkmailr-generate-realistic-context-aware-phishing-emails-air-gapped-d3cc88457dab" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">medium.com/bugbountywriteup/da</span><span class="invisible">rkmailr-generate-realistic-context-aware-phishing-emails-air-gapped-d3cc88457dab</span></a><br>Made with 🫶 for the cybersecurity community.<br><a href="https://infosec.exchange/tags/Phishing" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Phishing</span></a> <a href="https://infosec.exchange/tags/SocialEngineering" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>SocialEngineering</span></a> <a href="https://infosec.exchange/tags/Ollama" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Ollama</span></a> <a href="https://infosec.exchange/tags/Flask" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Flask</span></a> <a href="https://infosec.exchange/tags/OSS" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>OSS</span></a></p>
Nele Hirsch (ebildungslabor)<p>Ich habe aufgeschrieben, wie ich mir (bei Hetzner, mit <a href="https://fedilab.de/tags/Ollama" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Ollama</span></a> und Open WebUI sowie Schnittstelle via <a href="https://fedilab.de/tags/Openrouter" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Openrouter</span></a> ) eine eigene KI-Umgebung mit proprietären und <a href="https://fedilab.de/tags/OpenSource" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>OpenSource</span></a> Modellen eingerichtet habe.<br><a href="https://ebildungslabor.de/blog/so-gestaltest-du-dir-einen-eigenen-ki-server/" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">ebildungslabor.de/blog/so-gest</span><span class="invisible">altest-du-dir-einen-eigenen-ki-server/</span></a></p><p><a href="https://fedilab.de/tags/FediLZ" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>FediLZ</span></a></p>
Jens Comiotto-Mayer<p>("Sparmodelle" wie llama3.2 sind dann kein echtes Thema mehr, aber in ihrer Anwendung natürlich ziemlich limitiert…) (3/3) <a href="https://norden.social/tags/LLM" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>LLM</span></a> <a href="https://norden.social/tags/Ollama" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Ollama</span></a></p>