Artificial Intelligence and LLMs¶
This section covers the use of Artificial Intelligence, specifically Large Language Models (LLMs), in DevOps and infrastructure environments.
🤖 What you'll find here¶
- LLM Fundamentals: Architecture, basic concepts and DevOps use cases
- Local tools: Ollama, LM Studio, LLaMA.cpp for running models locally
- Infrastructure integration: Kubernetes deployment, optimized storage, networking
- Testing methodologies: Benchmarks, evaluation and prompt engineering
- Practical cases: Chatbots, log analysis, IaC automation
🚀 Quick start¶
If you're new to LLMs, start with: 1. Introduction to LLMs - Basic concepts 2. Ollama: getting started - Your first installation 3. Model evaluation - How to measure performance