AI Models
1 result for "Phi-4"
- ▾Phi-4Open
Microsoft · 16K tokens · self-host
Best for: Edge deployment, STEM tasks, embedded AI in products
How: ollama run phi4. MIT license — embed in commercial products freely.
Example: Embed in a CI pipeline to validate config files and Terraform plans.
GPQA Diamond 56.2%MATH 80.4%14B paramsSTEM reasoningMIT licenseruns on laptopHardware to self-hostVRAM: 9GB (quantized) / 28GB (FP16)GPU: Any 8GB+ GPU (RTX 3060, laptop 4050, etc.)RAM: 16GB system RAM14B dense. Runs locally on most developer laptops with quantization.
API: Ollama, Hugging Face, Azure AI