AI Models

1 result for "Phi-4"

Compare →
  • Phi-4Open

    Microsoft · 16K tokens · self-host

    Best for: Edge deployment, STEM tasks, embedded AI in products

    How: ollama run phi4. MIT license — embed in commercial products freely.

    Example: Embed in a CI pipeline to validate config files and Terraform plans.

    GPQA Diamond 56.2%MATH 80.4%
    14B paramsSTEM reasoningMIT licenseruns on laptop
    Hardware to self-host
    VRAM: 9GB (quantized) / 28GB (FP16)
    GPU: Any 8GB+ GPU (RTX 3060, laptop 4050, etc.)
    RAM: 16GB system RAM

    14B dense. Runs locally on most developer laptops with quantization.

    API: Ollama, Hugging Face, Azure AI