🌊 LiquidAI LFM2-8B Chat
Chat with the LiquidAI LFM2-8B-A1B model using ZeroGPU. This is a hybrid MoE model with 8.3B total parameters and 1.5B active parameters, optimized for edge AI deployment.
💡 Tip: The first response may take a moment as the GPU is allocated. The model excels at:
- Instruction following
- Math and reasoning
- Multi-turn conversations
- Agentic tasks and data extraction
⚠️ Note: This model is best suited for narrow use cases. It may not perform well on knowledge-intensive tasks.
64 2048
0.1 2
0 1
1 2