Qwen2 discusses the improvements made to the previous version, Qwen1.5. The Qwen2 series includes five models, ranging in size from 0.5B to 72B parameters. The larger models are capable of understanding longer contexts, up to 128K tokens. They have also been shown to perform well on multilingual tasks. Overall, the Qwen2 series offers significant improvements in performance over Qwen1.5.