News

The BitNet b1.58 2B4T model was developed by Microsoft's General Artificial Intelligence group and contains two billion parameters – internal values that enable the model to ...
Memory requirements are the most obvious advantage of reducing the complexity of a model's internal weights. The BitNet b1.58 ...
Microsoft’s model BitNet b1.58 2B4T is available on Hugging Face but doesn’t run on GPU and requires a proprietary framework.
Microsoft's BitNet challenges industry norms with a minimalist approach using ternary weights that require just 400MB of ...
Microsoft’s new BitNet b1.58 model significantly reduces memory and energy requirements while matching the capabilities of ...
Explore the new AI model from Microsoft designed to run efficiently on a CPU, ensuring powerful performance without a GPU.
Microsoft’s General Artificial Intelligence group has introduced a groundbreaking large language model (LLM) that drastically ...
Microsoft put BitNet b1.58 2B4T on Hugging Face, a collaboration platform for the AI community. “We introduce BitNet b1.58 2B4T, the first open-source, native 1-bit Large Language Model (LLM ...
A group of computer scientists at Microsoft Research, working with a colleague from the University of Chinese Academy of ...