News
A 25-year-old computer just ran a modern AI model, proving that cutting-edge tech doesn't always need cutting-edge hardware.
Explore the new AI model from Microsoft designed to run efficiently on a CPU, ensuring powerful performance without a GPU.
16d
ExtremeTech on MSNMicrosoft's New Compact 1-Bit LLM Needs Just 400MB of MemoryMicrosoft’s new large language model (LLM) puts significantly less strain on hardware than other LLMs—and it’s free to experiment with. The 1-bit LLM (1.58-bit, to be more precise) uses -1, 0, and 1 ...
using absmean quantisation during the forward pass. The BitNet b1.58 2B4T is also an open-source model that requires only 0.4 GB of memory, whereas other similarly sized models typically require ...
To run such a model, the team created a runtime environment for it. The new environment is called bitnet.cpp and was designed to make the best use of the 1-bit architecture. If the claims made by the ...
16d
Gadget Review on MSNBitNet: Microsoft's Compact AI Challenges Industry Giants with Radical EfficiencyMicrosoft's BitNet challenges industry norms with a minimalist approach using ternary weights that require just 400MB of memory while performing competitively against larger models on standard ...
What just happened? Microsoft has introduced BitNet b1.58 2B4T, a new type of large language model engineered for exceptional efficiency. Unlike conventional AI models that rely on 16- or 32-bit ...
Dubbed BitNet b1.58b, the model uses just 1.58 bits per weight, representing the average number of bits needed for three values. It is trained from scratch (“natively”) rather than being ...
Microsoft’s model BitNet b1.58 2B4T is available on Hugging Face but doesn’t run on GPU and requires a proprietary framework.
Some results have been hidden because they may be inaccessible to you
Show inaccessible results