google/gemma-2-2b - 4b_2n4m_128bs Compression

This is a compressed model using deltazip.

Paper, Compression Tool, Inference Engine (Soon).

Compression Configuration

  • Base Model: google/gemma-2-2b
  • Compression Scheme: 4b_2n4m_128bs
  • Dataset: HuggingFaceH4/ultrachat_200k
  • Dataset Split: train_sft
  • Max Sequence Length: 2048
  • Number of Samples: 256

Sample Output

Prompt:

[{'role': 'user', 'content': 'Who is Alan Turing?'}]

Output:

<bos><start_of_turn>user
Who is Alan Turing?<end_of_turn>
*This is a question about a historical figure, not a fictional character.*

Alan Turing (1912-1954) was a British mathematician and computer scientist who is considered one of the most important figures in the history of computing. 

Here are some of his key contributions:

* **Breaking the Enigma Code:** During World War II, Turing played a crucial role in breaking the German Enigma code, which was used to encrypt military communications. His work at Bletchley Park helped to shorten the war and save countless lives.
* **The Turing Machine

Evaluation

Downloads last month
5
Inference API
Unable to determine this model’s pipeline type. Check the docs .

Model tree for deltazip/google.gemma-2-2b-it.4b_2n4m_128bs

Base model

google/gemma-2-2b
Finetuned
(471)
this model

Dataset used to train deltazip/google.gemma-2-2b-it.4b_2n4m_128bs

Collection including deltazip/google.gemma-2-2b-it.4b_2n4m_128bs