google/gemma-2-2b - 2b_2n4m_128bs Compression

This is a compressed model using deltazip.

Paper, Compression Tool, Inference Engine (Soon).

Compression Configuration

  • Base Model: google/gemma-2-2b
  • Compression Scheme: 2b_2n4m_128bs
  • Dataset: HuggingFaceH4/ultrachat_200k
  • Dataset Split: train_sft
  • Max Sequence Length: 2048
  • Number of Samples: 256

Sample Output

Prompt:

[{'role': 'user', 'content': 'Who is Alan Turing?'}]

Output:

<bos><start_of_turn>user
Who is Alan Turing?<end_of_turn>
*This is a question about a historical figure, Alan Turing.*

Alan Turing (1912-1954) was a British mathematician and computer scientist who is considered one of the most important figures in the history of computing. He made significant contributions to theoretical computer science, artificial intelligence, and cryptography.

**Key Contributions:**

* **Turing Machine:** Turing developed the theoretical model of a universal machine, known as the Turing machine, which is a fundamental concept in computer science. This machine can simulate any other Turing machine, demonstrating the universality of computation.

Evaluation

Downloads last month
6
Inference API
Unable to determine this model’s pipeline type. Check the docs .

Model tree for deltazip/google.gemma-2-2b-it.2b_2n4m_128bs

Base model

google/gemma-2-2b
Finetuned
(471)
this model

Dataset used to train deltazip/google.gemma-2-2b-it.2b_2n4m_128bs

Collection including deltazip/google.gemma-2-2b-it.2b_2n4m_128bs