[Experimental model]

This model is an experiment using the frankenstein script from https://huggingface.co/chargoddard/llama2-22b BLOCK_DIAGONAL = False

Using: https://huggingface.co/The-Face-Of-Goonery/Huginn-13b-FP16 + Then used https://huggingface.co/upstage/llama-30b-instruct-2048 as donor model.

It used 160GB of system ram to merge these models, they merge fast without swap.

For prompt template and model information see huginnV1.

Downloads last month
9
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.