In the ever-evolving world of artificial intelligence and language models, staying ahead of the curve is paramount. Enter Sparrow 1.1B Chat Alpha, the first Open Source model offered from CogniSys. Part of the Sparrow series, this language model aims to redefine the boundaries of models ready for fine-tuning and use as agents in a small size capable of running on diverse hardware locally.
The ambition behind the Sparrow series was simple: to fine-tune LLMs that are versatile foundations.
The beauty of the Sparrow series lies in its adaptability. While they are not powerhouses on their own, these models are also designed to be a foundation for domain-specific fine-tuning, allowing for hyper-specialization in various sectors. Along with small but powerful models to use in multi-agent scenarios.
Model Origin: Sparrow-1.1B-Chat-α is based on PY007/TinyLlama-1.1B-intermediate-step-480k-1T.
Model Specifications: 1.1 billion parameters, Sparrow-7B-α is small but mighty. Fine-tuned using a mix of publicly available and synthetic datasets.
Languages: While its primary language of expertise is English, the model's architecture allows for potential expansions into other languages in future iterations as needed.
Licensing and Warranty: Committed to open-source values, Sparrow-1.1B-Chat-α is available under the Apache 2.0 license, ensuring a wide range of uses without restrictive limitations. However, users should note that the model comes with no warranties or guarantees, emphasizing the importance of discretion, appropriate application, and testing.
As the first model in the Sparrow series, Sparrow-1.1B-Chat-α sets a high standard. But at CogniSys, the journey has just begun. With continuous advancements in the field of AI, the Sparrow series is poised to see more fine-tuned, domain-specific iterations that will cater to various industries and needs. Along with a 7B parameter variant coming next.
Download: