OLMo is Here, Powered by Mosaic AI + Databricks

neub9
By neub9
2 Min Read

As Chief Scientist (Neural Networks) at Databricks, I lead our research team toward the goal of giving everyone the ability to build and fine-tune AI models with their own data. In 2020, I was part of a small group of machine learning academics and industry veterans that founded MosaicML. We have always been committed to supporting open scientific inquiry, both by sharing our knowledge and providing tools to the community. Since joining Databricks, which shares similar academic roots, we have only deepened that commitment.

With that spirit in mind, we have been collaborating with scientists from the nonprofit Allen Institute for AI (AI2) on everything from technical knowledge-sharing to today’s big announcement: OLMo. In my opinion, AI2 is one of the best NLP labs in the world, even more so because they conduct their cutting-edge research with the unrestrained creativity, commitment to integrity, and resources of a non-profit. We’ve found common ground in a belief in openness, a passion for doing rigorous science, and a love of building artifacts that we put into the hands of the community.

Today AI2 is releasing OLMo 7B, an open source, state-of-the-art large language model. We’re proud to have supported their work: OLMo (short for Open-source Large Language Model) was trained using the Mosaic AI Model Training Platform. The AI2 team is also sharing the pre-training data and training code used to develop this model (which is a derivative of the MosaicML LLM Foundry).

We’re thrilled to have played a part in the success of the OLMo project, but I want to give credit where credit is due. We shared our tools, but they did the hard work of building the models. To learn more about the OLMo 7B model and its variants, check out AI2’s blog post or the model card on Hugging Face.

Share This Article
Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *