Baseten Blog | Page 9
Models We Love: July 2023
Explore open source foundation models: Llama 2 (Meta/Microsoft), FreeWilly1/2, SDXL 1.0 (Stability AI), LayoutLM (Inspira), NSQL 350M (Number Station).
Model autoscaling features on Baseten
Scale replica count up and down in response to traffic, with scale to zero and fast cold starts.
Models We Love: June 2023
Dive into open source foundation models, focusing on LLMs: Falcon-7B/40B, WizardLM, MusicGen (Meta), MPT-7B (Mosaic).
New in June 2023
LangChain adds Baseten integration, Falcon soars to the top of the LLM leaderboard
Three techniques to adapt LLMs for any use case
Prompt engineering, embeddings, vector databases, and fine-tuning are ways to adapt Large Language Models (LLMs) to run on your data for your use case
What I learned from my AI startup’s internal hackathon
See hackathon projects from Baseten for ML infrastructure, inference, user experience, and streaming
Deploy Falcon-40B on Baseten
Deploy Falcon-40B and Falcon-7B, top-ranked open-source LLMs on HuggingFace, to Baseten's production-ready ML infrastructure.
Deploy open-source models in a couple clicks from Baseten’s model library
An explanation of how Baseten's model library works for deploying and serving popular open-source models.
Getting started with foundation models
Summarizing foundation models, focusing on data type, scale, in-context learning, and fine-tuning, illustrated with Meta's LLaMA model family.