Executive Overview: The Rise of Open Foundational Models

Executive Overview: The Rise of Open Foundational Models


Moving generative AI applications from the proof of concept stage into production requires control, reliability and data governance. Organizations are turning to open source foundation models in search of that control and the ability to better influence outputs by more tightly managing both the models and the data they are trained on.

Databricks has assisted thousands of customers in evaluating use cases for generative AI and determining the most appropriate architecture for their organization.

Our customers have shared with us the challenge of building and deploying production-quality AI models, which is often difficult and expensive. As a result, most CIOs are not comfortable taking the models into production. There are various reasons for this, such as lack of control, ownership and quality, unpredictable performance, and the high costs associated with scaling these foundational models.

We noticed a change in our customers’ behavior. More and more organizations were adopting open source models to improve their efficiency and reduce costs. As a response, we developed DBRX, a state-of-the-art open LLM that enables organizations to utilize their own data to create their own LLMs. With DBRX, organizations have full control over their data, and the security and quality of the model, and lower the cost.

Lack of control and ownership

Tools like ChatGPT are great, but they were built with consumers in mind and using foundational models like GPT-4 introduces all sorts of issues around accuracy, safety, and governance, not to mention what happens to your proprietary data when you send it to the cloud.

With DBRX and the Data Intelligence Platform, you can eliminate these challenges and tackle GenAI with confidence. DBRX enables enterprises to replace proprietary SaaS models with an open source model for better control by customizing it to your organization’s specific needs, data and IP for competitive advantage – no longer do you have to send sensitive data into the cloud and into proprietary tools. With Databricks, have complete ownership over both the models and the data. We enable you to use your own data to build GenAI solutions by augmenting DBRX through RAG (retrieval augmented generation), fine-tuning or pre-training and building your own custom LLM from scratch. DBRX and the Data Intelligence Platform make delivering production-quality models a reality.

An LLM that understands the enterprise

Databricks is focused on maximizing the safety and accuracy of the output generated by your models. It’s one thing if a model hallucinates or provides inaccurate results to a consumer prompt in ChatGPT – but it has entirely different repercussions if that happens in the enterprise, repercussions that could result in damages to your bottom line and brand in the market. However, ensuring quality experiences is a complex problem. Databricks simplifies this process by handling the management of all aspects of the ML lifecycle—from data ingestion, featurization, model building, tuning, and productionization—all from a single platform.

The Databricks Data Intelligence Platform has a suite of tools that can be used with DBRX to ensure the quality and accuracy of model outputs. RAG is one pattern that can be used to reduce hallucinations and make your model more reliable. When a prompt comes in, it can help you find relevant documents about that prompt using vector search and bring those documents into the content of the model and output an answer to the question.

In addition, The Data Intelligences Platform provides monitoring of your DBRX models around model quality, hallucination, toxicity, etc. This is important when it comes to the outputs, ensuring once the model has generated a response, it can help you detect things like PII data or other data that needs to be filtered. So in an enterprise context, you need to do all these things – you can’t rely on the model to just produce the raw outputs. This monitoring provides the checks and balances and brings in the right data to make your models accurate and reliable.

Lastly, it is important to ensure security and access controls are in place, guaranteeing that users who shouldn’t have access to data won’t get it. And with end-to-end lineage, you can be confident that your models are auditable from data through production. All of this is made possible with Databricks when building on DBRX. These capabilities let you easily move multiple models and use cases from POCs into production in a standardized and governed way.

Building cost-efficient LLMs

Organizations who are building their own models (fine-tuning or pre-training), want to achieve the same quality of models as ChatGPT for their domains, but at an accessible cost. Databricks enables enterprise organizations to train and deploy DBRX that is cost effective at scale while getting similar results as SaaS providers.

What’s interesting about DBRX is that it beats other open source models, as well as ChatGPT (GPT-3.5) using standard benchmarks on language understanding, programming, math and logic. You can read more about how it was built, and trained, the benchmarks and how to access the model in Hugging Face and github in the link above. These performance improvements not only provide better accuracy but also better performance.

We have built an optimized software stack specifically for building large models. It uses a combination of different techniques like tuned parallelism for increased compute utilization, auto-recovery, automatic memory usage adjustments, and stream datasets in real-time. This platform has a proven track record of lowering costs up to 10x.

Finally, Databricks also helps lower costs by making purpose-built, smaller-sized models available that can be augmented or fine-tuned with your data. These smaller models, augmented with your data can give similar performance as larger Foundation models, at a fraction of the cost.

Getting Started with Open Source LLMs

DBRX was built in and on Databricks, so your team can use the same tools and techniques that we built DBRX, and create their own model or improve their own high-quality models at a fraction of the cost. And there are many companies already doing this today like JetBlue, Block, NASDAQ and Accenture.

DBRX along with the Data Intelligence Platform ushers in a new wave of flexibility, regardless if your team has existing models or wants to build new ones. It provides complete ownership of both models and data, it provides faster, more reliable deployments across multiple use cases, and it enables your team to build LLMs at scale at lower costs. This is why many organizations are building their Generative AI solutions with Databricks.

Getting Started with DBRX on Databricks is easy with the Databricks Mosaic AI Foundation Model APIs. You can quickly get started with our pay-as-you-go pricing and query the model from our AI Playground chat interface. To privately host DBRX, you can download the model from the Databricks Marketplace and deploy the model on Model Serving.

Learn more about how to leverage the power of open source LLMs and the Data Intelligence Platform by registering for Data+AI Summit



Source link
lol

By stp2y

Leave a Reply

Your email address will not be published. Required fields are marked *

No widgets found. Go to Widget page and add the widget in Offcanvas Sidebar Widget Area.