Viral News

Optimizing Databricks LLM Pipelines with DSPy

Optimizing Databricks LLM Pipelines with DSPy

If you’ve been following the world of industry-grade LLM technology for the last year, you’ve likely observed a plethora of frameworks and tools in production. Startups are building everything from Retrieval-Augmented Generation (RAG) automation to custom fine-tuning services. Langchain is perhaps the most famous of all these new frameworks, enabling easy prototypes for chained language model components since Spring 2023. However, a recent, significant development has come not from a startup, but from the world of academia. In October 2023, researchers working in Databricks co-founder Matei Zaharia’s Stanford research lab released DSPy, a library for compiling declarative language model calls into…
Read More
Hugging Face Autotrain – Getting Started

Hugging Face Autotrain – Getting Started

Autotrain is a no-code platform from Hugging Face to train, evaluate, and deploy machine learning and deep learning models. In this article, we will use Hugging Face Autotrain to train a Small Language Model (SLM). The Hugging Face Autotrain platform offers several functionalities for training: Computer Vision models Machine Learning models And LMs & LLMs However, in this article we will focus on training a language model for instruction following using the Autotrain platform. Although we can directly access Autotrain from their platform, we will use local installation. So, in way, we will use some code rather than the no-code…
Read More
Self-Driving Cars vs. Coding Copilots

Self-Driving Cars vs. Coding Copilots

Back in the mid-2010s, the world of autonomous vehicles was making great progress, and it seemed that we would soon be ushered around in cars that drove themselves, leaving us free to spend our time how we wanted. That obviously hasn’t happened, but instead, we’ve been treated to a form of AI we weren’t expecting: generative AI-powered copilots. Following the launch of ChatGPT in late 2022, the world of generative AI has been on a tear. Every company seems to be investing in large language models (LLMs) to build one of the two most visible forms of GenAI: chatbots and…
Read More
Data Machina #252

Data Machina #252

Diffusion, FM & Pre-Trained AI models for Time-Series. DeepNN-based models are starting to match or even outperform statistical time-series analysis & forecasting methods in some scenarios. Yet, DeepNN-based models for time-series suffer from 4 key issues: 1) complex architecture 2) enormous amount of time required for training 3) high inference costs, and 4) poor context sensitivity.Latest innovative approaches. To address those issues, a new breed of foundation or pre-trained AI models for time-series is emerging. Some of these new AI models use hybrid approaches borrowing from NLP, vision/ image, or physics modelling, like: transformers, diffusion models, KANs and state space…
Read More
The Role of Synthetic Data in Cybersecurity

The Role of Synthetic Data in Cybersecurity

Data's value is something of a double-edged sword. On one hand, digital data lays the groundwork for powerful AI applications, many of which could change the world for the better. Conversely, storing so many details on people creates huge privacy risks. Synthetic data provides a possible solution. What Is Synthetic Data? Synthetic data is a subset of anonymized data – data that doesn't reveal any real-world details. More specifically, it refers to information that looks and acts like real-world data but has no ties to actual people, places or events. In short, it's fake data that can produce real results. In…
Read More
Announcing General Availability of Liquid Clustering

Announcing General Availability of Liquid Clustering

We’re excited to announce the General Availability of Delta Lake Liquid Clustering in the Databricks Data Intelligence Platform. Liquid Clustering is an innovative data management technique that replaces table partitioning and ZORDER so you no longer have to fine-tune your data layout to achieve optimal query performance.  Liquid clustering significantly simplifies data layout-related decisions and provides the flexibility to redefine clustering keys without data rewrites. It allows data layout to evolve alongside analytic needs over time – something you could never do with partitioning on Delta.  Since the Public Preview of Liquid Clustering at the Data and AI Summit last year, we’ve…
Read More
Top Data Validation Tools for Machine Learning in 2024

Top Data Validation Tools for Machine Learning in 2024

Image generated with MidjourneyIt was challenging to stop myself from starting this article with some variation of the popular phrase "garbage in, garbage out." Well, I did it anyway. But jokes aside, we can easily imagine a situation in which we have built and deployed a machine learning model (possibly a black box) that accepts some input and returns some predictions. So far, so good.However, with tons of complexity happening before the model (data preprocessing and manipulation), the model itself, and any post-processing of the outputs, many things can go wrong. And in some mission-critical fields (finance, healthcare, or security),…
Read More
Fast Stochastic Policy Gradient: Negative Momentum for Reinforcement Learning

Fast Stochastic Policy Gradient: Negative Momentum for Reinforcement Learning

arXiv:2405.12228v1 Announce Type: new Abstract: Stochastic optimization algorithms, particularly stochastic policy gradient (SPG), report significant success in reinforcement learning (RL). Nevertheless, up to now, that how to speedily acquire an optimal solution for RL is still a challenge. To tackle this issue, this work develops a fast SPG algorithm from the perspective of utilizing a momentum, coined SPG-NM. Specifically, in SPG-NM, a novel type of the negative momentum (NM) technique is applied into the classical SPG algorithm. Different from the existing NM techniques, we have adopted a few hyper-parameters in our SPG-NM algorithm. Moreover, the computational complexity is nearly same…
Read More
Focus on Low-Resolution Information: Multi-Granular Information-Lossless Model for Low-Resolution Human Pose Estimation

Focus on Low-Resolution Information: Multi-Granular Information-Lossless Model for Low-Resolution Human Pose Estimation

arXiv:2405.12247v1 Announce Type: new Abstract: In real-world applications of human pose estimation, low-resolution input images are frequently encountered when the performance of the image acquisition equipment is limited or the shooting distance is too far. However, existing state-of-the-art models for human pose estimation perform poorly on low-resolution images. One key reason is the presence of downsampling layers in these models, e.g., strided convolutions and pooling layers. It further reduces the already insufficient image information. Another key reason is that the body skeleton and human kinematic information are not fully utilized. In this work, we propose a Multi-Granular Information-Lossless (MGIL) model…
Read More
The Arabic Noun System Generation

The Arabic Noun System Generation

arXiv:2405.11014v1 Announce Type: new Abstract: In this paper, we show that the multiple-stem approach to nouns with a broken plural pattern allows for greater generalizations to be stated in the morphological system. Such an approach dispenses with truncating/deleting rules and other complex rules that are required to account for the highly allomorphic broken plural system. The generation of inflected sound nouns necessitates a pre-specification of the affixes denoting the sound plural masculine and the sound plural feminine, namely uwna and aAt, in the lexicon. The first subsection of section one provides an evaluation of some of the previous analyses of…
Read More
No widgets found. Go to Widget page and add the widget in Offcanvas Sidebar Widget Area.