What's new

Welcome to App4Day.com

Join us now to get access to all our features. Once registered and logged in, you will be able to create topics, post replies to existing threads, give reputation to your fellow members, get your own private messenger, and so, so much more. It's also quick and totally free, so what are you waiting for?

LangChain on Azure - Building Scalable LLM Applications

V

voska89

Moderator
Joined
Jul 7, 2023
Messages
42,387
Reaction score
0
Points
36
bb3c9425d3445807114adecfa7efe1ec.jpeg

Free Download LangChain on Azure - Building Scalable LLM Applications
Published 1/2024
Created by Markus Lang
MP4 | Video: h264, 1280x720 | Audio: AAC, 44.1 KHz, 2 Ch
Genre: eLearning | Language: English | Duration: 36 Lectures ( 3h 6m ) | Size: 1.5 GB​

Navigating the Azure Landscape: A Pathway from Basic AI Applications to Scalable Microservices
What you'll learn:
Azure account structure and resource group management
Manage Files in the cloud with Blob Storage
Azure Cognitive Search & PgVector as Vector Databases
Utilize PgVector and the Indexing API for data retrieval
Manage container images using Azure Container Registry
Deploy and monitor Azure App Services
Azure Functions and Event Grid for event-driven architecture
Apply security measures to protect Azure app services and databases
Requirements:
Intermediate Python Skills (OOP, Datatypes, Functions, modules etc.)
Familiarity with the terminal
Basic Docker knowledge
Basic to intermediate LangChain knowledge - VectorStores, RAG, Agents etc.
Description:
Dive into the depths of Azure and Large Language Model (LLM) applications with this comprehensive course. Starting with the initial setup of Azure account structures and resource groups, moving to the practical management of Azure Blob Storage, this course equips you with the essential skills to navigate and utilize Azure's extensive offerings.We then delve into different vector stores, such as Azure Cognitive Search and PgVector, comparing their advantages and disadvantages. You will learn how to chunk raw data, embed it, and insert it into the vector store. A typical Retrieval Augmented Generation (RAG) process is performed on the vector store, primarily using Jupyter notebooks for this part of the course.After covering the basics, we transition from notebooks to using docker-compose to locally spin up services. We'll delve deeply into how these services work.The next step is deploying these services to the cloud, where we learn about new services like the Container Registry and App Service.Once the Web Apps are set up, we implement an event-driven indexing process with Blob Triggers, the Event Grid, and Azure Functions to index documents upon changes in Blob Storage.The final chapters cover basic security measures, such as setting up a firewall for the database and IP-based access restrictions.This course is tailored for individuals with foundational knowledge of Python, Docker, and LangChain and is perfect for anyone looking to build real applications with a production-grade architecture, moving beyond simple playground apps with Streamlit.
Who this course is for:
LLM Enthusiasts who are tired of simple LangChain & Streamlit applications and want to reach the next level
Homepage
Code:
https://www.udemy.com/course/langchain-on-azure-building-scalable-llm-applications/




Recommend Download Link Hight Speed | Please Say Thanks Keep Topic Live
No Password - Links are Interchangeable
 
Top Bottom