Azure Open AI - Developing solutions with Large Language Models. Azure OpenAI Service provides access to OpenAI's powerful large language models such as ChatGPT, GPT, Codex, and Embeddings models. These models enable various natural language processing (NLP) solutions to understand, converse, and generate content. Users can access the service through REST APIs, SDKs, and Azure OpenAI Studio.
We will explore how to take advantage of large-scale, generative AI models with deep understandings of language and code to enable new reasoning and comprehension capabilities for building cutting-edge applications. We will apply these coding and language models to a variety of use cases.
The training will also explore how to detect and mitigate harmful use with built-in responsible AI and access enterprise-grade Azure security.
By attending this course, you will gain the following skills:
Large Language Models and Azure Open AI – an introduction
In this module you will get a high-level perspective on Large Language Models (LLM), the background and the opportunities you get with Azure Open AI.
Implement solutions for Azure Open AI
In this module we will explore how to build clients with Azure Open AI – REST-based and SDKs. You will work with the following topics:
Azure Prompt Flow
Azure Machine Learning prompt flow is a development tool designed to streamline the entire development cycle of AI applications powered by Large Language Models (LLMs) It simplifies the process of prototyping, experimenting, iterating, and deploying your AI applications.
In this module we will explore:
Azure AI Studio
Build, evaluate, and deploy your AI solutions from end to end with Azure AI Studio
Create and customize end-to-end models
In this module we will explore useful Azure Open AI resources and code samples to help you get started and accelerate your technology adoption journey.
The Azure OpenAI service provides REST API access to OpenAI's powerful language models on the Azure cloud. These models can be easily adapted to your specific task including but not limited to content generation, summarization, semantic search, and natural language to code translation. We will access the service through REST APIs, Python SDK, .NET SDK.
Build you own copilot
To help developers build their own Copilot experiences on top of AI plugins, Microsoft have released Semantic Kernel, a lightweight open-source SDK that allows you to orchestrate AI plugins. With Semantic Kernel, you can leverage the same AI orchestration patterns that power Microsoft 365 Copilot and Bing in your own apps, while still leveraging your existing development skills and investments.
In this module we will explore the possibilities.