Skip to main content

Posts

Showing posts with the label OpenAI

Run Your OpenAI SWARM Agents Locally With Open Source Model - 100% 🆓

In this article we will see how we can run our agents locally which means we will be using OpenAI Swarm framework but still we will not be paying anything to OpenAI as we will not be utilizing OpenAI's API key. Using OpenAI's Swarm but not using OpenAI's key, got confused?  Well, we will be achieving this using Ollama :) Now, before we proceed, if you have not watched my earlier video on what is OpenAI-Swarm and how to get started with it, I would recommend you check this one here:  Here is link of the GitHub repository containing Swarm's source code and implementation details. What Are We Trying To Do? We will create our own agent utilizing  OpenAI-Swarm framework using Ollama and the open-source model Llama3.2:1b . This agent will run locally on our machine without the need to any API key from OpenAI . Setting Up The Things Install Swarm We need to install Swarm from GitHub as it is still in experimental stage and that can be done by running below command: pip...

How To Generate SQL Queries Using Azure OpenAI

In the ever-evolving landscape of data management and artificial intelligence, the ability to generate SQL queries using natural language inputs is a game-changer. Azure OpenAI , with its powerful language models, offers a seamless way to translate natural language into SQL queries, making data interaction more intuitive and accessible. This article will guide you through the process of leveraging Azure OpenAI to generate SQL queries, enhancing your data querying capabilities. Introduction Azure OpenAI Service provides access to OpenAI’s powerful language models through the Azure platform. These models are capable of understanding and generating human-like text, making them ideal for tasks such as natural language processing, text generation, and, importantly, translating natural language into SQL queries. Setting Up Azure OpenAI and Azure SQL DB Before you can start generating SQL queries, you need to set up your Azure OpenAI environment. Here are the steps. Create an Azure Account...

Generate The Streamed Response Using Semantic Kernel

Semantic Kernel , a powerful tool for integrating large language models into your applications, now supports streaming responses. In this blog post, we’ll explore how to leverage this feature to obtain streamed results from LLMs like AzureOpenAI and OpenAI. Why Streamed Responses Matter When working with language models, especially in conversational scenarios, streaming responses offer several advantages: Real-time Interaction: Streaming allows you to receive partial responses as they become available, enabling more interactive and dynamic conversations. Reduced Latency: Instead of waiting for the entire response, you can start processing and displaying content incrementally, reducing overall latency. Efficient Resource Usage: Streaming conserves memory and resources by handling data in smaller chunks. How to Use Streaming Responses I've published a complete video on how to generate this using Python and that can be found on my YouTube channel named Shweta Lodha . Here, I'm jus...

How To Hide Sensitive Data Before Passing To LLM-OpenAI

In my previous article “ Passing An Audio File To LLM ”, I explained how one can pass an audio file to LLM. In continuation to that, I’m extending this article by adding more value to it by addressing the use case of sensitive information. Let’s say, an audio file contains information about your bank account number, your secure id, your pin, your passcode, your date of birth, or any such information which has to be kept secured. You will find this kind of information if you are dealing with customer facing audio calls, specifically in finance sector. As these details, which are also known as PII (Personal Identifiable Information), are very sensitive and it is not at all safe to keep them only on any server. Hence, one should be very careful while dealing with such kind of data. Now, when it comes to using PII with generative AI based application, we need a way wherein we can just remove such information from data before passing that to LLM and that’s what this article is all about. In...

Get Answers From Audio Without Listening

In this article, I’ll explain about how we can pass an audio file to LLM and I’m taking OpenAI as our LLM . There are many people who prefer audio and video tutorials over reading along with our podcast lovers as listening seems to be more effective for them as compared to reading a book, an e-book or an article, and it is quite common that after a certain period of time, we may forget some of the portions of our tutorial. Now, in order to get the insights again, re-watching or re-listening is the only option, which could be very time-consuming. So, the best solution is to come up with a small AI-based application by writing just a few lines of code which can analyze the audio and respond to all the questions that are asked by the user. Here, utilizing generative AI could be the best option, but the problem is, we can’t pass audio directly as it is text-based. Let’s deep dive into this article, to understand how we can make this work in a step-by-step fashion. If this is what that int...

Integrating ChatGPT With Google Docs

In this article, I’ll explain how you can integrate ChatGPT inside Google docs and utilize the capabilities of any text based OpenAI model of your choice. If you are not aware what Google docs is — it is an offering by Google, where in you can create and collaborate documents online. Setting Up Google Docs You can go to https://docs.google.com/, login with your Gmail id and you are all set. If you have already created any document, you can open that otherwise you can go ahead and open a new document. New document will look something like this: In the above window, click on Extensions button and select Apps Script: On click of Apps Script will open up an editor wherein we will write our code. If you're looking for complete source code and explanation, then feel free to check out my article on Medium . Alternatively you can watch the video here .