Agentic AI Protocols: MCP, A2A, and ACP

🚀 Add to Chrome – It’s Free - YouTube Summarizer

Category: AI Protocols

Tags: A2AAIIntegrationMCPProtocols

Entities: A2AAnthropicData Science DojoGoogleGoogle DocsHubSpotLangchainMCPMicrosoft OutlookZad AhmedZapier

Building WordCloud ...

Summary

    Introduction
    • Zad Ahmed introduces himself as an instructor at Data Science Dojo, specializing in AI products and protocols.
    • The webinar focuses on agentic AI protocols, particularly MCP and A2A.
    Agentic AI Protocols
    • Agentic AI protocols facilitate communication between AI agents and external tools.
    • Model Context Protocol (MCP) is a popular protocol for AI engineers, allowing tools to be called like APIs but with enhanced capabilities.
    • Agent to Agent (A2A) communication allows structured collaboration between AI agents in an organization.
    MCP Integration
    • MCP allows AI agents to access external tools using standardized protocols, improving development speed and reducing complexity.
    • Examples include integrating with Zapier to connect with thousands of applications.
    • MCP servers and clients enable a decoupled architecture, enhancing microservices integration.
    A2A Protocol
    • Developed by Google, A2A enables hierarchical collaboration between AI agents, improving task distribution and security.
    • A router agent delegates tasks to specialized worker agents based on their capabilities.
    Practical Exercise
    • Participants were guided through setting up an MCP server with Zapier and integrating it with tools like Google Docs and Microsoft Outlook.
    Key Takeaways
    • Agentic AI protocols enhance AI capabilities by enabling communication with external tools.
    • MCP provides a standardized way for AI agents to discover and utilize external tools, improving development efficiency.
    • A2A protocol allows for efficient task distribution among AI agents, enhancing performance and security.
    • Microservices architecture in MCP ensures decoupled and scalable integration of AI tools.
    • Practical exercises with Zapier demonstrate the ease of integrating AI protocols with various applications.

    Transcript

    00:00

    All right. Hello everyone.

    I hope you all are having a great day. So I welcome

    00:17

    you all to this webinar where we'll be discussing agentic AI protocols, MCP and other protocols as well. So I'm going to get started with my introduction first.

    I am Zad Ahmed. I've been working here at data science dojo for more than two

    00:33

    years now. Uh we have been building some cool uh AI products.

    We have our own uh uh SAS platform that we call it as A Asento. We have been building it for quite some time now.

    So other than that

    00:50

    uh I also uh do uh regular teaching in Agentic AI and LLM boot camp. So that's uh the short introduction about myself.

    So we'll be taking out all of your questions. So uh at the end of this

    01:07

    session. So feel free to just drop your questions in the chat and I'll be going over it uh after the end of the session.

    So let's get started with today's topic. Uh the agenda for today's is we're going

    01:23

    to discuss agentic AI protocols. uh why do we need agentic AI protocols?

    What is an agentic behavior? What is MCP which is most popularly used among uh the AI engineers which is model context

    01:40

    protocol. Then there is A2A which is agent to agent communication.

    Uh okay, I saw a message from Sunita. Uh yes, the recording will be on YouTube and the slides would be shared and even

    01:58

    the repository on which we'll be doing our hands-on exercise would also be shared with you guys later after the end of this webinar. So we'll be doing a hands-on exercise.

    Uh there's Zapier. If you guys have heard about it, Zapier

    02:14

    enables us to connect with 8,000 applications, 35,000 different tools. So, we'll be going over uh Zapier MCP integrations as well.

    Uh so, this is the agenda for today's webinar. Let's get

    02:30

    started with our first topic which is what is an agentic behavior? Why do we need an agentic behavior?

    So previously it all started with if you remember in 2023 this LLM error started with just uh with

    02:47

    a simple LLM call. It used to ask it used to answer questions but you but you you you would have seen that the knowledge for LLMs were only restricted to 2022 2023.

    So,

    03:04

    so the advancement of chat GPD, let's get started from there. In 2023, we had Chad GBD.

    So when we used to ask who is the current president of USA or who is the current

    03:19

    president of UK the the LLM used to respond with that I'm not capable of answering this because my knowledge cut off is cut off knowledge cut off was till 2022

    03:36

    then uh the advancement came towards using lang chains and people started integrating ing web search tool, web search capabilities in uh chat, GBT and other models. So after this new

    03:51

    advancement which we call it as uh passing the context through uh passing the context through different searches to our LLM. So there was a concept for passing uh retrieval

    04:07

    augmented generation. So there was rag that was most popular in 2024 2023.

    So there was rag and then LLM started improving their uh knowledge by additional information getting the

    04:23

    context from different uh areas. So this was the start of I would say the agentic behavior because we were passing out uh the additional information additional context in form of uh prompts.

    So we

    04:41

    used to pass context and the user used to ask questions. So the LLMs were able to respond it correctly because of the extra context we are sharing in the process.

    04:57

    So this all started with uh and then lang chain ad advanced capabilities improved the LLM's responses as well. Okay.

    So then uh keeping that in mind

    05:14

    there was uh this new concept of agentic AI protocols. So if I just start with uh how agentic AI protocols uh came into being.

    So I would always give an example of this uh HTTP inspired standardized

    05:33

    agent. So if I go so if most of you have used these REST APIs.

    So previously companies used to build rest APIs so that uh customers can uh utilize the the get post put delete

    05:51

    methods and get the additional or the context from the application they have built. Let's say if we talk about SQL database or if we talk about HubSpot.

    So we usually had rest APIs

    06:10

    that we used to get data out of it. So this was where the new concept came into being of model context protocol that people uh started moving towards

    06:27

    tools. So I would go over some of the MCP servers today.

    We'll look into the code how they have built their MCB servers. But the concept was exactly similar.

    You used to call REST APIs to get the

    06:42

    information from uh applications, databases. Now we are using tool wrappers on those REST APIs.

    So there are some tool wrappers that underneath it they are just calling

    06:58

    those APIs but they are I would say uh sugarcoated rappers that query execute learn reason all of this on their own. So this was the concept uh

    07:15

    so this was uh the concept of uh moving from HTTPS to a tools toolsbased approach and this was one of the thing that uh every company started adapting to it

    07:32

    after anthropic released this protocol. All right.

    So [Music] okay uh so yes we'll be sharing the slides as well uh and the recording will also be

    07:48

    shared. So now we are we are advancing towards our our protocols.

    So what we have understood what we have learned so far is there's an agentic behavior agents LLMs are calling tools for additional

    08:06

    information. uh tools are just uh sugarcoated rappers on REST APIs.

    They are just getting the same information that REST APIs used to get for us but now they are in a form of rapper that LLM can can call these tools

    08:22

    easily because they have their prompts they have everything set up there. So coming back to it, this model context protocol is uh a similar kind of infrastructure that you

    08:39

    can see a USBC ports that every MCP server, every business logic remains uh decoupled from your entire architecture. Let's say you have your MCP client.

    This is something that you

    08:54

    can use. There are uh multiple clients such as fast MCP uh MCPUs and then there is also lang chain adapters as well.

    So there are a number of repositories that you can see over the GitHub about these. So MCP clients

    09:11

    are there. All you need to do is plug and play.

    You create you set up your MCP server on a different web app. All of the business logic would remain in that web application.

    the client it it's a similar kind of

    09:28

    approach that we used to use in REST APIs. The client will call a particular tool.

    Let's say you ask the the user query was send an email to this uh respected person. Let's say I would write a query send email

    09:46

    send email to Sanjay. So what the LLM would do?

    LLM would go and pick the Gmail uh tool which sends email. Send email.

    Let's say Gmail has a

    10:02

    tool which sends email or let's say an API with a tool wrapper. So MCP client would would call this tool would call this tool.

    The tool will execute its business logic

    10:18

    in the server. Business logic would be executed in the server.

    And then the tool after the after the tool after the response has been after the email has been sent sent the response would be sent to the client

    10:33

    that yes your uh email has been sent. Then the client is your it could be anything your chat GPT your claudic uh ID or any they could be cursor VS code.

    10:49

    So there could be any id which which would be serving as a client and this would respond to the user that we have sent an email to this uh respected person and the agenda and all the details have been sent successfully.

    11:07

    So this microervices architecture, this decoupled architecture enables uh additional knowledge, additional context, additional power to the MCP, to your LLM client, to your agents. So now

    11:25

    agents have improved so far that from the start they had some cut of knowledge. they didn't had uh this much amount of tools but now uh all of this is coming in a

    11:41

    closed loop that you have your MCB servers you have uh you have your business logic in a decoupled architecture and then your client is just uh calling those tools making it execute the work and then the rest uh of

    11:56

    the business logic remains in those MCP servers all right so any questions so far just drop it in the chat. We'll be answering it uh even in the between of the session and at the end of the session as well.

    12:12

    So MCP enables effective tool integr integration for LLM agents. So this is the protocol that enthropic set it up.

    It is standardized how LLM agents discover, access, utilize external tools and resources. So the first step for an

    12:30

    MCP uh server or MCP client let's say you you have set it up in chat GPT or cloud. So the first step is to discover the tool how many tools that are available to this

    12:47

    um Suna is saying can we say MCP is a REST API? I would say it is uh a second layer to rest APIs because it is more than a REST API.

    You you used to add parameters in REST APIs but in tools and MCPS you add prompts, you add custom

    13:05

    instructions, you add all of the details related to it and then you call that API. So, so this is the right point.

    I think we can go and look into one of the MCP. I would always choose this HubSpot MCP server.

    I'm going to share the this link

    13:23

    in the chat as well. You guys can also explore.

    This is an official HubSpot MCB server. So if we go and look at the code of this HubSpot MCB server, you can see that there are different prompts that this server has.

    There are different

    13:39

    tools. So if you go and look in the tools, these are different uh associations engagements.

    If you have worked with HubSpot, these are different uh actions that you can take on HubSpot. There are engagements, links, objects,

    13:55

    properties, workflows. So there there are tools for every API.

    So this used to be an API, get workflow, now uh list workflow. This used to be an APIs, but now they have converted them into

    14:11

    tools. So if you go over this and you can find that it has a tool name, description, the purpose has been written, the usage guidelines are there, the input schema is there.

    So LLM is responsible for putting the input schema and then it

    14:29

    will go and try this API. You can see over here that it is just calling this API with the right uh it with the right number of uh arguments into it.

    So LLM is responsible for adding arguments in into

    14:46

    it after reading the input schema and then it just simple try catch block that we used to write it has created a tool on top of it. So organizations were used to build a rest APIs previously.

    Now they are building tools on top of it. So

    15:05

    this is a uh this is an open-source repository that you can go and look into how HubSpot have create HubSpot has created their own MCB server. Okay.

    So

    15:20

    let me take one question. So you can build a custom MCP based on the context we have.

    Yes, there is also one more thing. If you have your uh Postman API collections, there is a uh there is a a Postman have set this up

    15:37

    that you can just convert your API collection into MCB server. So if I go over this Postman uh API collection to MCB server.

    So it is as easy as you used to build collections on Postman. You can create

    15:53

    your own custom MCP servers as well. So over here you can go into this resource.

    You can create your MCP servers. If you have Postman collections, try creating out.

    I think within 15 20 minutes you would be uh you will have your own

    16:10

    custom MCP server. So I'm going to share it in the chat as well.

    This is super powerful. Think of it as uh whoever is building a API collection can also build their MCP server and start renting out it to other

    16:26

    people. Okay.

    So there is this uh question from Abdul Razak. How many tools and MCP can expose?

    What if we have 100 or 10 or thousand tools? How can how many MCPS can I incorporate in my LLM application?

    16:43

    Yes. Context window.

    Yes. Uh so I read a research paper on this that entropic advice not to share more than 40 tools with an LLM.

    This is the limitation or hard limit. You I you can go over it and look into resources about it.

    Uh more

    17:02

    than 40 tools your agent would start hallucinating. Your agent would start calling uh not exact tools accordingly.

    Okay. So moving forward to our slide

    17:17

    deck. So now we know that it just people used to build rest APIs now they are building tool wrappers around it.

    So if we go back to this there is the first step for any MCP client is to discover how many tools you have. Then after it

    17:35

    has discovered how many tools we have it start doing the function calling. it takes into the input schema and and then the LLM fills out the uh from the user query the LLM understand uh the intent

    17:50

    of that query and then prepared the arguments for the API that needs to be called. So this function calling has a lot of things to do.

    It takes into the account of input schema of that API and then based on the user intent it will

    18:07

    fill out the arguments in the API and do the rest API call underneath the function calling. Then there is the third process that is uh uh third step which is result uh processing.

    So after the business logic

    18:25

    has been executed after the tool has been called the result would be processed in it and then there is uh context integration. So this is a standardized discovery.

    So uh anthropic actually standardized this

    18:42

    protocol uh among all of the companies that have been building uh these MCB servers. So the benefits of this is that it is uh there is a faster development for two augmented LLMs 40 50% faster

    18:58

    development. Now there is result uh uh error handling robust failure management protocols a standardiz discovery and a consistent invocation that all the tools everything uh uh follows the same JSON

    19:13

    based function calling interface. So the implementation benefits of MCP is that organizations implementing MCP reports 60% improvement in crossplatform compatibility and 70% reduction in tool

    19:28

    integration complexity. Previously when we used to build integrations for our LLM agents, it used to take us around few weeks to build connectors.

    Just like we build a connector for teams, we build a connector for outlook. We it used to

    19:46

    take us around weeks to integrate that. But now with the advancement of MCB servers, we can um do that job with 70% reduction in tool integration.

    So

    20:01

    you can share your questions in the Q&A box as well. Um, so can I repeat how schema are used in the calling of tools?

    Yes, of course. So let me go back to the code and we'll go

    20:17

    back to workflows. So you can see over here.

    So these are the details we are passing to our LLM. We pass with tools.

    We pass the name of that tool, the description of the tool. The description of this tool is the purpose.

    This tool

    20:33

    retrieves detailed information about a specific workflow from the HubSpot account. The usage guidance is this.

    Use the flow ID parameter to specify which workflow to retrieve. This is we are teaching all of this to our LLM to call an API.

    20:50

    Consider this that your LLM is a new intern in your company. you're giving the uh documentation to it so that it can start calling the rest APIs and get the information from it.

    So this is the exact same script.

    21:05

    It has this input schema where where you can uh where it defines all of that and with with these details your LLM would be responsible for calling the for calling this tool. This tool would

    21:22

    further call the API which you can see over here that this client.get automation v4 flows arguments dotflow id. So you can see there is at the end of the day there is just a function that is being called by the llm in which we

    21:39

    have written out the API. Um I hope that answers your question.

    Um so okay so please feel free to drop your

    21:55

    questions in the Q&M box and I would be uh replying to all of them uh at the end of the session. Okay so let's go to the next slide and look into this.

    So there is this view of

    22:13

    model context protocol your host with MCP clients cla ids tools or you can say chat GBT or what we built was aento so I'll say just aentto is also an MCB

    22:28

    client now so so there are so this is the MCB protocol that we are using there is an MCB server you can call it as Gmail MCB server uh it as a local data source and then there is another MCB server. There is

    22:44

    another MCB server. It is connected with web APIs to a remote service.

    Let's say it is it is SQL server on a remote basis. This is your pine cone vector database or this could be your Google drive.

    23:07

    All right. So this is a segmentation that we have done that um all of these things they are separated out in a microservices architecture and we are just calling the tools with our MCP client.

    23:23

    Okay. So moving forward from here uh the MCP uh it manages the reliable agent connection life cycles that whenever whenever so there there are number of things that are going on with this MCP

    23:38

    client and server operations there is this MCP client which is let's say ato or you can call it as cursor so they are considering as client because they have the LLM an agent in it and you have your

    23:57

    MCB server over here which is I would say Gmail MCB server. Whenever a user ask a question, whenever a user ask a question, it will go to the client, the LLM would

    24:14

    go and do the function tool call. In order to call the tool in the MCP server, a a connection would be established uh and then it will authenticate the verification of it.

    If it's the right

    24:30

    client that has access to this, then it's then it maintain until unless the server is not replying. When server executes his business logic, executes the logic then it gets back to the client that yes

    24:48

    your work has been your task has been accomplished and it's gracefully disconnected. So the beauty of this is that it has the identity verification every MCP is they are now building or two compliant MCP

    25:05

    servers. So it can further take into accounts of tokens and authentication is being done connection we are monitoring the connection system is resilience and failure reduction is less.

    25:21

    Okay. So going over this uh there is another thing that this agent discovery and registration.

    So how how do uh the MCP how would the

    25:38

    MCP client would understand how many tools it has and we'll be doing this all of this in a hands-on exercise. So you you give an example to the schema or the agents you give the weather agent ID and description and we'll go over this when

    25:55

    we start with our practical exercise. I would go over this practical code as well.

    So for the time being let's skip and we'll come back to it when we get started with our Zap year integration. So so far anyone has any questions please drop it in the chat and we'll go

    26:13

    from there. The other protocol, the second protocol is agent to agent communication which is um so I believe uh agent to agent is uh Google actually introduced this protocol

    26:30

    which is which enables hierarchical agent collaboration in enterprises. So I remember someone was asking about how multi-agent collaboration works.

    So we're going to discuss that and then we're going to move forward to our hands-on exercise.

    26:50

    So agentto agent communication was developed by Google. It enables structured collaboration between agents within organizational boundaries through a hierarchical management approach.

    So there is a manager agent or you can say

    27:06

    CEO who who would delegate the task, coordinate workflows and monitor the progress or you can say a supervisor agent. There are other names to it as well.

    A router agent

    27:21

    and then there are different multiple worker agents that are execute that execute based on a specialized task based on specific capabilities. So the benefit of this agentto agent

    27:36

    protocol is there is a distinct task distribution and intelligent allocation of subtask to specialized agents. There are uh the enterpriseg grade security and policy enforcement can be done over here with governance and security.

    You

    27:52

    can add role based access or one agent would have access to one set of documentation which we'll go over in a bit. So the performance improved 45 there are

    28:08

    45% improvement in performance because there is sole ownership of one agent who is let's say you have uh one agent for HR you have one agent for sales you have one agent for marketing domain

    28:24

    so the beauty of this is that your agent is responsible for performing one set of tasks so this enables This enables the governance and security part as well that your one agent would

    28:40

    have access to the data of HR only. This agent would have uh the data for marketing only and this agent would have data for sales only.

    So this solves the problem for uh this solves the problem for

    28:56

    governance and security and the other things as well. So if I go over this, if I go back, so this protocol actually helps us in task distribution.

    There are

    29:13

    40% reduction in complex workflow completion times because when you give 10 different things to one particular agent, uh there used to be a lot of hallucination with that. So performance is improved task with

    29:29

    task distribution and governance and security. So this is these are the benefits of agent to agent protocol.

    So coming back to it um this router agent so you can take into the account of this that let's say this

    29:45

    router is uh your company's CEO. Whenever a user ask about a question let's say uh uh sales revenue for 2025

    30:02

    it will it will route this question to this sales agent. If there is an uh if there is a question related to HR policies, it will route it to the HR agent.

    Once

    30:19

    once it has shared this question with the HR agent, it will uh reply back with an answer and you'll get the answer from this router. So this router has two tasks to do.

    One is it will route the

    30:34

    query after understanding the intent and the second most important task it has to do is prepare the response. Prepare or generate final response.

    30:56

    So these are the two uh most important tasks that this router u this router agent would be performing. So I see a number of questions that are related to the slides.

    Yes, we'll be sharing the slides and demo code as well

    31:13

    and you will have the recording of the session too. Okay.

    So so similar to this router agent there is also a concept of consolidator. We'll go over it but if anyone has any questions

    31:30

    about this agentto agent protocol I would be happy to respond. So this is agent to agent protocol.

    We are discussing agent AI protocols. We have went through uh MCPS.

    Now we are

    31:46

    discussing agent to agent protocol. So quiet if at the end it's um so uh so pankage your question is why do we need MCPS if it is just calling an

    32:04

    API uh only behind the scenes so it's just one more of a thing so if you if you have used if you have used an API tool earlier to this to MCPS the LLM used to listenate a lot because

    32:21

    the LLM was unaware of API. Uh it was used to be unaware of the input schema for that API.

    It was used to be unaware of it was used to be unaware of um the parameters that this API would require.

    32:38

    So we used to pass a lot we used to pass we used to dump the entire documentation for the API in the prompts. which results in uh context window issues,

    32:53

    context window issues. So there was a lot of uh issues that we used to face with uh the simple API call or the function call with LLMs.

    So the benefit of MCP is that you

    33:10

    can see it's a distributed approach. Every API now is a tool.

    Now it has every API has its own prompts. Let's say there is a uh there is a API for get list or get workflows.

    33:28

    So now now the burden of building those APIs, building those prompts. Now all of the responsibility of this API tool calling and function calling is on the on the company itself who have who has their

    33:45

    APIs. Previously we used to you we what we used to do we used to pick their rest APIs we used to write custom prompts we used to write all of the function calling logic everything.

    So let's say if I'm working

    34:01

    with teams rest APIs, I am not fully aware of all the arguments that particular API required because I'm new to that application. But the person who has built that API would be the best person to write the prompt for that API would be the best person to write the uh

    34:18

    input schema to. So let's take an example.

    If I am uh working in data science dojo and I want to work with teams API, I would there is there is some time that I would spend on those uh teams APIs, I would go into the I would

    34:34

    look into the documentation. I would understand the parameters when API required.

    So if I'm trying to explain an LLM that hey there is this team's API use this and that there is this is the one

    34:50

    example. The other example is the person who has built that API is trying to explain an LLM.

    This is the API. You can use this, you can use it for that, you can use it for that.

    He will know it in and out of that. So consider you have

    35:06

    for teams you have thousands of thousands of APIs. If I'm the person writing the prompt for it, it won't be as uh efficient as the person who has built all those thousands of APIs.

    So you see this MCP protocol it has it has

    35:26

    diverted the responsibility from from a person who is a client or who wants to consume that API to the person who is building the APIs. So now the person who is building the APIs is responsible for writing the prompt is responsible for

    35:42

    making that API available for an LLM in the best place in the best manner as compared to me who was a customer who was just trying to consume that API. So all of this responsibility has been shift to the companies.

    So I would say

    36:00

    MCP is we have shifted the uh responsibilities to the person who is building the APIs. We are making it sure that it is uh it is available for LLMs to interact with

    36:15

    this AP. So now there is one more thing.

    Let's say if I'm a software engineer I'm building an API. I would think of an prompt that I can add into it and I can make it LLM ready.

    So I would say MCP servers are just rest APIs. The person

    36:32

    who are building those rest APIs they are making it available for LLMs to interact with. Previously they used to write documentation for humans to interact with.

    REST API was a protocol that we humans were uh following it. Now

    36:48

    MCP is a protocol that LLMs are following it to call the rest APIs. Can the router agent and the worker agent use different LLM?

    Absolutely. Yes, the router agent and uh worker agent can use different LLMs.

    37:06

    So we have this functionality in our product as well. Uh so you're building a multi- aent system and I want to understand how to use MCPS.

    So, SNITA if you have a multi- aent agentic system you can go and look

    37:23

    into lang chain adapters. Langchain adapters are it's it's a repository.

    I would share it at the end of this class. It converts all of those tools in re in it converts all of those MCP

    37:39

    server tools into something that lang chain langraph agent can consume. So I'm going to share this with uh you at the end of the session.

    Um does A2A protocol also have a

    37:55

    specification similar? Yes, A2A protocol has their own specification own there are multiple repositories that are available for A2.

    How do agent pass their response to another agent? Does it use router for

    38:12

    the consolidated final response? Yes.

    So um all of these agents they they can they talk to the router agent in form of uh JSON response. So they just so this

    38:30

    router is the person who is consolidating the final response. Okay.

    So any other questions? Uh we had open AI specification I believe that shared the contract request response failure definition.

    Yes punks. So previously we

    38:48

    used to have open AI open API specs. We as a human used to use open API specs and we used to make sure that we are calling the right right rest APIs.

    But now we are building our APIs in such a

    39:04

    manner that any other LLM can call it. [Music] Uh how do how to define an agent is manager or worker?

    Maybe the demo. Yes,

    39:20

    we'll be doing a demo on MCPS right now. But uh so how do you define the definition is in system prompt.

    It's just the difference of system prompt which takes into the account in the router and a worker agent. So it is as

    39:37

    so Ravi it is as simple as that that you write a system prompt for the router agent. It will it will uh do its job accordingly and then if you have a worker agent you will define its

    39:52

    capabilities in the system problem. Can an agent be a worker for two different routers?

    Uh yes. Uh in the workflow we have created in our product, uh you can reutilize this agent in a different

    40:09

    router in a different consolidator because it's just another instance of that agent that would be called. It's just an API of this agent that that we are calling.

    Okay. So,

    40:27

    okay. So, let me remove this annotation and we'll go back to the next slide and we'll go to the next slide.

    I'll stop annotation. Similar is the criteria for consolidator

    40:43

    as well that all of the agents you have an agent for HR, marketing, sales, consolidator. Let's say you ask a query that is what are the HR policies and how does it affected our marketing

    41:00

    strategies in on LinkedIn. So it will get the response from the marketing agent as well and the HR agent as well and the consolidator would further consolidate the response.

    Okay. So

    41:17

    I think that's pretty much about it from the from our theory part. Let's go and uh for the for this exercise to complete you guys would have um you'll have to actually log into Zapier.

    So log to

    41:34

    Zapier. Uh I would just log out.

    So you need two prerequisites for this exercise. You have you must create an account in Zapier and you must have uh a cursor agent running along.

    So,

    41:52

    so we'll get started with this our practical exercise. I'm going to log out and login again to actually walk you guys through the flow.

    So, this is a short interactive exercise that all of you can try it out. This recording is being done if you don't have those two

    42:08

    prerequisites. a cursor and a Zapier account.

    It's all right. You can um you can try it out later because everything will be recorded.

    Okay, I can share this Zappier link as

    42:25

    well. You can just log in from here and then it is mcp.zapier.com uh zapier.com.

    [Music]

    42:43

    So, I'll just continue with my Google account.

    43:00

    Okay. So, this is it.

    mcp.zapier.com. zapier.com.

    You guys can all log in and then follow these steps. If you don't have prerequisites, don't worry.

    This recording will be shared with you.

    43:16

    It will be on YouTube as well. So, we're going to get started with create a new MCP server.

    We're going to select uh cursor as our client and then we'll be I'll just name it as uh live

    43:31

    live session. I'm going to create the MCP server.

    So there we are. We have our MCP server.

    We're going to start adding tools. So I'm going to add this Microsoft Outlook tool.

    I will ask it to have all of the

    43:49

    tools available through Microsoft. I'll add that all of the authentication everything would be done by Zapier itself.

    You can see now I have all of these tools available which is find calendar events, find contacts, find emails, all of this as a tool are

    44:06

    available. So these are just APIs and tool wrappers that you can then you'll go to the connect link.

    You will see uh this entire code that you can copy from here. Treat

    44:23

    your MCP server URL as uh password because this is highly sensitive. Anyone who would have access to your MCP server URL would be able to call those tools from its own cursor.

    44:38

    So I'm going to quickly uh go over it. I'm going to So So you have to in cursor you have to come and go to you have to come and click on MCP and integrations.

    44:54

    Uh you can add a new MCP server. you can uh so I'm going to go and click add a new MCB server and then I'm going to paste the code that I have copied from there over here.

    So after this

    45:12

    uh I'm going to say it's later and then I'm going to select and ask my agent how many tools do you have access to? How many tools you have access

    45:28

    tools do you have? So even from even from cursor uh you can see it has so these are the tools the cursor itself has.

    So if you go and

    45:46

    uh uncheck and click on Zapier, you will see how many tools uh this Zapier MCP has. So if I go over it and click on MCP so it is loading the tools currently.

    46:03

    Uh you can go to this just add a new MCP server and it will load out the tools. So in the meantime it is loading these tools.

    Uh I'm happy to take the questions if there are any questions.

    46:21

    Okay. So you can see it has 23 tools available.

    Uh you it has Microsoft Outlook find calendar do this do that. All of these tools are available on our cursor after just a few clicks of integration.

    Uh,

    46:38

    how many tools do you have from MCP integration?

    46:53

    So, you can see it has all of these tools available with MCP Zapier. Now, with these tools, I can send out emails.

    I can send out anything to uh let's try it out. sending out an email.

    Send out

    47:09

    an email to Sanjay at datascience dojo.com um with these details

    47:24

    of my webinar agenda. I'm going to go over the slide.

    Just click out just I'm just going to copy this and paste it over here.

    47:43

    Okay. So now you'll see that this cursor agent would call an MCB tools and so do you have plans to cover A2A demo and details about ACB?

    Uh uh Ravi will be

    48:01

    actually doing the A2A demo in the next session because I believe we are just we're short on time today. It's just 15 minutes left.

    We I I will go over ACP as well. So the we actually set it's a 1

    48:16

    hour commitment. So we'll only go over the MCP demo but I will go over the comparison about MCP, ACP and A2A everything in a consolidated.

    Okay. So

    48:33

    so you see it has uh it has called. So you can now look into this.

    Now you can look into this process. Uh it it strategize with this instruction.

    Send an email to this. Recipient is

    48:48

    Sanji data science dojo. Subject is webinar agenda.

    Body is this. It has successfully called the right tool.

    Microsoft Outlook send email. Uh it has also uh written out the detail

    49:05

    in it. And now if I go and open up my outlook uh sent emails I would find let me actually go and find the email it has shared with Sanjay.

    49:21

    Okay. So I'm going to find that email and we'll share with you guys.

    So all of you how many of you are trying it out this MCP uh tools? Have you set up your Zapier account?

    Were you able to execute some

    49:37

    of it? So you can see these are the details.

    These are the tools available and you can try it out with this uh Zapier MCP service. Okay.

    So does

    49:56

    uh does all the information need to be passed to LLM for every separate interaction within a session need to be passed? So I don't think so all the so the beauty of this model context protocol is that it has the

    50:15

    capability to call the relevant tool based on the user the query is asking. So let's say u let's try it out.

    Uh I'm going to start connecting my Google Docs as well.

    50:31

    Uh I have my web previous webinar agenda on Google Docs. So I have connected the Google Docs tools.

    It can find a document. It can append text to a document.

    It can create document from template. All of this is

    50:48

    possible. So I'm going to find the webinar agenda on Google Docs.

    I think this was uh agentic AI with MCP integration. That was this was the previous webinar.

    So let's try it out.

    51:04

    Now if I just plug out and plug in again, Zapier would automatically detect the Google Docs tools as well. So now I can see that it has uh the tool to find out the documents.

    I'm going to ask my

    51:20

    agent find the details about this document on my Google Docs. [Music]

    51:39

    So it is calling this tool Google Docs find a document. Uh how do you do it without Zapier Suna?

    Uh all of these MCP servers are available on this repository called uh

    51:57

    MCP market. So you can deploy your MCP servers on your uh web apps.

    This is how we are doing it on our product because on our production environment you can deploy all of these MCP servers on your web apps and then you can uh use them in

    52:14

    your client. Okay.

    So if I go back to cursor you can see that it has f it has found out the entire document agentic with MCP integration. This was the this was the

    52:30

    owner of this document. the document ID, the details about this.

    Now I can say send the entire details to Sanjay at data science

    52:46

    dojo.com. [Music] So now we are using two tools at a time.

    It has acquired the information from that particular document. Now it is it is sending out that uh agenda

    53:03

    documentation from u agenda documentation from the Google docs. Now it is sending out to Microsoft Outlooks by emailing me.

    Okay. So anyone has any questions so far?

    53:21

    If you were able to build your own tools and function how would you do say here in Zapier MCU? uh there are multiple sessions that we can do on building your own MCP server and then using it without Zapier as

    53:36

    well. So we are planning to actually schedule those as well.

    So now it says the email has been delivered successfully through your Microsoft Outlook account. I'm going to go and look in the sent emails.

    Yeah, you can see over here

    53:53

    uh it has uh shared the details but not in the proper structure. So this is something you can prompt.

    This was the previous email that I that this MCP sent out to uh Sanjay and this was the

    54:08

    current email at 11:52. This is the email that's been automatically sent out.

    So I would say please resend this email with better

    54:25

    um with I would say please resend this email and make sure to format it correctly.

    54:40

    This stink how one can create tools say match teacher on board to MCV server and begin

    54:56

    using it. Uh you can easily create tools.

    uh the the e the best way is uh if you want to create tools just uh start with the basic uh repository. I would say fast MCP is the repository you

    55:15

    can get started with. Just go to fast MCP and start with a fast API backend server build your APIs.

    U so so I think Sai this is your question. Let's say if you want to create one tool

    55:33

    match teacher and on board to MCB server begin using it. Yes it is as simple as that.

    Just copy this code. Fast MCP import fast MCP.

    This is a demo. You create your tool over here and then just

    55:48

    run this server. This is your first demo MCP server that you can get started.

    We are planning to schedule more uh similar events that we we can create our MCP servers and then uh custom MCP servers and then deploy custom MCP servers on

    56:04

    web apps and then use it. So you guys can use this repository.

    I'm going to share this in the chat as well. Okay.

    And then there was another library

    56:20

    I was talking about langchain adapters. So this is langchain mcp adapters that if you have a lang chain agent, if you have a lang graph agent, you can use this uh you can use this as a client for

    56:35

    your products applications. Okay, so let's go back to our cursor.

    Yes, it has improved formatting features, all of this colored headings. Let's see the new email it has shared.

    56:52

    Oh, I think that's the email I was uh expecting. Okay, dear Sanjay, I'm sharing the complete details for our coming webinar.

    Please find it a very beautiful header with all the

    57:08

    details interactive element. I really like the uh the email.

    So, I think that's about it. Uh who are the boot camps aimed at engineers?

    No, our LLM boot camps agent

    57:24

    boot camps is uh beginner friendly. If you know even if you don't know basic Python will share some prerequisites with you.

    So you can join our boot camps. It is uh we have prepared it uh in a very beginner friendly environment.

    57:41

    So anyone who is a college student or they can join us we get started with a very beginner friendly theory. Okay.

    So anyone has any questions so far.

    57:58

    So that's about it from today's session. We discussed uh we discussed the theory about agentic AI protocols.

    Why do we need agentic AI protocols? What is MCP A2A?

    There are further more things that we would be covering in our next

    58:14

    sessions. Creating your custom MCP servers, deploying your MCP servers on web apps, then using it on different clients.

    But this was the start. I would recommend all of you to try try out this exercise on your own.

    Try uh with Zapier

    58:33

    MCP and then move forward from there. Think so.

    That's about it. Uh so all of you that have joined um we can stay in touch.

    You can just uh reach out to me on my LinkedIn. I'm going to share that with

    58:50

    you guys in the chat as well. Okay.

    So Alishba over to you if you would like to share any ending notes. >> Thank you Zad.

    So I'm going to be taking

    59:05

    the screen here. Give me one second.

    Hello everyone. So this is Alishba from data science dojo and before we end this session I would like to introduce you to our upcoming boot camp Agent boot camp

    59:23

    and I have shared the link in the chat as well. So you can check the page out.

    Um we have a boot camp coming. We have put all the details on the page the link that I've mentioned.

    you will be able to see that who this boot camp is for to see where you relate and it is like Zad

    59:39

    mentioned it's for beginners it's a very beginner friendly boot camp as well um it's very practical as well and you will be able to see all the guest speakers instructors that we have in the boot camp as well and you will be getting the verified certificate from the University

    59:55

    of New Mexico you'll find more details on the page and you will find the whole curriculum and everything we're going to be covering on it. The dates that we have is from for September 30 and October 9.

    Um, so go

    00:10

    ahead, enroll for them. And just so you know, we're having an a conference next week.

    And we are offering free boot camp seats and giveaways and much more. So, make sure that you register for the conference and join us next week and you

    00:25

    could be one of the lucky winners. With that, we will end the session.

    And if you have any questions or anything just make sure you can ask us on support or any channel. Um they are all mentioned on our LinkedIn and everywhere.

    And

    00:41

    before I end this, thank you so much dad for such a wonderful session and we hope to see you in the next session as well. Thank you everyone.

    Have a great day. Bye-bye.

    And before I end this, yes, the recording will be available to everyone.

    00:56

    I'll be sending an email to all the participants with the resources, slides, and recording.