
A Code Implementation to Building a Context-Aware AI Assistant in Google Colab Using LangChain, LangGraph, Gemini Pro, and Model Context Protocol (MCP) Principles with Tool Integration Support
www.marktechpost.com
In this hands-on tutorial, we bring the core principles of the Model Context Protocol (MCP) to life by implementing a lightweight, context-aware AI assistant using LangChain, LangGraph, and Googles Gemini language model. While full MCP integration typically involves dedicated servers and communication protocols, this simplified version demonstrates how the same ideas, context retrieval, tool invocation, and dynamic interaction can be recreated in a single notebook using a modular agent architecture. The assistant can respond to natural language queries and selectively route them to external tools (like a custom knowledge base), mimicking how MCP clients interact with context providers in real-world setups.!pip install langchain langchain-google-genai langgraph python-dotenv!pip install google-generativeaiFirst, we install essential libraries. The first command installs LangChain, LangGraph, the Google Generative AI LangChain wrapper, and environment variable support via python-dotenv. The second command installs Googles official generative AI client, which enables interaction with Gemini models.import osos.environ["GEMINI_API_KEY"] = "Your API Key"Here, we set your Gemini API key as an environment variable so the model can securely access it without hardcoding it into your codebase. Replace Your API Key with your actual key from Google AI Studio.from langchain.tools import BaseToolfrom langchain_google_genai import ChatGoogleGenerativeAIfrom langchain.prompts import ChatPromptTemplatefrom langchain.schema.messages import HumanMessage, AIMessagefrom langgraph.prebuilt import create_react_agentimport osmodel = ChatGoogleGenerativeAI( model="gemini-2.0-flash-lite", temperature=0.7, google_api_key=os.getenv("GEMINI_API_KEY"))class SimpleKnowledgeBaseTool(BaseTool): name: str = "simple_knowledge_base" description: str = "Retrieves basic information about AI concepts." def _run(self, query: str): knowledge = { "MCP": "Model Context Protocol (MCP) is an open standard by Anthropic designed to connect AI assistants with external data sources, enabling real-time, context-rich interactions.", "RAG": "Retrieval-Augmented Generation (RAG) enhances LLM responses by dynamically retrieving relevant external documents." } return knowledge.get(query, "I don't have information on that topic.") async def _arun(self, query: str): return self._run(query)kb_tool = SimpleKnowledgeBaseTool()tools = [kb_tool]graph = create_react_agent(model, tools)In this block, we initialize the Gemini language model (gemini-2.0-flash-lite) using LangChains ChatGoogleGenerativeAI, with the API key securely loaded from environment variables. We then define a custom tool named SimpleKnowledgeBaseTool that simulates an external knowledge source by returning predefined answers to queries about AI concepts like MCP and RAG. This tool acts as a basic context provider, similar to how an MCP server would operate. Finally, we use LangGraphs create_react_agent to build a ReAct-style agent that can reason through prompts and dynamically decide when to call tools, mimicking MCPs tool-aware, context-rich interactions principle.import nest_asyncioimport asyncionest_asyncio.apply() async def chat_with_agent(): inputs = {"messages": []} print(" MCP-Like Assistant ready! Type 'exit' to quit.") while True: user_input = input("nYou: ") if user_input.lower() == "exit": print(" Ending chat.") break from langchain.schema.messages import HumanMessage, AIMessage inputs["messages"].append(HumanMessage(content=user_input)) async for state in graph.astream(inputs, stream_mode="values"): last_message = state["messages"][-1] if isinstance(last_message, AIMessage): print("nAgent:", last_message.content) inputs["messages"] = state["messages"]await chat_with_agent()Finally, we set up an asynchronous chat loop to interact with the MCP-inspired assistant. Using nest_asyncio, we enable support for running asynchronous code inside the notebooks existing event loop. The chat_with_agent() function captures user input, feeds it to the ReAct agent, and streams the models responses in real time. With each turn, the assistant uses tool-aware reasoning to decide whether to answer directly or invoke the custom knowledge base tool, emulating how an MCP client interacts with context providers to deliver dynamic, context-rich responses.In conclusion, this tutorial offers a practical foundation for building context-aware AI agents inspired by the MCP standard. Weve created a functional prototype demonstrating on-demand tool use and external knowledge retrieval by combining LangChains tool interface, LangGraphs agent framework, and Geminis powerful language generation. Although the setup is simplified, it captures the essence of MCPs architecture: modularity, interoperability, and intelligent context injection. From here, you can extend the assistant to integrate real APIs, local documents, or dynamic search tools, evolving it into a production-ready AI system aligned with the principles of the Model Context Protocol.Here is the Colab Notebook. Also,dont forget to follow us onTwitterand join ourTelegram ChannelandLinkedIn Group. Dont Forget to join our85k+ ML SubReddit. Asif RazzaqWebsite| + postsBioAsif Razzaq is the CEO of Marktechpost Media Inc.. As a visionary entrepreneur and engineer, Asif is committed to harnessing the potential of Artificial Intelligence for social good. His most recent endeavor is the launch of an Artificial Intelligence Media Platform, Marktechpost, which stands out for its in-depth coverage of machine learning and deep learning news that is both technically sound and easily understandable by a wide audience. The platform boasts of over 2 million monthly views, illustrating its popularity among audiences.Asif Razzaqhttps://www.marktechpost.com/author/6flvq/Augment Code Released Augment SWE-bench Verified Agent: An Open-Source Agent Combining Claude Sonnet 3.7 and OpenAI O1 to Excel in Complex Software Engineering TasksAsif Razzaqhttps://www.marktechpost.com/author/6flvq/Meet Open-Qwen2VL: A Fully Open and Compute-Efficient Multimodal Large Language ModelAsif Razzaqhttps://www.marktechpost.com/author/6flvq/Researchers from Dataocean AI and Tsinghua University Introduces Dolphin: A Multilingual Automatic Speech Recognition ASR Model Optimized for Eastern Languages and DialectsAsif Razzaqhttps://www.marktechpost.com/author/6flvq/Introduction to MCP: The Ultimate Guide to Model Context Protocol for AI Assistants
0 Comments
·0 Shares
·15 Views