Spaces:
Build error
Build error
File size: 84,556 Bytes
01b4d04 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360 361 362 363 364 365 366 367 368 369 370 371 372 373 374 375 376 377 378 379 380 381 382 383 384 385 386 387 388 389 390 391 392 393 394 395 396 397 398 399 400 401 402 403 404 405 406 407 408 409 410 411 412 413 414 415 416 417 418 419 420 421 422 423 424 425 426 427 428 429 430 431 432 433 434 435 436 437 438 439 440 441 442 443 444 445 446 447 448 449 450 451 452 453 454 455 456 457 458 459 460 461 462 463 464 465 466 467 468 469 470 471 472 473 474 475 476 477 478 479 480 481 482 483 484 485 486 487 488 489 490 491 492 493 494 495 496 497 498 499 500 501 502 503 504 505 506 507 508 509 510 511 512 513 514 515 516 517 518 519 520 521 522 523 524 525 526 527 528 529 530 531 532 533 534 535 536 537 538 539 540 541 542 543 544 545 546 547 548 549 550 551 552 553 554 555 556 557 558 559 560 561 562 563 564 565 566 567 568 569 570 571 572 573 574 575 576 577 578 579 580 581 582 583 584 585 586 587 588 589 590 591 592 593 594 595 596 597 598 599 600 601 602 603 604 605 606 607 608 609 610 611 612 613 614 615 616 617 618 619 620 621 622 623 624 625 626 627 628 629 630 631 632 633 634 635 636 637 638 639 640 641 642 643 644 645 646 647 648 649 650 651 652 653 654 655 656 657 658 659 660 661 662 663 664 665 666 667 668 669 670 671 672 673 674 675 676 677 678 679 680 681 682 683 684 685 686 687 688 689 690 691 692 693 694 695 696 697 698 699 700 701 702 703 704 705 706 707 708 709 710 711 712 713 714 715 716 717 718 719 720 721 722 723 724 725 726 727 728 729 730 731 732 733 734 735 736 737 738 739 740 741 742 743 744 745 746 747 748 749 750 |
{
"cells": [
{
"cell_type": "markdown",
"id": "d0cc4adf",
"metadata": {},
"source": [
"### Question data"
]
},
{
"cell_type": "code",
"execution_count": 1,
"id": "14e3f417",
"metadata": {},
"outputs": [],
"source": [
"# Load metadata.jsonl\n",
"import json\n",
"# Load the metadata.jsonl file\n",
"with open('metadata.jsonl', 'r') as jsonl_file:\n",
" json_list = list(jsonl_file)\n",
"\n",
"json_QA = []\n",
"for json_str in json_list:\n",
" json_data = json.loads(json_str)\n",
" json_QA.append(json_data)"
]
},
{
"cell_type": "code",
"execution_count": 2,
"id": "5e2da6fc",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"==================================================\n",
"Task ID: 5f982798-16b9-4051-ab57-cfc7ebdb2a91\n",
"Question: I read a paper about multiwavelength observations of fast radio bursts back in March 2021 on Arxiv, and it had a fascinating diagram of an X-ray time profile. There was a similar burst-1 diagram in another paper from one of the same authors about fast radio bursts back in July 2020, but I can't recall what the difference in seconds in the measured time span was. How many more seconds did one measure than the other? Just give the number.\n",
"Level: 3\n",
"Final Answer: 0.2\n",
"Annotator Metadata: \n",
" ├── Steps: \n",
" │ ├── 1. Searched \"arxiv\" on Google.\n",
" │ ├── 2. Opened arXiv.\n",
" │ ├── 3. Searched \"multiwavelength observations of fast radio bursts\" on arXiv.\n",
" │ ├── 4. Scrolled down to March 2021.\n",
" │ ├── 5. Opened the \"Multiwavelength observations of Fast Radio Bursts\" PDF in a new tab.\n",
" │ ├── 6. Opened each author's name to find the one that had a July 2020 paper (Nicastro, L).\n",
" │ ├── 7. Opened the \"The lowest frequency Fast Radio Bursts: Sardinia Radio Telescope detection of the periodic FRB 180916 at 328 MHz\" PDF.\n",
" │ ├── 8. Searched \"time profile\" in the first paper.\n",
" │ ├── 9. Noted the time span of the diagram (0.3 s).\n",
" │ ├── 10. Searched \"burst-1 profile\" in the second paper.\n",
" │ ├── 11. Noted the time span of the diagram (0.5 s).\n",
" │ ├── 12. Subtracted the two (0.5 - 0.3 = 0.2 s).\n",
" ├── Number of steps: 12\n",
" ├── How long did this take?: 15 minutes\n",
" ├── Tools:\n",
" │ ├── 1. PDF access\n",
" │ ├── 2. Calculator\n",
" │ ├── 3. Web browser\n",
" │ ├── 4. Search engine\n",
" └── Number of tools: 4\n",
"==================================================\n"
]
}
],
"source": [
"# randomly select 3 samples\n",
"# {\"task_id\": \"c61d22de-5f6c-4958-a7f6-5e9707bd3466\", \"Question\": \"A paper about AI regulation that was originally submitted to arXiv.org in June 2022 shows a figure with three axes, where each axis has a label word at both ends. Which of these words is used to describe a type of society in a Physics and Society article submitted to arXiv.org on August 11, 2016?\", \"Level\": 2, \"Final answer\": \"egalitarian\", \"file_name\": \"\", \"Annotator Metadata\": {\"Steps\": \"1. Go to arxiv.org and navigate to the Advanced Search page.\\n2. Enter \\\"AI regulation\\\" in the search box and select \\\"All fields\\\" from the dropdown.\\n3. Enter 2022-06-01 and 2022-07-01 into the date inputs, select \\\"Submission date (original)\\\", and submit the search.\\n4. Go through the search results to find the article that has a figure with three axes and labels on each end of the axes, titled \\\"Fairness in Agreement With European Values: An Interdisciplinary Perspective on AI Regulation\\\".\\n5. Note the six words used as labels: deontological, egalitarian, localized, standardized, utilitarian, and consequential.\\n6. Go back to arxiv.org\\n7. Find \\\"Physics and Society\\\" and go to the page for the \\\"Physics and Society\\\" category.\\n8. Note that the tag for this category is \\\"physics.soc-ph\\\".\\n9. Go to the Advanced Search page.\\n10. Enter \\\"physics.soc-ph\\\" in the search box and select \\\"All fields\\\" from the dropdown.\\n11. Enter 2016-08-11 and 2016-08-12 into the date inputs, select \\\"Submission date (original)\\\", and submit the search.\\n12. Search for instances of the six words in the results to find the paper titled \\\"Phase transition from egalitarian to hierarchical societies driven by competition between cognitive and social constraints\\\", indicating that \\\"egalitarian\\\" is the correct answer.\", \"Number of steps\": \"12\", \"How long did this take?\": \"8 minutes\", \"Tools\": \"1. Web browser\\n2. Image recognition tools (to identify and parse a figure with three axes)\", \"Number of tools\": \"2\"}}\n",
"\n",
"import random\n",
"# random.seed(42)\n",
"random_samples = random.sample(json_QA, 1)\n",
"for sample in random_samples:\n",
" print(\"=\" * 50)\n",
" print(f\"Task ID: {sample['task_id']}\")\n",
" print(f\"Question: {sample['Question']}\")\n",
" print(f\"Level: {sample['Level']}\")\n",
" print(f\"Final Answer: {sample['Final answer']}\")\n",
" print(f\"Annotator Metadata: \")\n",
" print(f\" ├── Steps: \")\n",
" for step in sample['Annotator Metadata']['Steps'].split('\\n'):\n",
" print(f\" │ ├── {step}\")\n",
" print(f\" ├── Number of steps: {sample['Annotator Metadata']['Number of steps']}\")\n",
" print(f\" ├── How long did this take?: {sample['Annotator Metadata']['How long did this take?']}\")\n",
" print(f\" ├── Tools:\")\n",
" for tool in sample['Annotator Metadata']['Tools'].split('\\n'):\n",
" print(f\" │ ├── {tool}\")\n",
" print(f\" └── Number of tools: {sample['Annotator Metadata']['Number of tools']}\")\n",
"print(\"=\" * 50)"
]
},
{
"cell_type": "code",
"execution_count": 3,
"id": "4bb02420",
"metadata": {},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"/opt/homebrew/lib/python3.11/site-packages/tqdm/auto.py:21: TqdmWarning: IProgress not found. Please update jupyter and ipywidgets. See https://ipywidgets.readthedocs.io/en/stable/user_install.html\n",
" from .autonotebook import tqdm as notebook_tqdm\n",
"/opt/homebrew/lib/python3.11/site-packages/huggingface_hub/file_download.py:896: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`.\n",
" warnings.warn(\n"
]
}
],
"source": [
"### build a vector database based on the metadata.jsonl\n",
"# https://python.langchain.com/docs/integrations/vectorstores/supabase/\n",
"import os\n",
"from dotenv import load_dotenv\n",
"from langchain_huggingface import HuggingFaceEmbeddings\n",
"from langchain_community.vectorstores import SupabaseVectorStore\n",
"from supabase.client import Client, create_client\n",
"\n",
"\n",
"load_dotenv()\n",
"embeddings = HuggingFaceEmbeddings(model_name=\"sentence-transformers/all-mpnet-base-v2\") # dim=768\n",
"\n",
"supabase_url = os.environ.get(\"SUPABASE_URL\")\n",
"supabase_key = os.environ.get(\"SUPABASE_SERVICE_KEY\")\n",
"supabase: Client = create_client(supabase_url, supabase_key)"
]
},
{
"cell_type": "code",
"execution_count": 4,
"id": "a070b955",
"metadata": {},
"outputs": [],
"source": [
"# wrap the metadata.jsonl's questions and answers into a list of document\n",
"from langchain.schema import Document\n",
"docs = []\n",
"for sample in json_QA:\n",
" content = f\"Question : {sample['Question']}\\n\\nFinal answer : {sample['Final answer']}\"\n",
" doc = {\n",
" \"content\" : content,\n",
" \"metadata\" : { # meatadata的格式必须时source键,否则会报错\n",
" \"source\" : sample['task_id']\n",
" },\n",
" \"embedding\" : embeddings.embed_query(content),\n",
" }\n",
" docs.append(doc)\n",
"\n",
"# upload the documents to the vector database\n",
"try:\n",
" response = (\n",
" supabase.table(\"documents\")\n",
" .insert(docs)\n",
" .execute()\n",
" )\n",
"except Exception as exception:\n",
" print(\"Error inserting data into Supabase:\", exception)\n",
"\n",
"# ALTERNATIVE : Save the documents (a list of dict) into a csv file, and manually upload it to Supabase\n",
"# import pandas as pd\n",
"# df = pd.DataFrame(docs)\n",
"# df.to_csv('supabase_docs.csv', index=False)"
]
},
{
"cell_type": "code",
"execution_count": 5,
"id": "77fb9dbb",
"metadata": {},
"outputs": [],
"source": [
"# add items to vector database\n",
"vector_store = SupabaseVectorStore(\n",
" client=supabase,\n",
" embedding= embeddings,\n",
" table_name=\"documents\",\n",
" query_name=\"match_documents_langchain\",\n",
")\n",
"retriever = vector_store.as_retriever()"
]
},
{
"cell_type": "code",
"execution_count": 6,
"id": "12a05971",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"Document(metadata={'source': '840bfca7-4f7b-481a-8794-c560c340185d'}, page_content='Question : On June 6, 2023, an article by Carolyn Collins Petersen was published in Universe Today. This article mentions a team that produced a paper about their observations, linked at the bottom of the article. Find this paper. Under what NASA award number was the work performed by R. G. Arendt supported by?\\n\\nFinal answer : 80GSFC21M0002')"
]
},
"execution_count": 6,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"query = \"On June 6, 2023, an article by Carolyn Collins Petersen was published in Universe Today. This article mentions a team that produced a paper about their observations, linked at the bottom of the article. Find this paper. Under what NASA award number was the work performed by R. G. Arendt supported by?\"\n",
"# matched_docs = vector_store.similarity_search(query, 2)\n",
"docs = retriever.invoke(query)\n",
"docs[0]"
]
},
{
"cell_type": "code",
"execution_count": 7,
"id": "1eae5ba4",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"List of tools used in all samples:\n",
"Total number of tools used: 83\n",
" ├── web browser: 107\n",
" ├── image recognition tools (to identify and parse a figure with three axes): 1\n",
" ├── search engine: 101\n",
" ├── calculator: 34\n",
" ├── unlambda compiler (optional): 1\n",
" ├── a web browser.: 2\n",
" ├── a search engine.: 2\n",
" ├── a calculator.: 1\n",
" ├── microsoft excel: 5\n",
" ├── google search: 1\n",
" ├── ne: 9\n",
" ├── pdf access: 7\n",
" ├── file handling: 2\n",
" ├── python: 3\n",
" ├── image recognition tools: 12\n",
" ├── jsonld file access: 1\n",
" ├── video parsing: 1\n",
" ├── python compiler: 1\n",
" ├── video recognition tools: 3\n",
" ├── pdf viewer: 7\n",
" ├── microsoft excel / google sheets: 3\n",
" ├── word document access: 1\n",
" ├── tool to extract text from images: 1\n",
" ├── a word reversal tool / script: 1\n",
" ├── counter: 1\n",
" ├── excel: 3\n",
" ├── image recognition: 5\n",
" ├── color recognition: 3\n",
" ├── excel file access: 3\n",
" ├── xml file access: 1\n",
" ├── access to the internet archive, web.archive.org: 1\n",
" ├── text processing/diff tool: 1\n",
" ├── gif parsing tools: 1\n",
" ├── a web browser: 7\n",
" ├── a search engine: 7\n",
" ├── a speech-to-text tool: 2\n",
" ├── code/data analysis tools: 1\n",
" ├── audio capability: 2\n",
" ├── pdf reader: 1\n",
" ├── markdown: 1\n",
" ├── a calculator: 5\n",
" ├── access to wikipedia: 3\n",
" ├── image recognition/ocr: 3\n",
" ├── google translate access: 1\n",
" ├── ocr: 4\n",
" ├── bass note data: 1\n",
" ├── text editor: 1\n",
" ├── xlsx file access: 1\n",
" ├── powerpoint viewer: 1\n",
" ├── csv file access: 1\n",
" ├── calculator (or use excel): 1\n",
" ├── computer algebra system: 1\n",
" ├── video processing software: 1\n",
" ├── audio processing software: 1\n",
" ├── computer vision: 1\n",
" ├── google maps: 1\n",
" ├── access to excel files: 1\n",
" ├── calculator (or ability to count): 1\n",
" ├── a file interface: 3\n",
" ├── a python ide: 1\n",
" ├── spreadsheet editor: 1\n",
" ├── tools required: 1\n",
" ├── b browser: 1\n",
" ├── image recognition and processing tools: 1\n",
" ├── computer vision or ocr: 1\n",
" ├── c++ compiler: 1\n",
" ├── access to google maps: 1\n",
" ├── youtube player: 1\n",
" ├── natural language processor: 1\n",
" ├── graph interaction tools: 1\n",
" ├── bablyonian cuniform -> arabic legend: 1\n",
" ├── access to youtube: 1\n",
" ├── image search tools: 1\n",
" ├── calculator or counting function: 1\n",
" ├── a speech-to-text audio processing tool: 1\n",
" ├── access to academic journal websites: 1\n",
" ├── pdf reader/extracter: 1\n",
" ├── rubik's cube model: 1\n",
" ├── wikipedia: 1\n",
" ├── video capability: 1\n",
" ├── image processing tools: 1\n",
" ├── age recognition software: 1\n",
" ├── youtube: 1\n"
]
}
],
"source": [
"# list of the tools used in all the samples\n",
"from collections import Counter, OrderedDict\n",
"\n",
"tools = []\n",
"for sample in json_QA:\n",
" for tool in sample['Annotator Metadata']['Tools'].split('\\n'):\n",
" tool = tool[2:].strip().lower()\n",
" if tool.startswith(\"(\"):\n",
" tool = tool[11:].strip()\n",
" tools.append(tool)\n",
"tools_counter = OrderedDict(Counter(tools))\n",
"print(\"List of tools used in all samples:\")\n",
"print(\"Total number of tools used:\", len(tools_counter))\n",
"for tool, count in tools_counter.items():\n",
" print(f\" ├── {tool}: {count}\")"
]
},
{
"cell_type": "markdown",
"id": "5efee12a",
"metadata": {},
"source": [
"#### Graph"
]
},
{
"cell_type": "code",
"execution_count": 8,
"id": "7fe573cc",
"metadata": {},
"outputs": [],
"source": [
"system_prompt = \"\"\"\n",
"You are a helpful assistant tasked with answering questions using a set of tools.\n",
"If the tool is not available, you can try to find the information online. You can also use your own knowledge to answer the question. \n",
"You need to provide a step-by-step explanation of how you arrived at the answer.\n",
"==========================\n",
"Here is a few examples showing you how to answer the question step by step.\n",
"\"\"\"\n",
"for i, samples in enumerate(random_samples):\n",
" system_prompt += f\"\\nQuestion {i+1}: {samples['Question']}\\nSteps:\\n{samples['Annotator Metadata']['Steps']}\\nTools:\\n{samples['Annotator Metadata']['Tools']}\\nFinal Answer: {samples['Final answer']}\\n\"\n",
"system_prompt += \"\\n==========================\\n\"\n",
"system_prompt += \"Now, please answer the following question step by step.\\n\"\n",
"\n",
"# save the system_prompt to a file\n",
"with open('system_prompt.txt', 'w') as f:\n",
" f.write(system_prompt)"
]
},
{
"cell_type": "code",
"execution_count": 9,
"id": "d6beb0da",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"\n",
"You are a helpful assistant tasked with answering questions using a set of tools.\n",
"If the tool is not available, you can try to find the information online. You can also use your own knowledge to answer the question. \n",
"You need to provide a step-by-step explanation of how you arrived at the answer.\n",
"==========================\n",
"Here is a few examples showing you how to answer the question step by step.\n",
"\n",
"Question 1: I read a paper about multiwavelength observations of fast radio bursts back in March 2021 on Arxiv, and it had a fascinating diagram of an X-ray time profile. There was a similar burst-1 diagram in another paper from one of the same authors about fast radio bursts back in July 2020, but I can't recall what the difference in seconds in the measured time span was. How many more seconds did one measure than the other? Just give the number.\n",
"Steps:\n",
"1. Searched \"arxiv\" on Google.\n",
"2. Opened arXiv.\n",
"3. Searched \"multiwavelength observations of fast radio bursts\" on arXiv.\n",
"4. Scrolled down to March 2021.\n",
"5. Opened the \"Multiwavelength observations of Fast Radio Bursts\" PDF in a new tab.\n",
"6. Opened each author's name to find the one that had a July 2020 paper (Nicastro, L).\n",
"7. Opened the \"The lowest frequency Fast Radio Bursts: Sardinia Radio Telescope detection of the periodic FRB 180916 at 328 MHz\" PDF.\n",
"8. Searched \"time profile\" in the first paper.\n",
"9. Noted the time span of the diagram (0.3 s).\n",
"10. Searched \"burst-1 profile\" in the second paper.\n",
"11. Noted the time span of the diagram (0.5 s).\n",
"12. Subtracted the two (0.5 - 0.3 = 0.2 s).\n",
"Tools:\n",
"1. PDF access\n",
"2. Calculator\n",
"3. Web browser\n",
"4. Search engine\n",
"Final Answer: 0.2\n",
"\n",
"==========================\n",
"Now, please answer the following question step by step.\n",
"\n"
]
}
],
"source": [
"# load the system prompt from the file\n",
"with open('system_prompt.txt', 'r') as f:\n",
" system_prompt = f.read()\n",
"print(system_prompt)"
]
},
{
"cell_type": "code",
"execution_count": 10,
"id": "42fde0f8",
"metadata": {},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"/opt/homebrew/lib/python3.11/site-packages/huggingface_hub/file_download.py:896: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`.\n",
" warnings.warn(\n"
]
}
],
"source": [
"import dotenv\n",
"from langgraph.graph import MessagesState, START, StateGraph\n",
"from langgraph.prebuilt import tools_condition\n",
"from langgraph.prebuilt import ToolNode\n",
"from langchain_google_genai import ChatGoogleGenerativeAI\n",
"from langchain_huggingface import HuggingFaceEmbeddings\n",
"from langchain_community.tools.tavily_search import TavilySearchResults\n",
"from langchain_community.document_loaders import WikipediaLoader\n",
"from langchain_community.document_loaders import ArxivLoader\n",
"from langchain_community.vectorstores import SupabaseVectorStore\n",
"from langchain.tools.retriever import create_retriever_tool\n",
"from langchain_core.messages import HumanMessage, SystemMessage\n",
"from langchain_core.tools import tool\n",
"from supabase.client import Client, create_client\n",
"\n",
"# Define the retriever from supabase\n",
"load_dotenv()\n",
"embeddings = HuggingFaceEmbeddings(model_name=\"sentence-transformers/all-mpnet-base-v2\") # dim=768\n",
"\n",
"supabase_url = os.environ.get(\"SUPABASE_URL\")\n",
"supabase_key = os.environ.get(\"SUPABASE_SERVICE_KEY\")\n",
"supabase: Client = create_client(supabase_url, supabase_key)\n",
"vector_store = SupabaseVectorStore(\n",
" client=supabase,\n",
" embedding= embeddings,\n",
" table_name=\"documents\",\n",
" query_name=\"match_documents_langchain\",\n",
")\n",
"\n",
"question_retrieve_tool = create_retriever_tool(\n",
" vector_store.as_retriever(),\n",
" \"Question Retriever\",\n",
" \"Find similar questions in the vector database for the given question.\",\n",
")\n",
"\n",
"@tool\n",
"def multiply(a: int, b: int) -> int:\n",
" \"\"\"Multiply two numbers.\n",
"\n",
" Args:\n",
" a: first int\n",
" b: second int\n",
" \"\"\"\n",
" return a * b\n",
"\n",
"@tool\n",
"def add(a: int, b: int) -> int:\n",
" \"\"\"Add two numbers.\n",
" \n",
" Args:\n",
" a: first int\n",
" b: second int\n",
" \"\"\"\n",
" return a + b\n",
"\n",
"@tool\n",
"def subtract(a: int, b: int) -> int:\n",
" \"\"\"Subtract two numbers.\n",
" \n",
" Args:\n",
" a: first int\n",
" b: second int\n",
" \"\"\"\n",
" return a - b\n",
"\n",
"@tool\n",
"def divide(a: int, b: int) -> int:\n",
" \"\"\"Divide two numbers.\n",
" \n",
" Args:\n",
" a: first int\n",
" b: second int\n",
" \"\"\"\n",
" if b == 0:\n",
" raise ValueError(\"Cannot divide by zero.\")\n",
" return a / b\n",
"\n",
"@tool\n",
"def modulus(a: int, b: int) -> int:\n",
" \"\"\"Get the modulus of two numbers.\n",
" \n",
" Args:\n",
" a: first int\n",
" b: second int\n",
" \"\"\"\n",
" return a % b\n",
"\n",
"@tool\n",
"def wiki_search(query: str) -> str:\n",
" \"\"\"Search Wikipedia for a query and return maximum 2 results.\n",
" \n",
" Args:\n",
" query: The search query.\"\"\"\n",
" search_docs = WikipediaLoader(query=query, load_max_docs=2).load()\n",
" formatted_search_docs = \"\\n\\n---\\n\\n\".join(\n",
" [\n",
" f'<Document source=\"{doc.metadata[\"source\"]}\" page=\"{doc.metadata.get(\"page\", \"\")}\"/>\\n{doc.page_content}\\n</Document>'\n",
" for doc in search_docs\n",
" ])\n",
" return {\"wiki_results\": formatted_search_docs}\n",
"\n",
"@tool\n",
"def web_search(query: str) -> str:\n",
" \"\"\"Search Tavily for a query and return maximum 3 results.\n",
" \n",
" Args:\n",
" query: The search query.\"\"\"\n",
" search_docs = TavilySearchResults(max_results=3).invoke(query=query)\n",
" formatted_search_docs = \"\\n\\n---\\n\\n\".join(\n",
" [\n",
" f'<Document source=\"{doc.metadata[\"source\"]}\" page=\"{doc.metadata.get(\"page\", \"\")}\"/>\\n{doc.page_content}\\n</Document>'\n",
" for doc in search_docs\n",
" ])\n",
" return {\"web_results\": formatted_search_docs}\n",
"\n",
"@tool\n",
"def arvix_search(query: str) -> str:\n",
" \"\"\"Search Arxiv for a query and return maximum 3 result.\n",
" \n",
" Args:\n",
" query: The search query.\"\"\"\n",
" search_docs = ArxivLoader(query=query, load_max_docs=3).load()\n",
" formatted_search_docs = \"\\n\\n---\\n\\n\".join(\n",
" [\n",
" f'<Document source=\"{doc.metadata[\"source\"]}\" page=\"{doc.metadata.get(\"page\", \"\")}\"/>\\n{doc.page_content[:1000]}\\n</Document>'\n",
" for doc in search_docs\n",
" ])\n",
" return {\"arvix_results\": formatted_search_docs}\n",
"\n",
"@tool\n",
"def similar_question_search(question: str) -> str:\n",
" \"\"\"Search the vector database for similar questions and return the first results.\n",
" \n",
" Args:\n",
" question: the question human provided.\"\"\"\n",
" matched_docs = vector_store.similarity_search(query, 3)\n",
" formatted_search_docs = \"\\n\\n---\\n\\n\".join(\n",
" [\n",
" f'<Document source=\"{doc.metadata[\"source\"]}\" page=\"{doc.metadata.get(\"page\", \"\")}\"/>\\n{doc.page_content[:1000]}\\n</Document>'\n",
" for doc in matched_docs\n",
" ])\n",
" return {\"similar_questions\": formatted_search_docs}\n",
"\n",
"tools = [\n",
" multiply,\n",
" add,\n",
" subtract,\n",
" divide,\n",
" modulus,\n",
" wiki_search,\n",
" web_search,\n",
" arvix_search,\n",
" question_retrieve_tool\n",
"]\n",
"\n",
"llm = ChatGoogleGenerativeAI(model=\"gemini-2.0-flash\")\n",
"llm_with_tools = llm.bind_tools(tools)"
]
},
{
"cell_type": "code",
"execution_count": 11,
"id": "7dd0716c",
"metadata": {},
"outputs": [],
"source": [
"# load the system prompt from the file\n",
"with open('system_prompt.txt', 'r') as f:\n",
" system_prompt = f.read()\n",
"\n",
"\n",
"# System message\n",
"sys_msg = SystemMessage(content=system_prompt)\n",
"\n",
"# Node\n",
"def assistant(state: MessagesState):\n",
" \"\"\"Assistant node\"\"\"\n",
" return {\"messages\": [llm_with_tools.invoke([sys_msg] + state[\"messages\"])]}\n",
"\n",
"# Build graph\n",
"builder = StateGraph(MessagesState)\n",
"builder.add_node(\"assistant\", assistant)\n",
"builder.add_node(\"tools\", ToolNode(tools))\n",
"builder.add_edge(START, \"assistant\")\n",
"builder.add_conditional_edges(\n",
" \"assistant\",\n",
" # If the latest message (result) from assistant is a tool call -> tools_condition routes to tools\n",
" # If the latest message (result) from assistant is a not a tool call -> tools_condition routes to END\n",
" tools_condition,\n",
")\n",
"builder.add_edge(\"tools\", \"assistant\")\n",
"\n",
"# Compile graph\n",
"graph = builder.compile()\n"
]
},
{
"cell_type": "code",
"execution_count": 12,
"id": "f4e77216",
"metadata": {},
"outputs": [
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAANgAAAD5CAIAAADKsmwpAAAQAElEQVR4nOydB1wUR9vA5zrcwdGOXqRIFRC7gkZsxK7YguU1xhgTJcVXjVETNSYajCbGYCxYYuJnjYliYq+xRo2xIIqAgNI7HFzh+vfo5UVEQEzYuzl2/r/7HXu7e7dX/jwz88zsLFun0yECwdiwEYGAAUREAhYQEQlYQEQkYAERkYAFREQCFpikiAq5pixfKavWyKrVarVOrTSBDBTPnMnmMviWbL6Q5ehuhgjPYkoiSqtU6TekmcmSqjKVpS2Hb8mC31Voy0GmkArValDRQ4WsWsrhMbPvy7yCBd4hcLNAhCcwTCKhrdXoLv9WVpqvsHPhegdbuLY1R6ZMjUyTlSzNTZflZ9aED7Xz7WCJaI8JiHj3ivj3fSXhw+w6RNqg1gWE9suHyhQyTdR/nMwtWIjG4C7i7/uKzfjM7kNEqPVSWqBIXJc38HUnN18+oitYi3hyR5GTl1lIhBWiAQfW5fWKFolceIiW4Cti4vq8tmEWweG0sFDPgXW5IRHW8KkR/WAiLLmQWOIZJKCVhUB0rNuVo2UVRUpEP3AUMfVGNZvDDIu0RvRj4nyPs/uKaTg2D0cRz+0r6diXjhYCDAYDigLIVSGagZ2If52qCI4Q8szpm8vo2Nfm3tWqGqkG0Qm8RIQiKTtVFj60NSdrmsMro+xvnatEdAIvETPvSKFPFtEeD39+8mUxohN4/erQ8QWdsMiwfPTRR7/99ht6efr375+fn48oAHpZrEXcgodyRBvwErGyROUdYmgRU1JS0MtTWFhYWUlh6enX2SInTYZoA0YiQvW8olhJXTMlMTFx3LhxERER/fr1+/DDD4uKimBl586dIaotXbo0MjISHmo0mo0bN44cOTI8PHzQoEErVqyQy/8OSxD/du3a9f777/fo0ePChQtDhw6FlcOHD58zZw6iAIGQXZpLo4QiRiJKq9Tw7SNquHnz5rJly8aPH793795vv/0Wgtn8+fNh/ZEjR+AevDx48CAsgGo//PDDzJkz9+zZs2TJknPnzq1bt07/Cmw2e//+/W3btk1ISOjSpUtcXBys3LFjx2effYYoAL4K+EIQbcBoPKK0SiMQUhUOMzIyeDzesGHDwCc3NzcIdQUFBbDeyupx5w2fz9cvQBSEgAe2wbKHh0dUVNSlS5f0rwAZPjMzM4iI+ocCweMqhFAo1C+0OAIrllRMowwORiLqtDouZU1mKILBpGnTpo0YMaJbt24uLi52dnbP72ZtbX348GGIncXFxWq1WiaTgaO1W0NDQ5GhYLEZXDMaJRAw+qh8IVtcokLU4OnpuW3bNoiFa9euhYrdlClTkpOTn99t1apVW7Zsgark5s2boZiOjo6uu9XCwnDDESSVanAR0QaMRIRyGUpnRBm+vr4Q6k6ePAmVPBaLNWvWLKXymdYAtFSgpvj6668PHjzY1dVVJBJJJBJkJCitqGAIThHRkm3rxNFqKenvh/iXlJQEC6Bgp06dZsyYAe2VsrK/u3T1gwy0Wi24qK8sAlKp9Pz5802PP6BudIJCprF3p9HYRLxqIWZ8FnSuIAq4fPny7NmzT58+nZubm5qaCo1iZ2dnJycn3hNu3LgBK6ES6e/vf+jQIdgnPT0dQibkeqqqqh4+fAj1xXovCM0UuL948WJmZiaigNS/qp09TfvUnJcCLxE92wke3qVExKlTp0KFb82aNWPGjImNjYVIFh8fD+bBJqgvnjp1ClI2kDJcvHgxBEWoIy5YsCAmJgb2BFknT54MbZd6LxgYGAi5xm+++WblypWopdGodXkP5B4BNDpzAK8R2nKJ+sSOohHvuCJ6k3VXkpMmfyXaHtEGvCKiuQXbxpF7m2YDT57n8q9ldBudjt0J9hHDRAnzM9r3bnhgLJSb0EHX4CZoAnO53AY3eXl5Qe4GUcMPT2hwE6R7Gmt3Q8m+YcOGBjfdv17l4G5m69jwZ2mt4Hjy1K1zlQyGrv0rDZ/FXF1d3eB6hUIBIuqrffVgMpkU9X/oj1svDVSLSqXicDgNboLGe91UeV0ObcnvPcbe0rrhJ7ZWMD2LD36Mdt2tDD8kzOjQ9oNj2ok0dJrL+f0lZYUKRCfO7C128jSjoYUI5/Oaoet579c5r4yyd/GhRTrt7E/Fbr7mtJ0HB99udQaTEfOhxx9HylKuVaFWjVajO7Auz9aJS+fZmExgEqbLh0qzU2Thw0StMsH754ny1OvVkWPt6TzxDTKVaelK8hSXfysVCNlQTEMVylxg8qMBinNqslNl109UhEVadx1oy2TSaKBNg5iGiHpy02UQPLKSpfbuPCsRB7yEG1/I0moR/rAYSFyukoo1OqS7/2c1vPO27QWhr1hzuOSsxceYkoi1FGTJS/OU0io13JgMhkzSkoPHZDLZo0ePIOGMWhRLGw581QIrlqUtx83HXGBFZi9/BpMUkVJSUlKWL1++Y8cORDAg5P+SgAVERAIWEBEJWEBEJGABEZGABUREAhYQEQlYQEQkYAERkYAFREQCFhARCVhARCRgARGRgAVERAIWEBEJWEBEJGABEZGABUREAhYQEQlYQEQkYAERkYAFREQCFhARCVhARKwPg8Gwt6fR5NWYQESsj06nKykpQQTDQkQkYAERkYAFREQCFhARCVhARCRgARGRgAVERAIWEBEJWEBEJGABEZGABUREAhYQEQlYQEQkYAERkYAFREQCFpAL/vzN+PHjJRIJg8FQKpVisVgkEsGyQqE4fvw4IlAPuRDc3wwaNKi4uDg/P7+0tFSlUhUUFMCypSV9r1trYIiIfxMTE+Pu7l53DUTE3r17I4JBICL+DZfLHTlyJIv19AK8Hh4eY8aMQQSDQER8yrhx41xdXfXLEA779Onj7OyMCAaBiPgUCIqjR4/WB0UIh2PHjkUEQ0FEfAYIii4uLvpw6OjoiAiGAsc8olyiKStQKBXGySuNGDD9999/79lxdGayFBkcBtIJrNm2jlw2h14xAq88orJGe2pXUV6G3N1foJRrEf3g8hgVxSqtVuvfybLzAFtEGzASUS7V7F+b132YvYObOaI9fx4rMeMzw4fZIXqAUfzfvTK730QXYqGeLgPta+TaP0+UI3qAi4i3z1cGdLUSCEnf91O6vGr/8K5MLlUjGoCLiEWPavhCDiLUg4EqClWIBuAiokqpE9oSEetj52xWXU6LiIhLUVgj0eg0iFAPpUKjpcfwKFInI2ABEZGABUREAhYQEQlYQEQkYAERkYAFREQCFhARCVhARCRgARGRgAVERAIWkHNWUGbmgz79Ot+5cwsRjAcREYnsHWZ9MN/Fxa2JfbKyMmImDEX/jpGj+hcU5iNCQ5CiGQkthSOGv+BE+rS0FPTvKCoqFIsrEaERTFjE+6n3tmz5Lv1BqlKp8Gzj/eabsZ07ddNvOnwk8edfdhUU5PF4Zu1DO74bO9fBwbGx9VA0v/lWTPyaLSEhYaDLxoQ1t27/JZNJnZxcxoyeMGzoqB9+TPhx+2Z4OpTgsTNnw8rGDn3w15+3/bAxbvma+O9W5eQ8FFpaTZr05uBBI27euj57zjuww4SJwyf/Z9obU95BhGcx1aJZoVB8NP89Dpf71ar1G9ZtD2oXumjxnJKSYtiUlHTzq6+XjR41fuuWvXFffCuuqlz6+fwm1tdl5aqlpWUlXyxf8/3Wn0ZFx6z5dsWf16/EvPb6qFExoGzi/lPDho5u4tBsNlsqlWzfsWXpkpW/Hfw9KmrIN2viYFNIcNjiRXGwQ8LGHeNjpiDCc5hqRGSxWN98nWBnJ7KysoaHU6fM2L9/T/Ld230iB2Q9zODxeANfHQZauLq4LVm0orCoAPZpbH1dMrMeRI98LTCgHSy7Dh/j5xvg6OhsZmbG4/IYDIb+WGq1urFD67dOiJmiD8CDBo6AUJqRkda9e08+XwBrLC2F8GqI8BymKiLIpFKr4teufJCRJpFU60+KraoSw32HsM4gzfuzpkGZ2KlTN2cnF1tbuybW1yW8xyu79/wAL9itW0RoSIfAwOCXOrQeb29f/QJoB/fVkmpEeBGmWjTn5mbPmfuOUqlcuODzTRt3JmzYUbvJw8Pzu/ht0AretHkt1MlmvjvlXkpyE+vr8t9ZC6ZNjU1KujH3w5nRo/vDnhDhmn9oPRB3n3lMpkJtBqYaEc+cPaHRaD75eLn+V4dGRt2tPj6+nyxcBjtAdnDrtvULP571054jXC63wfV1nwjRbvTo8XArLy87cfLw1u/XW1vbjBs7qfmHJvwzTDUiqlRKaPnWxp6Tp576lJKSfPduEnpSjwwL6zT1jRmQNwGxGltf+0SJRHLy1FF9CIRSO+a1yUFBIdCmbv6hXwiZKLoxTFXEwIBg0OjosV/LykoTD+67n3oXQlfG40qb5Oq1yx8vmn3u/Om8/FzIsEBLwsnR2dHRqbH1ta8JNcj4tV9Cyxq25hfknTp9DNKHoCxssrCwhANBu7uwsKCJQzfxhoVP6otXrlyEV0CE5zDVojk8/JXXxv0nYVP8+g2ru3WNmD9v6c+/7Ny950cmkwnZQbVatXHjGkjECAQWwcHtV8TFg2STJk5tcH3tawoEgi9XfAcJwtlz3oYqIOQRIeEHrWzY1K/vwOMnDs35cMaE8VNgZWOH9vUNaOwN+/kFdu0avmHjN0VFBTPemYUIz4LLJEy/fJsb1kfk0IakNp7h0sGiNgHmgV2FqLVDuvgIWEBEJGABEZGABUREAhYQEQlYQEQkYAERkYAFREQCFhARCVhARCRgARGRgAVERAIWEBEJWICLiFYiro5BBo3Wh8dncXm0mAQBlw/JEzBL82oQ4VlyUqW2zlxEA3AR0TOQLy5WIkIdJGKV0JZj40BENCDu/nwLa9bVoyWI8D/O7i7oFS1C9ACv6zVfOVpeWaxy8jIXuZrR7crZehgMXVW5uqpMeeVwyaQFbaxEdLksHF4iAll3pek3JTUyTXlBoyW1UqlkPQFRgFajUapUBpuPQS6Xc7nc2s9iJmBxuAxnH7NuA+1YLAaiDdiJ+EKys7MPHDjwwQcfIGpYunTp+fPnly9f3r17d0Q9EokkLi4ODofojSmJKBaLCwsLnZycrKysEDXcu3fvk08+AdfDw8Pj4+ORAdm7d29oaGhgYCCiJSZTDystLY2Ojvby8qLOQmD37t1gIXo8IWLapUuXkAEZMmQIxMXKSprOoWgaIkJFCvw4c+YMVKcQZaSkpNy4cUO/DN7v2rULGRALC4sdOx5Po/Pw4cPc3FxEM0xAxDlz5kD9oWPHjohidu7cWVRUVPsQimkDB0XA2tra2dk5NjYWjo7oBO4i7tmzZ9iwYXw+H1EM/PC14VAPVEn1IcrA8Hi8gwcPQiEAy/QpqfEV8eLFi3APFkZGRiLq2b59O4RDrVar+x+w8v79+8hIdOr0eM4dCI3nzp1DNADTVjN8+8ePH//iiy+QwYGaIjQajBILGwT+QyZPnqxWq9ns1jxUCtOIyGQyjWIhhoCFcL969Wr41d9QSQAAD6ZJREFUz0StF7xELC8vnz59Oiz06tULEeowb948KCVqalrtACW8oj38369atQoRGgKKCCig9Q35iIgI1LrAJSIePnwY7pctW0ZpvtrUgWpijx49oA8mOTkZtS6wEHHhwoUCgQARmgHUnqHvEdKNsHzrVuu5fqCRRayoqID78ePHGyZH02pwc3t85cANGzYcPXoUtQqMKeKxY8cSExNhISQkBBFenoSEBOgYhIX8fJO/1qQxRbxw4cIbb7yBCP8CfXph9+7d27ZtQ6aMcUQ8ffo03JNBeC2FvjseFmQyGTJNDC2iSqXq1q1bWFgYIrQoU6dORU/6RXfu3IlMEIOKCJ25ZWVlkAmzs7NDBAqIioqCLxl6KU1u4L3hRIyLi6uqqnJycmrdfaZGZ/bs2e7u7pCOOHjwIDIdDOQEJGB9n4AI1KNvSt++fRvi4siRI5EpQLmIUExwuVwvL6/g4GBEMCCLFy/OzMyEhWvXrnXt2hXhDbVFM3wR0DT28fEhHSdGwdvbG+6vX7/+9ddfI7yhUETooTfWIOd/yfPXaDZpZs6cCZkK9OTUVYQrVIm4b9++v/76q0OHDsjUuHPnzvDhw1HromfPnuhJTwy2p2VRJSI0jaEHD5ka+oEtEyZMQK0R+B/Td+5jCFWnCkDiGlKGkKxBpsP3339fWlo6b9481EqBTycUCik9JfcfY3pTjlBEfHw8i8WKjY1FBGNAYWMFMqtGPAvupYBku5WVVau3cO7cudj+IhSK6OzsbBIjNxctWgSZ9tdffx21dqBohioTwhIKi2b1Eww2v9s/A8J2//79Bw8ejGgAqSNiyttvvw0N5N69eyOCsaG2ZyUyMlKpxHRm7IkTJ06fPp1WFtK0jgj4+flBXzPCj+joaKga6qf1oA80rSNiS1RU1JYtWzw8PBDNoG8dERorWq0Wn08O7wfK4l9//ZWMzMUNaovm7OxsqIohPBCLxREREadPn6athfStI3p7eysUChxmbCkoKIB64dWrVzFPJ1EKqSMamQcPHsyaNevQoUOI3tA6j1hVVcVkMvWD140C9O5AD97evXsRAWMoP3nq0qVLK1asQEYCjr527VpioR761hGB0NDQM2fODB06FJqrBpiQvS4nT54EBbdu3YoIT6BjHRE6LZKSkuqNube1tYXoaBgdExMTr1y5YsRgjCE41xGpioibNm1ycXGptxJarBAgEfXs3Lnzzp07xMJ6iEQiPC1ElBbN7777ro2NTe1DCL3t2rUzwNn1CQkJRUVF0IOHCM9C0zpi3759hwwZwuH8faFXUFB/LhmlrF69msFgzJ49GxGeg9Z5xBkzZly7dg3kgP6M9evX+/j4IMr4/PPPIYWOT18ObtCxjlhLfHy8h4cH9DhbW1tTauH8+fNDQkKIhU2Acx2xWTU2tUorl2jRP4Tx8UfLlixZ0ql9z+oKqk5cX7J4yaDh/QYMGIAIjQN1xGnTpgUEBCD8eEHRnHKtKumCuLxQaW5ByeXiWwT4CFyBtiJf5xUs6NjX2tnLHBHqAPkyqBrBtwT3+jWw7Ofnt2fPHoQNTUXEayfKS/NVvUY5WdpyEPbAlysuUf3+S1H4ELs2gZRfRNKE8Pf3T01NhY7W2jXQ4/rWW28hnGi0jnj1WLm4RN0r2tEkLATg393agTv0LXd4549STHUGXyqIiYkxN3+mlGjTpk2/fv0QTjQsYkWxsjRP0X2oAzJB+k10vnkW04k1jMKIESNcXV1rH/L5fAzn0G9YRLAQahTINOHyWJUlqqpyTBNmRgGSCbXtZchw9enTB2FGwyJKxBp7dxMeQOruL6goJiI+BYKi/hpBAoFgypQpCD8aFlGl0Kpq/nG+xvhIKlU6DZnT5xkgKEIvF4RDPC/yReZVx5FH96WQc5VVaZRybY1cg1oCAeoe2e496O4/tbsItQQCIVur0cG9QMhy8jKztPlXjVoiIkakXq9Kuyl9dE/q4idUqXQsNovFYSNmi2UtuvYYAvfVLZRRkNYw1EqVNlup0+qq9peaC1htwwTtwoUWVv/kDRMRsSD9ZvWFxDIbFwGLJ2g3wL4282wqOPgiebUiJ0t271q+VxC/50g7Nufleo+JiEZGo9Ed3loorUZu7Z255ib8c5hb8uAm8rIpzxFvWpAVOdY+qJuw+U8nIhqT4pyafWtyfbq5CN15qLVg624Ftzt/lJTkKXqPsm/ms3C5gj0NEZcpj2wrbtcf6vmtx8JaHP3ty0qZUN9o5v5ERONQ+KgmcX2hZxdX1HqxdbcuLkRHfyxszs5ERCOgVmn3r81r07k1W6jHro21TMq8furFPa5ERCNw+Psin+6t30I9dl52j1IVOenSpncjIhqau3+IpVIGT2AaY5paBL5IeO6XF1QWiYiG5tJv5Q7etohOmAt5TDYbcqVN7IORiEs+nTdn7gzUqkm+LLZrY8nmYTrc/Xby6bmLukmllailsfOyvXulqSsBtpiIBxJ/WrHyU0RokvvXJTwBHefF4/E55YXKiqJGJ1RvMRHT0nCcKxsrVAptSU6NhR1NT6kRiPiZdxoNii3TszJr9vTbt2/AwvHjhzYl7PRt63/nzq3NW78DO6HbNDAg+K233gsMaKff+fCRxJ/27cjPzzU353frGj7jnf/a2tafwhX2+fmXXQUFeTyeWfvQju/GznVwcEQmzsMUqcjLElHGzaQT5y7tKirJ4vH4HUKiBvWfweU+jr7b9yyEvmt/3x5nz28XV5c4iNpED53bxj0EPe5gVB888s2NpGM6rTbIv2db786IMizt+YXZjVYTWyYiLvtstZ9vQN8+UYn7T3l7tc3JeTR33kx7kcO6tT98F7/NnM+f++GM4uLHo49OnDj81dfLogYM+X7L3s8+XZWWfn/Bwg/qnUmYlHQT9hk9avzWLXvjvvhWXFW59PP5yPQRl6g1KqpGMyTfO7dz3yK/tl3nxO54LXpR0t0zP/8ap9/EYrGzHt3Ozrk7a+b2Tz86xudb7d2/TL/pzPkfr15PHD5o1n9nbvfyDDt17ntEGRweuyBT3tjWlhHRwsKCxWZzuFwrK2sWi3Xw158h2i2Y/5mPjy/cPl6wTK1WHz/xeMLWfT/vjIjoPXHCG+7ubcLCOr337ofgYnLy7bqvlvUwg8fjDXx1mKuLW1Bg8JJFK2JnzkGmj6RSTV0z5cyF7d6eHQcPmCmycw/0Cx8SFXvj9rFK8d9DD5VKOdjG45pDjOwYOrC49KFS+Xg+6b9uHw0O6t214zB4VnjX0X4+FM4JwzFj10gbHVtJSas5LT0FAmTtfEt8Ph+0y8hIAx0zMtODAkNq9/T3D4L7BxlpdZ/eIawzFOjvz5p26PCBgsJ8KLhBR2T6yCQaikTUarW5+SkQDmvXgJRwX1D4QP8QPNMX0wDf/PGgGJm8Sq1WlZbluLsG1T7Lw60dohKegCWtavgUDkpG38hkUjtbUd01fL4AVspr5FAKw/LT9eaPT0CWy58Zq+nh4QkF+u69P27avLZ69fLAwGCoI7YCF6mbZUilqtFqNSfObD559plZSauqS/ULbPbz4yp0ECbhD6fOJqhcIirRaXSNDbWkRESBwEIqfaZ9BA9BTXMzcyaTCUY+Xf9kGfav9wpQoH+ycJlGo4FGz9Zt6xd+POunPUewnbelmVhYsUpKWmbcfz04HDOoCPbs/lq3TsOfOaKgqcw550mMlCue/lJyeVM5538JxCBljZZv2bByLVk017Y5/P2CUtNSamdAq5ZUZ2c/DAh4PDliWx+/O8lPr517724S+l8BXUtKSvLdJ+uhugn1yKlvzBCLK8vLmzugCFssrNlqJSUiwr+3q3NARWWBg72n/mZr48pksvn8poamcthcG2vngsL02jVpGdcQZagVGjNBozWTFhPR0sLywYPU9AepIM2IEWMVipqVX30GzefMzAfLln8MMe/VqKGw29ixk65cuQjpm8LCgpu3rq9d91X79h0DnhXx6rXLHy+afe786bz8XHjB/fv3ODk6Ozo6IRPH2p7DZlF1bmRkz0l37p2FVnBxyaO8/NRdPy9Zt2V6Tc0LhhpAlgea21euJ0Jt8tylnfkFaYgylHK1s3ejOdQWK5qjo2PiVix+/4M3l366qmuXHqu+XLdpy9pp08dDVAsJDvvm6wRr68ezx/bvNxAcBRE3b/kO7OwZEfn22x/Ue6lJE6dCPXrjxjWlZSWwT3Bw+xVx8SZ3GsfzeLYTHPuxUOQtQhQQ2q7P+NFLz17Yfvz0JjMzC0+P0BlT15uZCZp+1oC+06SyykPH4rU6baBfxJCod7fvXQDLiAKkpVLf0EaHADc8G9i14+XQum8faap982d257fvZQU/PMKMA+vy2UJLSxEd54jKuJwzZparlV3Dw47I6BuDEtDVQiFRIPpRI1GK3HiNWYjIyVMGJrCL8I9DD4WOFlzzhn+S5JTze/YvbXCTwNxKKhc3uKl7p5FDB76HWoisR7e27mi4BwGSREwGEzVUTerRZRRk0VEjlGaW9xxmjRqHiGhoeo20+/N0hUu7hmda8/PpOnvm/zW4CfpCapPS9eDxWrIS4uYS2Nh7UKkULBan7lSLzXkP0ooaDkfnGdTUmyQiGhrfDpbpt6Q11YoGT94D1Wy5LsiocDg8W5uWfA81FdV9xr6giUbqiEZg8BtOmdfytVpaTBNVlFbi38Hc4UWTyxERjcP4eR6ZV3JRa6covczemRkcbvXCPYmIxsHGgTvhI9f0i9katQlP/9c0JRllPkGcvuOaNe8wEdFo8C04r81xAxelFXLUutCqtXnJhZ5+7M79bZr5FCKiMRHact750oejlebeLpBXtZL8YklWRer57J5DrLtEvUSHCGk1G5+oSY45abLzB0p5Fjwmlyu0F2B7ml8TSMrkklJZVbGk/SvWY2e+9CXGiIhY4O7Hn/iRx6N70rRb0sxreTbO5soaLZvLZnHZDCamnexMFlMlV2pUGqTTVhTIoV0c1EkQ1N3zZWdG1ENExIg2QYI2T7K+Rdk1T6YuVtfItAoZJSPH/j3mFjoGky0Q8vhCtrOXE4f7r6p5REQccfQwc/RAtKJhEblmDC0y4WFXAmsOk2Xyw8ZoRcPh1NKGU/LIhHMK2SkSWyfTPq+AbjQsooM7z3THocolapErz8Ka1DpMiUYjomtbs/O/NGuuT9w4tSO/y4Dm5lEJmNDU9Zrv/iFOvyVp39vOxpHLYuOe+q6RaapKlZcOFg+c7OjgQceJjkyaF1w4POuu9Na5ysKsGhYb66LaSsSpKld5Bgk6D7CBblxEMDVeIGItCjnWffM6LTITkO5KE6a5IhIIlEKalgQsICISsICISMACIiIBC4iIBCwgIhKw4P8BAAD//2v4e7oAAAAGSURBVAMA1x7mMDWkAPIAAAAASUVORK5CYII=",
"text/plain": [
"<IPython.core.display.Image object>"
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"from IPython.display import Image, display\n",
"\n",
"display(Image(graph.get_graph(xray=True).draw_mermaid_png()))"
]
},
{
"cell_type": "code",
"execution_count": 13,
"id": "5987d58c",
"metadata": {},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"/opt/homebrew/lib/python3.11/site-packages/langchain_google_genai/chat_models.py:1468: UserWarning: HumanMessage with empty content was removed to prevent API error\n",
" warnings.warn(\n"
]
},
{
"ename": "ChatGoogleGenerativeAIError",
"evalue": "Invalid argument provided to Gemini: 400 * GenerateContentRequest.contents: contents is not specified\n* GenerateContentRequest.tools[0].function_declarations[8].name: Invalid function name. Must start with a letter or an underscore. Must be alphameric (a-z, A-Z, 0-9), underscores (_), dots (.) or dashes (-), with a maximum length of 64.\n",
"output_type": "error",
"traceback": [
"\u001b[31m---------------------------------------------------------------------------\u001b[39m",
"\u001b[31m_InactiveRpcError\u001b[39m Traceback (most recent call last)",
"\u001b[36mFile \u001b[39m\u001b[32m/opt/homebrew/lib/python3.11/site-packages/google/api_core/grpc_helpers.py:79\u001b[39m, in \u001b[36m_wrap_unary_errors.<locals>.error_remapped_callable\u001b[39m\u001b[34m(*args, **kwargs)\u001b[39m\n\u001b[32m 78\u001b[39m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[32m---> \u001b[39m\u001b[32m79\u001b[39m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[43mcallable_\u001b[49m\u001b[43m(\u001b[49m\u001b[43m*\u001b[49m\u001b[43margs\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43m*\u001b[49m\u001b[43m*\u001b[49m\u001b[43mkwargs\u001b[49m\u001b[43m)\u001b[49m\n\u001b[32m 80\u001b[39m \u001b[38;5;28;01mexcept\u001b[39;00m grpc.RpcError \u001b[38;5;28;01mas\u001b[39;00m exc:\n",
"\u001b[36mFile \u001b[39m\u001b[32m/opt/homebrew/lib/python3.11/site-packages/grpc/_interceptor.py:277\u001b[39m, in \u001b[36m_UnaryUnaryMultiCallable.__call__\u001b[39m\u001b[34m(self, request, timeout, metadata, credentials, wait_for_ready, compression)\u001b[39m\n\u001b[32m 268\u001b[39m \u001b[38;5;28;01mdef\u001b[39;00m \u001b[34m__call__\u001b[39m(\n\u001b[32m 269\u001b[39m \u001b[38;5;28mself\u001b[39m,\n\u001b[32m 270\u001b[39m request: Any,\n\u001b[32m (...)\u001b[39m\u001b[32m 275\u001b[39m compression: Optional[grpc.Compression] = \u001b[38;5;28;01mNone\u001b[39;00m,\n\u001b[32m 276\u001b[39m ) -> Any:\n\u001b[32m--> \u001b[39m\u001b[32m277\u001b[39m response, ignored_call = \u001b[38;5;28;43mself\u001b[39;49m\u001b[43m.\u001b[49m\u001b[43m_with_call\u001b[49m\u001b[43m(\u001b[49m\n\u001b[32m 278\u001b[39m \u001b[43m \u001b[49m\u001b[43mrequest\u001b[49m\u001b[43m,\u001b[49m\n\u001b[32m 279\u001b[39m \u001b[43m \u001b[49m\u001b[43mtimeout\u001b[49m\u001b[43m=\u001b[49m\u001b[43mtimeout\u001b[49m\u001b[43m,\u001b[49m\n\u001b[32m 280\u001b[39m \u001b[43m \u001b[49m\u001b[43mmetadata\u001b[49m\u001b[43m=\u001b[49m\u001b[43mmetadata\u001b[49m\u001b[43m,\u001b[49m\n\u001b[32m 281\u001b[39m \u001b[43m \u001b[49m\u001b[43mcredentials\u001b[49m\u001b[43m=\u001b[49m\u001b[43mcredentials\u001b[49m\u001b[43m,\u001b[49m\n\u001b[32m 282\u001b[39m \u001b[43m \u001b[49m\u001b[43mwait_for_ready\u001b[49m\u001b[43m=\u001b[49m\u001b[43mwait_for_ready\u001b[49m\u001b[43m,\u001b[49m\n\u001b[32m 283\u001b[39m \u001b[43m \u001b[49m\u001b[43mcompression\u001b[49m\u001b[43m=\u001b[49m\u001b[43mcompression\u001b[49m\u001b[43m,\u001b[49m\n\u001b[32m 284\u001b[39m \u001b[43m \u001b[49m\u001b[43m)\u001b[49m\n\u001b[32m 285\u001b[39m \u001b[38;5;28;01mreturn\u001b[39;00m response\n",
"\u001b[36mFile \u001b[39m\u001b[32m/opt/homebrew/lib/python3.11/site-packages/grpc/_interceptor.py:332\u001b[39m, in \u001b[36m_UnaryUnaryMultiCallable._with_call\u001b[39m\u001b[34m(self, request, timeout, metadata, credentials, wait_for_ready, compression)\u001b[39m\n\u001b[32m 329\u001b[39m call = \u001b[38;5;28mself\u001b[39m._interceptor.intercept_unary_unary(\n\u001b[32m 330\u001b[39m continuation, client_call_details, request\n\u001b[32m 331\u001b[39m )\n\u001b[32m--> \u001b[39m\u001b[32m332\u001b[39m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[43mcall\u001b[49m\u001b[43m.\u001b[49m\u001b[43mresult\u001b[49m\u001b[43m(\u001b[49m\u001b[43m)\u001b[49m, call\n",
"\u001b[36mFile \u001b[39m\u001b[32m/opt/homebrew/lib/python3.11/site-packages/grpc/_channel.py:440\u001b[39m, in \u001b[36m_InactiveRpcError.result\u001b[39m\u001b[34m(self, timeout)\u001b[39m\n\u001b[32m 439\u001b[39m \u001b[38;5;250m\u001b[39m\u001b[33;03m\"\"\"See grpc.Future.result.\"\"\"\u001b[39;00m\n\u001b[32m--> \u001b[39m\u001b[32m440\u001b[39m \u001b[38;5;28;01mraise\u001b[39;00m \u001b[38;5;28mself\u001b[39m\n",
"\u001b[36mFile \u001b[39m\u001b[32m/opt/homebrew/lib/python3.11/site-packages/grpc/_interceptor.py:315\u001b[39m, in \u001b[36m_UnaryUnaryMultiCallable._with_call.<locals>.continuation\u001b[39m\u001b[34m(new_details, request)\u001b[39m\n\u001b[32m 314\u001b[39m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[32m--> \u001b[39m\u001b[32m315\u001b[39m response, call = \u001b[38;5;28;43mself\u001b[39;49m\u001b[43m.\u001b[49m\u001b[43m_thunk\u001b[49m\u001b[43m(\u001b[49m\u001b[43mnew_method\u001b[49m\u001b[43m)\u001b[49m\u001b[43m.\u001b[49m\u001b[43mwith_call\u001b[49m\u001b[43m(\u001b[49m\n\u001b[32m 316\u001b[39m \u001b[43m \u001b[49m\u001b[43mrequest\u001b[49m\u001b[43m,\u001b[49m\n\u001b[32m 317\u001b[39m \u001b[43m \u001b[49m\u001b[43mtimeout\u001b[49m\u001b[43m=\u001b[49m\u001b[43mnew_timeout\u001b[49m\u001b[43m,\u001b[49m\n\u001b[32m 318\u001b[39m \u001b[43m \u001b[49m\u001b[43mmetadata\u001b[49m\u001b[43m=\u001b[49m\u001b[43mnew_metadata\u001b[49m\u001b[43m,\u001b[49m\n\u001b[32m 319\u001b[39m \u001b[43m \u001b[49m\u001b[43mcredentials\u001b[49m\u001b[43m=\u001b[49m\u001b[43mnew_credentials\u001b[49m\u001b[43m,\u001b[49m\n\u001b[32m 320\u001b[39m \u001b[43m \u001b[49m\u001b[43mwait_for_ready\u001b[49m\u001b[43m=\u001b[49m\u001b[43mnew_wait_for_ready\u001b[49m\u001b[43m,\u001b[49m\n\u001b[32m 321\u001b[39m \u001b[43m \u001b[49m\u001b[43mcompression\u001b[49m\u001b[43m=\u001b[49m\u001b[43mnew_compression\u001b[49m\u001b[43m,\u001b[49m\n\u001b[32m 322\u001b[39m \u001b[43m \u001b[49m\u001b[43m)\u001b[49m\n\u001b[32m 323\u001b[39m \u001b[38;5;28;01mreturn\u001b[39;00m _UnaryOutcome(response, call)\n",
"\u001b[36mFile \u001b[39m\u001b[32m/opt/homebrew/lib/python3.11/site-packages/grpc/_channel.py:1198\u001b[39m, in \u001b[36m_UnaryUnaryMultiCallable.with_call\u001b[39m\u001b[34m(self, request, timeout, metadata, credentials, wait_for_ready, compression)\u001b[39m\n\u001b[32m 1192\u001b[39m (\n\u001b[32m 1193\u001b[39m state,\n\u001b[32m 1194\u001b[39m call,\n\u001b[32m 1195\u001b[39m ) = \u001b[38;5;28mself\u001b[39m._blocking(\n\u001b[32m 1196\u001b[39m request, timeout, metadata, credentials, wait_for_ready, compression\n\u001b[32m 1197\u001b[39m )\n\u001b[32m-> \u001b[39m\u001b[32m1198\u001b[39m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[43m_end_unary_response_blocking\u001b[49m\u001b[43m(\u001b[49m\u001b[43mstate\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mcall\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;28;43;01mTrue\u001b[39;49;00m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;28;43;01mNone\u001b[39;49;00m\u001b[43m)\u001b[49m\n",
"\u001b[36mFile \u001b[39m\u001b[32m/opt/homebrew/lib/python3.11/site-packages/grpc/_channel.py:1006\u001b[39m, in \u001b[36m_end_unary_response_blocking\u001b[39m\u001b[34m(state, call, with_call, deadline)\u001b[39m\n\u001b[32m 1005\u001b[39m \u001b[38;5;28;01melse\u001b[39;00m:\n\u001b[32m-> \u001b[39m\u001b[32m1006\u001b[39m \u001b[38;5;28;01mraise\u001b[39;00m _InactiveRpcError(state)\n",
"\u001b[31m_InactiveRpcError\u001b[39m: <_InactiveRpcError of RPC that terminated with:\n\tstatus = StatusCode.INVALID_ARGUMENT\n\tdetails = \"* GenerateContentRequest.contents: contents is not specified\n* GenerateContentRequest.tools[0].function_declarations[8].name: Invalid function name. Must start with a letter or an underscore. Must be alphameric (a-z, A-Z, 0-9), underscores (_), dots (.) or dashes (-), with a maximum length of 64.\n\"\n\tdebug_error_string = \"UNKNOWN:Error received from peer ipv4:216.58.213.74:443 {created_time:\"2025-04-28T10:17:08.229578+02:00\", grpc_status:3, grpc_message:\"* GenerateContentRequest.contents: contents is not specified\\n* GenerateContentRequest.tools[0].function_declarations[8].name: Invalid function name. Must start with a letter or an underscore. Must be alphameric (a-z, A-Z, 0-9), underscores (_), dots (.) or dashes (-), with a maximum length of 64.\\n\"}\"\n>",
"\nThe above exception was the direct cause of the following exception:\n",
"\u001b[31mInvalidArgument\u001b[39m Traceback (most recent call last)",
"\u001b[36mFile \u001b[39m\u001b[32m/opt/homebrew/lib/python3.11/site-packages/langchain_google_genai/chat_models.py:190\u001b[39m, in \u001b[36m_chat_with_retry.<locals>._chat_with_retry\u001b[39m\u001b[34m(**kwargs)\u001b[39m\n\u001b[32m 189\u001b[39m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[32m--> \u001b[39m\u001b[32m190\u001b[39m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[43mgeneration_method\u001b[49m\u001b[43m(\u001b[49m\u001b[43m*\u001b[49m\u001b[43m*\u001b[49m\u001b[43mkwargs\u001b[49m\u001b[43m)\u001b[49m\n\u001b[32m 191\u001b[39m \u001b[38;5;66;03m# Do not retry for these errors.\u001b[39;00m\n",
"\u001b[36mFile \u001b[39m\u001b[32m/opt/homebrew/lib/python3.11/site-packages/google/ai/generativelanguage_v1beta/services/generative_service/client.py:867\u001b[39m, in \u001b[36mGenerativeServiceClient.generate_content\u001b[39m\u001b[34m(self, request, model, contents, retry, timeout, metadata)\u001b[39m\n\u001b[32m 866\u001b[39m \u001b[38;5;66;03m# Send the request.\u001b[39;00m\n\u001b[32m--> \u001b[39m\u001b[32m867\u001b[39m response = \u001b[43mrpc\u001b[49m\u001b[43m(\u001b[49m\n\u001b[32m 868\u001b[39m \u001b[43m \u001b[49m\u001b[43mrequest\u001b[49m\u001b[43m,\u001b[49m\n\u001b[32m 869\u001b[39m \u001b[43m \u001b[49m\u001b[43mretry\u001b[49m\u001b[43m=\u001b[49m\u001b[43mretry\u001b[49m\u001b[43m,\u001b[49m\n\u001b[32m 870\u001b[39m \u001b[43m \u001b[49m\u001b[43mtimeout\u001b[49m\u001b[43m=\u001b[49m\u001b[43mtimeout\u001b[49m\u001b[43m,\u001b[49m\n\u001b[32m 871\u001b[39m \u001b[43m \u001b[49m\u001b[43mmetadata\u001b[49m\u001b[43m=\u001b[49m\u001b[43mmetadata\u001b[49m\u001b[43m,\u001b[49m\n\u001b[32m 872\u001b[39m \u001b[43m\u001b[49m\u001b[43m)\u001b[49m\n\u001b[32m 874\u001b[39m \u001b[38;5;66;03m# Done; return the response.\u001b[39;00m\n",
"\u001b[36mFile \u001b[39m\u001b[32m/opt/homebrew/lib/python3.11/site-packages/google/api_core/gapic_v1/method.py:131\u001b[39m, in \u001b[36m_GapicCallable.__call__\u001b[39m\u001b[34m(self, timeout, retry, compression, *args, **kwargs)\u001b[39m\n\u001b[32m 129\u001b[39m kwargs[\u001b[33m\"\u001b[39m\u001b[33mcompression\u001b[39m\u001b[33m\"\u001b[39m] = compression\n\u001b[32m--> \u001b[39m\u001b[32m131\u001b[39m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[43mwrapped_func\u001b[49m\u001b[43m(\u001b[49m\u001b[43m*\u001b[49m\u001b[43margs\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43m*\u001b[49m\u001b[43m*\u001b[49m\u001b[43mkwargs\u001b[49m\u001b[43m)\u001b[49m\n",
"\u001b[36mFile \u001b[39m\u001b[32m/opt/homebrew/lib/python3.11/site-packages/google/api_core/retry.py:372\u001b[39m, in \u001b[36mRetry.__call__.<locals>.retry_wrapped_func\u001b[39m\u001b[34m(*args, **kwargs)\u001b[39m\n\u001b[32m 369\u001b[39m sleep_generator = exponential_sleep_generator(\n\u001b[32m 370\u001b[39m \u001b[38;5;28mself\u001b[39m._initial, \u001b[38;5;28mself\u001b[39m._maximum, multiplier=\u001b[38;5;28mself\u001b[39m._multiplier\n\u001b[32m 371\u001b[39m )\n\u001b[32m--> \u001b[39m\u001b[32m372\u001b[39m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[43mretry_target\u001b[49m\u001b[43m(\u001b[49m\n\u001b[32m 373\u001b[39m \u001b[43m \u001b[49m\u001b[43mtarget\u001b[49m\u001b[43m,\u001b[49m\n\u001b[32m 374\u001b[39m \u001b[43m \u001b[49m\u001b[38;5;28;43mself\u001b[39;49m\u001b[43m.\u001b[49m\u001b[43m_predicate\u001b[49m\u001b[43m,\u001b[49m\n\u001b[32m 375\u001b[39m \u001b[43m \u001b[49m\u001b[43msleep_generator\u001b[49m\u001b[43m,\u001b[49m\n\u001b[32m 376\u001b[39m \u001b[43m \u001b[49m\u001b[38;5;28;43mself\u001b[39;49m\u001b[43m.\u001b[49m\u001b[43m_timeout\u001b[49m\u001b[43m,\u001b[49m\n\u001b[32m 377\u001b[39m \u001b[43m \u001b[49m\u001b[43mon_error\u001b[49m\u001b[43m=\u001b[49m\u001b[43mon_error\u001b[49m\u001b[43m,\u001b[49m\n\u001b[32m 378\u001b[39m \u001b[43m\u001b[49m\u001b[43m)\u001b[49m\n",
"\u001b[36mFile \u001b[39m\u001b[32m/opt/homebrew/lib/python3.11/site-packages/google/api_core/retry.py:207\u001b[39m, in \u001b[36mretry_target\u001b[39m\u001b[34m(target, predicate, sleep_generator, timeout, on_error, **kwargs)\u001b[39m\n\u001b[32m 206\u001b[39m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[32m--> \u001b[39m\u001b[32m207\u001b[39m result = \u001b[43mtarget\u001b[49m\u001b[43m(\u001b[49m\u001b[43m)\u001b[49m\n\u001b[32m 208\u001b[39m \u001b[38;5;28;01mif\u001b[39;00m inspect.isawaitable(result):\n",
"\u001b[36mFile \u001b[39m\u001b[32m/opt/homebrew/lib/python3.11/site-packages/google/api_core/timeout.py:120\u001b[39m, in \u001b[36mTimeToDeadlineTimeout.__call__.<locals>.func_with_timeout\u001b[39m\u001b[34m(*args, **kwargs)\u001b[39m\n\u001b[32m 118\u001b[39m kwargs[\u001b[33m\"\u001b[39m\u001b[33mtimeout\u001b[39m\u001b[33m\"\u001b[39m] = \u001b[38;5;28mmax\u001b[39m(\u001b[32m0\u001b[39m, \u001b[38;5;28mself\u001b[39m._timeout - time_since_first_attempt)\n\u001b[32m--> \u001b[39m\u001b[32m120\u001b[39m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[43mfunc\u001b[49m\u001b[43m(\u001b[49m\u001b[43m*\u001b[49m\u001b[43margs\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43m*\u001b[49m\u001b[43m*\u001b[49m\u001b[43mkwargs\u001b[49m\u001b[43m)\u001b[49m\n",
"\u001b[36mFile \u001b[39m\u001b[32m/opt/homebrew/lib/python3.11/site-packages/google/api_core/grpc_helpers.py:81\u001b[39m, in \u001b[36m_wrap_unary_errors.<locals>.error_remapped_callable\u001b[39m\u001b[34m(*args, **kwargs)\u001b[39m\n\u001b[32m 80\u001b[39m \u001b[38;5;28;01mexcept\u001b[39;00m grpc.RpcError \u001b[38;5;28;01mas\u001b[39;00m exc:\n\u001b[32m---> \u001b[39m\u001b[32m81\u001b[39m \u001b[38;5;28;01mraise\u001b[39;00m exceptions.from_grpc_error(exc) \u001b[38;5;28;01mfrom\u001b[39;00m \u001b[34;01mexc\u001b[39;00m\n",
"\u001b[31mInvalidArgument\u001b[39m: 400 * GenerateContentRequest.contents: contents is not specified\n* GenerateContentRequest.tools[0].function_declarations[8].name: Invalid function name. Must start with a letter or an underscore. Must be alphameric (a-z, A-Z, 0-9), underscores (_), dots (.) or dashes (-), with a maximum length of 64.\n",
"\nThe above exception was the direct cause of the following exception:\n",
"\u001b[31mChatGoogleGenerativeAIError\u001b[39m Traceback (most recent call last)",
"\u001b[36mCell\u001b[39m\u001b[36m \u001b[39m\u001b[32mIn[13]\u001b[39m\u001b[32m, line 3\u001b[39m\n\u001b[32m 1\u001b[39m question = \u001b[33m\"\u001b[39m\u001b[33m\"\u001b[39m\n\u001b[32m 2\u001b[39m messages = [HumanMessage(content=question)]\n\u001b[32m----> \u001b[39m\u001b[32m3\u001b[39m messages = \u001b[43mgraph\u001b[49m\u001b[43m.\u001b[49m\u001b[43minvoke\u001b[49m\u001b[43m(\u001b[49m\u001b[43m{\u001b[49m\u001b[33;43m\"\u001b[39;49m\u001b[33;43mmessages\u001b[39;49m\u001b[33;43m\"\u001b[39;49m\u001b[43m:\u001b[49m\u001b[43m \u001b[49m\u001b[43mmessages\u001b[49m\u001b[43m}\u001b[49m\u001b[43m)\u001b[49m\n",
"\u001b[36mFile \u001b[39m\u001b[32m/opt/homebrew/lib/python3.11/site-packages/langgraph/pregel/__init__.py:2795\u001b[39m, in \u001b[36mPregel.invoke\u001b[39m\u001b[34m(self, input, config, stream_mode, output_keys, interrupt_before, interrupt_after, checkpoint_during, debug, **kwargs)\u001b[39m\n\u001b[32m 2793\u001b[39m \u001b[38;5;28;01melse\u001b[39;00m:\n\u001b[32m 2794\u001b[39m chunks = []\n\u001b[32m-> \u001b[39m\u001b[32m2795\u001b[39m \u001b[43m\u001b[49m\u001b[38;5;28;43;01mfor\u001b[39;49;00m\u001b[43m \u001b[49m\u001b[43mchunk\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;129;43;01min\u001b[39;49;00m\u001b[43m \u001b[49m\u001b[38;5;28;43mself\u001b[39;49m\u001b[43m.\u001b[49m\u001b[43mstream\u001b[49m\u001b[43m(\u001b[49m\n\u001b[32m 2796\u001b[39m \u001b[43m \u001b[49m\u001b[38;5;28;43minput\u001b[39;49m\u001b[43m,\u001b[49m\n\u001b[32m 2797\u001b[39m \u001b[43m \u001b[49m\u001b[43mconfig\u001b[49m\u001b[43m,\u001b[49m\n\u001b[32m 2798\u001b[39m \u001b[43m \u001b[49m\u001b[43mstream_mode\u001b[49m\u001b[43m=\u001b[49m\u001b[43mstream_mode\u001b[49m\u001b[43m,\u001b[49m\n\u001b[32m 2799\u001b[39m \u001b[43m \u001b[49m\u001b[43moutput_keys\u001b[49m\u001b[43m=\u001b[49m\u001b[43moutput_keys\u001b[49m\u001b[43m,\u001b[49m\n\u001b[32m 2800\u001b[39m \u001b[43m \u001b[49m\u001b[43minterrupt_before\u001b[49m\u001b[43m=\u001b[49m\u001b[43minterrupt_before\u001b[49m\u001b[43m,\u001b[49m\n\u001b[32m 2801\u001b[39m \u001b[43m \u001b[49m\u001b[43minterrupt_after\u001b[49m\u001b[43m=\u001b[49m\u001b[43minterrupt_after\u001b[49m\u001b[43m,\u001b[49m\n\u001b[32m 2802\u001b[39m \u001b[43m \u001b[49m\u001b[43mcheckpoint_during\u001b[49m\u001b[43m=\u001b[49m\u001b[43mcheckpoint_during\u001b[49m\u001b[43m,\u001b[49m\n\u001b[32m 2803\u001b[39m \u001b[43m \u001b[49m\u001b[43mdebug\u001b[49m\u001b[43m=\u001b[49m\u001b[43mdebug\u001b[49m\u001b[43m,\u001b[49m\n\u001b[32m 2804\u001b[39m \u001b[43m \u001b[49m\u001b[43m*\u001b[49m\u001b[43m*\u001b[49m\u001b[43mkwargs\u001b[49m\u001b[43m,\u001b[49m\n\u001b[32m 2805\u001b[39m \u001b[43m\u001b[49m\u001b[43m)\u001b[49m\u001b[43m:\u001b[49m\n\u001b[32m 2806\u001b[39m \u001b[43m \u001b[49m\u001b[38;5;28;43;01mif\u001b[39;49;00m\u001b[43m \u001b[49m\u001b[43mstream_mode\u001b[49m\u001b[43m \u001b[49m\u001b[43m==\u001b[49m\u001b[43m \u001b[49m\u001b[33;43m\"\u001b[39;49m\u001b[33;43mvalues\u001b[39;49m\u001b[33;43m\"\u001b[39;49m\u001b[43m:\u001b[49m\n\u001b[32m 2807\u001b[39m \u001b[43m \u001b[49m\u001b[43mlatest\u001b[49m\u001b[43m \u001b[49m\u001b[43m=\u001b[49m\u001b[43m \u001b[49m\u001b[43mchunk\u001b[49m\n",
"\u001b[36mFile \u001b[39m\u001b[32m/opt/homebrew/lib/python3.11/site-packages/langgraph/pregel/__init__.py:2433\u001b[39m, in \u001b[36mPregel.stream\u001b[39m\u001b[34m(self, input, config, stream_mode, output_keys, interrupt_before, interrupt_after, checkpoint_during, debug, subgraphs)\u001b[39m\n\u001b[32m 2427\u001b[39m \u001b[38;5;66;03m# Similarly to Bulk Synchronous Parallel / Pregel model\u001b[39;00m\n\u001b[32m 2428\u001b[39m \u001b[38;5;66;03m# computation proceeds in steps, while there are channel updates.\u001b[39;00m\n\u001b[32m 2429\u001b[39m \u001b[38;5;66;03m# Channel updates from step N are only visible in step N+1\u001b[39;00m\n\u001b[32m 2430\u001b[39m \u001b[38;5;66;03m# channels are guaranteed to be immutable for the duration of the step,\u001b[39;00m\n\u001b[32m 2431\u001b[39m \u001b[38;5;66;03m# with channel updates applied only at the transition between steps.\u001b[39;00m\n\u001b[32m 2432\u001b[39m \u001b[38;5;28;01mwhile\u001b[39;00m loop.tick(input_keys=\u001b[38;5;28mself\u001b[39m.input_channels):\n\u001b[32m-> \u001b[39m\u001b[32m2433\u001b[39m \u001b[43m \u001b[49m\u001b[38;5;28;43;01mfor\u001b[39;49;00m\u001b[43m \u001b[49m\u001b[43m_\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;129;43;01min\u001b[39;49;00m\u001b[43m \u001b[49m\u001b[43mrunner\u001b[49m\u001b[43m.\u001b[49m\u001b[43mtick\u001b[49m\u001b[43m(\u001b[49m\n\u001b[32m 2434\u001b[39m \u001b[43m \u001b[49m\u001b[43mloop\u001b[49m\u001b[43m.\u001b[49m\u001b[43mtasks\u001b[49m\u001b[43m.\u001b[49m\u001b[43mvalues\u001b[49m\u001b[43m(\u001b[49m\u001b[43m)\u001b[49m\u001b[43m,\u001b[49m\n\u001b[32m 2435\u001b[39m \u001b[43m \u001b[49m\u001b[43mtimeout\u001b[49m\u001b[43m=\u001b[49m\u001b[38;5;28;43mself\u001b[39;49m\u001b[43m.\u001b[49m\u001b[43mstep_timeout\u001b[49m\u001b[43m,\u001b[49m\n\u001b[32m 2436\u001b[39m \u001b[43m \u001b[49m\u001b[43mretry_policy\u001b[49m\u001b[43m=\u001b[49m\u001b[38;5;28;43mself\u001b[39;49m\u001b[43m.\u001b[49m\u001b[43mretry_policy\u001b[49m\u001b[43m,\u001b[49m\n\u001b[32m 2437\u001b[39m \u001b[43m \u001b[49m\u001b[43mget_waiter\u001b[49m\u001b[43m=\u001b[49m\u001b[43mget_waiter\u001b[49m\u001b[43m,\u001b[49m\n\u001b[32m 2438\u001b[39m \u001b[43m \u001b[49m\u001b[43m)\u001b[49m\u001b[43m:\u001b[49m\n\u001b[32m 2439\u001b[39m \u001b[43m \u001b[49m\u001b[38;5;66;43;03m# emit output\u001b[39;49;00m\n\u001b[32m 2440\u001b[39m \u001b[43m \u001b[49m\u001b[38;5;28;43;01myield from\u001b[39;49;00m\u001b[43m \u001b[49m\u001b[43moutput\u001b[49m\u001b[43m(\u001b[49m\u001b[43m)\u001b[49m\n\u001b[32m 2441\u001b[39m \u001b[38;5;66;03m# emit output\u001b[39;00m\n",
"\u001b[36mCell\u001b[39m\u001b[36m \u001b[39m\u001b[32mIn[11]\u001b[39m\u001b[32m, line 12\u001b[39m, in \u001b[36massistant\u001b[39m\u001b[34m(state)\u001b[39m\n\u001b[32m 10\u001b[39m \u001b[38;5;28;01mdef\u001b[39;00m \u001b[34massistant\u001b[39m(state: MessagesState):\n\u001b[32m 11\u001b[39m \u001b[38;5;250m \u001b[39m\u001b[33;03m\"\"\"Assistant node\"\"\"\u001b[39;00m\n\u001b[32m---> \u001b[39m\u001b[32m12\u001b[39m \u001b[38;5;28;01mreturn\u001b[39;00m {\u001b[33m\"\u001b[39m\u001b[33mmessages\u001b[39m\u001b[33m\"\u001b[39m: [\u001b[43mllm_with_tools\u001b[49m\u001b[43m.\u001b[49m\u001b[43minvoke\u001b[49m\u001b[43m(\u001b[49m\u001b[43m[\u001b[49m\u001b[43msys_msg\u001b[49m\u001b[43m]\u001b[49m\u001b[43m \u001b[49m\u001b[43m+\u001b[49m\u001b[43m \u001b[49m\u001b[43mstate\u001b[49m\u001b[43m[\u001b[49m\u001b[33;43m\"\u001b[39;49m\u001b[33;43mmessages\u001b[39;49m\u001b[33;43m\"\u001b[39;49m\u001b[43m]\u001b[49m\u001b[43m)\u001b[49m]}\n",
"\u001b[36mFile \u001b[39m\u001b[32m/opt/homebrew/lib/python3.11/site-packages/langchain_core/runnables/base.py:5416\u001b[39m, in \u001b[36mRunnableBindingBase.invoke\u001b[39m\u001b[34m(self, input, config, **kwargs)\u001b[39m\n\u001b[32m 5409\u001b[39m \u001b[38;5;129m@override\u001b[39m\n\u001b[32m 5410\u001b[39m \u001b[38;5;28;01mdef\u001b[39;00m \u001b[34minvoke\u001b[39m(\n\u001b[32m 5411\u001b[39m \u001b[38;5;28mself\u001b[39m,\n\u001b[32m (...)\u001b[39m\u001b[32m 5414\u001b[39m **kwargs: Optional[Any],\n\u001b[32m 5415\u001b[39m ) -> Output:\n\u001b[32m-> \u001b[39m\u001b[32m5416\u001b[39m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28;43mself\u001b[39;49m\u001b[43m.\u001b[49m\u001b[43mbound\u001b[49m\u001b[43m.\u001b[49m\u001b[43minvoke\u001b[49m\u001b[43m(\u001b[49m\n\u001b[32m 5417\u001b[39m \u001b[43m \u001b[49m\u001b[38;5;28;43minput\u001b[39;49m\u001b[43m,\u001b[49m\n\u001b[32m 5418\u001b[39m \u001b[43m \u001b[49m\u001b[38;5;28;43mself\u001b[39;49m\u001b[43m.\u001b[49m\u001b[43m_merge_configs\u001b[49m\u001b[43m(\u001b[49m\u001b[43mconfig\u001b[49m\u001b[43m)\u001b[49m\u001b[43m,\u001b[49m\n\u001b[32m 5419\u001b[39m \u001b[43m \u001b[49m\u001b[43m*\u001b[49m\u001b[43m*\u001b[49m\u001b[43m{\u001b[49m\u001b[43m*\u001b[49m\u001b[43m*\u001b[49m\u001b[38;5;28;43mself\u001b[39;49m\u001b[43m.\u001b[49m\u001b[43mkwargs\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43m*\u001b[49m\u001b[43m*\u001b[49m\u001b[43mkwargs\u001b[49m\u001b[43m}\u001b[49m\u001b[43m,\u001b[49m\n\u001b[32m 5420\u001b[39m \u001b[43m \u001b[49m\u001b[43m)\u001b[49m\n",
"\u001b[36mFile \u001b[39m\u001b[32m/opt/homebrew/lib/python3.11/site-packages/langchain_google_genai/chat_models.py:1175\u001b[39m, in \u001b[36mChatGoogleGenerativeAI.invoke\u001b[39m\u001b[34m(self, input, config, code_execution, stop, **kwargs)\u001b[39m\n\u001b[32m 1170\u001b[39m \u001b[38;5;28;01melse\u001b[39;00m:\n\u001b[32m 1171\u001b[39m \u001b[38;5;28;01mraise\u001b[39;00m \u001b[38;5;167;01mValueError\u001b[39;00m(\n\u001b[32m 1172\u001b[39m \u001b[33m\"\u001b[39m\u001b[33mTools are already defined.\u001b[39m\u001b[33m\"\u001b[39m \u001b[33m\"\u001b[39m\u001b[33mcode_execution tool can\u001b[39m\u001b[33m'\u001b[39m\u001b[33mt be defined\u001b[39m\u001b[33m\"\u001b[39m\n\u001b[32m 1173\u001b[39m )\n\u001b[32m-> \u001b[39m\u001b[32m1175\u001b[39m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28;43msuper\u001b[39;49m\u001b[43m(\u001b[49m\u001b[43m)\u001b[49m\u001b[43m.\u001b[49m\u001b[43minvoke\u001b[49m\u001b[43m(\u001b[49m\u001b[38;5;28;43minput\u001b[39;49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mconfig\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mstop\u001b[49m\u001b[43m=\u001b[49m\u001b[43mstop\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43m*\u001b[49m\u001b[43m*\u001b[49m\u001b[43mkwargs\u001b[49m\u001b[43m)\u001b[49m\n",
"\u001b[36mFile \u001b[39m\u001b[32m/opt/homebrew/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py:369\u001b[39m, in \u001b[36mBaseChatModel.invoke\u001b[39m\u001b[34m(self, input, config, stop, **kwargs)\u001b[39m\n\u001b[32m 357\u001b[39m \u001b[38;5;129m@override\u001b[39m\n\u001b[32m 358\u001b[39m \u001b[38;5;28;01mdef\u001b[39;00m \u001b[34minvoke\u001b[39m(\n\u001b[32m 359\u001b[39m \u001b[38;5;28mself\u001b[39m,\n\u001b[32m (...)\u001b[39m\u001b[32m 364\u001b[39m **kwargs: Any,\n\u001b[32m 365\u001b[39m ) -> BaseMessage:\n\u001b[32m 366\u001b[39m config = ensure_config(config)\n\u001b[32m 367\u001b[39m \u001b[38;5;28;01mreturn\u001b[39;00m cast(\n\u001b[32m 368\u001b[39m \u001b[33m\"\u001b[39m\u001b[33mChatGeneration\u001b[39m\u001b[33m\"\u001b[39m,\n\u001b[32m--> \u001b[39m\u001b[32m369\u001b[39m \u001b[38;5;28;43mself\u001b[39;49m\u001b[43m.\u001b[49m\u001b[43mgenerate_prompt\u001b[49m\u001b[43m(\u001b[49m\n\u001b[32m 370\u001b[39m \u001b[43m \u001b[49m\u001b[43m[\u001b[49m\u001b[38;5;28;43mself\u001b[39;49m\u001b[43m.\u001b[49m\u001b[43m_convert_input\u001b[49m\u001b[43m(\u001b[49m\u001b[38;5;28;43minput\u001b[39;49m\u001b[43m)\u001b[49m\u001b[43m]\u001b[49m\u001b[43m,\u001b[49m\n\u001b[32m 371\u001b[39m \u001b[43m \u001b[49m\u001b[43mstop\u001b[49m\u001b[43m=\u001b[49m\u001b[43mstop\u001b[49m\u001b[43m,\u001b[49m\n\u001b[32m 372\u001b[39m \u001b[43m \u001b[49m\u001b[43mcallbacks\u001b[49m\u001b[43m=\u001b[49m\u001b[43mconfig\u001b[49m\u001b[43m.\u001b[49m\u001b[43mget\u001b[49m\u001b[43m(\u001b[49m\u001b[33;43m\"\u001b[39;49m\u001b[33;43mcallbacks\u001b[39;49m\u001b[33;43m\"\u001b[39;49m\u001b[43m)\u001b[49m\u001b[43m,\u001b[49m\n\u001b[32m 373\u001b[39m \u001b[43m \u001b[49m\u001b[43mtags\u001b[49m\u001b[43m=\u001b[49m\u001b[43mconfig\u001b[49m\u001b[43m.\u001b[49m\u001b[43mget\u001b[49m\u001b[43m(\u001b[49m\u001b[33;43m\"\u001b[39;49m\u001b[33;43mtags\u001b[39;49m\u001b[33;43m\"\u001b[39;49m\u001b[43m)\u001b[49m\u001b[43m,\u001b[49m\n\u001b[32m 374\u001b[39m \u001b[43m \u001b[49m\u001b[43mmetadata\u001b[49m\u001b[43m=\u001b[49m\u001b[43mconfig\u001b[49m\u001b[43m.\u001b[49m\u001b[43mget\u001b[49m\u001b[43m(\u001b[49m\u001b[33;43m\"\u001b[39;49m\u001b[33;43mmetadata\u001b[39;49m\u001b[33;43m\"\u001b[39;49m\u001b[43m)\u001b[49m\u001b[43m,\u001b[49m\n\u001b[32m 375\u001b[39m \u001b[43m \u001b[49m\u001b[43mrun_name\u001b[49m\u001b[43m=\u001b[49m\u001b[43mconfig\u001b[49m\u001b[43m.\u001b[49m\u001b[43mget\u001b[49m\u001b[43m(\u001b[49m\u001b[33;43m\"\u001b[39;49m\u001b[33;43mrun_name\u001b[39;49m\u001b[33;43m\"\u001b[39;49m\u001b[43m)\u001b[49m\u001b[43m,\u001b[49m\n\u001b[32m 376\u001b[39m \u001b[43m \u001b[49m\u001b[43mrun_id\u001b[49m\u001b[43m=\u001b[49m\u001b[43mconfig\u001b[49m\u001b[43m.\u001b[49m\u001b[43mpop\u001b[49m\u001b[43m(\u001b[49m\u001b[33;43m\"\u001b[39;49m\u001b[33;43mrun_id\u001b[39;49m\u001b[33;43m\"\u001b[39;49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;28;43;01mNone\u001b[39;49;00m\u001b[43m)\u001b[49m\u001b[43m,\u001b[49m\n\u001b[32m 377\u001b[39m \u001b[43m \u001b[49m\u001b[43m*\u001b[49m\u001b[43m*\u001b[49m\u001b[43mkwargs\u001b[49m\u001b[43m,\u001b[49m\n\u001b[32m 378\u001b[39m \u001b[43m \u001b[49m\u001b[43m)\u001b[49m.generations[\u001b[32m0\u001b[39m][\u001b[32m0\u001b[39m],\n\u001b[32m 379\u001b[39m ).message\n",
"\u001b[36mFile \u001b[39m\u001b[32m/opt/homebrew/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py:946\u001b[39m, in \u001b[36mBaseChatModel.generate_prompt\u001b[39m\u001b[34m(self, prompts, stop, callbacks, **kwargs)\u001b[39m\n\u001b[32m 937\u001b[39m \u001b[38;5;129m@override\u001b[39m\n\u001b[32m 938\u001b[39m \u001b[38;5;28;01mdef\u001b[39;00m \u001b[34mgenerate_prompt\u001b[39m(\n\u001b[32m 939\u001b[39m \u001b[38;5;28mself\u001b[39m,\n\u001b[32m (...)\u001b[39m\u001b[32m 943\u001b[39m **kwargs: Any,\n\u001b[32m 944\u001b[39m ) -> LLMResult:\n\u001b[32m 945\u001b[39m prompt_messages = [p.to_messages() \u001b[38;5;28;01mfor\u001b[39;00m p \u001b[38;5;129;01min\u001b[39;00m prompts]\n\u001b[32m--> \u001b[39m\u001b[32m946\u001b[39m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28;43mself\u001b[39;49m\u001b[43m.\u001b[49m\u001b[43mgenerate\u001b[49m\u001b[43m(\u001b[49m\u001b[43mprompt_messages\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mstop\u001b[49m\u001b[43m=\u001b[49m\u001b[43mstop\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mcallbacks\u001b[49m\u001b[43m=\u001b[49m\u001b[43mcallbacks\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43m*\u001b[49m\u001b[43m*\u001b[49m\u001b[43mkwargs\u001b[49m\u001b[43m)\u001b[49m\n",
"\u001b[36mFile \u001b[39m\u001b[32m/opt/homebrew/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py:765\u001b[39m, in \u001b[36mBaseChatModel.generate\u001b[39m\u001b[34m(self, messages, stop, callbacks, tags, metadata, run_name, run_id, **kwargs)\u001b[39m\n\u001b[32m 762\u001b[39m \u001b[38;5;28;01mfor\u001b[39;00m i, m \u001b[38;5;129;01min\u001b[39;00m \u001b[38;5;28menumerate\u001b[39m(input_messages):\n\u001b[32m 763\u001b[39m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[32m 764\u001b[39m results.append(\n\u001b[32m--> \u001b[39m\u001b[32m765\u001b[39m \u001b[38;5;28;43mself\u001b[39;49m\u001b[43m.\u001b[49m\u001b[43m_generate_with_cache\u001b[49m\u001b[43m(\u001b[49m\n\u001b[32m 766\u001b[39m \u001b[43m \u001b[49m\u001b[43mm\u001b[49m\u001b[43m,\u001b[49m\n\u001b[32m 767\u001b[39m \u001b[43m \u001b[49m\u001b[43mstop\u001b[49m\u001b[43m=\u001b[49m\u001b[43mstop\u001b[49m\u001b[43m,\u001b[49m\n\u001b[32m 768\u001b[39m \u001b[43m \u001b[49m\u001b[43mrun_manager\u001b[49m\u001b[43m=\u001b[49m\u001b[43mrun_managers\u001b[49m\u001b[43m[\u001b[49m\u001b[43mi\u001b[49m\u001b[43m]\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;28;43;01mif\u001b[39;49;00m\u001b[43m \u001b[49m\u001b[43mrun_managers\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;28;43;01melse\u001b[39;49;00m\u001b[43m \u001b[49m\u001b[38;5;28;43;01mNone\u001b[39;49;00m\u001b[43m,\u001b[49m\n\u001b[32m 769\u001b[39m \u001b[43m \u001b[49m\u001b[43m*\u001b[49m\u001b[43m*\u001b[49m\u001b[43mkwargs\u001b[49m\u001b[43m,\u001b[49m\n\u001b[32m 770\u001b[39m \u001b[43m \u001b[49m\u001b[43m)\u001b[49m\n\u001b[32m 771\u001b[39m )\n\u001b[32m 772\u001b[39m \u001b[38;5;28;01mexcept\u001b[39;00m \u001b[38;5;167;01mBaseException\u001b[39;00m \u001b[38;5;28;01mas\u001b[39;00m e:\n\u001b[32m 773\u001b[39m \u001b[38;5;28;01mif\u001b[39;00m run_managers:\n",
"\u001b[36mFile \u001b[39m\u001b[32m/opt/homebrew/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py:1011\u001b[39m, in \u001b[36mBaseChatModel._generate_with_cache\u001b[39m\u001b[34m(self, messages, stop, run_manager, **kwargs)\u001b[39m\n\u001b[32m 1009\u001b[39m result = generate_from_stream(\u001b[38;5;28miter\u001b[39m(chunks))\n\u001b[32m 1010\u001b[39m \u001b[38;5;28;01melif\u001b[39;00m inspect.signature(\u001b[38;5;28mself\u001b[39m._generate).parameters.get(\u001b[33m\"\u001b[39m\u001b[33mrun_manager\u001b[39m\u001b[33m\"\u001b[39m):\n\u001b[32m-> \u001b[39m\u001b[32m1011\u001b[39m result = \u001b[38;5;28;43mself\u001b[39;49m\u001b[43m.\u001b[49m\u001b[43m_generate\u001b[49m\u001b[43m(\u001b[49m\n\u001b[32m 1012\u001b[39m \u001b[43m \u001b[49m\u001b[43mmessages\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mstop\u001b[49m\u001b[43m=\u001b[49m\u001b[43mstop\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mrun_manager\u001b[49m\u001b[43m=\u001b[49m\u001b[43mrun_manager\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43m*\u001b[49m\u001b[43m*\u001b[49m\u001b[43mkwargs\u001b[49m\n\u001b[32m 1013\u001b[39m \u001b[43m \u001b[49m\u001b[43m)\u001b[49m\n\u001b[32m 1014\u001b[39m \u001b[38;5;28;01melse\u001b[39;00m:\n\u001b[32m 1015\u001b[39m result = \u001b[38;5;28mself\u001b[39m._generate(messages, stop=stop, **kwargs)\n",
"\u001b[36mFile \u001b[39m\u001b[32m/opt/homebrew/lib/python3.11/site-packages/langchain_google_genai/chat_models.py:1242\u001b[39m, in \u001b[36mChatGoogleGenerativeAI._generate\u001b[39m\u001b[34m(self, messages, stop, run_manager, tools, functions, safety_settings, tool_config, generation_config, cached_content, tool_choice, **kwargs)\u001b[39m\n\u001b[32m 1216\u001b[39m \u001b[38;5;28;01mdef\u001b[39;00m \u001b[34m_generate\u001b[39m(\n\u001b[32m 1217\u001b[39m \u001b[38;5;28mself\u001b[39m,\n\u001b[32m 1218\u001b[39m messages: List[BaseMessage],\n\u001b[32m (...)\u001b[39m\u001b[32m 1229\u001b[39m **kwargs: Any,\n\u001b[32m 1230\u001b[39m ) -> ChatResult:\n\u001b[32m 1231\u001b[39m request = \u001b[38;5;28mself\u001b[39m._prepare_request(\n\u001b[32m 1232\u001b[39m messages,\n\u001b[32m 1233\u001b[39m stop=stop,\n\u001b[32m (...)\u001b[39m\u001b[32m 1240\u001b[39m tool_choice=tool_choice,\n\u001b[32m 1241\u001b[39m )\n\u001b[32m-> \u001b[39m\u001b[32m1242\u001b[39m response: GenerateContentResponse = \u001b[43m_chat_with_retry\u001b[49m\u001b[43m(\u001b[49m\n\u001b[32m 1243\u001b[39m \u001b[43m \u001b[49m\u001b[43mrequest\u001b[49m\u001b[43m=\u001b[49m\u001b[43mrequest\u001b[49m\u001b[43m,\u001b[49m\n\u001b[32m 1244\u001b[39m \u001b[43m \u001b[49m\u001b[43m*\u001b[49m\u001b[43m*\u001b[49m\u001b[43mkwargs\u001b[49m\u001b[43m,\u001b[49m\n\u001b[32m 1245\u001b[39m \u001b[43m \u001b[49m\u001b[43mgeneration_method\u001b[49m\u001b[43m=\u001b[49m\u001b[38;5;28;43mself\u001b[39;49m\u001b[43m.\u001b[49m\u001b[43mclient\u001b[49m\u001b[43m.\u001b[49m\u001b[43mgenerate_content\u001b[49m\u001b[43m,\u001b[49m\n\u001b[32m 1246\u001b[39m \u001b[43m \u001b[49m\u001b[43mmetadata\u001b[49m\u001b[43m=\u001b[49m\u001b[38;5;28;43mself\u001b[39;49m\u001b[43m.\u001b[49m\u001b[43mdefault_metadata\u001b[49m\u001b[43m,\u001b[49m\n\u001b[32m 1247\u001b[39m \u001b[43m \u001b[49m\u001b[43m)\u001b[49m\n\u001b[32m 1248\u001b[39m \u001b[38;5;28;01mreturn\u001b[39;00m _response_to_result(response)\n",
"\u001b[36mFile \u001b[39m\u001b[32m/opt/homebrew/lib/python3.11/site-packages/langchain_google_genai/chat_models.py:208\u001b[39m, in \u001b[36m_chat_with_retry\u001b[39m\u001b[34m(generation_method, **kwargs)\u001b[39m\n\u001b[32m 205\u001b[39m \u001b[38;5;28;01mexcept\u001b[39;00m \u001b[38;5;167;01mException\u001b[39;00m \u001b[38;5;28;01mas\u001b[39;00m e:\n\u001b[32m 206\u001b[39m \u001b[38;5;28;01mraise\u001b[39;00m e\n\u001b[32m--> \u001b[39m\u001b[32m208\u001b[39m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[43m_chat_with_retry\u001b[49m\u001b[43m(\u001b[49m\u001b[43m*\u001b[49m\u001b[43m*\u001b[49m\u001b[43mkwargs\u001b[49m\u001b[43m)\u001b[49m\n",
"\u001b[36mFile \u001b[39m\u001b[32m/opt/homebrew/lib/python3.11/site-packages/tenacity/__init__.py:338\u001b[39m, in \u001b[36mBaseRetrying.wraps.<locals>.wrapped_f\u001b[39m\u001b[34m(*args, **kw)\u001b[39m\n\u001b[32m 336\u001b[39m copy = \u001b[38;5;28mself\u001b[39m.copy()\n\u001b[32m 337\u001b[39m wrapped_f.statistics = copy.statistics \u001b[38;5;66;03m# type: ignore[attr-defined]\u001b[39;00m\n\u001b[32m--> \u001b[39m\u001b[32m338\u001b[39m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[43mcopy\u001b[49m\u001b[43m(\u001b[49m\u001b[43mf\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43m*\u001b[49m\u001b[43margs\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43m*\u001b[49m\u001b[43m*\u001b[49m\u001b[43mkw\u001b[49m\u001b[43m)\u001b[49m\n",
"\u001b[36mFile \u001b[39m\u001b[32m/opt/homebrew/lib/python3.11/site-packages/tenacity/__init__.py:477\u001b[39m, in \u001b[36mRetrying.__call__\u001b[39m\u001b[34m(self, fn, *args, **kwargs)\u001b[39m\n\u001b[32m 475\u001b[39m retry_state = RetryCallState(retry_object=\u001b[38;5;28mself\u001b[39m, fn=fn, args=args, kwargs=kwargs)\n\u001b[32m 476\u001b[39m \u001b[38;5;28;01mwhile\u001b[39;00m \u001b[38;5;28;01mTrue\u001b[39;00m:\n\u001b[32m--> \u001b[39m\u001b[32m477\u001b[39m do = \u001b[38;5;28;43mself\u001b[39;49m\u001b[43m.\u001b[49m\u001b[43miter\u001b[49m\u001b[43m(\u001b[49m\u001b[43mretry_state\u001b[49m\u001b[43m=\u001b[49m\u001b[43mretry_state\u001b[49m\u001b[43m)\u001b[49m\n\u001b[32m 478\u001b[39m \u001b[38;5;28;01mif\u001b[39;00m \u001b[38;5;28misinstance\u001b[39m(do, DoAttempt):\n\u001b[32m 479\u001b[39m \u001b[38;5;28;01mtry\u001b[39;00m:\n",
"\u001b[36mFile \u001b[39m\u001b[32m/opt/homebrew/lib/python3.11/site-packages/tenacity/__init__.py:378\u001b[39m, in \u001b[36mBaseRetrying.iter\u001b[39m\u001b[34m(self, retry_state)\u001b[39m\n\u001b[32m 376\u001b[39m result = \u001b[38;5;28;01mNone\u001b[39;00m\n\u001b[32m 377\u001b[39m \u001b[38;5;28;01mfor\u001b[39;00m action \u001b[38;5;129;01min\u001b[39;00m \u001b[38;5;28mself\u001b[39m.iter_state.actions:\n\u001b[32m--> \u001b[39m\u001b[32m378\u001b[39m result = \u001b[43maction\u001b[49m\u001b[43m(\u001b[49m\u001b[43mretry_state\u001b[49m\u001b[43m)\u001b[49m\n\u001b[32m 379\u001b[39m \u001b[38;5;28;01mreturn\u001b[39;00m result\n",
"\u001b[36mFile \u001b[39m\u001b[32m/opt/homebrew/lib/python3.11/site-packages/tenacity/__init__.py:400\u001b[39m, in \u001b[36mBaseRetrying._post_retry_check_actions.<locals>.<lambda>\u001b[39m\u001b[34m(rs)\u001b[39m\n\u001b[32m 398\u001b[39m \u001b[38;5;28;01mdef\u001b[39;00m \u001b[34m_post_retry_check_actions\u001b[39m(\u001b[38;5;28mself\u001b[39m, retry_state: \u001b[33m\"\u001b[39m\u001b[33mRetryCallState\u001b[39m\u001b[33m\"\u001b[39m) -> \u001b[38;5;28;01mNone\u001b[39;00m:\n\u001b[32m 399\u001b[39m \u001b[38;5;28;01mif\u001b[39;00m \u001b[38;5;129;01mnot\u001b[39;00m (\u001b[38;5;28mself\u001b[39m.iter_state.is_explicit_retry \u001b[38;5;129;01mor\u001b[39;00m \u001b[38;5;28mself\u001b[39m.iter_state.retry_run_result):\n\u001b[32m--> \u001b[39m\u001b[32m400\u001b[39m \u001b[38;5;28mself\u001b[39m._add_action_func(\u001b[38;5;28;01mlambda\u001b[39;00m rs: \u001b[43mrs\u001b[49m\u001b[43m.\u001b[49m\u001b[43moutcome\u001b[49m\u001b[43m.\u001b[49m\u001b[43mresult\u001b[49m\u001b[43m(\u001b[49m\u001b[43m)\u001b[49m)\n\u001b[32m 401\u001b[39m \u001b[38;5;28;01mreturn\u001b[39;00m\n\u001b[32m 403\u001b[39m \u001b[38;5;28;01mif\u001b[39;00m \u001b[38;5;28mself\u001b[39m.after \u001b[38;5;129;01mis\u001b[39;00m \u001b[38;5;129;01mnot\u001b[39;00m \u001b[38;5;28;01mNone\u001b[39;00m:\n",
"\u001b[36mFile \u001b[39m\u001b[32m/opt/homebrew/Cellar/[email protected]/3.11.11/Frameworks/Python.framework/Versions/3.11/lib/python3.11/concurrent/futures/_base.py:449\u001b[39m, in \u001b[36mFuture.result\u001b[39m\u001b[34m(self, timeout)\u001b[39m\n\u001b[32m 447\u001b[39m \u001b[38;5;28;01mraise\u001b[39;00m CancelledError()\n\u001b[32m 448\u001b[39m \u001b[38;5;28;01melif\u001b[39;00m \u001b[38;5;28mself\u001b[39m._state == FINISHED:\n\u001b[32m--> \u001b[39m\u001b[32m449\u001b[39m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28;43mself\u001b[39;49m\u001b[43m.\u001b[49m\u001b[43m__get_result\u001b[49m\u001b[43m(\u001b[49m\u001b[43m)\u001b[49m\n\u001b[32m 451\u001b[39m \u001b[38;5;28mself\u001b[39m._condition.wait(timeout)\n\u001b[32m 453\u001b[39m \u001b[38;5;28;01mif\u001b[39;00m \u001b[38;5;28mself\u001b[39m._state \u001b[38;5;129;01min\u001b[39;00m [CANCELLED, CANCELLED_AND_NOTIFIED]:\n",
"\u001b[36mFile \u001b[39m\u001b[32m/opt/homebrew/Cellar/[email protected]/3.11.11/Frameworks/Python.framework/Versions/3.11/lib/python3.11/concurrent/futures/_base.py:401\u001b[39m, in \u001b[36mFuture.__get_result\u001b[39m\u001b[34m(self)\u001b[39m\n\u001b[32m 399\u001b[39m \u001b[38;5;28;01mif\u001b[39;00m \u001b[38;5;28mself\u001b[39m._exception:\n\u001b[32m 400\u001b[39m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[32m--> \u001b[39m\u001b[32m401\u001b[39m \u001b[38;5;28;01mraise\u001b[39;00m \u001b[38;5;28mself\u001b[39m._exception\n\u001b[32m 402\u001b[39m \u001b[38;5;28;01mfinally\u001b[39;00m:\n\u001b[32m 403\u001b[39m \u001b[38;5;66;03m# Break a reference cycle with the exception in self._exception\u001b[39;00m\n\u001b[32m 404\u001b[39m \u001b[38;5;28mself\u001b[39m = \u001b[38;5;28;01mNone\u001b[39;00m\n",
"\u001b[36mFile \u001b[39m\u001b[32m/opt/homebrew/lib/python3.11/site-packages/tenacity/__init__.py:480\u001b[39m, in \u001b[36mRetrying.__call__\u001b[39m\u001b[34m(self, fn, *args, **kwargs)\u001b[39m\n\u001b[32m 478\u001b[39m \u001b[38;5;28;01mif\u001b[39;00m \u001b[38;5;28misinstance\u001b[39m(do, DoAttempt):\n\u001b[32m 479\u001b[39m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[32m--> \u001b[39m\u001b[32m480\u001b[39m result = \u001b[43mfn\u001b[49m\u001b[43m(\u001b[49m\u001b[43m*\u001b[49m\u001b[43margs\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43m*\u001b[49m\u001b[43m*\u001b[49m\u001b[43mkwargs\u001b[49m\u001b[43m)\u001b[49m\n\u001b[32m 481\u001b[39m \u001b[38;5;28;01mexcept\u001b[39;00m \u001b[38;5;167;01mBaseException\u001b[39;00m: \u001b[38;5;66;03m# noqa: B902\u001b[39;00m\n\u001b[32m 482\u001b[39m retry_state.set_exception(sys.exc_info()) \u001b[38;5;66;03m# type: ignore[arg-type]\u001b[39;00m\n",
"\u001b[36mFile \u001b[39m\u001b[32m/opt/homebrew/lib/python3.11/site-packages/langchain_google_genai/chat_models.py:202\u001b[39m, in \u001b[36m_chat_with_retry.<locals>._chat_with_retry\u001b[39m\u001b[34m(**kwargs)\u001b[39m\n\u001b[32m 199\u001b[39m \u001b[38;5;28;01mraise\u001b[39;00m \u001b[38;5;167;01mValueError\u001b[39;00m(error_msg)\n\u001b[32m 201\u001b[39m \u001b[38;5;28;01mexcept\u001b[39;00m google.api_core.exceptions.InvalidArgument \u001b[38;5;28;01mas\u001b[39;00m e:\n\u001b[32m--> \u001b[39m\u001b[32m202\u001b[39m \u001b[38;5;28;01mraise\u001b[39;00m ChatGoogleGenerativeAIError(\n\u001b[32m 203\u001b[39m \u001b[33mf\u001b[39m\u001b[33m\"\u001b[39m\u001b[33mInvalid argument provided to Gemini: \u001b[39m\u001b[38;5;132;01m{\u001b[39;00me\u001b[38;5;132;01m}\u001b[39;00m\u001b[33m\"\u001b[39m\n\u001b[32m 204\u001b[39m ) \u001b[38;5;28;01mfrom\u001b[39;00m \u001b[34;01me\u001b[39;00m\n\u001b[32m 205\u001b[39m \u001b[38;5;28;01mexcept\u001b[39;00m \u001b[38;5;167;01mException\u001b[39;00m \u001b[38;5;28;01mas\u001b[39;00m e:\n\u001b[32m 206\u001b[39m \u001b[38;5;28;01mraise\u001b[39;00m e\n",
"\u001b[31mChatGoogleGenerativeAIError\u001b[39m: Invalid argument provided to Gemini: 400 * GenerateContentRequest.contents: contents is not specified\n* GenerateContentRequest.tools[0].function_declarations[8].name: Invalid function name. Must start with a letter or an underscore. Must be alphameric (a-z, A-Z, 0-9), underscores (_), dots (.) or dashes (-), with a maximum length of 64.\n",
"During task with name 'assistant' and id '90f43511-9e06-76c1-6eab-480c25fa29aa'"
]
}
],
"source": [
"question = \"\"\n",
"messages = [HumanMessage(content=question)]\n",
"messages = graph.invoke({\"messages\": messages})"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "330cbf17",
"metadata": {},
"outputs": [],
"source": [
"for m in messages['messages']:\n",
" m.pretty_print()"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.11"
}
},
"nbformat": 4,
"nbformat_minor": 5
}
|