Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Date filters #240

Open
wants to merge 4 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
110 changes: 57 additions & 53 deletions examples/langgraph-agent/agent.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -22,10 +22,10 @@
]
},
{
"metadata": {},
"cell_type": "code",
"outputs": [],
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import asyncio\n",
"import json\n",
Expand All @@ -47,10 +47,10 @@
]
},
{
"metadata": {},
"cell_type": "code",
"outputs": [],
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"def setup_logging():\n",
" logger = logging.getLogger()\n",
Expand All @@ -67,8 +67,8 @@
]
},
{
"metadata": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"## LangSmith integration (Optional)\n",
"\n",
Expand All @@ -78,18 +78,18 @@
]
},
{
"metadata": {},
"cell_type": "code",
"outputs": [],
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"os.environ['LANGCHAIN_TRACING_V2'] = 'false'\n",
"os.environ['LANGCHAIN_PROJECT'] = 'Graphiti LangGraph Tutorial'"
]
},
{
"metadata": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"## Configure Graphiti\n",
"\n",
Expand All @@ -103,10 +103,10 @@
]
},
{
"metadata": {},
"cell_type": "code",
"outputs": [],
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# Configure Graphiti\n",
"\n",
Expand All @@ -127,8 +127,8 @@
]
},
{
"metadata": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"## Generating a database schema \n",
"\n",
Expand All @@ -138,19 +138,19 @@
]
},
{
"metadata": {},
"cell_type": "code",
"outputs": [],
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# Note: This will clear the database\n",
"await clear_data(client.driver)\n",
"await client.build_indices_and_constraints()"
]
},
{
"metadata": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"## Load Shoe Data into the Graph\n",
"\n",
Expand All @@ -161,10 +161,10 @@
]
},
{
"metadata": {},
"cell_type": "code",
"outputs": [],
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"async def ingest_products_data(client: Graphiti):\n",
" script_dir = Path.cwd().parent\n",
Expand All @@ -187,19 +187,19 @@
]
},
{
"metadata": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"## Create a user node in the Graphiti graph\n",
"\n",
"In your own app, this step could be done later once the user has identified themselves and made their sales intent known. We do this here so we can configure the agent with the user's `node_uuid`."
]
},
{
"metadata": {},
"cell_type": "code",
"outputs": [],
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from graphiti_core.search.search_config_recipes import NODE_HYBRID_SEARCH_EPISODE_MENTIONS\n",
"\n",
Expand All @@ -224,20 +224,20 @@
]
},
{
"metadata": {},
"cell_type": "code",
"outputs": [],
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"def edges_to_facts_string(entities: list[EntityEdge]):\n",
" return '-' + '\\n- '.join([edge.fact for edge in entities])"
]
},
{
"metadata": {},
"cell_type": "code",
"outputs": [],
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from langchain_core.messages import AIMessage, SystemMessage\n",
"from langchain_core.tools import tool\n",
Expand All @@ -248,19 +248,19 @@
]
},
{
"metadata": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"## `get_shoe_data` Tool\n",
"\n",
"The agent will use this to search the Graphiti graph for information about shoes. We center the search on the `manybirds_node_uuid` to ensure we rank shoe-related data over user data.\n"
]
},
{
"metadata": {},
"cell_type": "code",
"outputs": [],
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"@tool\n",
"async def get_shoe_data(query: str) -> str:\n",
Expand All @@ -278,25 +278,27 @@
]
},
{
"metadata": {},
"cell_type": "code",
"outputs": [],
"execution_count": null,
"source": "llm = ChatOpenAI(model='gpt-4o-mini', temperature=0).bind_tools(tools)"
"metadata": {},
"outputs": [],
"source": [
"llm = ChatOpenAI(model='gpt-4o-mini', temperature=0).bind_tools(tools)"
]
},
{
"metadata": {},
"cell_type": "code",
"outputs": [],
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# Test the tool node\n",
"await tool_node.ainvoke({'messages': [await llm.ainvoke('wool shoes')]})"
]
},
{
"metadata": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"## Chatbot Function Explanation\n",
"\n",
Expand All @@ -312,10 +314,10 @@
]
},
{
"metadata": {},
"cell_type": "code",
"outputs": [],
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"class State(TypedDict):\n",
" messages: Annotated[list, add_messages]\n",
Expand Down Expand Up @@ -372,8 +374,8 @@
]
},
{
"metadata": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"## Setting up the Agent\n",
"\n",
Expand All @@ -387,10 +389,10 @@
]
},
{
"metadata": {},
"cell_type": "code",
"outputs": [],
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"graph_builder = StateGraph(State)\n",
"\n",
Expand Down Expand Up @@ -420,34 +422,34 @@
]
},
{
"metadata": {},
"cell_type": "markdown",
"metadata": {},
"source": "Our LangGraph agent graph is illustrated below."
},
{
"metadata": {},
"cell_type": "code",
"outputs": [],
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"with suppress(Exception):\n",
" display(Image(graph.get_graph().draw_mermaid_png()))"
]
},
{
"metadata": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"## Running the Agent\n",
"\n",
"Let's test the agent with a single call"
]
},
{
"metadata": {},
"cell_type": "code",
"outputs": [],
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"await graph.ainvoke(\n",
" {\n",
Expand All @@ -465,35 +467,37 @@
]
},
{
"metadata": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"## Viewing the Graph\n",
"\n",
"At this stage, the graph would look something like this. The `jess` node is `INTERESTED_IN` the `TinyBirds Wool Runner` node. The image below was generated using Neo4j Desktop."
]
},
{
"metadata": {},
"cell_type": "code",
"outputs": [],
"execution_count": null,
"source": "display(Image(filename='tinybirds-jess.png', width=850))"
"metadata": {},
"outputs": [],
"source": [
"display(Image(filename='tinybirds-jess.png', width=850))"
]
},
{
"metadata": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"## Running the Agent interactively\n",
"\n",
"The following code will run the agent in an event loop. Just enter a message into the box and click submit."
]
},
{
"metadata": {},
"cell_type": "code",
"outputs": [],
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"conversation_output = widgets.Output()\n",
"config = {'configurable': {'thread_id': uuid.uuid4().hex}}\n",
Expand All @@ -512,14 +516,14 @@
"\n",
" try:\n",
" async for event in graph.astream(\n",
" graph_state,\n",
" config=config,\n",
" graph_state,\n",
" config=config,\n",
" ):\n",
" for value in event.values():\n",
" if 'messages' in value:\n",
" last_message = value['messages'][-1]\n",
" if isinstance(last_message, AIMessage) and isinstance(\n",
" last_message.content, str\n",
" last_message.content, str\n",
" ):\n",
" conversation_output.append_stdout(last_message.content)\n",
" except Exception as e:\n",
Expand Down
Loading
Loading