0

I modified the example from the LlamaIndex documentation: Single Agent Workflow Example to work with a local LLM using the @llamaindex/ollama adapter package.

import { tool } from 'llamaindex';
import { agent } from '@llamaindex/workflow';
import { ollama } from '@llamaindex/ollama';

(async () => {
  // Define a joke-telling tool
  const jokeTool = tool(() => 'Baby Llama is called cria', {
    name: 'joke',
    description: 'Use this tool to get a joke',
  });

  // Create an single agent workflow with the tool
  const jokeAgent = agent({
    tools: [jokeTool],
    llm: ollama({ model: 'qwen2.5-coder:14b' }),
  });

  // Run the workflow
  const result = await jokeAgent.run('Tell me something funny');
  console.log(result.data.result); // Baby Llama is called cria
  console.log('---------------------');
  console.log(result.data.message); // { role: 'assistant', content: 'Baby Llama is called cria' }
})().catch(console.error);

The output was:

{
  "name": "joke",
  "arguments": {}
}
---------------------
{
  role: 'assistant',
  content: '{\n  "name": "joke",\n  "arguments": {}\n}'
}

The tool wasn't actually called.


I'm new in the framework, but it looked like this was similar to GitHub issue #17713. It seems the bug was fixed in the Python version, but not in the TypeScript version. So I tried running the equivalent code in Python using llama_index:

from llama_index.llms.ollama import Ollama
from llama_index.core.agent.workflow import FunctionAgent

async def test():
    joke_agent = FunctionAgent(
        tools=[joke_tool],
        llm=Ollama(model='qwen2.5-coder:14b')
        # llm=Ollama(model='deepseek-coder:6.7b-instruct')
    )

    result = await joke_agent.run(user_msg='Tell me something funny')

    print('-----------------')
    print(result)
    print('-----------------')

def joke_tool():
    '''Use this tool to get a joke'''
    return 'Baby Llama is called cria'

The output was similar:

-----------------
{
  "name": "joke_tool",
  "arguments": {}
}
-----------------

Either I do something wrong or there's a bug in ollama or ... I don't know..

0

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.