• SheeEttin@lemmy.zip
    link
    fedilink
    English
    arrow-up
    15
    ·
    edit-2
    5 days ago

    I would not trust a text generator to do math, no. It’s wholly the wrong tool for the job. Nor do I trust them to be up to date and compliant with tax code. And I really don’t trust them to take legal responsibility for their output.

    • vermaterc@lemmy.mlOP
      link
      fedilink
      arrow-up
      3
      arrow-down
      2
      ·
      5 days ago

      State-of-the-art LLM agents do not perform calculations, they call external tools to do that.

      • audaxdreik@pawb.social
        link
        fedilink
        English
        arrow-up
        4
        ·
        5 days ago

        You’re describing neurosymbolic AI, a combination of machine learning and neural network (LLM) models. Gary Marcus wrote an excellent article on it recently that I recommend giving a read, How o3 and Grok 4 Accidentally Vindicated Neurosymbolic AI.

        The primary issue I see here is that you’re still relying on the LLM to reasonably understand and invoke the ML models. It needs to parse the data and understand what’s important in order to feed it into the ML models and as has been stated many times, LLMs do not truly “understand” anything, they are inferring things statistically. I still do not trust them to be statistically accurate and perform without error.