AI Models Need a Virtual Machine
Applications using AI embed the AI model in a framework that interfaces between the model and the rest of the system, providing needed services such as tool calling, context retrieval, etc. Software for early chatbots took user input, called the LLM, and returned the result to the user; essentially just a read-eval-print loop. But, as the capabilities of LLMs have evolved and extension mechanisms, such as MCP were defined, the complexities of the control software that calls the LLM have increase...
Read more at blog.sigplan.org