Laracon 2024: Colin DeCarlo: Laravel and AI

Colin DeCarlo presented two AI-integrated applications that Vehikl recently built.

He went into some background about how LLMs work under the hood converting text to embeddings, and how to use embeddings in an Laravel app.

Documentation Helper

ai.vehikl.com is a documentation helper that Vehikl built for Laracon 2023, trained on Laravel, Vue, and Winter documentation.

They configured the LLM queries with a temperature of 0, instructing the model to not make up information when it doesn’t know the answer.

Chatbot

They also recently created a chat app providing a custom tools for the AI to use:

  • The app has an API integration with a weather API
  • When handling a chat request about the weather for an activity this evening, the app passes the user’s message to ChatGPT, along with a list of custom tools the app provides
  • ChatGPT interprets the request, realizes that it needs a custom tool to get the weather data, and then responds to the app with the name of the tool and the parameters to pass to it
  • The app calls the weather service, then returns that data back to ChatGPT along with a request or correlation ID (I didn’t take good notes on this part)
  • ChatGPT interprets the weather data and then writes a more natural-language chat message to send back to the user

I haven’t dived into LLMs much yet, but this approach of using custom tools seems like a pretty nice way of integrating other, more programmatic services along with AI tools.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.