Building an AI agent | #110
LLM’s have changed everything. Their practical use cases are simply insane and have a huge impact on the individual’s productivity.
I believe more than building an agent positioning it is critical.
Let me give you an example of a recent agent that I have built for getting quick information.
As software engineers, we like to automate things. Whenever I see something really manual, I ask myself if can it be automated.
Recently, I found such a problem, where I felt the best uses of the LLM could be.
Reading lengthy failure logs is a pain, we can automate this using LLMs, and generate simple insights from the failures. So, I built and deployed a simple AI agent that generates insights and delivers a Slack message.
Deploying this can also be tricky.
But, if you are familiar with Nginx, this will be easy.
Here are the techletters that I had previously written that can help:
How to make your local app accessible to your network? | #97
Deploying the app and making it accessible within your network is something interesting that I have done few weeks back. This is something that I learned out of frustration to save cost for deployment. Due to this issue, I learned how you can deploy your app locally and make your local machine as a server.