Leveraging AI to Automate SEO Metadata Generation for Web Content

I’ve been playing around with AI for the past few weeks, trying to test its limit, bypass security features, check its use cases and see how it can be leverage for every day work (eg. programming, information gathering, research and more). One thing I’ve realised is how hard it is to keep up with the insane amount of things that are being created with it, new research dropping constantly and the tools evolving at a rapid pace.

I’ve also wanted to learn more about SEO for a while and use this blog as a playground. This seemed like the perfect use case to try and integrate AI into the deployment workflow for this website and generate SEO components such as page descriptions, post titles and more.

Unified interfaces for Large Language Models (LLM API providers)

I currently don’t have the hardware resources to run large language models locally and since I wanted to learn more about how each model differs from one another, I’ve settled on using openrouter.ai. They provide an API interface to different model providers and allow you to use 383 different models (at the time of writing), including a number of free models (great for testing).

openrouter.ai Available Models

To use the model via openrouter, you need to an LLM Frontend such as SillyTavern, Open WebUI or LibreChat. There’s so many out there but these seem to be the most popular (other than the built-in Web UIs). They all get constantly updated to support the newest bells and whistles like MCP servers and tools, chat history and more.

However, if you’re going to build stuff with AI, you’ll need to find a library to integrate models into your code (or build your own). Again, there’s so many of them out there and the landscape is changing constantly but a few currently seem to stand out. For this small project, I wanted to try out one of the frameworks that seemed very promising due to the integrations it has, great documentation and strong coding practices, namely I wanted to use Agno. The examples they had really sold me on the framework plus the ability to request structured output which made this extremely easy.

Using the framework, it took little time to write a program that takes in a file, reads it in, sends the content to an Agent and request it to generate SEO metadata.

Using an AI agent to generate SEO optimised metadata

I use Hugo to convert my markdown content into a static website. The pages can contain metadata stored at the top of the page using the front matter format. This allows us to modify the output title, description, tags/keywords and more. As such, this is what we’ll update with this tool before generating out site.

This was the initial script I used, with really dump prompts, no optimisations, nothing special honestly (I’ve even left the comments in to show you how little effort this required):

Python script using Agno to generate SEO metadata

This worked surprising well and with some small tweaks, you can see the results:

fieldsgen.py example output

Depending on the model you’re using the inference speed takes more or less time but for such as simple task (as long as you don’t use reasoning models), this is fairly quick.

It’s also extremely easy to switch out the models as you can see in the commented out code above. Changing the models produces different outputs and allows you to identify the best model for this sort of task:

fieldsgen.py example output with different Models

Sometimes, the models may not cooperate and provide output in the wrong format which causes errors:

fieldsgen.py example error

Thankfully, by searching the Agno documentation (using their builtin AI search because we’re in the theme of AI), we find an easy solution for error handling to retry on failure:

Agno Documentation answer with AI

Agno Documentation answer with AI response

It’s not super robust but for our use-case is good enough:

agent = Agent(
    model=OpenRouter(id="openrouter/optimus-alpha"),
    description="You are an SEO specialist who generates SEO optimised metadata",
    response_model=SEOPage,
    exponential_backoff=True,
    retries=3,
)

It’s still not perfect, sometimes returns weird keywords, and does sometime return the same keyword/tags more than once but with more tweaking, additional code checks and finding the optimal model for this task, it will help generate SEO metadata for the site’s pages (especially pages that I’m not too concerned about like the bazaar pages).

Outcome

As an example, here is the SEO description and tags generated for the file synchronisation cheatsheet in the bazaar (note the weird CLI parameter used as a tag “-avPh”):

File Synchronisation markdown file with generated metadata viewed in Obsidian

Here’s what it looks like on the front-end with the SEO metadata set in the HTML head element:

File Synchronisation rendered after Hugo generates the HTML page

Honestly, pleasantly surprised considering this is a page with limited content.

Conclusion

Overall, a lot of potential and in my opinion some good use cases for AI. Still quite a bit of work to do to optimise everything, improve error handling and make it generally more robust but great outcome as a Proof-of-Concept.

Here’s the metadata it generated for this post:

~/websites/fyx.me > python3 seo-generator/fieldsgen.py content/minis/AI\ SEO\ generation.md 

[!] Model :      google/gemini-2.5-pro-exp-03-25:free

Title :  Leveraging AI to Automate SEO Metadata Generation for Web Content

Desc. :  Explore how AI, specifically LLMs accessed via platforms like openrouter.ai and frameworks like Agno, can automate SEO metadata generation for Hugo websites. Learn about the process, tools, challenges, and potential of integrating AI into your deployment workflow.

Tags  :  ['AI', 'Artificial Intelligence', 'SEO', 'Search Engine Optimization', 'Metadata Generation', 'LLM', 'Large Language Models', 'openrouter.ai', 'Agno', 'Python', 'Automation', 'Hugo', 'Static Site Generator', 'API Integration', 'Workflow Automation', 'Structured Output', 'SEO Tools']

Lastly, I think AI in general is a great tool to help in speeding up workflows but as always it works even better when you validate/approve what it does. In this case review the metadata it generates and tweak it slightly.