This week on ‘Works on My Machine,’ I demo Synonllm, a module I originally built as a part of my RubyConf2024 talk Going Postel.
The initial goal was to make it easier to use AI-generated code ‘as-is’ by being flexible about what you get back from the LLM. A lot of the time generated code has slight inconsistencies like being snake_cased instead of camelCased or using slightly wrong method names, but usually the intent is there, so like Jon Postel suggested: be flexible in what you accept as long as the intent is clear.
This module came to mind again because I was recently a guest on The Ruby AI Podcast and we got onto the topic of leaning in to the discomfort you can feel when working with LLMs as an experienced software engineer. They’re going to do things a lot different from what you’re used to, but that doesn’t have to be a bad thing.
What It Does
Synonllm is a Ruby module that combines method_missing
with an LLM to provide AI-powered synonym matching for method calls. Here’s the breakdown from the video:
Capture & Context: When you call a method that doesn’t exist on a class that includes
Synonllm
, it captures the called method name, its arguments, the source code of the target class, and the object’s available methods.LLM Analysis: The information is sent to an LLM, which analyzes the intent behind your attempted method call against the available methods on the class.
Intelligent Suggestion: The LLM suggests the most semantically similar existing method.
Automatic Call:
Synonllm
automatically calls the correct method with the original arguments. You can also modify it easily to rely on the returned confidence score.
In the demo, you’ll see this work with a Date
object (correctly mapping day_of_week to wday) and then with a FileHandler class. For the FileHandler, we use much more natural language-like calls (f.read file, 4
or even f.read
file without specifying the number of lines) and Synonllm
figures out the correct, more verbose, underlying method to call.
Why It Matters
While this might seem like a small convenience, or an unconventional use of AI, I think this hints at a few interesting ideas for where things could go in the future:
More Forgiving Interfaces: Ruby is already known for its developer-friendliness. Tools like Synonllm could push this even further, making it less critical to memorize exact API names, especially for quick scripts or when exploring new libraries. You can focus more on what you want to do, and let the AI help with the how.
Bridging AI-Generated Code: As we use more AI-generated code, we’ll encounter variations in naming conventions (snake_case vs camelCase, or slightly different phrasings for similar actions). Synonllm or something like it could act as an intelligent adapter smoothing out these differences without manual refactoring.
Experimenting with “Smarter” DSLs: Ruby is fantastic for creating Domain-Specific Languages (DSLs). Image building DSLs where the user can express their intent more naturally, and an LLM backend (like the one in Synonllm) translates that into the precise underlying operations. This could open up all sorts of possibilities which we’ll be exploring over the next couple weeks.
What other things can we do with programming languages when the system understands your intent, not just exact syntax?
How To Get It
The code for synonllm is available on GitHub. You can find the main module and the example code from the video in the sublayerapp/synonllm_demo repository.
To try it out, you’ll need to clone the repository and have an environment variable of GEMINI_API_KEY
set with a valid API key (Optionally you can change the generator code to use OpenAI or Claude, setting OPENAI_API_KEY
and ANTHROPIC_API_KE
Y respectively).
If this experiment sparks any thoughts or questions, or you end up using it please let me know!
Share this post