Finding Lemmas Fast: Why We Don't Use LLMs for Everything
Published on March 10, 2026
In the age of generative AI, it’s tempting to throw a Large Language Model (LLM) at every problem. Need a translation? Ask an LLM. Need to summarize a text? Ask an LLM. Need to find the dictionary form (lemma) of a conjugated Russian word? You could ask an LLM—but we don't.
The Need for Speed
When you type a word into the RURussian search bar, or when our system analyzes a sentence to break down its grammatical components, we need to instantly find the "normal form" or lemma of each word. For example, if you encounter the word "собаками" (with the dogs), the system needs to immediately recognize that the lemma is "собака" (dog).
We initially experimented with LLMs and external API calls (like Wiktionary) for this task. However, the latency was unacceptable. Waiting even a second or two for an API to return a lemma ruins the fluidity of features like autocomplete and instant context analysis. LLMs are simply too slow and computationally expensive for a task that requires millisecond precision.
Our Solution: Pymorphy3
Instead of relying on heavy LLMs, we integrated Pymorphy3, a highly optimized morphological analyzer for the Russian language.
Pymorphy3 uses dictionary-based lookups and advanced heuristics to parse words entirely locally on our servers. When a word is passed through our system, Pymorphy3 instantly calculates the most probable lemmas along with their parts of speech and grammatical tags.
- Zero Latency: Because it runs locally without network overhead or heavy neural network inference, lemma lookup happens in a fraction of a millisecond.
- High Accuracy: It scores different lemma possibilities based on corpus frequency, ensuring the most likely dictionary form is suggested first.
- Resource Efficient: It frees up our server resources, allowing us to reserve our powerful LLM integrations (like GPT-5) for tasks where they truly shine—like generating natural audio and contextual example sentences.
The Right Tool for the Job
Building a responsive language learning platform is all about choosing the right tool for the job. While LLMs power our advanced generative features, classical NLP tools like Pymorphy3 remain the unsung heroes of our backend, ensuring that your learning experience is always snappy, reliable, and uninterrupted.