Show HN: I coded my own JSON translation tool to easily localize my side project

quicklang.app

43 points by jboschpons 12 days ago

Hi HN, I’m Joan, the developer of Quicklang. I made this app to easily translate and keep in sync all my localization JSON files for my side projects. While searching online for a similar tool, I only found enterprise solutions that do not allow direct editing of JSON files.

I used to use ChatGPT to translate the JSON translation file changes before coding Quicklang. However, I realized that ChatGPT only allows you to input short content for translation into another language (even if you provide a .json file), and each time I had to request translations for one language at a time. So, I decided to build an app that only sends the changes I’ve made to the OpenAI API and easily translates them into all the target languages for my side projects.

Technical details: I used Next.js to build the front end and backend, and I use a custom VPS (EC2 instance) on AWS to handle the translation process. This is because the translation can take several minutes, and Vercel Functions time out after 10 seconds by default (up to 60 seconds on the Hobby plan). Finally, I save the translation files in an S3 bucket.

What’s next? I want to add cool features like change history, the capability to add context to the OpenAI API to make translations as accurate as possible, and maybe allow developers to interact with the API in order to use the tool.

Let me know your thoughts and feedback. It’s been a blast working on this so far, and I think it’s just neat :)

constantcrying 12 days ago

>Perfect for manage project localization

>This credits will never expire.

>So if you sync new o modified content

There are obvious mistakes in the English version of your website, which is totally bizarre to me, since these errors are absent in the German version. My guess is that the AI actually fixed these errors when translating.

I have to say that the German translation is really bad though.

"Habe nach einem Online-JSON-Editor gesucht, aber nur Enterprise-Tools gefunden, die nicht das bieten, was ich brauchte, also...

Ich habe mein eigenes JSON-Übersetzungstool erstellt. "

Dropping the "Ich" at the beginning of the first sentence makes it sound like total slang. And the transition between the paragraphs does not work in German (due to the verb being at a different position compared to English). It sounds extremely clunky.

"Prost!"

This just is not an appropriate translation.

"Hier ist was sie über Quicklang."

This isn't a complete sentence. It misses an essential component.

"Holen Quicklang"

Nonsensical translation.

"lokalisieren Sie Ihre SaaS-, App-, KI-Produkt- oder Website, um Ihnen das weltweite Versenden zu erleichtern."

I don't get what that means. The AI translated shipping as sending. It should have been "um Ihnen den weltweiten Vertrieb zu erleichtern", or something like that.

I usually hate nitpicking on stuff like this and I wouldn't have mentioned it if the product was anything else. But surely you can present your product in a better way.

  • footy 10 days ago

    The Spanish is better than the German but is still super clunky. In particular, it goes from the informal to the informal second person as if there were no difference, and the end result reads like it was written by committee (which I suppose it was) with no one doing any kind of QA.

    This is in spite of OP apparently being Spanish.

  • jaflo 12 days ago

    I agree, the German translation is pretty bad (phrases like "Hier ist, was sie über Quicklang.", "Es ist ein Durcheinander. Sie wissen Bescheid.", and "Du kannst all meine Programmierprojekte auf X" are incomplete or awkward).

    It also doesn't sound like this can handle dynamic phrases like "buy {{count}} items"?

    • antaviana 12 days ago

      When you use machine translation, you must be aware that the fact that text is not in the original language does not necessarily mean that someone else speaking another language can understand it.

  • lurking_swe 12 days ago

    this is a perfect example of why AI prompting is important, even if people roll their eyes and tell you that using an LLM is “easy”. Clearly not.

    Pro tip: If you are using an LLM to do translations for you, give it CONTEXT so that the translations are not garbage. Don’t just say “translate x into german”. Explain in sentence or two what it’s translating and who the audience is. Many words are translated differently depending on the context of the conversation.

    • numpad0 11 days ago

      That tip btw isn't limited to LLMs. Blindfolded human translators given single word to translate tend not to do great jobs.

      • hju22_-3 11 days ago

        For example, Breath of the Wild has a quest-line featuring two NPCs sending love letters via a river, and you can help then get together or something. Only issue is that it's hella creepy in the English translation, while it's much less in the original Japanese one. I don't remember the specifics, but the problem there was the translators were not given context for what they were translating.

        • numpad0 11 days ago

          It doesn't take that much specific examples or Japanese text as an example. "Walk" could be referring to speed, a physical walkway, undesired mechanical behaviors, some sort of activism, so on. "Connect with $VAR" translates differently depending on what the $VAR is; a human, a group of humans, a computer, a webservice. The intention of a word or few words divorced from associated stimuli is simply in-determinate.

          In GP's defense, I suppose there are values in butchered translation as a low-budget checkbox checker. LLM do generate a lot of great lipsums.

ecjhdnc2025 12 days ago

I've built a fairly sophisticated web-based translation tool for an agency's client who demanded it.

I built it to their spec, for their CMS. They didn't use it, because the scheme they wanted turned out to be too much bother, and they couldn't find translation agencies who would bother with it.

The reality is that if you want to make it easy to keep translations up-to-date, you actually have to support all the (confusing, frustrating) translation infrastructure built around .PO files. Because then you have the support of translation agencies, tooling, even Crowdin etc.

Trying to short-circuit this with clever minimal bespoke JSON and ChatGPT is probably a mistake: this is a job where you will ultimately want actual people with actual multilingual ability working for you, and if you don't use the normal tooling you'll find it difficult to attract contributors even with open source.

  • cyanydeez 12 days ago

    Unless of course, you can ant to become the Uber of undercutting businesses for decades to capture the market then steadily raise your prices as you drive people out of work.

  • samuelstros 12 days ago

    > translation infrastructure built around .PO files

    not only .po files :D XML, i18next json, localizable strings (iOS), ... the list goes on.

samuelstros 12 days ago

disclaimer: i'm the founder of https://inlang.com/

monetizing your solution will likely be a dead end.

the value prop of your solution doesn't match the app you built, and what buyers pay for. for example, machine translating translation files is easier for you to build and developers to use with a cli [0] instead of a web app. there is no value in rendering json in the web app. vscode does a better job at rendering json's.

you could monetize via a web app if you allow non-devs to edit translations. but that's a beast called CAT editor [1], where you need to support all sorts of different file formats. aka, the value of a CAT editor is the file support and ecosystem around it, not the editor itself.

[0] https://inlang.com/m/2qj2w8pu/app-inlang-cli#machine-transla...

[1] https://inlang.com/m/tdozzpar/app-inlang-finkLocalizationEdi...

jvanveen 12 days ago

Interesting! How about using Deepl as a translation backend? I got some good results for translation strings with {{placeholders}} that needed to be ignored in the translations. Its api also has some neat features like formality, glossaries and context(experimental).

Ps. I'm working on a similar opensource tool to speed up the i18n process ( https://codeberg.org/garage44/expressio)

  • jboschpons 12 days ago

    Hey!

    I use DeepL for real-time translation in my side project. The reason I’m not utilizing the DeepL API on my backend and instead opting for the OpenAI API is twofold:

    1. OpenAI is more cost-effective compared to DeepL API. 2. I was already using ChatGPT to translate my JSON files, so there’s no change in translation quality for me.

    The primary issue I wanted to resolve is easily keeping all my destination language JSON files in sync.

    Cheers!

    • jvanveen 12 days ago

      Cool, nice product. Interesting to see that you've managed to get translations working on top of chatGPT, even for languages like Arabic! I decided not to support target language edits, because I felt it would make it much more difficult to keep languages in sync, when the source string is not always leading. How do you deal with this properly? Would love to have an automatic workflow that updates the json source file, as i18n tags are being added/removed from frontend files. For versioning; maybe just take advantage of the project's (git) versioning?

bccdee 12 days ago

There's a much easier way to translate json

    cat english.json | sed 's/"\([^"]*\)"/«\1»/g' > french.json
  • jboschpons 12 days ago

    Hey!

    Good approach, but I want to translate the strings on the JSON file from some source language (spanish for me) into any destination language.

    So the JSON may be:

    { "hello": "Hola" }

    If the destination language is "English" the result must be:

    { "hello": "Hello" }

umvi 12 days ago

I'd use this to do a quick and dirty localization, but quality localization is hard and expensive even for skilled humans. For example, see Super Mario RPG for SNES's localization which had a bunch of idioms and references to other games and anime go over the localizers' heads and become non-sensical phrases in English ("キイーッウキイーーッ! あの時の赤んぼう!?", a reference to Yoshi's Island became "That's...my child?" which makes no sense)

LtWorf 12 days ago

Ever heard of .po files?

tracker1 11 days ago

Very cool, I've done similar in the past. One point of contention is that I like YAML more than JSON for this kind of effort as multi-line text is much easier to deal with. I'd also generate typescript definitions based on the YAML/JSON for the default built values that has tended to help with the dev side (at least for JS/web projects).

strawhatdev 12 days ago

Looks really cool, minor typo on the landing page:

Strunggle with i18n -> Struggle with i18n ^

JanSt 12 days ago

have you thought about automatically generating json language files from pure html? Say I have an app that does not yet use a language file but just code like

<strong>Hello World</strong>

And on-the-fly translation? Say I have a backend that returns english language but I need it to translate it to another language on the fly? It could check whether the translation is available and otherwise generate it and store it. The original text key could be a hash of the text and you probably need in-memory key lookup for those hashes.

  • jvanveen 12 days ago

    "you have 3 items in your cart" would in that case create a translation for each time the amount changes, instead of having "you have {{count}} items in your cart. A backend proxy sounds like an interesting idea, but working with an i18n formatted string from the start seems like a more sustainable approach.

itake 11 days ago

plugging my own project:

https://github.com/KevinColemanInc/i18n-translate-go

Works well for i18njs. The issues I ran into are: chat-gpt didn't translate all the keys in the batch (especially for obscure languages like Laos) and sometimes the chatgpt output invalid unicode (see error in the readme).

  • jboschpons 11 days ago

    Hey! I ran into the same issues. In order to fix the invalid output you can specify to ChatGPT that you want to recieve the response as a JSON formatted text.

    • itake 10 days ago

      I think that is what I did: I had it use the functions api to pass the translations, but it still wouldn't output everything in the small languages.

      I was a bit lazy, so I just inform the user to re-run it manually instead of automatically handling it.

aleksjess 11 days ago

Ok, so I have my translation, and I am not quite sure what to do now... Do I just copy-paste into my `messages.json`?

pmontra 12 days ago

That could be great for a customer of mine. They have the documentation of their API on Postman and they want to translate it into other languages. Postman don't seem to have support for multiple languages so we would have to either manually manage multiple versions of the documentation (but managing the examples could be a nightmare) or export as JSON, identify and update the changes, import. This service could automate the middle part of that process.

jboschpons 12 days ago

Hey HN.

I got tired of manually copy/pasting translations from ChatGPT every time I updated my main language JSON file. So I build my own alternative.

Joan

chairmanmow 12 days ago

If this for your side project and your just using machine translation anyways - if this saves you steps, go for it. When it comes down to an actual product, as a consumer, most would rather buy a product that's been proofread by someone for the language/region they use it in, quality can only improve. I'm not against bootstrapping translations with AI, but they are more than strings that serve functions, they are human language that evoke emotion and associations - proofreading is important to me in something I might have to pay for. And human language is super important sometimes, and most of us can really only think/feel in a singular one. If my language comes across wrong here, it will undermine my point.

I don't mean to be critical, sounds like something I could actually use occasionally. I sometimes feel like I'm the only one putting up this common sense fight: if designers can spend hours crafting something carefully worded, tailored in English for something, why would it make sense to just take that and auto-translate it into something that if it's wrong gets discovered in the market as some very awkward in-app experience where the words don't make sense so much someone complains? My only gripe is people thinking AI is the total substitute for localization, it's not a silver bullet, but sure is better than nothing.

maxpr 11 days ago

Disclaimer: I'm the tech co-founder at https://replexica.com.

I like the idea.

At Replexica we've basically built a better + much faster (+ sometimes cheaper) alternative to Lokalise, Phrase, and Crowdin (we help dev teams do AI translations of user interfaces - web, mobile, Apple Vision Pro, etc.). So having seen some things, I must say AI-powered localization is indeed the future, but it's very, very hard to get it right.

For example, it took us a while to perfect the quality. Working with the "industry standard" scores (BLEU, etc.) isn't easy, and the state of the machine translation industry feels very last century, so you have to oftentimes invent things by studying the latest research.

It's a constant quest to ensure the user gets the best, perfect result, and not to mention, different LLMs perform differently with different language pairs, which adds an extra challenge to maintaining accuracy while iterating. For example, we had to build a regression testing setup internally, to make sure the quality only improves as we ship.

Nevertheless, good luck. I will be keeping an eye on your progress.

BTW, loving your domain name.

EDIT: typos

zxilly 12 days ago

Honestly, I don't see the reason to pay $10/month just to translate a json file.Or is this just another OpenAI API caller startup? "In any case, we must use LLM?"

  • jboschpons 12 days ago

    Hey!

    Thank you for your feedback! Quicklang is designed as a tool to synchronize and manage your JSON translation files centrally. For me, it’s good to have quick access for localizing my side projects efficiently and maintaining a changelog, among other features.

    While there are several enterprise options available such as DeepL or Lokalise, they don’t quite fit my specific use case. ;)