This post is about a Project for OSD600 course at Seneca College
This post will describe the project and chronicle my process and progress in building the tool
Description
The Project name is code-mage. The goal of this project is to build a CLI(Command Line Interface) tool that translates a file written in a programming language into another language, using LLM(Large Language Model) API.
Release 0.1
Features
- Supported Languages: Javascript, Python, C++, Java
- Default target Language is Python
- Supported LLM model: openrouter, groq
- Default LLM model is openrouter(sao10k/l3-euryale-70b)
Usage
poetry run codemage <source_file>
Examples
You can try the tool with the included example files as followings:
poetry run codemage ./example/test.js -t python -o result
poetry run codemage ./example/test.js ./example/sample.js -target c++
poetry run codemage ./example/test.js -m groq -o result -t
- Detailed Installment and Usage available in Github: CodeMage
Work Progress
Sep 10
I've started my project for osd600. It is a tool, written in Python, translating a source file written in one programming language into another language. The translation will be done by Large Language Model AI(such as ChatGPT).
So far, I have implemented a part that gets arguments and options user enters when running the tool and reads the file(checked if it is properly read by printing on console).
Currently, to run this tool in terminal, user needs to use this commend
python3 translator.py source_file -t python
I personally find it ugly. Instead, what I want is running it with the following commend
translator source_file -t python -o result
From my 3 hours research, I learned that I should use Poetry(.toml) to do this. I could create a script in bin folder connecting it to the python file with softlink(this is just for continuous update for changes).
How to run it without python3
and .py
(generated by ChatGPT)
- Add Shebang: Ensure the script starts with #!/usr/bin/env python3.
- Move Script: Place the script in a directory like ~/bin and rename it to remove .py.
- Make Executable: Use chmod +x to make the script executable.
- Update PATH: Add the directory to your PATH environment variable.
However, it wouldn't work for other users, since it is set up locally. As a result, I need to use Poetry.
How it works is, it creates a virtual environment and generate a script in the virtual environment that can run the python script(the tool), so that I can run the python script without specifying "python3"
Since it is new to me and quite complicated, I need to study more about it.
There are a lot to do for release 0.1
- Using Poetry
- Integrating LLM API into the tool
Sep 11
I deleted the whole project and then recreated a new project since I messed up the project structure. I also changed the name into code-mage
While I was trying to use Poetry, for some reason things were messed up and didn't work well. Therefore, I deleted the whole project and recreated it although I used the core logic of the previous the code.
I started the project with Poetry, and copied the core code into the new project and then integrated LLM API into the tool, and it translate the source_file into a target language.
It works, however, sometimes it gives unnecessary explanation or parts. Although I send the prompt with "system" like "Only display the code without anything else", it still gives some unnecessary part.
It only works for some core functions and options, I have a lot to do still. I can use the tool as follows
poetry run codemage <file>
However, I still haven't figured out how to run it like:
codemage <file>
I think that in order to do so I need to build and package the whole project and make available with pip or pipx. I don't know it yet, so I need to study during the weekends.
For Lab1, I am going to review Anh Chien Vu's project, and he will review my code. I had a short look at his project, and I think there are something that I can help or suggest.
Sep 13
In today's Lab Class, I saw a demo of one amazing student. He also implemented his tool with python, and it was better than my tool in every single way. I contacted him in Slack, and I am going to ask him how to package the tool and how can I run the tool without poetry run
commend.
Today, I reviewed my code and applied the issues that other class mates left in my repo. I refactored my code and groomed it. There are still many things to improve and fix. I will work on it after this weekend.(I currently have too many assignments...)
Sep 20
I received a Pull Request from Arina Kolodeznikova. I checked her code and it was really clean, and similar to my coding style. But there was one small issue with printing the token usage into stderr
so I asked her to fix it. Then, she updated the pull request and I merged into my repo.
I almost finished Realse 0.1
. However, there is one thing that is slightly different from specification. My tool make the result as a file by default(the specification says it should be stdout
). And -o
or --output <fileName>
is for renaming the output filename. So I am thinking whether I will change it or not.
Lastly, I added a new feature, -m, --model
for selecting a LLM API model. Now the user can choose the model. the default model is openrouter
Reflection
This project was very tough for me. Using all those libraries with python(the programming language that I am not very familiar or comfortable with) was not easy. And also, using git was not easy. However, I learned a lot from this 2 week experience. Seeing how great and awesome the classmates programming skills was frustrating but also motivating. It made me want to be able to build those cool tools and I am looking forward to improving my tool for the next releases.
Top comments (0)