DEV Community

Cover image for Starting an Open Source Project - OSD#2
Bregwin Jogi
Bregwin Jogi

Posted on • Edited on

Starting an Open Source Project - OSD#2

If you remember from my previous blog post, I was researching about which open-source project to contribute to. Along with that, I was also asked to develop a open-source Gen AI based console application that modifies code in some way. The interesting thing about it is that each of us in the course (OSD600) will be partnering up with one or more students to review each other's code for that application.

Usually, we'd be failed for plagiarism in other courses if checked out other student's code, but because this course is specifically related to Open Source development, it is encouraged to review and collaborate with others on the project. Specifically, we were asked to test the partner's application and see if there are any problems. If you found any, just add it as an Issue on the GitHub repo of the application.

The project I created is called RefactorCode, a terminal app to improve your code. It will remove unreachable/commented-out code, fix common bugs, add helpful comments and split large functions for modularity. It pretty much tries to help out where it can. It uses the free Gemini API for inference.

This week, I partnered with Kannav Sethi for his project: DialectMorph which is a tool for converting code files written in one language into another. The good thing about testing someone else's code is that you realize something that could be improved in your own code. I realized that my README file could be more clear about its features and updated it. Similarly, when creating instructions for setting up an application, we sometimes forget some intermediate steps that are required for the application. This caused some issues on my end when testing the code and I created a issue for it. Kannav quickly responded to it and worked on fixing the issue, and I was able to continue running the application.

I also pointed out some small suggestions to improve the user experience, such as changing the option name to another name to make it easier for the end-user. I also added pointed out some issues with running the code which were quickly fixed.

Similarly, Kannav helpfully pointed out some issues on my application. It was great working with someone else because having a second pair of eyes really helps discover issues I didn't notice. For example, I didn't notice that the loading spinner was running even after the code has been refactored and all the processes were completed. So I updated the code to fix that issue.

Similarly, some sections of the code were throwing the error whereas other areas were using stderr, so I updated it to be more consistent.

There is also another issue to make the model names print out dynamic instead of hard-coding it. It is also a good issue to fix, but I don't have the time to fix the issue currently. I am hoping to fix it before releasing version 0.1 of the application.

Overall, I would say it was a great experience and I learned a lot from it. When reviewing other people's work, I learned to focus on the code itself and make sure the developer doesn't feel disheartened by criticism. Doing this peer review helped me understand that by putting myself in their shoes.

I am excited for what's to come in upcoming labs. Feel free to check out our work and give some feedback on it!

Top comments (0)