When I embarked on creating an AI-powered chef assistant, I wanted to solve a real need instead of building another generic chatbot. The idea was simple yet practical: a web app that helps users prepare meals based on the ingredients they have.
The Concept
Users enter the dish they want to prepare. The app, leveraging AI, asks follow-up questions to refine the dish type. It then lists all required ingredients, allowing users to select which ones they have. If the user doesn't have enough ingredients, the app either suggests a different dish or tells them the minimal additional ingredients needed to make the original dish.
Starting the Backend
I began by outlining the basic features and setting up the backend. Using Django and PostgreSQL, I created the project, set up the database, and defined the models. The setup process for PostgreSQL was straightforward:
Install PostgreSQL: Download and install the latest version from the official PostgreSQL website.
Create a Database: Use the createdb command or pgAdmin to create a new database for the project.
Connect Django to PostgreSQL: Update Django's settings.py with the database credentials, ensuring the ENGINE is set to 'django.db.backends.postgresql'.
Choosing the Right AI Model
The real challenge was selecting an AI model to run locally on my PC. I opted for Llama 3.1, the latest model available. I started with the 8B version, considering my PC's capabilities. After downloading the model (4.7GB), I tested it with a simple "hi" query. The response time was painfully slow, and my PC became unresponsive.
After trying stablelm-zephyr, which promised to be lightweight, and facing similar issues, I moved on to other models like those available on Hugging Face. Still, the performance wasn't there. Finally, I switched to Llama 3.1 on GroqCloud, which provided fast responses in less than a second, saving the day.
Lessons Learned
Assess Hardware Capabilities: Ensure your hardware can handle the AI model you're planning to use. Overloading your system can lead to severe performance issues.
Start Small: Begin with lighter models and scale up as needed. This helps prevent unnecessary strain on your system.
Cloud Solutions: Sometimes, using cloud-based solutions like GroqCloud can save you time and resources.
Next Steps
The next step involves connecting user inputs to the AI model and storing the responses in the database. The data will be categorized and saved under columns like User Queries, Selected Ingredients, Suggested Recipes, and Missing Ingredients.
What AI challenges have you faced, and how did you overcome them? Share your experiences!
Top comments (0)