ITAPIA is an intelligent stock investment assistant platform, built with the core philosophies of transparency and Explainability (XAI). This project does not just provide recommendations, but also empowers users to understand the "why" behind each decision.
- The Problem: The current market is flooded with "black box" investment tools that issue cryptic buy/sell signals, eroding trust and turning investing into a game of chance.
- Our Solution: We built a "glass box". By combining traditional AI/ML models with a powerful Rule Engine, every piece of advice is traced back to the "evidence" and "rules" that triggered it, giving users full control and confidence.
- π§ Hybrid AI Architecture: Combines the strengths of traditional Machine Learning models (Forecasting), Natural Language Processing (NLP), and a symbolic Rule Engine for reasoning.
- π Explainable Recommendations (XAI): Every piece of advice on Decisions, Risks, and Opportunities is accompanied by a list of triggered rules as evidence.
- 𧬠Evolvable Rule Engine: Built on a foundation of Symbolic Expression Trees, ready for the application of genetic algorithms to automatically discover new strategies.
- π€ Personalized Investment Profiles: Allows users to create and experiment with multiple investment "personas," each with unique parameters for risk appetite, goals, and experience.
- βοΈ Modern Full-Stack System: Fully built with a Backend (Python, FastAPI, Docker) and Frontend (Vue.js, TypeScript, Vuetify), delivering a smooth and professional user experience.
ITAPIA is designed with a microservices architecture, ensuring modularity, scalability, and maintainability.
Dive deeper into our system design in the Architecture Documentation.
Prerequisites:
- Git
- Docker & Docker Compose
- Python (Python 3.11 is recommended)
- npm
- OpenSSL
# Clone the repository
git clone https://github.com/your-username/itapia.git
cd itapiaBefore running the project, you need to prepare the following secrets:
a. Get Kaggle API Key:
- Log in to Kaggle.
- Go to your account page (click your avatar -> Account).
- In the "API" section, click "Create New API Token".
- A
kaggle.jsonfile will be downloaded. Open it; you will need theusernameandkeyvalues.
b. Get Google OAuth 2.0 Credentials:
- Go to the Google Cloud Console and create a new project.
- Navigate to APIs & Services -> OAuth consent screen, select External, and fill in the required application details. Add the
.../auth/userinfo.emailand.../auth/userinfo.profilescopes. - Go to Credentials, click + CREATE CREDENTIALS -> OAuth client ID.
- Select Web application and configure the following:
- Authorized JavaScript origins:
http://localhost:3000 - Authorized redirect URIs:
http://localhost:8000/api/v1/auth/google/callback
- Authorized JavaScript origins:
- After creation, copy the Client ID and Client Secret.
c. Generate JWT Secret Key:
- Open your terminal and run the following command:
openssl rand -hex 32
- Copy the generated random string.
d. Configure .env files:
Copy the configuration files from the provided templates and fill in the necessary values.
# Backend
cp ./backend/.env.template ./backend/.env
# Frontend
cp ./frontend/.env.template ./frontend/.env
# Entire
cp ./.env.template ./.enve. Download require models
-
You need to download Spacy Models before build docker images. You can find at Spacy Models
In this project, we use
en_core_web_md-3.6.0at en_core_web_md-3.6.0 -
Then, you needs to move this file to a directory and modify in Dockerfile
RUN conda run -n itapia conda install -c conda-forge -y ta-lib "spacy=3.6.*" pandas-ta COPY ./ai_service_quick/requirements.txt /ai-service-quick/requirements.txt RUN conda run -n itapia pip install --no-cache-dir --upgrade -r /ai-service-quick/requirements.txt COPY ./path_to_your_wheel/wheel_file.whl /tmp/ RUN conda run -n itapia pip install /tmp/wheel_file.whl
# Navigate to the backend directory
cd backend
# Build and run all backend services in detached mode
docker-compose -f docker-compose.local.yaml up -d api-gateway# Navigate to the frontend directory from the root
cd frontend
# Install dependencies
npm install
# Sync schemas if necessary
npm run sync:schemas
# Run the development server
npm run dev- Frontend Application: http://localhost:3000
- Backend API Docs (Swagger UI): http://localhost:8000/docs
After step 1 and 2 (after setup credentials and enviroment), you can use my pre-built docker images by using default docker-compose:
docker-compose -f docker-compose.yml up -d api-gateway frontendWe provided some utils command to help you easily run this project.
-
You can rebuild all custom image from code using
utils-cmd/rebuild-all [--tag ^<tag_name^>] [--help]
If
--tagis not provided, these image will use default taglatest. -
You can push you images to Docker Hub using
utils-cmd/push-docker-hub [--local-tag ^<tag^>] [--hub-tag ^<tag^>] [--user ^<username^>]
If any tag not be used, use default arg. See
--help
For data seeding, you can use
utils-cmd/seed-dataThis cmd can be used after run DB Service (Postgre). And will seed important data like tickers, sector, exchanges and rules.
You can run all services (backend and frontend) by
utils-cmd/run-all [--local-images] [--help]If tag --local-images is setted, your local images (build from code) will be used, if no, my pre-built images will be used
And you can stop by using
utils-cmd/stop-allitapia/
βββ backend/ # Contains all microservices, Docker config, and the .env for the backend
β βββ api_gateway/ # API Gateway service (see backend/api_gateway/README.md)
β βββ ai_service_quick/ # Quick AI analysis service
β βββ data_processing/ # Data processing scripts
β βββ data_seeds/ # Data seeding utilities
β βββ evo_worker/ # Evolutionary algorithms worker
β βββ shared/ # Shared code between services
β βββ .env.template # Backend environment template
βββ frontend/ # The Vue.js SPA, contains its own .env for the frontend
βββ docs/ # Documentation files
βββ .gitignore
βββ docker-compose.yml # Docker Compose configuration
βββ README.md # You are here
- β Phase 1: Core MVP (Analysis, Advisor, Rules, Auth)
- β Phase 2: UI/UX Polish & Personalization (UX Polish, Profile Management)
- β
Phase 3: Automatic Optimization (
Evo-worker) βΆοΈ Phase 4: Deep Dive & LLM Integration
- System Architecture: A deep dive into the microservices, data flow, and design decisions.
- API Reference: A detailed list and description of all API endpoints.
- Rule Engine Architecture: An explanation of the Symbolic Expression Tree design.
- API Gateway Documentation: Detailed documentation for the API Gateway service.
To understand the development process, roles, and responsibilities of each component, you can read the:
This is a graduate thesis project. All contributions or questions are welcome. Please create an Issue to discuss.
If you use this work, please cite it as:
[Le, Minh Triet]. (2025). ITAPIA: An Intelligent and Transparent AI-Powered Personal Investment Assistant.
Graduate Thesis, Hanoi University of Science and Technology, Vietnam.

