Getting Started: A Developer's Guide¶
Welcome to Labeeb
This guide provides a complete walkthrough for setting up the Labeeb platform for local development. We'll cover the initial platform-wide setup and then dive into the specific steps for each microservice.
1. Prerequisites¶
Before you begin, make sure you have the following installed:
- Docker & Docker Compose: The primary tools for running the containerized services.
- Git: For cloning the repository.
- A code editor: Such as VS Code with the Docker and Remote Development extensions.
2. Platform-Wide First-Time Setup¶
This procedure builds and initializes the entire platform from a fresh checkout. You only need to do this once.
-
Configure Environment
Create your local
.envfile from the template. This file holds the master configuration for the Docker environment. -
Build & Start Services
Build the Docker images and start all services in the background. The first build will take a few minutes.
-
Initialize Database
Run the Laravel database migrations to create all the required tables for the API service.
-
Bootstrap & Verify Search
Set up the OpenSearch indices, pipelines, and templates, then run a smoke test to verify the search configuration.
3. Service-Specific Setup & Verification¶
After the initial setup, you may need to perform service-specific tasks. Here’s how to get started with each core service.
-
API Service
The API service is the central nervous system of the platform, handling data processing, business logic, and client-facing communication.
1. Configure Environment: Create your local
.envfile for the API service from its template.2. Run Migrations:
3. Install a new PHP package (example):
-
AI-Box Service
The AI-Box powers evidence retrieval and handles all ML-powered tasks. It implements hybrid search and fact-checking tools.
1. Configure Environment: Create your local
.envfile for the AI-Box service from its template.2. Build & Start Service:
3. Run a smoke test (hybrid search query):
-
Scraper Service
The Scraper fetches and normalizes news articles from various external sources. It's profile-driven and can forward articles to the API.
1. Configure Environment: Create your local
.envfile for the Scraper service from its template.2. Build & Start Service:
3. Set API Ingestion Variables (in
scraper/.env):4. Trigger a test scrape:
4. Daily Developer Workflow¶
Once the platform is set up, your daily workflow will be much simpler.
-
Pull Latest Changes:
-
Start Services: Start all services in the background. A rebuild is only needed if a
Dockerfileor dependencies change. -
Run Local Tests: Before committing, run the local test suites for the service you are working on.
-
View Logs: Tail the logs for a specific service to debug issues.
-
Stop Services: When you're done for the day, stop all running services.