This project is a comprehensive real-time forex exchange rate processing system that simulates and processes currency exchange data from multiple data provider platforms. The system uses a modular architecture with TCP streaming and REST API data sources, featuring dynamic calculations, real-time monitoring, and enterprise-grade data processing capabilities.
- Multi-Platform Data Ingestion: TCP streaming (PF1) and REST API (PF2) data providers
- Real-time Data Processing: Live forex rate calculations and transformations
- Enterprise Messaging: Kafka-based event streaming architecture
- High-Performance Caching: Redis for raw data storage and fast retrieval
- Data Persistence: PostgreSQL for reliable data storage
- Search & Analytics: OpenSearch/Elasticsearch integration
- Monitoring & Alerting: Email notifications and comprehensive logging
- Scalable Architecture: Microservices-based modular design
┌─────────────────┐ ┌─────────────────┐
│ PF1 (TCP) │ │ PF2 (REST) │
│ Platform │ │ Platform │
└─────────┬───────┘ └─────────┬───────┘
│ │
└──────────┬───────────┘
│
┌────────▼────────┐
│ Coordinator │
│ Service │
└─────────┬───────┘
│
┌────────────┼────────────┐
│ │ │
┌────▼───┐ ┌────▼───┐ ┌────▼───┐
│ Redis │ │ Kafka │ │ Alarm │
│(Raw) │ │(Comp.) │ │Service │
└────────┘ └────┬───┘ └────────┘
│
┌─────────▼─────────┐
│ Kafka Consumer │
└─────────┬─────────┘
│
┌────────────┼────────────┐
│ │ │
┌────▼───┐ ┌────▼───┐ ┌────▼───┐
│PostgreSQL│ │OpenSearch│ │Logstash│
│ │ │ │ │ │
└────────┘ └──────────┘ └────────┘
- Raw Data Collection: Data providers (PF1/PF2) collect forex rates
- Coordination: Coordinator service manages data flow and processing
- Raw Storage: Raw data stored in Redis lists (
raw:<rate>
) - Computation: Dynamic calculations triggered and results published
- Event Streaming: Computed data sent to Kafka topics (
computed:<symbol>
) - Persistence: Kafka consumer stores data in PostgreSQL and OpenSearch
- Monitoring: AlarmService monitors delays and sends email alerts
Module | Description | Port | Technology Stack |
---|---|---|---|
common | Shared models, DTOs, mappers, and JPA repositories | - | Spring Data JPA |
platform-tcp | TCP streaming data provider (PF1) | 8081 | Java TCP Sockets |
platform-rest | REST API + SSE streaming provider (PF2) | 8082 | Spring WebFlux, SSE |
coordinator | Central orchestration service | 8080 | Spring Boot, Groovy |
kafka-consumer | Data persistence service | - | Kafka Streams |
logstash | Log aggregation and processing | - | Logstash Pipeline |
Service | Description | Port | Purpose |
---|---|---|---|
Kafka | Message streaming platform | 9092 | Event-driven architecture |
Redis | In-memory data store | 6379 | Raw data caching |
PostgreSQL | Relational database | 5432 | Data persistence |
OpenSearch | Search and analytics engine | 9200 | Data analytics |
OpenSearch Dashboards | Visualization platform | 5601 | Data visualization |
Ensure you have the following software installed:
- Java 23: Oracle JDK 23 or OpenJDK 23
- Maven 3.8+: Maven Download
- Docker: Windows | Mac | Linux
- Docker Compose: Installation Guide
- Git: Git Download
-
Clone the Repository
git clone https://github.com/mtgsoftworks/Forex_Project.git cd Forex_Project
-
Build the Project
# Build all modules (skip tests for quick setup) mvn clean install -DskipTests
-
Start Infrastructure Services
# Start all Docker services docker-compose up -d # Follow logs (optional) docker-compose logs -f
-
Start Application Modules
Option A: Individual Terminal Windows
# Terminal 1: TCP Platform cd platform-tcp mvn spring-boot:run # Terminal 2: REST Platform cd platform-rest mvn spring-boot:run # Terminal 3: Coordinator cd coordinator mvn spring-boot:run # Terminal 4: Kafka Consumer cd kafka-consumer mvn spring-boot:run
Check that all services are running:
# Verify Docker services
docker-compose ps
# Test Kafka
docker exec -it kafka kafka-topics --list --bootstrap-server localhost:9092
# Test Redis
docker exec -it redis redis-cli ping
# Test Coordinator API
curl http://localhost:8080/api/manual/pf2/PF2_USDTRY
Each module can be configured via its respective application.yml
file:
pf2:
rest:
base-url: http://localhost:8082/api/rates/
poll-interval: 1000 # milliseconds
enabled: false # auto-polling enabled/disabled
manual-mode: false # manual-only mode
Configuration Modes:
enabled=false, manual-mode=false
: Auto-polling disabled, manual endpoint availableenabled=true, manual-mode=false
: Auto-polling everypoll-interval
msenabled=true, manual-mode=true
: Manual-only mode, no auto-polling
pf1:
tcp:
host: localhost
port: 8081
enabled: false # auto-connect on startup
TCP Modes:
enabled=false
: Manual connection via APIenabled=true
: Auto-connect on application startup
Endpoint | Method | Description |
---|---|---|
/api/manual/pf2/{symbol} |
GET | Manual PF2 data fetch |
/api/status |
GET | Service health status |
Endpoint | Method | Description |
---|---|---|
/api/rates/{rateName} |
GET | Single forex rate |
/api/rates/stream/{rateName} |
GET | SSE stream |
TCP Protocol Commands:
subscribe|PF1_USDTRY # Subscribe to USD/TRY rates
unsubscribe|PF1_USDTRY # Unsubscribe from rates
-
TCP Client Test
// TCPClient.java import java.io.*; import java.net.*; public class TCPClient { public static void main(String[] args) { try { Socket socket = new Socket("localhost", 8081); PrintWriter out = new PrintWriter(socket.getOutputStream(), true); BufferedReader in = new BufferedReader(new InputStreamReader(socket.getInputStream())); // Subscribe to USD/TRY rates out.println("subscribe|PF1_USDTRY"); // Read incoming messages String line; while ((line = in.readLine()) != null) { System.out.println("Received: " + line); } socket.close(); } catch (Exception e) { e.printStackTrace(); } } }
-
Data Verification Commands
# Check raw data in Redis docker exec -it redis redis-cli LRANGE raw:PF2_USDTRY 0 -1 # Check computed data in Kafka docker exec -it kafka kafka-console-consumer \ --topic computed:USDTRY \ --from-beginning \ --bootstrap-server localhost:9092
# Unit tests
mvn test
# Integration tests
mvn verify -Pintegration-tests
# Docker Compose test profile
docker-compose -f docker-compose.yml -f docker-compose.test.yml up
- URL: http://localhost:5601
- Username: admin
- Password: admin
- Application logs:
logs/
directory - Logstash pipelines:
logstash/pipeline/*.conf
- OpenSearch endpoint: http://localhost:9200
Supported Currency Pairs:
- USD/TRY (Turkish Lira)
- EUR/USD (Euro/US Dollar)
- GBP/USD (British Pound/US Dollar)
Kafka Topics:
forex_topic
: Raw data streamcomputed:USDTRY
: Computed USD/TRY ratescomputed:EURUSD
: Computed EUR/USD ratescomputed:GBPUSD
: Computed GBP/USD rates
- Java 23: Core programming language
- Spring Boot 3.4.2: Application framework
- Maven 3.9.3: Dependency management
- Groovy 3.0.9: Dynamic calculations and scripting
- Apache Kafka (Confluent CP 7.x): Event streaming platform
- Redis 7.x: In-memory data store
- PostgreSQL 15-alpine: Relational database
- OpenSearch 2.x: Search and analytics engine
- Docker & Docker Compose 2.4: Containerization
- Logstash 7.x: Log processing pipeline
- OpenSearch Dashboards: Data visualization
-
Port Conflict Error
Error: Port 8080 is already in use
Solution: Kill the process using the port or change port in configuration
-
Kafka Connection Error
Connection refused to kafka:9092
Solution: Ensure all Docker services are running with
docker-compose ps
-
Redis Connection Error
JedisConnectionException: Could not connect to redis:16379
Solution: Verify Redis container is running and port configuration
-
Java Version Error
UnsupportedClassVersionError: ... requires Java 23
Solution: Ensure JDK 23 is installed and
JAVA_HOME
is set correctly
# Check all Docker services
docker-compose ps
# Check specific service logs
docker-compose logs [service-name]
# Restart specific service
docker-compose restart [service-name]
This system is designed for:
- Financial Institutions: Real-time forex rate processing
- Trading Platforms: High-frequency trading data feeds
- Risk Management: Currency exposure monitoring
- Academic Research: Forex market analysis and simulation
- Fintech Applications: Currency conversion services
This project is licensed under the MIT License - see the LICENSE file for details.
Mesut Taha Güven (@mtgsoftworks)
- Spring Boot community for excellent framework support
- Apache Kafka for robust messaging capabilities
- Redis community for high-performance caching solutions
- OpenSearch project for powerful search and analytics