Compare commits

..

22 Commits

Author SHA1 Message Date
035c21ba23 V 1.4.0 Komplette Absicherung mit Hilfe von Claude 2026-04-26 12:23:17 +02:00
8961b9237c Barometer-Trend mit eingeführt 2026-04-24 15:49:03 +02:00
3652831bc3 Mapping der neu gesendeten Werte auf die in der DB (main.py)
Anzeige der letzten 24h richtig (App.jsx)
Y-Bereichsberechnung für alle 3 (THP) dynamisch,
Windbön mit angezeigt
2026-04-24 14:31:06 +02:00
f271ff455f Sponsor-Line entfernt 2026-04-09 17:20:55 +02:00
d75c60cef9 Version angepasst 2026-04-09 17:11:23 +02:00
99553ad4da Tabelle hinzugefügt 2026-04-09 16:55:02 +02:00
995a4c64d8 Start/Ende-Datum ohne Zeit 2026-04-09 09:26:18 +02:00
6c45f260c6 Bereichswahl dazu 2026-04-08 09:08:24 +02:00
rxf
d4a5f1b1c9 Temperatur mit Min/Max 2026-03-30 11:43:06 +02:00
rxf
267f8198b9 V 1.2.0 diverse Anpassungen, so dass es ähnlich der alten Anwenung ist. Ist nun mal gut zu benutzen 2026-03-24 17:18:25 +01:00
rxf
acd509fef6 Immer noch nicht richtig gut, also noch **WIP** 2026-03-23 22:09:51 +01:00
rxf
c471c0e33a **WIP** 2026-03-22 20:09:44 +01:00
rxf
0b9d21c24c V1.1.0 Responsive
Footer angepasst
Lauffähigkeit auf Server verbessert
deploy.sh mit for loop
2026-03-22 18:44:22 +01:00
rxf
b71d92646b docker-compose für den server angepasst 2026-02-10 21:22:25 +01:00
rxf
4fde7ed46a neues docker-compose für Profuktion und Anpassungen 2026-02-10 20:22:44 +01:00
rxf
f32e472ea3 HTTP-Empfang geht 2026-02-10 14:06:42 +01:00
rxf
db1e2fd737 Min/Max-Werte unter Graf
Aktuelle Werte oben mit dran
2026-02-09 16:27:44 +01:00
rxf
c03ffe839d Bereichsgrenzen angepasst
Highcharts-Warnunbg unterdrückt
2026-02-08 22:26:10 +01:00
rxf
7139619d28 Anzeige Min/Max unter den Graphen 2026-02-08 22:11:50 +01:00
rxf
19ea455b55 Umrechnung der Windgeschwindigleit in km/h
Anzeige des aktuellen Wertes in der Grafik
2026-02-08 22:04:58 +01:00
rxf
ea0b8dd8f9 Grafiken nun halb so hoch wie breit 2026-02-08 20:08:00 +01:00
rxf
2fc4bd9db6 Highcharts sieht viel besser aus 2026-02-08 19:44:49 +01:00
27 changed files with 5076 additions and 865 deletions

185
DEPLOY-PRODUCTION.md Normal file
View File

@@ -0,0 +1,185 @@
# Production Deployment Guide
## Voraussetzungen
- Docker und Docker Compose auf dem Server installiert
- Traefik läuft im `dockge_default` Netzwerk
- Domain `wetter.fuerst-stuttgart.de` zeigt auf den Server
- `.env` Datei mit Datenbank-Credentials
## 1. Images bauen und pushen
Lokal auf dem Entwicklungsrechner:
```bash
# Alle Images bauen und zur Registry pushen
./push-images.sh
```
Dieser Befehl:
- Baut `wetterstation-collector`
- Baut `wetterstation-api`
- Baut `wetterstation-frontend`
- Pusht alle Images zu `docker.citysensor.de`
## 2. Server vorbereiten
Auf dem Production-Server:
```bash
# Projekt-Verzeichnis erstellen
mkdir -p ~/wetterstation
cd ~/wetterstation
# docker-compose.prod.yml hochladen
# .env Datei erstellen oder hochladen
```
### Beispiel .env für Production:
```env
# Datenbank
DB_NAME=wetterstation
DB_USER=wetterstation_user
DB_PASSWORD=<sicheres-passwort>
DB_HOST=postgres
DB_PORT=5432
# Collector
COLLECTOR_PORT=8001
# Optional: PostgreSQL custom port
# DB_PORT=5432
```
## 3. Deployment starten
```bash
# Images von Registry pullen
docker-compose -f docker-compose.prod.yml pull
# Services starten
docker-compose -f docker-compose.prod.yml up -d
# Logs prüfen
docker-compose -f docker-compose.prod.yml logs -f
```
## 4. Erreichbarkeit
Nach erfolgreichem Start ist die Wetterstation erreichbar unter:
- **Frontend**: https://wetter.fuerst-stuttgart.de/
- **API**: https://wetter.fuerst-stuttgart.de/api/health
- **Collector**: https://wetter.fuerst-stuttgart.de/collector/health
Traefik übernimmt:
- Automatisches HTTPS (Let's Encrypt)
- Routing basierend auf Pfad
- StripPrefix für `/api` und `/collector`
## 5. Updates deployen
Bei Code-Änderungen:
```bash
# Lokal: Images neu bauen und pushen
./push-images.sh
# Server: Neue Images pullen und Container neu starten
ssh user@server
cd ~/wetterstation
docker-compose -f docker-compose.prod.yml pull
docker-compose -f docker-compose.prod.yml up -d
```
## 6. Nützliche Befehle
```bash
# Status prüfen
docker-compose -f docker-compose.prod.yml ps
# Logs einzelner Services
docker-compose -f docker-compose.prod.yml logs -f frontend
docker-compose -f docker-compose.prod.yml logs -f api
docker-compose -f docker-compose.prod.yml logs -f collector
docker-compose -f docker-compose.prod.yml logs -f postgres
# Service neu starten
docker-compose -f docker-compose.prod.yml restart api
# Alle Services stoppen
docker-compose -f docker-compose.prod.yml down
# Services stoppen und Volumes löschen (⚠️ löscht Daten!)
docker-compose -f docker-compose.prod.yml down -v
```
## 7. Datenbank-Backup
```bash
# Backup erstellen
docker exec wetterstation_db_prod pg_dump -U wetterstation_user wetterstation > backup_$(date +%Y%m%d_%H%M%S).sql
# Backup wiederherstellen
docker exec -i wetterstation_db_prod psql -U wetterstation_user wetterstation < backup.sql
```
## Architektur
```
Internet
Traefik (dockge_default)
├─→ Frontend (nginx) → API (intern)
├─→ API (FastAPI)
└─→ Collector (FastAPI)
PostgreSQL (intern)
```
**Netzwerke**:
- `dockge_default` (external): Traefik-Netzwerk
- `wetterstation_internal`: Interne Service-Kommunikation
**Container**:
- `wetterstation_frontend_prod`: Nginx + React SPA
- `wetterstation_api_prod`: FastAPI (Weather Data API)
- `wetterstation_collector_prod`: FastAPI (Data Collection)
- `wetterstation_db_prod`: PostgreSQL 16
## Troubleshooting
### SSL-Zertifikat wird nicht erstellt
Prüfe:
- DNS zeigt auf Server: `dig wetter.fuerst-stuttgart.de`
- Traefik läuft: `docker ps | grep traefik`
- Port 80/443 offen: `netstat -tulpn | grep -E ':(80|443)'`
### API nicht erreichbar
```bash
# Prüfe ob Container läuft
docker ps | grep wetterstation_api_prod
# Prüfe Logs
docker logs wetterstation_api_prod
# Teste intern
docker exec wetterstation_api_prod curl localhost:8000/health
```
### Datenbank-Verbindungsfehler
```bash
# Prüfe ob DB läuft
docker ps | grep wetterstation_db_prod
# Prüfe DB-Logs
docker logs wetterstation_db_prod
# Teste Verbindung
docker exec wetterstation_db_prod psql -U wetterstation_user -d wetterstation -c "SELECT 1"
```

View File

@@ -1,3 +1,5 @@
# syntax=docker/dockerfile:1
FROM python:3.11-slim
WORKDIR /app
@@ -5,6 +7,7 @@ WORKDIR /app
# System-Abhängigkeiten installieren
RUN apt-get update && apt-get install -y \
gcc \
curl \
&& rm -rf /var/lib/apt/lists/*
# Python-Abhängigkeiten installieren

514
api/README.md Normal file
View File

@@ -0,0 +1,514 @@
# Wetterstation API
REST API zum Abrufen von Wetterdaten aus der PostgreSQL-Datenbank.
## Übersicht
Die API basiert auf **FastAPI** und bietet Endpunkte für aktuelle Wetterdaten, historische Zeitreihen, Statistiken und aggregierte Daten.
- **Version:** 1.0.0
- **Framework:** FastAPI mit Uvicorn
- **Datenbank:** PostgreSQL
- **Interaktive API-Dokumentation:** `/docs` (Swagger UI) oder `/redoc` (ReDoc)
## Starten der API
### Lokal (Development)
```bash
cd api
python main.py
```
Die API läuft dann auf `http://localhost:8000`
### Docker (Production)
```bash
docker compose up -d
```
## Umgebungsvariablen
Die API benötigt folgende Umgebungsvariablen (definiert in `.env`):
```env
DB_HOST=localhost
DB_PORT=5432
DB_NAME=wetterstation
DB_USER=wetterstation_user
DB_PASSWORD=<passwort>
```
## Endpunkte
### 📋 General
#### `GET /`
**Root-Endpunkt mit API-Informationen**
**Response:**
```json
{
"message": "Wetterstation API",
"version": "1.0.0",
"docs": "/docs"
}
```
---
#### `GET /health`
**Health Check - Prüft API- und Datenbankstatus**
**Response:**
```json
{
"status": "ok",
"database": "connected",
"timestamp": "2026-03-23T14:30:00"
}
```
---
### 🌡️ Weather Data
#### `GET /weather/latest`
**Gibt die neuesten Wetterdaten zurück**
**Response Model:** `WeatherData`
**Beispiel:**
```json
{
"id": 123456,
"datetime": "2026-03-23T14:30:00Z",
"temperature": 15.5,
"humidity": 65,
"pressure": 1013.2,
"wind_speed": 12.5,
"wind_gust": 18.7,
"wind_dir": 225.0,
"rain": 0.0,
"rain_rate": 0.0,
"received_at": "2026-03-23T14:30:05"
}
```
---
#### `GET /weather/current`
**Alias für `/weather/latest` - gibt aktuelle Wetterdaten zurück**
---
#### `GET /weather/history`
**Gibt historische Wetterdaten der letzten X Stunden zurück**
**Query Parameter:**
- `hours` (optional): Anzahl Stunden zurück (1-168, default: 24)
- `limit` (optional): Maximale Anzahl Datensätze (1-10000, default: 1000)
**Beispiel:**
```bash
GET /weather/history?hours=48&limit=500
```
**Response:** Array von `WeatherData`
---
#### `GET /weather/range`
**Gibt Wetterdaten für einen bestimmten Zeitraum zurück**
**Query Parameter:**
- `start` (erforderlich): Startdatum (ISO 8601)
- `end` (erforderlich): Enddatum (ISO 8601)
- `limit` (optional): Maximale Anzahl Datensätze (1-50000, default: 10000)
**Beispiel:**
```bash
GET /weather/range?start=2026-03-01T00:00:00Z&end=2026-03-23T23:59:59Z&limit=5000
```
**Response:** Array von `WeatherData`
---
#### `GET /weather/temperature`
**Gibt nur Temperatur-Zeitreihen zurück (optimiert für Diagramme)**
**Query Parameter:**
- `hours` (optional): Anzahl Stunden zurück (1-168, default: 24)
**Response:**
```json
[
{
"datetime": "2026-03-23T14:00:00Z",
"temperature": 15.3
},
{
"datetime": "2026-03-23T14:05:00Z",
"temperature": 15.5
}
]
```
---
#### `GET /weather/wind`
**Gibt nur Wind-Daten zurück (Geschwindigkeit, Richtung, Böen)**
**Query Parameter:**
- `hours` (optional): Anzahl Stunden zurück (1-168, default: 24)
**Response:**
```json
[
{
"datetime": "2026-03-23T14:00:00Z",
"wind_speed": 12.5,
"wind_gust": 18.7,
"wind_dir": 225.0
}
]
```
---
#### `GET /weather/rain`
**Gibt nur Regen-Daten zurück**
**Query Parameter:**
- `hours` (optional): Anzahl Stunden zurück (1-168, default: 24)
**Response:**
```json
[
{
"datetime": "2026-03-23T14:00:00Z",
"rain": 0.5,
"rain_rate": 2.3
}
]
```
---
### 📊 Statistics
#### `GET /weather/stats`
**Gibt aggregierte Statistiken für den angegebenen Zeitraum zurück**
**Query Parameter:**
- `hours` (optional): Zeitraum in Stunden (1-168, default: 24)
**Response Model:** `WeatherStats`
**Beispiel:**
```json
{
"avg_temperature": 15.2,
"min_temperature": 8.5,
"max_temperature": 22.1,
"avg_humidity": 65.3,
"avg_pressure": 1013.5,
"avg_wind_speed": 10.2,
"max_wind_gust": 28.5,
"total_rain": 3.2,
"data_points": 288
}
```
---
#### `GET /weather/daily`
**Gibt tägliche Statistiken für die letzten X Tage zurück**
**Query Parameter:**
- `days` (optional): Anzahl Tage zurück (1-90, default: 7)
**Response:** Array von `WeatherStats` mit `date` Feld
---
### 📈 Aggregated Data
Die aggregierten Endpunkte sind optimiert für Langzeit-Visualisierungen und reduzieren die Datenmenge durch Mittelwertbildung.
#### `GET /weather/hourly-aggregated`
**Gibt stündlich aggregierte Wetterdaten zurück (Stundenmittel)**
**Query Parameter:**
- `days` (optional): Anzahl Tage zurück (1-60, default: 7)
**Response:** Array von `WeatherData` (stündlich aggregiert)
**Verwendung:** Ideal für 7-Tage- und 30-Tage-Ansichten
---
#### `GET /weather/daily-aggregated`
**Gibt täglich aggregierte Wetterdaten zurück (Tagesmittel)**
**Query Parameter:**
- `days` (optional): Anzahl Tage zurück (1-730, default: 365)
**Response:** Array von `WeatherData` (täglich aggregiert)
**Besonderheit:** Bei `days >= 365` werden automatisch **alle verfügbaren Daten** zurückgegeben (nicht nur die letzten 365 Tage).
**Verwendung:** Ideal für Jahresübersicht (365-Tage-Ansicht)
---
#### `GET /weather/rain-daily`
**Gibt tägliche Regensummen zurück**
**Query Parameter:**
- `days` (optional): Anzahl Tage zurück (1-365, default: 30)
**Response:**
```json
[
{
"date": "2026-03-23T00:00:00Z",
"total_rain": 5.2
},
{
"date": "2026-03-22T00:00:00Z",
"total_rain": 0.0
}
]
```
**Verwendung:** Ideal für 7-Tage- und 30-Tage-Regen-Diagramme
---
#### `GET /weather/rain-weekly`
**Gibt wöchentliche Regensummen zurück (Woche = Mo-So)**
**Query Parameter:**
- `days` (optional): Anzahl Tage zurück (1-730, default: 365)
**Response:**
```json
[
{
"week_start": "2026-03-17T00:00:00Z",
"total_rain": 12.5
}
]
```
**Besonderheit:** Bei `days >= 365` werden automatisch **alle verfügbaren Daten** zurückgegeben.
**Verwendung:** Ideal für Jahresübersicht (365-Tage-Ansicht)
---
## Datenmodelle
### WeatherData
```typescript
{
id: number
datetime: string (ISO 8601)
temperature: number | null // °C
humidity: number | null // %
pressure: number | null // hPa
wind_speed: number | null // km/h (konvertiert von mph)
wind_gust: number | null // km/h (konvertiert von mph)
wind_dir: number | null // Grad (0-360)
rain: number | null // mm
rain_rate: number | null // mm/h
received_at: string (ISO 8601)
}
```
### WeatherStats
```typescript
{
avg_temperature: number | null
min_temperature: number | null
max_temperature: number | null
avg_humidity: number | null
avg_pressure: number | null
avg_wind_speed: number | null
max_wind_gust: number | null
total_rain: number | null
data_points: number
}
```
### HealthResponse
```typescript
{
status: string // "ok" | "error"
database: string // "connected" | "disconnected"
timestamp: string (ISO 8601)
}
```
---
## Einheitenkonvertierung
Die API konvertiert automatisch folgende Einheiten aus der Datenbank:
| Wert | Datenbank | API-Ausgabe |
|------|-----------|-------------|
| Windgeschwindigkeit | mph | km/h (× 1.60934) |
| Windböen | mph | km/h (× 1.60934) |
| Temperatur | °C | °C (unverändert) |
| Luftdruck | hPa | hPa (unverändert) |
| Regen | mm | mm (unverändert) |
---
## CORS
Die API erlaubt CORS-Anfragen von allen Origins (`allow_origins=["*"]`). In Production sollte dies auf spezifische Domains eingeschränkt werden.
---
## Fehlerbehandlung
### HTTP Status Codes
- `200 OK` - Erfolgreiche Anfrage
- `400 Bad Request` - Ungültige Parameter
- `404 Not Found` - Keine Daten gefunden
- `500 Internal Server Error` - Datenbankfehler
### Fehler-Response
```json
{
"detail": "Keine Daten verfügbar"
}
```
---
## Interaktive Dokumentation
FastAPI generiert automatisch eine interaktive API-Dokumentation:
- **Swagger UI:** [http://localhost:8000/docs](http://localhost:8000/docs)
- **ReDoc:** [http://localhost:8000/redoc](http://localhost:8000/redoc)
Dort können alle Endpunkte direkt getestet werden.
---
## Beispiele
### cURL
```bash
# Aktuelle Wetterdaten abrufen
curl http://localhost:8000/weather/current
# Letzte 48 Stunden
curl "http://localhost:8000/weather/history?hours=48"
# Jahresübersicht (alle verfügbaren Daten)
curl "http://localhost:8000/weather/daily-aggregated?days=365"
# Statistiken für letzte 7 Tage
curl "http://localhost:8000/weather/stats?hours=168"
```
### JavaScript (Fetch)
```javascript
// Aktuelle Wetterdaten
const response = await fetch('http://localhost:8000/weather/current')
const data = await response.json()
console.log(`Temperatur: ${data.temperature}°C`)
// Tägliche Aggregation für 365 Tage
const yearData = await fetch('http://localhost:8000/weather/daily-aggregated?days=365')
const year = await yearData.json()
console.log(`${year.length} Tage verfügbar`)
```
### Python (requests)
```python
import requests
# Aktuelle Daten
response = requests.get('http://localhost:8000/weather/current')
data = response.json()
print(f"Temperatur: {data['temperature']}°C")
# Statistiken
stats = requests.get('http://localhost:8000/weather/stats?hours=24')
print(f"Durchschnittstemperatur: {stats.json()['avg_temperature']}°C")
```
---
## Entwicklung
### Abhängigkeiten installieren
```bash
pip install -r requirements.txt
```
### Server starten (Development mit Auto-Reload)
```bash
uvicorn main:app --reload --host 0.0.0.0 --port 8000
```
### Logging
Die API verwendet Python's `logging`-Modul. Log-Level: `INFO`
---
## Deployment
Die API wird als Docker-Container deployed. Siehe `Dockerfile` und `docker-compose.yml` im Hauptverzeichnis.
### Docker Image bauen
```bash
docker build -t wetterstation-api ./api
```
### Container starten
```bash
docker run -d \
-p 8000:8000 \
-e DB_HOST=db \
-e DB_USER=wetterstation_user \
-e DB_PASSWORD=<passwort> \
wetterstation-api
```
---
## Performance-Tipps
1. **Aggregierte Endpunkte verwenden** für Langzeit-Visualisierungen (reduziert Datenmenge)
2. **Limit-Parameter** nutzen, um nur benötigte Datenmenge abzurufen
3. **Spezifische Endpunkte** verwenden (`/weather/temperature` statt `/weather/history` wenn nur Temperatur benötigt wird)
4. **Caching** auf Client-Seite implementieren für historische Daten
---
## Lizenz
Siehe Hauptprojekt-Repository.

View File

@@ -1,6 +1,7 @@
from fastapi import FastAPI, HTTPException, Query
from contextlib import asynccontextmanager
from fastapi import FastAPI, HTTPException, Query, Depends
from fastapi.middleware.cors import CORSMiddleware
from pydantic import BaseModel, Field
from pydantic import BaseModel, Field, ConfigDict
from typing import List, Optional
from datetime import datetime, timedelta
import os
@@ -8,6 +9,7 @@ from pathlib import Path
from dotenv import load_dotenv
import psycopg
from psycopg.rows import dict_row
from psycopg_pool import ConnectionPool
import logging
# Logging konfigurieren
@@ -28,25 +30,81 @@ DB_NAME = os.getenv('DB_NAME', 'wetterstation')
DB_USER = os.getenv('DB_USER')
DB_PASSWORD = os.getenv('DB_PASSWORD')
# --------------------------------------------------------------------------- #
# Connection Pool + Lifespan
# --------------------------------------------------------------------------- #
DB_POOL_MIN = int(os.getenv("DB_POOL_MIN", 2))
DB_POOL_MAX = int(os.getenv("DB_POOL_MAX", 10))
pool: Optional["ConnectionPool"] = None
def _build_conninfo() -> str:
return (
f"host={DB_HOST} port={DB_PORT} dbname={DB_NAME} "
f"user={DB_USER} password={DB_PASSWORD}"
)
@asynccontextmanager
async def lifespan(app: FastAPI):
global pool
if not DB_USER or not DB_PASSWORD:
raise RuntimeError("DB_USER/DB_PASSWORD nicht gesetzt")
pool = ConnectionPool(
conninfo=_build_conninfo(),
min_size=DB_POOL_MIN,
max_size=DB_POOL_MAX,
timeout=10,
kwargs={"row_factory": dict_row, "autocommit": True},
)
pool.wait()
logger.info("DB-Pool initialisiert (min=%d, max=%d)", DB_POOL_MIN, DB_POOL_MAX)
try:
yield
finally:
if pool is not None:
pool.close()
logger.info("DB-Pool geschlossen")
# FastAPI App erstellen
app = FastAPI(
title="Wetterstation API",
description="API zum Auslesen von Wetterdaten",
version="1.0.0"
version="1.1.0",
lifespan=lifespan,
)
# CORS Middleware
# CORS Middleware — auf bekannte Frontend-Domains beschraenkt.
# Zusaetzliche Origins koennen via ENV CORS_EXTRA_ORIGINS (Komma-separiert) gesetzt werden.
_default_origins = [
"https://stwwetter.fuerst-stuttgart.de",
"https://sternwarte-welzheim.de",
"http://localhost:3000",
"http://localhost:5173",
]
_extra = os.getenv("CORS_EXTRA_ORIGINS", "")
_extra_list = [o.strip() for o in _extra.split(",") if o.strip()]
ALLOWED_ORIGINS = _default_origins + _extra_list
app.add_middleware(
CORSMiddleware,
allow_origins=["*"],
allow_credentials=True,
allow_methods=["*"],
allow_headers=["*"],
allow_origins=ALLOWED_ORIGINS,
allow_credentials=False, # API liest nur, keine Cookies/Auth noetig
allow_methods=["GET", "OPTIONS"], # API ist read-only
allow_headers=["Content-Type"],
max_age=600,
)
logger.info("CORS aktiv fuer: %s", ALLOWED_ORIGINS)
# Pydantic Models
class WeatherData(BaseModel):
model_config = ConfigDict(from_attributes=True)
id: int
datetime: datetime
temperature: Optional[float] = None
@@ -57,11 +115,9 @@ class WeatherData(BaseModel):
wind_dir: Optional[float] = None
rain: Optional[float] = None
rain_rate: Optional[float] = None
bar_trend: Optional[int] = None
received_at: datetime
class Config:
from_attributes = True
class WeatherStats(BaseModel):
avg_temperature: Optional[float] = None
@@ -81,22 +137,17 @@ class HealthResponse(BaseModel):
timestamp: datetime
# Datenbankverbindung
def get_db_connection():
"""Erstellt eine Datenbankverbindung"""
# Datenbankverbindung — aus dem Pool, als FastAPI-Dependency.
def get_db_conn():
"""Yieldet eine Connection aus dem Pool und gibt sie automatisch zurueck."""
if pool is None:
raise HTTPException(status_code=503, detail="DB-Pool nicht initialisiert")
try:
conn = psycopg.connect(
host=DB_HOST,
port=DB_PORT,
dbname=DB_NAME,
user=DB_USER,
password=DB_PASSWORD,
row_factory=dict_row
)
return conn
except Exception as e:
logger.error(f"Datenbankverbindungsfehler: {e}")
raise HTTPException(status_code=500, detail="Datenbankverbindung fehlgeschlagen")
with pool.connection() as conn:
yield conn
except psycopg.Error:
logger.exception("DB-Fehler beim Pool-Zugriff")
raise HTTPException(status_code=503, detail="Datenbank nicht erreichbar")
# API Endpoints
@@ -105,7 +156,7 @@ async def root():
"""Root Endpoint"""
return {
"message": "Wetterstation API",
"version": "1.0.0",
"version": "1.1.0",
"docs": "/docs"
}
@@ -113,14 +164,15 @@ async def root():
@app.get("/health", response_model=HealthResponse, tags=["General"])
async def health_check():
"""Health Check Endpoint"""
db_status = "disconnected"
try:
conn = get_db_connection()
with conn.cursor() as cursor:
cursor.execute("SELECT 1")
conn.close()
db_status = "connected"
if pool is not None:
with pool.connection() as conn:
with conn.cursor() as cursor:
cursor.execute("SELECT 1")
db_status = "connected"
except Exception:
db_status = "disconnected"
logger.exception("Health-Check DB-Test fehlgeschlagen")
return {
"status": "ok" if db_status == "connected" else "error",
@@ -130,206 +182,408 @@ async def health_check():
@app.get("/weather/latest", response_model=WeatherData, tags=["Weather Data"])
async def get_latest_weather():
async def get_latest_weather(conn = Depends(get_db_conn)):
"""Gibt die neuesten Wetterdaten zurück"""
conn = get_db_connection()
try:
with conn.cursor() as cursor:
cursor.execute("""
SELECT * FROM weather_data
ORDER BY datetime DESC
LIMIT 1
""")
result = cursor.fetchone()
if not result:
raise HTTPException(status_code=404, detail="Keine Daten verfügbar")
return dict(result)
finally:
conn.close()
with conn.cursor() as cursor:
cursor.execute("""
SELECT id, datetime, temperature, humidity, pressure,
wind_speed * 1.60934 as wind_speed,
wind_gust * 1.60934 as wind_gust,
wind_dir, rain, rain_rate, bar_trend, received_at
FROM weather_data
ORDER BY datetime DESC
LIMIT 1
""")
result = cursor.fetchone()
if not result:
raise HTTPException(status_code=404, detail="Keine Daten verfügbar")
return dict(result)
@app.get("/weather/current", response_model=WeatherData, tags=["Weather Data"])
async def get_current_weather():
async def get_current_weather(conn = Depends(get_db_conn)):
"""Alias für /weather/latest - gibt aktuelle Wetterdaten zurück"""
return await get_latest_weather()
return await get_latest_weather(conn=conn)
@app.get("/weather/history", response_model=List[WeatherData], tags=["Weather Data"])
async def get_weather_history(
hours: int = Query(24, ge=1, le=168, description="Anzahl Stunden zurück (max 168 = 7 Tage)"),
limit: int = Query(1000, ge=1, le=10000, description="Maximale Anzahl Datensätze")
limit: int = Query(1000, ge=1, le=10000, description="Maximale Anzahl Datensätze"),
conn = Depends(get_db_conn),
):
"""Gibt historische Wetterdaten der letzten X Stunden zurück"""
conn = get_db_connection()
try:
with conn.cursor() as cursor:
cursor.execute("""
SELECT * FROM weather_data
WHERE datetime >= NOW() - make_interval(hours => %s)
ORDER BY datetime DESC
LIMIT %s
""", (hours, limit))
results = cursor.fetchall()
return [dict(row) for row in results]
finally:
conn.close()
with conn.cursor() as cursor:
cursor.execute("""
SELECT id, datetime, temperature, humidity, pressure,
wind_speed * 1.60934 as wind_speed,
wind_gust * 1.60934 as wind_gust,
wind_dir, rain, rain_rate, bar_trend, received_at
FROM weather_data
WHERE datetime >= NOW() - make_interval(hours => %s)
ORDER BY datetime DESC
LIMIT %s
""", (hours, limit))
results = cursor.fetchall()
return [dict(row) for row in results]
@app.get("/weather/range", response_model=List[WeatherData], tags=["Weather Data"])
async def get_weather_by_date_range(
start: datetime = Query(..., description="Startdatum (ISO 8601)"),
end: datetime = Query(..., description="Enddatum (ISO 8601)"),
limit: int = Query(10000, ge=1, le=50000, description="Maximale Anzahl Datensätze")
limit: int = Query(10000, ge=1, le=50000, description="Maximale Anzahl Datensätze"),
conn = Depends(get_db_conn),
):
"""Gibt Wetterdaten für einen bestimmten Zeitraum zurück"""
if start >= end:
raise HTTPException(status_code=400, detail="Startdatum muss vor Enddatum liegen")
conn = get_db_connection()
try:
with conn.cursor() as cursor:
cursor.execute("""
SELECT * FROM weather_data
WHERE datetime BETWEEN %s AND %s
ORDER BY datetime ASC
LIMIT %s
""", (start, end, limit))
results = cursor.fetchall()
return [dict(row) for row in results]
finally:
conn.close()
with conn.cursor() as cursor:
cursor.execute("""
SELECT * FROM weather_data
WHERE datetime BETWEEN %s AND %s
ORDER BY datetime ASC
LIMIT %s
""", (start, end, limit))
results = cursor.fetchall()
return [dict(row) for row in results]
@app.get("/weather/stats", response_model=WeatherStats, tags=["Statistics"])
async def get_weather_statistics(
hours: int = Query(24, ge=1, le=168, description="Zeitraum in Stunden für Statistiken")
hours: int = Query(24, ge=1, le=168, description="Zeitraum in Stunden für Statistiken"),
conn = Depends(get_db_conn),
):
"""Gibt aggregierte Statistiken für den angegebenen Zeitraum zurück"""
conn = get_db_connection()
try:
with conn.cursor() as cursor:
cursor.execute("""
SELECT
AVG(temperature) as avg_temperature,
MIN(temperature) as min_temperature,
MAX(temperature) as max_temperature,
AVG(humidity) as avg_humidity,
AVG(pressure) as avg_pressure,
AVG(wind_speed) as avg_wind_speed,
MAX(wind_gust) as max_wind_gust,
SUM(rain) as total_rain,
COUNT(*) as data_points
FROM weather_data
WHERE datetime >= NOW() - make_interval(hours => %s)
""", (hours,))
result = cursor.fetchone()
if not result or result['data_points'] == 0:
raise HTTPException(status_code=404, detail="Keine Daten für den Zeitraum verfügbar")
return dict(result)
finally:
conn.close()
with conn.cursor() as cursor:
cursor.execute("""
SELECT
AVG(temperature) as avg_temperature,
MIN(temperature) as min_temperature,
MAX(temperature) as max_temperature,
AVG(humidity) as avg_humidity,
AVG(pressure) as avg_pressure,
AVG(wind_speed * 1.60934) as avg_wind_speed,
MAX(wind_gust * 1.60934) as max_wind_gust,
SUM(rain) as total_rain,
COUNT(*) as data_points
FROM weather_data
WHERE datetime >= NOW() - make_interval(hours => %s)
""", (hours,))
result = cursor.fetchone()
if not result or result['data_points'] == 0:
raise HTTPException(status_code=404, detail="Keine Daten für den Zeitraum verfügbar")
return dict(result)
@app.get("/weather/daily", response_model=List[WeatherStats], tags=["Statistics"])
async def get_daily_statistics(
days: int = Query(7, ge=1, le=90, description="Anzahl Tage zurück (max 90)")
days: int = Query(7, ge=1, le=90, description="Anzahl Tage zurück (max 90)"),
conn = Depends(get_db_conn),
):
"""Gibt tägliche Statistiken für die letzten X Tage zurück"""
conn = get_db_connection()
try:
with conn.cursor() as cursor:
cursor.execute("""
SELECT
DATE(datetime) as date,
AVG(temperature) as avg_temperature,
MIN(temperature) as min_temperature,
MAX(temperature) as max_temperature,
AVG(humidity) as avg_humidity,
AVG(pressure) as avg_pressure,
AVG(wind_speed) as avg_wind_speed,
MAX(wind_gust) as max_wind_gust,
SUM(rain) as total_rain,
COUNT(*) as data_points
FROM weather_data
WHERE datetime >= NOW() - make_interval(days => %s)
GROUP BY DATE(datetime)
ORDER BY date DESC
""", (days,))
results = cursor.fetchall()
return [dict(row) for row in results]
finally:
conn.close()
with conn.cursor() as cursor:
cursor.execute("""
SELECT
DATE(datetime) as date,
AVG(temperature) as avg_temperature,
MIN(temperature) as min_temperature,
MAX(temperature) as max_temperature,
AVG(humidity) as avg_humidity,
AVG(pressure) as avg_pressure,
AVG(wind_speed * 1.60934) as avg_wind_speed,
MAX(wind_gust * 1.60934) as max_wind_gust,
SUM(rain) as total_rain,
COUNT(*) as data_points
FROM weather_data
WHERE datetime >= NOW() - make_interval(days => %s)
GROUP BY DATE(datetime)
ORDER BY date DESC
""", (days,))
results = cursor.fetchall()
return [dict(row) for row in results]
@app.get("/weather/temperature", response_model=List[dict], tags=["Weather Data"])
async def get_temperature_data(
hours: int = Query(24, ge=1, le=168, description="Anzahl Stunden zurück")
hours: int = Query(24, ge=1, le=168, description="Anzahl Stunden zurück"),
conn = Depends(get_db_conn),
):
"""Gibt nur Temperatur-Zeitreihen zurück (optimiert für Diagramme)"""
conn = get_db_connection()
try:
with conn.cursor() as cursor:
cursor.execute("""
SELECT datetime, temperature
FROM weather_data
WHERE datetime >= NOW() - make_interval(hours => %s)
AND temperature IS NOT NULL
ORDER BY datetime ASC
""", (hours,))
results = cursor.fetchall()
return [dict(row) for row in results]
finally:
conn.close()
with conn.cursor() as cursor:
cursor.execute("""
SELECT datetime, temperature
FROM weather_data
WHERE datetime >= NOW() - make_interval(hours => %s)
AND temperature IS NOT NULL
ORDER BY datetime ASC
""", (hours,))
results = cursor.fetchall()
return [dict(row) for row in results]
@app.get("/weather/wind", response_model=List[dict], tags=["Weather Data"])
async def get_wind_data(
hours: int = Query(24, ge=1, le=168, description="Anzahl Stunden zurück")
hours: int = Query(24, ge=1, le=168, description="Anzahl Stunden zurück"),
conn = Depends(get_db_conn),
):
"""Gibt nur Wind-Daten zurück (Geschwindigkeit, Richtung, Böen)"""
conn = get_db_connection()
try:
with conn.cursor() as cursor:
cursor.execute("""
SELECT datetime, wind_speed, wind_gust, wind_dir
FROM weather_data
WHERE datetime >= NOW() - make_interval(hours => %s)
ORDER BY datetime ASC
""", (hours,))
results = cursor.fetchall()
return [dict(row) for row in results]
finally:
conn.close()
with conn.cursor() as cursor:
cursor.execute("""
SELECT datetime,
wind_speed * 1.60934 as wind_speed,
wind_gust * 1.60934 as wind_gust,
wind_dir
FROM weather_data
WHERE datetime >= NOW() - make_interval(hours => %s)
ORDER BY datetime ASC
""", (hours,))
results = cursor.fetchall()
return [dict(row) for row in results]
@app.get("/weather/rain", response_model=List[dict], tags=["Weather Data"])
async def get_rain_data(
hours: int = Query(24, ge=1, le=168, description="Anzahl Stunden zurück")
hours: int = Query(24, ge=1, le=168, description="Anzahl Stunden zurück"),
conn = Depends(get_db_conn),
):
"""Gibt nur Regen-Daten zurück"""
conn = get_db_connection()
try:
with conn.cursor() as cursor:
cursor.execute("""
SELECT datetime, rain, rain_rate
FROM weather_data
WHERE datetime >= NOW() - make_interval(hours => %s)
with conn.cursor() as cursor:
cursor.execute("""
SELECT datetime, rain, rain_rate
FROM weather_data
WHERE datetime >= NOW() - make_interval(hours => %s)
ORDER BY datetime ASC
""", (hours,))
results = cursor.fetchall()
return [dict(row) for row in results]
@app.get("/weather/hourly-aggregated", response_model=List[WeatherData], tags=["Aggregated Data"])
async def get_hourly_aggregated_data(
days: int = Query(7, ge=1, le=60, description="Anzahl Tage zurück (max 60)"),
conn = Depends(get_db_conn),
):
"""Gibt stündlich aggregierte Wetterdaten zurück (Stundenmittel)"""
with conn.cursor() as cursor:
cursor.execute("""
SELECT
0 as id,
date_trunc('hour', datetime) as datetime,
AVG(temperature) as temperature,
ROUND(AVG(humidity)) as humidity,
AVG(pressure) as pressure,
AVG(wind_speed * 1.60934) as wind_speed,
MAX(wind_gust * 1.60934) as wind_gust,
AVG(wind_dir) as wind_dir,
AVG(rain) as rain,
AVG(rain_rate) as rain_rate,
MAX(received_at) as received_at
FROM weather_data
WHERE datetime >= NOW() - make_interval(days => %s)
GROUP BY date_trunc('hour', datetime)
ORDER BY datetime ASC
""", (days,))
results = cursor.fetchall()
return [dict(row) for row in results]
@app.get("/weather/daily-aggregated", response_model=List[dict], tags=["Aggregated Data"])
async def get_daily_aggregated_data(
days: int = Query(365, ge=1, le=730, description="Anzahl Tage zurück (max 730)"),
conn = Depends(get_db_conn),
):
"""Gibt täglich aggregierte Wetterdaten zurück (Tagesmittel mit Min/Max-Temperaturen)"""
with conn.cursor() as cursor:
cursor.execute("""
SELECT
date_trunc('day', datetime) as datetime,
AVG(temperature)::float as temperature,
MIN(temperature)::float as min_temperature,
MAX(temperature)::float as max_temperature,
ROUND(AVG(humidity))::int as humidity,
MIN(humidity)::int as min_humidity,
MAX(humidity)::int as max_humidity,
AVG(pressure)::float as pressure,
MIN(pressure)::float as min_pressure,
MAX(pressure)::float as max_pressure,
AVG(wind_speed * 1.60934)::float as wind_speed,
MAX(wind_gust * 1.60934)::float as wind_gust,
AVG(wind_dir)::float as wind_dir,
SUM(rain)::float as total_rain
FROM weather_data
WHERE datetime >= NOW() - make_interval(days => %s)
GROUP BY date_trunc('day', datetime)
ORDER BY datetime ASC
""", (hours,))
results = cursor.fetchall()
return [dict(row) for row in results]
finally:
conn.close()
""", (days,))
results = cursor.fetchall()
return [dict(row) for row in results]
@app.get("/weather/daily-with-minmax", response_model=List[dict], tags=["Aggregated Data"])
async def get_daily_with_minmax_data(
days: int = Query(30, ge=1, le=90, description="Anzahl Tage zurück (max 90)"),
conn = Depends(get_db_conn),
):
"""Gibt täglich aggregierte Wetterdaten mit Min/Max-Temperaturen zurück"""
with conn.cursor() as cursor:
cursor.execute("""
SELECT
date_trunc('day', datetime) as datetime,
AVG(temperature)::float as temperature,
MIN(temperature)::float as min_temperature,
MAX(temperature)::float as max_temperature,
ROUND(AVG(humidity))::int as humidity,
MIN(humidity)::int as min_humidity,
MAX(humidity)::int as max_humidity,
AVG(pressure)::float as pressure,
MIN(pressure)::float as min_pressure,
MAX(pressure)::float as max_pressure,
AVG(wind_speed * 1.60934)::float as wind_speed,
MAX(wind_gust * 1.60934)::float as wind_gust,
AVG(wind_dir)::float as wind_dir,
SUM(rain)::float as total_rain
FROM weather_data
WHERE datetime >= NOW() - make_interval(days => %s)
GROUP BY date_trunc('day', datetime)
ORDER BY datetime ASC
""", (days,))
results = cursor.fetchall()
return [dict(row) for row in results]
@app.get("/weather/rain-daily", response_model=List[dict], tags=["Aggregated Data"])
async def get_daily_rain_data(
days: int = Query(30, ge=1, le=365, description="Anzahl Tage zurück"),
conn = Depends(get_db_conn),
):
"""Gibt tägliche Regensummen zurück"""
with conn.cursor() as cursor:
cursor.execute("""
SELECT
date_trunc('day', datetime) as date,
SUM(rain) as total_rain
FROM weather_data
WHERE datetime >= NOW() - make_interval(days => %s)
GROUP BY date_trunc('day', datetime)
ORDER BY date ASC
""", (days,))
results = cursor.fetchall()
return [dict(row) for row in results]
@app.get("/weather/rain-weekly", response_model=List[dict], tags=["Aggregated Data"])
async def get_weekly_rain_data(
days: int = Query(365, ge=1, le=730, description="Anzahl Tage zurück"),
conn = Depends(get_db_conn),
):
"""Gibt wöchentliche Regensummen zurück (Woche = Mo-So)"""
with conn.cursor() as cursor:
# Bei 365 Tagen: alle verfügbaren Daten zurückgeben
if days >= 365:
cursor.execute("""
SELECT
date_trunc('week', datetime) as week_start,
SUM(rain) as total_rain
FROM weather_data
GROUP BY date_trunc('week', datetime)
ORDER BY week_start ASC
""")
else:
cursor.execute("""
SELECT
date_trunc('week', datetime) as week_start,
SUM(rain) as total_rain
FROM weather_data
WHERE datetime >= NOW() - make_interval(days => %s)
GROUP BY date_trunc('week', datetime)
ORDER BY week_start ASC
""", (days,))
results = cursor.fetchall()
return [dict(row) for row in results]
@app.get("/weather/hourly-aggregated-range", response_model=List[dict], tags=["Aggregated Data"])
async def get_hourly_aggregated_range(
start: datetime = Query(..., description="Startdatum (ISO 8601)"),
end: datetime = Query(..., description="Enddatum (ISO 8601)"),
conn = Depends(get_db_conn),
):
"""Gibt stündlich aggregierte Wetterdaten für einen bestimmten Zeitraum zurück"""
if start >= end:
raise HTTPException(status_code=400, detail="Startdatum muss vor Enddatum liegen")
with conn.cursor() as cursor:
cursor.execute("""
SELECT
date_trunc('hour', datetime) as datetime,
AVG(temperature)::float as temperature,
ROUND(AVG(humidity))::int as humidity,
AVG(pressure)::float as pressure,
AVG(wind_speed * 1.60934)::float as wind_speed,
MAX(wind_gust * 1.60934)::float as wind_gust,
AVG(wind_dir)::float as wind_dir
FROM weather_data
WHERE datetime BETWEEN %s AND %s
GROUP BY date_trunc('hour', datetime)
ORDER BY datetime ASC
""", (start, end))
results = cursor.fetchall()
return [dict(row) for row in results]
@app.get("/weather/daily-aggregated-range", response_model=List[dict], tags=["Aggregated Data"])
async def get_daily_aggregated_range(
start: datetime = Query(..., description="Startdatum (ISO 8601)"),
end: datetime = Query(..., description="Enddatum (ISO 8601)"),
conn = Depends(get_db_conn),
):
"""Gibt täglich aggregierte Wetterdaten mit Min/Max-Temperaturen für einen bestimmten Zeitraum zurück"""
if start >= end:
raise HTTPException(status_code=400, detail="Startdatum muss vor Enddatum liegen")
with conn.cursor() as cursor:
cursor.execute("""
SELECT
date_trunc('day', datetime) as datetime,
AVG(temperature)::float as temperature,
MIN(temperature)::float as min_temperature,
MAX(temperature)::float as max_temperature,
ROUND(AVG(humidity))::int as humidity,
MIN(humidity)::int as min_humidity,
MAX(humidity)::int as max_humidity,
AVG(pressure)::float as pressure,
MIN(pressure)::float as min_pressure,
MAX(pressure)::float as max_pressure,
AVG(wind_speed * 1.60934)::float as wind_speed,
MAX(wind_gust * 1.60934)::float as wind_gust,
AVG(wind_dir)::float as wind_dir,
SUM(rain)::float as total_rain
FROM weather_data
WHERE datetime BETWEEN %s AND %s
GROUP BY date_trunc('day', datetime)
ORDER BY datetime ASC
""", (start, end))
results = cursor.fetchall()
return [dict(row) for row in results]
if __name__ == "__main__":

598
api/main.py_org Normal file
View File

@@ -0,0 +1,598 @@
from fastapi import FastAPI, HTTPException, Query
from fastapi.middleware.cors import CORSMiddleware
from pydantic import BaseModel, Field, ConfigDict
from typing import List, Optional
from datetime import datetime, timedelta
import os
from pathlib import Path
from dotenv import load_dotenv
import psycopg
from psycopg.rows import dict_row
import logging
# Logging konfigurieren
logging.basicConfig(
level=logging.INFO,
format='%(asctime)s - %(levelname)s - %(message)s'
)
logger = logging.getLogger(__name__)
# Umgebungsvariablen laden
env_path = Path(__file__).parent.parent / '.env'
load_dotenv(dotenv_path=env_path)
# Datenbank-Konfiguration
DB_HOST = os.getenv('DB_HOST', 'localhost')
DB_PORT = int(os.getenv('DB_PORT', 5432))
DB_NAME = os.getenv('DB_NAME', 'wetterstation')
DB_USER = os.getenv('DB_USER')
DB_PASSWORD = os.getenv('DB_PASSWORD')
# FastAPI App erstellen
app = FastAPI(
title="Wetterstation API",
description="API zum Auslesen von Wetterdaten",
version="1.0.0"
)
# CORS Middleware — auf bekannte Frontend-Domains beschraenkt.
# Zusaetzliche Origins koennen via ENV CORS_EXTRA_ORIGINS (Komma-separiert) gesetzt werden.
_default_origins = [
"https://stwwetter.fuerst-stuttgart.de",
"https://sternwarte-welzheim.de",
"http://localhost:3000",
"http://localhost:5173",
]
_extra = os.getenv("CORS_EXTRA_ORIGINS", "")
_extra_list = [o.strip() for o in _extra.split(",") if o.strip()]
ALLOWED_ORIGINS = _default_origins + _extra_list
app.add_middleware(
CORSMiddleware,
allow_origins=ALLOWED_ORIGINS,
allow_credentials=False, # API liest nur, keine Cookies/Auth noetig
allow_methods=["GET", "OPTIONS"], # API ist read-only
allow_headers=["Content-Type"],
max_age=600,
)
logger.info("CORS aktiv fuer: %s", ALLOWED_ORIGINS)
# Pydantic Models
class WeatherData(BaseModel):
model_config = ConfigDict(from_attributes=True)
id: int
datetime: datetime
temperature: Optional[float] = None
humidity: Optional[int] = None
pressure: Optional[float] = None
wind_speed: Optional[float] = None
wind_gust: Optional[float] = None
wind_dir: Optional[float] = None
rain: Optional[float] = None
rain_rate: Optional[float] = None
bar_trend: Optional[int] = None
received_at: datetime
class WeatherStats(BaseModel):
avg_temperature: Optional[float] = None
min_temperature: Optional[float] = None
max_temperature: Optional[float] = None
avg_humidity: Optional[float] = None
avg_pressure: Optional[float] = None
avg_wind_speed: Optional[float] = None
max_wind_gust: Optional[float] = None
total_rain: Optional[float] = None
data_points: int
class HealthResponse(BaseModel):
status: str
database: str
timestamp: datetime
# Datenbankverbindung
def get_db_connection():
"""Erstellt eine Datenbankverbindung"""
try:
conn = psycopg.connect(
host=DB_HOST,
port=DB_PORT,
dbname=DB_NAME,
user=DB_USER,
password=DB_PASSWORD,
row_factory=dict_row
)
return conn
except Exception as e:
logger.error(f"Datenbankverbindungsfehler: {e}")
raise HTTPException(status_code=500, detail="Datenbankverbindung fehlgeschlagen")
# API Endpoints
@app.get("/", tags=["General"])
async def root():
"""Root Endpoint"""
return {
"message": "Wetterstation API",
"version": "1.0.0",
"docs": "/docs"
}
@app.get("/health", response_model=HealthResponse, tags=["General"])
async def health_check():
"""Health Check Endpoint"""
try:
conn = get_db_connection()
with conn.cursor() as cursor:
cursor.execute("SELECT 1")
conn.close()
db_status = "connected"
except Exception:
db_status = "disconnected"
return {
"status": "ok" if db_status == "connected" else "error",
"database": db_status,
"timestamp": datetime.now()
}
@app.get("/weather/latest", response_model=WeatherData, tags=["Weather Data"])
async def get_latest_weather():
"""Gibt die neuesten Wetterdaten zurück"""
conn = get_db_connection()
try:
with conn.cursor() as cursor:
cursor.execute("""
SELECT id, datetime, temperature, humidity, pressure,
wind_speed * 1.60934 as wind_speed,
wind_gust * 1.60934 as wind_gust,
wind_dir, rain, rain_rate, bar_trend, received_at
FROM weather_data
ORDER BY datetime DESC
LIMIT 1
""")
result = cursor.fetchone()
if not result:
raise HTTPException(status_code=404, detail="Keine Daten verfügbar")
return dict(result)
finally:
conn.close()
@app.get("/weather/current", response_model=WeatherData, tags=["Weather Data"])
async def get_current_weather():
"""Alias für /weather/latest - gibt aktuelle Wetterdaten zurück"""
return await get_latest_weather()
@app.get("/weather/history", response_model=List[WeatherData], tags=["Weather Data"])
async def get_weather_history(
hours: int = Query(24, ge=1, le=168, description="Anzahl Stunden zurück (max 168 = 7 Tage)"),
limit: int = Query(1000, ge=1, le=10000, description="Maximale Anzahl Datensätze")
):
"""Gibt historische Wetterdaten der letzten X Stunden zurück"""
conn = get_db_connection()
try:
with conn.cursor() as cursor:
cursor.execute("""
SELECT id, datetime, temperature, humidity, pressure,
wind_speed * 1.60934 as wind_speed,
wind_gust * 1.60934 as wind_gust,
wind_dir, rain, rain_rate, bar_trend, received_at
FROM weather_data
WHERE datetime >= NOW() - make_interval(hours => %s)
ORDER BY datetime DESC
LIMIT %s
""", (hours, limit))
results = cursor.fetchall()
return [dict(row) for row in results]
finally:
conn.close()
@app.get("/weather/range", response_model=List[WeatherData], tags=["Weather Data"])
async def get_weather_by_date_range(
start: datetime = Query(..., description="Startdatum (ISO 8601)"),
end: datetime = Query(..., description="Enddatum (ISO 8601)"),
limit: int = Query(10000, ge=1, le=50000, description="Maximale Anzahl Datensätze")
):
"""Gibt Wetterdaten für einen bestimmten Zeitraum zurück"""
if start >= end:
raise HTTPException(status_code=400, detail="Startdatum muss vor Enddatum liegen")
conn = get_db_connection()
try:
with conn.cursor() as cursor:
cursor.execute("""
SELECT * FROM weather_data
WHERE datetime BETWEEN %s AND %s
ORDER BY datetime ASC
LIMIT %s
""", (start, end, limit))
results = cursor.fetchall()
return [dict(row) for row in results]
finally:
conn.close()
@app.get("/weather/stats", response_model=WeatherStats, tags=["Statistics"])
async def get_weather_statistics(
hours: int = Query(24, ge=1, le=168, description="Zeitraum in Stunden für Statistiken")
):
"""Gibt aggregierte Statistiken für den angegebenen Zeitraum zurück"""
conn = get_db_connection()
try:
with conn.cursor() as cursor:
cursor.execute("""
SELECT
AVG(temperature) as avg_temperature,
MIN(temperature) as min_temperature,
MAX(temperature) as max_temperature,
AVG(humidity) as avg_humidity,
AVG(pressure) as avg_pressure,
AVG(wind_speed * 1.60934) as avg_wind_speed,
MAX(wind_gust * 1.60934) as max_wind_gust,
SUM(rain) as total_rain,
COUNT(*) as data_points
FROM weather_data
WHERE datetime >= NOW() - make_interval(hours => %s)
""", (hours,))
result = cursor.fetchone()
if not result or result['data_points'] == 0:
raise HTTPException(status_code=404, detail="Keine Daten für den Zeitraum verfügbar")
return dict(result)
finally:
conn.close()
@app.get("/weather/daily", response_model=List[WeatherStats], tags=["Statistics"])
async def get_daily_statistics(
days: int = Query(7, ge=1, le=90, description="Anzahl Tage zurück (max 90)")
):
"""Gibt tägliche Statistiken für die letzten X Tage zurück"""
conn = get_db_connection()
try:
with conn.cursor() as cursor:
cursor.execute("""
SELECT
DATE(datetime) as date,
AVG(temperature) as avg_temperature,
MIN(temperature) as min_temperature,
MAX(temperature) as max_temperature,
AVG(humidity) as avg_humidity,
AVG(pressure) as avg_pressure,
AVG(wind_speed * 1.60934) as avg_wind_speed,
MAX(wind_gust * 1.60934) as max_wind_gust,
SUM(rain) as total_rain,
COUNT(*) as data_points
FROM weather_data
WHERE datetime >= NOW() - make_interval(days => %s)
GROUP BY DATE(datetime)
ORDER BY date DESC
""", (days,))
results = cursor.fetchall()
return [dict(row) for row in results]
finally:
conn.close()
@app.get("/weather/temperature", response_model=List[dict], tags=["Weather Data"])
async def get_temperature_data(
hours: int = Query(24, ge=1, le=168, description="Anzahl Stunden zurück")
):
"""Gibt nur Temperatur-Zeitreihen zurück (optimiert für Diagramme)"""
conn = get_db_connection()
try:
with conn.cursor() as cursor:
cursor.execute("""
SELECT datetime, temperature
FROM weather_data
WHERE datetime >= NOW() - make_interval(hours => %s)
AND temperature IS NOT NULL
ORDER BY datetime ASC
""", (hours,))
results = cursor.fetchall()
return [dict(row) for row in results]
finally:
conn.close()
@app.get("/weather/wind", response_model=List[dict], tags=["Weather Data"])
async def get_wind_data(
hours: int = Query(24, ge=1, le=168, description="Anzahl Stunden zurück")
):
"""Gibt nur Wind-Daten zurück (Geschwindigkeit, Richtung, Böen)"""
conn = get_db_connection()
try:
with conn.cursor() as cursor:
cursor.execute("""
SELECT datetime,
wind_speed * 1.60934 as wind_speed,
wind_gust * 1.60934 as wind_gust,
wind_dir
FROM weather_data
WHERE datetime >= NOW() - make_interval(hours => %s)
ORDER BY datetime ASC
""", (hours,))
results = cursor.fetchall()
return [dict(row) for row in results]
finally:
conn.close()
@app.get("/weather/rain", response_model=List[dict], tags=["Weather Data"])
async def get_rain_data(
hours: int = Query(24, ge=1, le=168, description="Anzahl Stunden zurück")
):
"""Gibt nur Regen-Daten zurück"""
conn = get_db_connection()
try:
with conn.cursor() as cursor:
cursor.execute("""
SELECT datetime, rain, rain_rate
FROM weather_data
WHERE datetime >= NOW() - make_interval(hours => %s)
ORDER BY datetime ASC
""", (hours,))
results = cursor.fetchall()
return [dict(row) for row in results]
finally:
conn.close()
@app.get("/weather/hourly-aggregated", response_model=List[WeatherData], tags=["Aggregated Data"])
async def get_hourly_aggregated_data(
days: int = Query(7, ge=1, le=60, description="Anzahl Tage zurück (max 60)")
):
"""Gibt stündlich aggregierte Wetterdaten zurück (Stundenmittel)"""
conn = get_db_connection()
try:
with conn.cursor() as cursor:
cursor.execute("""
SELECT
0 as id,
date_trunc('hour', datetime) as datetime,
AVG(temperature) as temperature,
ROUND(AVG(humidity)) as humidity,
AVG(pressure) as pressure,
AVG(wind_speed * 1.60934) as wind_speed,
MAX(wind_gust * 1.60934) as wind_gust,
AVG(wind_dir) as wind_dir,
AVG(rain) as rain,
AVG(rain_rate) as rain_rate,
MAX(received_at) as received_at
FROM weather_data
WHERE datetime >= NOW() - make_interval(days => %s)
GROUP BY date_trunc('hour', datetime)
ORDER BY datetime ASC
""", (days,))
results = cursor.fetchall()
return [dict(row) for row in results]
finally:
conn.close()
@app.get("/weather/daily-aggregated", response_model=List[dict], tags=["Aggregated Data"])
async def get_daily_aggregated_data(
days: int = Query(365, ge=1, le=730, description="Anzahl Tage zurück (max 730)")
):
"""Gibt täglich aggregierte Wetterdaten zurück (Tagesmittel mit Min/Max-Temperaturen)"""
conn = get_db_connection()
try:
with conn.cursor() as cursor:
cursor.execute("""
SELECT
date_trunc('day', datetime) as datetime,
AVG(temperature)::float as temperature,
MIN(temperature)::float as min_temperature,
MAX(temperature)::float as max_temperature,
ROUND(AVG(humidity))::int as humidity,
MIN(humidity)::int as min_humidity,
MAX(humidity)::int as max_humidity,
AVG(pressure)::float as pressure,
MIN(pressure)::float as min_pressure,
MAX(pressure)::float as max_pressure,
AVG(wind_speed * 1.60934)::float as wind_speed,
MAX(wind_gust * 1.60934)::float as wind_gust,
AVG(wind_dir)::float as wind_dir,
SUM(rain)::float as total_rain
FROM weather_data
WHERE datetime >= NOW() - make_interval(days => %s)
GROUP BY date_trunc('day', datetime)
ORDER BY datetime ASC
""", (days,))
results = cursor.fetchall()
return [dict(row) for row in results]
finally:
conn.close()
@app.get("/weather/daily-with-minmax", response_model=List[dict], tags=["Aggregated Data"])
async def get_daily_with_minmax_data(
days: int = Query(30, ge=1, le=90, description="Anzahl Tage zurück (max 90)")
):
"""Gibt täglich aggregierte Wetterdaten mit Min/Max-Temperaturen zurück"""
conn = get_db_connection()
try:
with conn.cursor() as cursor:
cursor.execute("""
SELECT
date_trunc('day', datetime) as datetime,
AVG(temperature)::float as temperature,
MIN(temperature)::float as min_temperature,
MAX(temperature)::float as max_temperature,
ROUND(AVG(humidity))::int as humidity,
MIN(humidity)::int as min_humidity,
MAX(humidity)::int as max_humidity,
AVG(pressure)::float as pressure,
MIN(pressure)::float as min_pressure,
MAX(pressure)::float as max_pressure,
AVG(wind_speed * 1.60934)::float as wind_speed,
MAX(wind_gust * 1.60934)::float as wind_gust,
AVG(wind_dir)::float as wind_dir,
SUM(rain)::float as total_rain
FROM weather_data
WHERE datetime >= NOW() - make_interval(days => %s)
GROUP BY date_trunc('day', datetime)
ORDER BY datetime ASC
""", (days,))
results = cursor.fetchall()
return [dict(row) for row in results]
finally:
conn.close()
@app.get("/weather/rain-daily", response_model=List[dict], tags=["Aggregated Data"])
async def get_daily_rain_data(
days: int = Query(30, ge=1, le=365, description="Anzahl Tage zurück")
):
"""Gibt tägliche Regensummen zurück"""
conn = get_db_connection()
try:
with conn.cursor() as cursor:
cursor.execute("""
SELECT
date_trunc('day', datetime) as date,
SUM(rain) as total_rain
FROM weather_data
WHERE datetime >= NOW() - make_interval(days => %s)
GROUP BY date_trunc('day', datetime)
ORDER BY date ASC
""", (days,))
results = cursor.fetchall()
return [dict(row) for row in results]
finally:
conn.close()
@app.get("/weather/rain-weekly", response_model=List[dict], tags=["Aggregated Data"])
async def get_weekly_rain_data(
days: int = Query(365, ge=1, le=730, description="Anzahl Tage zurück")
):
"""Gibt wöchentliche Regensummen zurück (Woche = Mo-So)"""
conn = get_db_connection()
try:
with conn.cursor() as cursor:
# Bei 365 Tagen: alle verfügbaren Daten zurückgeben
if days >= 365:
cursor.execute("""
SELECT
date_trunc('week', datetime) as week_start,
SUM(rain) as total_rain
FROM weather_data
GROUP BY date_trunc('week', datetime)
ORDER BY week_start ASC
""")
else:
cursor.execute("""
SELECT
date_trunc('week', datetime) as week_start,
SUM(rain) as total_rain
FROM weather_data
WHERE datetime >= NOW() - make_interval(days => %s)
GROUP BY date_trunc('week', datetime)
ORDER BY week_start ASC
""", (days,))
results = cursor.fetchall()
return [dict(row) for row in results]
finally:
conn.close()
@app.get("/weather/hourly-aggregated-range", response_model=List[dict], tags=["Aggregated Data"])
async def get_hourly_aggregated_range(
start: datetime = Query(..., description="Startdatum (ISO 8601)"),
end: datetime = Query(..., description="Enddatum (ISO 8601)")
):
"""Gibt stündlich aggregierte Wetterdaten für einen bestimmten Zeitraum zurück"""
if start >= end:
raise HTTPException(status_code=400, detail="Startdatum muss vor Enddatum liegen")
conn = get_db_connection()
try:
with conn.cursor() as cursor:
cursor.execute("""
SELECT
date_trunc('hour', datetime) as datetime,
AVG(temperature)::float as temperature,
ROUND(AVG(humidity))::int as humidity,
AVG(pressure)::float as pressure,
AVG(wind_speed * 1.60934)::float as wind_speed,
MAX(wind_gust * 1.60934)::float as wind_gust,
AVG(wind_dir)::float as wind_dir
FROM weather_data
WHERE datetime BETWEEN %s AND %s
GROUP BY date_trunc('hour', datetime)
ORDER BY datetime ASC
""", (start, end))
results = cursor.fetchall()
return [dict(row) for row in results]
finally:
conn.close()
@app.get("/weather/daily-aggregated-range", response_model=List[dict], tags=["Aggregated Data"])
async def get_daily_aggregated_range(
start: datetime = Query(..., description="Startdatum (ISO 8601)"),
end: datetime = Query(..., description="Enddatum (ISO 8601)")
):
"""Gibt täglich aggregierte Wetterdaten mit Min/Max-Temperaturen für einen bestimmten Zeitraum zurück"""
if start >= end:
raise HTTPException(status_code=400, detail="Startdatum muss vor Enddatum liegen")
conn = get_db_connection()
try:
with conn.cursor() as cursor:
cursor.execute("""
SELECT
date_trunc('day', datetime) as datetime,
AVG(temperature)::float as temperature,
MIN(temperature)::float as min_temperature,
MAX(temperature)::float as max_temperature,
ROUND(AVG(humidity))::int as humidity,
MIN(humidity)::int as min_humidity,
MAX(humidity)::int as max_humidity,
AVG(pressure)::float as pressure,
MIN(pressure)::float as min_pressure,
MAX(pressure)::float as max_pressure,
AVG(wind_speed * 1.60934)::float as wind_speed,
MAX(wind_gust * 1.60934)::float as wind_gust,
AVG(wind_dir)::float as wind_dir,
SUM(rain)::float as total_rain
FROM weather_data
WHERE datetime BETWEEN %s AND %s
GROUP BY date_trunc('day', datetime)
ORDER BY datetime ASC
""", (start, end))
results = cursor.fetchall()
return [dict(row) for row in results]
finally:
conn.close()
if __name__ == "__main__":
import uvicorn
uvicorn.run(app, host="0.0.0.0", port=8000)

View File

@@ -1,5 +1,6 @@
fastapi>=0.115.0
uvicorn[standard]>=0.32.0
psycopg[binary]>=3.2.0
python-dotenv>=1.0.0
pydantic>=2.10.0
fastapi==0.115.5
uvicorn[standard]==0.34.0
psycopg[binary]==3.2.3
psycopg_pool==3.2.4
python-dotenv==1.0.1
pydantic==2.10.3

View File

@@ -1,3 +1,5 @@
# syntax=docker/dockerfile:1
FROM python:3.13-slim
WORKDIR /app

View File

@@ -1,204 +1,635 @@
# MQTT subscriber that reads weather data and stores in PostgreSQL
# HTTP API that receives weather data via POST and stores in PostgreSQL
import os
import json
import logging
import ssl
import secrets
from contextlib import asynccontextmanager
from datetime import datetime
from pathlib import Path
from dotenv import load_dotenv
import paho.mqtt.client as mqtt
import psycopg2
from psycopg2.extras import RealDictCursor
from typing import Optional
# Logging konfigurieren
from dotenv import load_dotenv
from fastapi import FastAPI, HTTPException, Request, Header, Depends, status
from fastapi.exceptions import RequestValidationError
from fastapi.responses import JSONResponse
from pydantic import BaseModel, ConfigDict, field_validator
import psycopg
from psycopg_pool import ConnectionPool
from slowapi import Limiter
from slowapi.errors import RateLimitExceeded
from slowapi.util import get_remote_address
import uvicorn
# --------------------------------------------------------------------------- #
# Konfiguration
# --------------------------------------------------------------------------- #
env_path = Path(__file__).parent.parent / ".env"
load_dotenv(dotenv_path=env_path)
COLLECTOR_PORT = int(os.getenv("COLLECTOR_PORT", 8001))
DB_HOST = os.getenv("DB_HOST", "localhost")
DB_PORT = int(os.getenv("DB_PORT", 5432))
DB_NAME = os.getenv("DB_NAME", "wetterstation")
DB_USER = os.getenv("DB_USER")
DB_PASSWORD = os.getenv("DB_PASSWORD")
# Sicherheit
COLLECTOR_API_KEY = os.getenv("COLLECTOR_API_KEY")
ENVIRONMENT = os.getenv("ENVIRONMENT", "production").lower()
IS_DEV = ENVIRONMENT in ("dev", "development", "local")
# Limits
MAX_BODY_BYTES = int(os.getenv("COLLECTOR_MAX_BODY_BYTES", 16 * 1024)) # 16 KiB
RATE_LIMIT = os.getenv("COLLECTOR_RATE_LIMIT", "30/minute")
# Logging — keine Rohdaten auf INFO mehr
logging.basicConfig(
level=logging.INFO,
format='%(asctime)s - %(levelname)s - %(message)s'
level=logging.DEBUG if IS_DEV else logging.INFO,
format="%(asctime)s - %(levelname)s - %(message)s",
)
logger = logging.getLogger(__name__)
# Umgebungsvariablen laden - eine Ebene höher
env_path = Path(__file__).parent.parent / '.env'
load_dotenv(dotenv_path=env_path)
# Konfiguration
MQTT_BROKER = os.getenv('MQTT_BROKER', 'rexfue.de')
MQTT_PORT = int(os.getenv('MQTT_PORT', 1883))
MQTT_USERNAME = os.getenv('MQTT_USERNAME')
MQTT_PASSWORD = os.getenv('MQTT_PASSWORD')
MQTT_TOPIC = os.getenv('MQTT_TOPIC', 'vantage/live')
# --------------------------------------------------------------------------- #
# Connection Pool
# --------------------------------------------------------------------------- #
DB_HOST = os.getenv('DB_HOST', 'localhost')
DB_PORT = int(os.getenv('DB_PORT', 5432))
DB_NAME = os.getenv('DB_NAME', 'wetterstation')
DB_USER = os.getenv('DB_USER')
DB_PASSWORD = os.getenv('DB_PASSWORD')
def _build_conninfo() -> str:
return (
f"host={DB_HOST} port={DB_PORT} dbname={DB_NAME} "
f"user={DB_USER} password={DB_PASSWORD}"
)
class WeatherDataCollector:
"""Klasse zum Sammeln und Speichern von Wetterdaten aus MQTT in PostgreSQL"""
def __init__(self):
self.db_conn = None
self.mqtt_client = None
self.setup_database()
self.setup_mqtt()
def setup_database(self):
"""Datenbankverbindung herstellen und Tabelle erstellen"""
try:
self.db_conn = psycopg2.connect(
host=DB_HOST,
port=DB_PORT,
database=DB_NAME,
user=DB_USER,
password=DB_PASSWORD
pool: Optional[ConnectionPool] = None
# --------------------------------------------------------------------------- #
# Rate-Limiter
# --------------------------------------------------------------------------- #
def _limit_key(request: Request) -> str:
"""Rate-Limit-Key: bei API-Key danach, sonst nach IP.
Hinter Traefik nutzt slowapi standardmaessig die Peer-IP, was der
Proxy-IP entspricht. Wenn ein API-Key da ist, bevorzugen wir den.
"""
api_key = request.headers.get("x-api-key")
if api_key:
# nur Praefix einsetzen, damit der volle Key nicht in Logs landet
return f"key:{api_key[:8]}"
fwd = request.headers.get("x-forwarded-for")
if fwd:
return f"ip:{fwd.split(',')[0].strip()}"
return f"ip:{get_remote_address(request)}"
limiter = Limiter(key_func=_limit_key, default_limits=[])
# --------------------------------------------------------------------------- #
# Auth-Dependency
# --------------------------------------------------------------------------- #
async def require_api_key(x_api_key: Optional[str] = Header(default=None)) -> None:
"""Prueft den API-Key timing-safe gegen die Konfiguration."""
if not COLLECTOR_API_KEY:
# Fail-closed: wenn kein Key konfiguriert ist, ist die API gesperrt.
logger.error(
"COLLECTOR_API_KEY ist nicht gesetzt - alle Schreibzugriffe blockiert."
)
raise HTTPException(
status_code=status.HTTP_503_SERVICE_UNAVAILABLE,
detail="Service not configured",
)
if not x_api_key or not secrets.compare_digest(x_api_key, COLLECTOR_API_KEY):
raise HTTPException(
status_code=status.HTTP_401_UNAUTHORIZED,
detail="Invalid or missing API key",
)
# --------------------------------------------------------------------------- #
# Pydantic Models mit Plausibilitaetspruefung
# --------------------------------------------------------------------------- #
class WeatherDataInput(BaseModel):
# extra-Felder verwerfen statt akzeptieren -> kein Pollution
model_config = ConfigDict(extra="ignore")
# Zeitstempel
time: Optional[str] = None
datetime: Optional[str] = None
dateTime: Optional[int] = None
# Aussentemperatur
tempOut: Optional[float] = None # Celsius (neu)
temperature: Optional[float] = None # Celsius
outTemp: Optional[float] = None # Fahrenheit (alt)
# Innentemperatur
tempIn: Optional[float] = None # Celsius
# Aussenfeuchte
humOut: Optional[int] = None
humidity: Optional[int] = None
outHumidity: Optional[float] = None
# Innenfeuchte
humIn: Optional[int] = None
# Luftdruck
pressure: Optional[float] = None # hPa
barometer: Optional[float] = None # inHg
barTrend: Optional[int] = None
# Wind
windAvg: Optional[float] = None
windSpeed: Optional[float] = None
wind_speed: Optional[float] = None
windGust: Optional[float] = None
wind_gust: Optional[float] = None
windDir: Optional[float] = None
wind_dir: Optional[float] = None
# Niederschlag
rain: Optional[float] = None
rainRate: Optional[float] = None
rain_rate: Optional[float] = None
# Vorhersage
forecast: Optional[int] = None
# ---- Validatoren -----------------------------------------------------
@field_validator("tempOut", "temperature", "tempIn")
@classmethod
def _temp_celsius_range(cls, v: Optional[float]) -> Optional[float]:
if v is not None and not (-90.0 <= v <= 70.0):
raise ValueError("temperature out of plausible range (Celsius)")
return v
@field_validator("outTemp")
@classmethod
def _temp_fahrenheit_range(cls, v: Optional[float]) -> Optional[float]:
if v is not None and not (-130.0 <= v <= 160.0):
raise ValueError("outTemp out of plausible range (Fahrenheit)")
return v
@field_validator("humOut", "humidity", "humIn")
@classmethod
def _humidity_int_range(cls, v: Optional[int]) -> Optional[int]:
if v is not None and not (0 <= v <= 100):
raise ValueError("humidity out of range")
return v
@field_validator("outHumidity")
@classmethod
def _humidity_float_range(cls, v: Optional[float]) -> Optional[float]:
if v is not None and not (0.0 <= v <= 100.0):
raise ValueError("outHumidity out of range")
return v
@field_validator("pressure")
@classmethod
def _pressure_hpa_range(cls, v: Optional[float]) -> Optional[float]:
if v is not None and not (800.0 <= v <= 1100.0):
raise ValueError("pressure (hPa) out of plausible range")
return v
@field_validator("barometer")
@classmethod
def _pressure_inhg_range(cls, v: Optional[float]) -> Optional[float]:
if v is not None and not (23.0 <= v <= 32.5):
raise ValueError("barometer (inHg) out of plausible range")
return v
@field_validator("windAvg", "windSpeed", "wind_speed", "windGust", "wind_gust")
@classmethod
def _wind_speed_range(cls, v: Optional[float]) -> Optional[float]:
if v is not None and not (0.0 <= v <= 120.0):
raise ValueError("wind speed out of plausible range")
return v
@field_validator("windDir", "wind_dir")
@classmethod
def _wind_dir_range(cls, v: Optional[float]) -> Optional[float]:
if v is not None and not (0.0 <= v <= 360.0):
raise ValueError("wind_dir out of range")
return v
@field_validator("rain", "rainRate", "rain_rate")
@classmethod
def _rain_range(cls, v: Optional[float]) -> Optional[float]:
if v is not None and not (0.0 <= v <= 1000.0):
raise ValueError("rain value out of plausible range")
return v
# ---- Konvertierungen -------------------------------------------------
def get_datetime_string(self) -> str:
if self.time:
return self.time
if self.datetime:
return self.datetime
if self.dateTime is not None:
# Plausibilitaet: 2000-01-01 .. 2100-01-01
if not (946684800 <= self.dateTime <= 4102444800):
raise ValueError("dateTime timestamp out of plausible range")
return datetime.fromtimestamp(self.dateTime).strftime("%Y-%m-%d %H:%M:%S")
raise ValueError("Kein Zeitstempel vorhanden (time, datetime oder dateTime)")
def get_temperature_celsius(self) -> Optional[float]:
if self.tempOut is not None:
return self.tempOut
if self.temperature is not None:
return self.temperature
if self.outTemp is not None:
return (self.outTemp - 32) * 5 / 9
return None
def get_temp_in(self) -> Optional[float]:
return self.tempIn
def get_humidity_int(self) -> Optional[int]:
if self.humOut is not None:
return int(self.humOut)
if self.humidity is not None:
return int(self.humidity)
if self.outHumidity is not None:
return int(self.outHumidity)
return None
def get_humidity_in(self) -> Optional[int]:
return int(self.humIn) if self.humIn is not None else None
def get_pressure_hpa(self) -> Optional[float]:
if self.pressure is not None:
return self.pressure
if self.barometer is not None:
return self.barometer * 33.8639
return None
def get_wind_speed(self) -> Optional[float]:
if self.windAvg is not None:
return self.windAvg
if self.windSpeed is not None:
return self.windSpeed
return self.wind_speed
def get_wind_gust(self) -> Optional[float]:
return self.windGust if self.windGust is not None else self.wind_gust
def get_wind_dir(self) -> Optional[float]:
return self.windDir if self.windDir is not None else self.wind_dir
def get_rain_rate(self) -> Optional[float]:
return self.rainRate if self.rainRate is not None else self.rain_rate
# --------------------------------------------------------------------------- #
# Datenbank-Setup
# --------------------------------------------------------------------------- #
def setup_database() -> None:
"""Tabelle, fehlende Spalten und Index anlegen (idempotent)."""
assert pool is not None
with pool.connection() as conn:
with conn.cursor() as cursor:
cursor.execute(
"""
CREATE TABLE IF NOT EXISTS weather_data (
id SERIAL PRIMARY KEY,
datetime TIMESTAMPTZ NOT NULL,
temperature FLOAT,
humidity INTEGER,
pressure FLOAT,
wind_speed FLOAT,
wind_gust FLOAT,
wind_dir FLOAT,
rain FLOAT,
rain_rate FLOAT,
received_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
UNIQUE(datetime)
)
"""
)
logger.info("Datenbankverbindung hergestellt")
# Tabelle erstellen falls nicht vorhanden
with self.db_conn.cursor() as cursor:
cursor.execute("""
CREATE TABLE IF NOT EXISTS weather_data (
id SERIAL PRIMARY KEY,
datetime TIMESTAMP NOT NULL,
temperature FLOAT,
humidity INTEGER,
pressure FLOAT,
wind_speed FLOAT,
wind_gust FLOAT,
wind_dir FLOAT,
rain FLOAT,
rain_rate FLOAT,
received_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
UNIQUE(datetime)
cursor.execute(
"ALTER TABLE weather_data ADD COLUMN IF NOT EXISTS temp_in FLOAT"
)
cursor.execute(
"ALTER TABLE weather_data ADD COLUMN IF NOT EXISTS humidity_in INTEGER"
)
cursor.execute(
"ALTER TABLE weather_data ADD COLUMN IF NOT EXISTS forecast INTEGER"
)
cursor.execute(
"ALTER TABLE weather_data ADD COLUMN IF NOT EXISTS bar_trend INTEGER"
)
cursor.execute(
"CREATE INDEX IF NOT EXISTS idx_weather_datetime_desc "
"ON weather_data (datetime DESC)"
)
conn.commit()
logger.info("Tabelle weather_data und Index bereit")
# --------------------------------------------------------------------------- #
# FastAPI Lifespan
# --------------------------------------------------------------------------- #
@asynccontextmanager
async def lifespan(app: FastAPI):
global pool
# Pflicht-Variablen pruefen — fail fast
missing = [v for v in ("DB_USER", "DB_PASSWORD") if not os.getenv(v)]
if missing:
raise RuntimeError(
f"Fehlende Umgebungsvariablen: {', '.join(missing)}"
)
if not COLLECTOR_API_KEY:
raise RuntimeError(
"COLLECTOR_API_KEY ist nicht gesetzt. "
"Mindestens 32 Zeichen empfohlen (z.B. via 'openssl rand -hex 32')."
)
if len(COLLECTOR_API_KEY) < 16:
raise RuntimeError(
"COLLECTOR_API_KEY ist zu kurz (Minimum 16 Zeichen)."
)
pool = ConnectionPool(
conninfo=_build_conninfo(),
min_size=1,
max_size=5,
timeout=10,
kwargs={"autocommit": False},
)
pool.wait()
logger.info("Connection Pool initialisiert (min=1, max=5)")
setup_database()
logger.info("Collector laeuft auf Port %d (env=%s)", COLLECTOR_PORT, ENVIRONMENT)
try:
yield
finally:
if pool is not None:
pool.close()
logger.info("Connection Pool geschlossen")
# --------------------------------------------------------------------------- #
# FastAPI App
# --------------------------------------------------------------------------- #
app = FastAPI(
title="Weather Data Collector API",
docs_url="/docs" if IS_DEV else None,
redoc_url=None,
openapi_url="/openapi.json" if IS_DEV else None,
lifespan=lifespan,
)
# Rate-Limiter an die App binden
app.state.limiter = limiter
@app.exception_handler(RateLimitExceeded)
async def _rate_limit_handler(request: Request, exc: RateLimitExceeded):
return JSONResponse(
status_code=status.HTTP_429_TOO_MANY_REQUESTS,
content={"detail": "Too many requests"},
)
@app.exception_handler(RequestValidationError)
async def _validation_handler(request: Request, exc: RequestValidationError):
# Details ins Log, generische Antwort an den Client.
logger.warning("Validation error on %s: %s", request.url.path, exc.errors())
return JSONResponse(
status_code=status.HTTP_422_UNPROCESSABLE_ENTITY,
content={"detail": "Validation error"},
)
@app.exception_handler(Exception)
async def _unhandled_handler(request: Request, exc: Exception):
# NIE Stacktraces oder str(exc) an den Client zurueckgeben.
logger.exception("Unhandled error on %s", request.url.path)
return JSONResponse(
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
content={"detail": "Internal server error"},
)
# --------------------------------------------------------------------------- #
# Body-Size-Middleware
# --------------------------------------------------------------------------- #
@app.middleware("http")
async def _limit_body_size(request: Request, call_next):
if request.method in ("POST", "PUT", "PATCH"):
cl = request.headers.get("content-length")
if cl is not None:
try:
if int(cl) > MAX_BODY_BYTES:
return JSONResponse(
status_code=status.HTTP_413_REQUEST_ENTITY_TOO_LARGE,
content={"detail": "Payload too large"},
)
""")
self.db_conn.commit()
logger.info("Tabelle weather_data bereit")
except Exception as e:
logger.error(f"Fehler bei Datenbankverbindung: {e}")
raise
def setup_mqtt(self):
"""MQTT Client konfigurieren"""
self.mqtt_client = mqtt.Client()
self.mqtt_client.username_pw_set(MQTT_USERNAME, MQTT_PASSWORD)
# Callbacks setzen
self.mqtt_client.on_connect = self.on_connect
self.mqtt_client.on_message = self.on_message
self.mqtt_client.on_disconnect = self.on_disconnect
logger.info(f"MQTT Client konfiguriert für {MQTT_BROKER}:{MQTT_PORT}")
def on_connect(self, client, userdata, flags, rc):
"""Callback wenn MQTT Verbindung hergestellt wird"""
if rc == 0:
logger.info("Mit MQTT Broker verbunden")
client.subscribe(MQTT_TOPIC)
logger.info(f"Topic abonniert: {MQTT_TOPIC}")
else:
logger.error(f"Verbindung fehlgeschlagen mit Code {rc}")
def on_disconnect(self, client, userdata, rc):
"""Callback wenn MQTT Verbindung getrennt wird"""
if rc != 0:
logger.warning(f"Unerwartete Trennung vom Broker. Code: {rc}")
def on_message(self, client, userdata, msg):
"""Callback wenn MQTT Nachricht empfangen wird"""
try:
payload = msg.payload.decode('utf-8')
logger.info(f"Nachricht empfangen auf {msg.topic}: {payload}")
# JSON parsen
data = json.loads(payload)
# In Datenbank speichern
self.save_to_database(data)
except json.JSONDecodeError as e:
logger.error(f"Fehler beim JSON-Parsen: {e}")
except Exception as e:
logger.error(f"Fehler bei Nachrichtenverarbeitung: {e}")
def save_to_database(self, data):
"""Wetterdaten in PostgreSQL speichern"""
try:
with self.db_conn.cursor() as cursor:
cursor.execute("""
INSERT INTO weather_data
(datetime, temperature, humidity, pressure, wind_speed,
wind_gust, wind_dir, rain, rain_rate)
VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s)
ON CONFLICT (datetime) DO UPDATE SET
temperature = EXCLUDED.temperature,
humidity = EXCLUDED.humidity,
pressure = EXCLUDED.pressure,
wind_speed = EXCLUDED.wind_speed,
wind_gust = EXCLUDED.wind_gust,
wind_dir = EXCLUDED.wind_dir,
rain = EXCLUDED.rain,
rain_rate = EXCLUDED.rain_rate
""", (
data.get('datetime'),
data.get('temperature'),
data.get('humidity'),
data.get('pressure'),
data.get('wind_speed'),
data.get('wind_gust'),
data.get('wind_dir'),
data.get('rain'),
data.get('rain_rate')
))
self.db_conn.commit()
logger.info(f"Daten gespeichert für {data.get('datetime')}")
except Exception as e:
logger.error(f"Fehler beim Speichern in Datenbank: {e}")
self.db_conn.rollback()
def start(self):
"""MQTT Client starten und auf Nachrichten warten"""
try:
self.mqtt_client.connect(MQTT_BROKER, MQTT_PORT, 60)
logger.info("Starte MQTT Loop...")
self.mqtt_client.loop_forever()
except KeyboardInterrupt:
logger.info("Programm wird beendet...")
except Exception as e:
logger.error(f"Fehler beim Start: {e}")
finally:
self.cleanup()
def cleanup(self):
"""Ressourcen aufräumen"""
if self.mqtt_client:
self.mqtt_client.disconnect()
logger.info("MQTT Verbindung getrennt")
if self.db_conn:
self.db_conn.close()
logger.info("Datenbankverbindung geschlossen")
except ValueError:
return JSONResponse(
status_code=status.HTTP_400_BAD_REQUEST,
content={"detail": "Invalid Content-Length"},
)
return await call_next(request)
def main():
"""Hauptfunktion"""
logger.info("Wetterstation Collector startet...")
# Prüfen ob alle nötigen Umgebungsvariablen gesetzt sind
required_vars = ['MQTT_USERNAME', 'MQTT_PASSWORD', 'DB_USER', 'DB_PASSWORD']
missing_vars = [var for var in required_vars if not os.getenv(var)]
if missing_vars:
logger.error(f"Fehlende Umgebungsvariablen: {', '.join(missing_vars)}")
logger.error("Bitte .env Datei mit den erforderlichen Werten erstellen")
return
collector = WeatherDataCollector()
collector.start()
# --------------------------------------------------------------------------- #
# Endpoints
# --------------------------------------------------------------------------- #
@app.get("/")
async def root():
"""Info-Endpunkt (kein Auth noetig)."""
return {
"service": "Weather Data Collector",
"version": "2.0.0",
"endpoint": "POST /weather (X-API-Key required)",
}
@app.get("/health")
async def health_check():
"""Health-Check ohne Auth, aber ohne sensitive Details."""
try:
assert pool is not None
with pool.connection() as conn:
with conn.cursor() as cursor:
cursor.execute("SELECT 1")
return {"status": "healthy"}
except Exception:
logger.exception("Health-Check fehlgeschlagen")
raise HTTPException(
status_code=status.HTTP_503_SERVICE_UNAVAILABLE,
detail="Service unavailable",
)
def _store_weather(data: WeatherDataInput) -> dict:
"""Schreibt einen Datenpunkt; setzt voraus, dass `data` validiert ist."""
assert pool is not None
dt_string = data.get_datetime_string()
values = (
dt_string,
data.get_temperature_celsius(),
data.get_temp_in(),
data.get_humidity_int(),
data.get_humidity_in(),
data.get_pressure_hpa(),
data.barTrend,
data.get_wind_speed(),
data.get_wind_gust(),
data.get_wind_dir(),
data.rain,
data.get_rain_rate(),
data.forecast,
)
with pool.connection() as conn:
with conn.cursor() as cursor:
cursor.execute(
"""
INSERT INTO weather_data
(datetime, temperature, temp_in, humidity, humidity_in,
pressure, bar_trend, wind_speed, wind_gust, wind_dir,
rain, rain_rate, forecast)
VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s)
ON CONFLICT (datetime) DO UPDATE SET
temperature = EXCLUDED.temperature,
temp_in = EXCLUDED.temp_in,
humidity = EXCLUDED.humidity,
humidity_in = EXCLUDED.humidity_in,
pressure = EXCLUDED.pressure,
bar_trend = EXCLUDED.bar_trend,
wind_speed = EXCLUDED.wind_speed,
wind_gust = EXCLUDED.wind_gust,
wind_dir = EXCLUDED.wind_dir,
rain = EXCLUDED.rain,
rain_rate = EXCLUDED.rain_rate,
forecast = EXCLUDED.forecast
""",
values,
)
conn.commit()
logger.info("Datenpunkt gespeichert fuer %s", dt_string)
return {"status": "success", "datetime": dt_string}
@app.post("/weather", dependencies=[Depends(require_api_key)])
@limiter.limit(RATE_LIMIT)
async def receive_weather_data(request: Request, data: WeatherDataInput):
"""Wetterdaten empfangen und speichern (Auth + Rate-Limit)."""
try:
return _store_weather(data)
except ValueError as e:
# Konvertierungs-Fehler (z.B. fehlender Zeitstempel)
logger.warning("Bad request on /weather: %s", e)
raise HTTPException(
status_code=status.HTTP_400_BAD_REQUEST,
detail="Invalid input",
)
except psycopg.Error:
logger.exception("DB-Fehler beim Speichern")
raise HTTPException(
status_code=status.HTTP_503_SERVICE_UNAVAILABLE,
detail="Storage unavailable",
)
@app.post("/", dependencies=[Depends(require_api_key)])
@limiter.limit(RATE_LIMIT)
async def root_post(request: Request):
"""Alias fuer POST /weather (Auth + Rate-Limit)."""
body = await request.body()
if len(body) > MAX_BODY_BYTES:
raise HTTPException(
status_code=status.HTTP_413_REQUEST_ENTITY_TOO_LARGE,
detail="Payload too large",
)
try:
data_dict = json.loads(body.decode("utf-8"))
except (UnicodeDecodeError, json.JSONDecodeError):
raise HTTPException(
status_code=status.HTTP_400_BAD_REQUEST,
detail="Invalid JSON",
)
try:
data = WeatherDataInput(**data_dict)
except Exception:
# Pydantic-Fehler enthalten ggf. Werte aus dem Body — nicht durchreichen.
logger.warning("Validation failed on POST /")
raise HTTPException(
status_code=status.HTTP_422_UNPROCESSABLE_ENTITY,
detail="Validation error",
)
try:
return _store_weather(data)
except ValueError:
raise HTTPException(
status_code=status.HTTP_400_BAD_REQUEST,
detail="Invalid input",
)
except psycopg.Error:
logger.exception("DB-Fehler beim Speichern")
raise HTTPException(
status_code=status.HTTP_503_SERVICE_UNAVAILABLE,
detail="Storage unavailable",
)
# Debug-Endpunkt nur in DEV-Modus + nur mit API-Key
if IS_DEV:
@app.post("/debug", dependencies=[Depends(require_api_key)])
async def debug_post(request: Request):
body = await request.body()
if len(body) > MAX_BODY_BYTES:
raise HTTPException(
status_code=status.HTTP_413_REQUEST_ENTITY_TOO_LARGE,
detail="Payload too large",
)
try:
payload = json.loads(body.decode("utf-8"))
except Exception:
raise HTTPException(status_code=400, detail="Invalid JSON")
logger.debug("Debug payload: %s", payload)
return {"status": "logged"}
# --------------------------------------------------------------------------- #
# Entry-Point
# --------------------------------------------------------------------------- #
def main() -> None:
uvicorn.run(
app,
host="0.0.0.0",
port=COLLECTOR_PORT,
# In Produktion liegt Traefik davor und terminiert TLS.
# X-Forwarded-* nur dann auswerten, wenn man dem Proxy vertraut.
proxy_headers=True,
forwarded_allow_ips="*",
)
if __name__ == "__main__":
main()
main()

364
collector/main.py_old Normal file
View File

@@ -0,0 +1,364 @@
# HTTP API that receives weather data via POST and stores in PostgreSQL
import os
import json
import logging
from datetime import datetime
from pathlib import Path
from dotenv import load_dotenv
from fastapi import FastAPI, HTTPException, Request
from pydantic import BaseModel
import psycopg2
from psycopg2.extras import RealDictCursor
import uvicorn
# Logging konfigurieren
logging.basicConfig(
level=logging.INFO,
format='%(asctime)s - %(levelname)s - %(message)s'
)
logger = logging.getLogger(__name__)
# Umgebungsvariablen laden - eine Ebene höher
env_path = Path(__file__).parent.parent / '.env'
load_dotenv(dotenv_path=env_path)
# Konfiguration
COLLECTOR_PORT = int(os.getenv('COLLECTOR_PORT', 8001))
DB_HOST = os.getenv('DB_HOST', 'localhost')
DB_PORT = int(os.getenv('DB_PORT', 5432))
DB_NAME = os.getenv('DB_NAME', 'wetterstation')
DB_USER = os.getenv('DB_USER')
DB_PASSWORD = os.getenv('DB_PASSWORD')
# FastAPI App
app = FastAPI(title="Weather Data Collector API")
# Pydantic Models
class WeatherDataInput(BaseModel):
# Zeitstempel: ISO-String (time), datetime-String oder Unix-Timestamp
time: str | None = None
datetime: str | None = None
dateTime: int | None = None
# Außentemperatur (Celsius): tempOut, temperature oder outTemp (Fahrenheit)
tempOut: float | None = None # Celsius (neues Format)
temperature: float | None = None
outTemp: float | None = None # Fahrenheit (altes Format)
# Innentemperatur
tempIn: float | None = None # Celsius
# Außenfeuchte
humOut: int | None = None
humidity: int | None = None
outHumidity: float | None = None
# Innenfeuchte
humIn: int | None = None
# Luftdruck
pressure: float | None = None
barometer: float | None = None # inHg
barTrend: int | None = None # hPa/Stunde
# Wind
windAvg: float | None = None # m/s Durchschnitt (neues Format)
windSpeed: float | None = None
wind_speed: float | None = None
windGust: float | None = None
wind_gust: float | None = None
windDir: float | None = None
wind_dir: float | None = None
# Niederschlag
rain: float | None = None
rainRate: float | None = None
rain_rate: float | None = None
# Vorhersage
forecast: int | None = None
model_config = {"extra": "allow"}
def get_datetime_string(self) -> str:
"""Zeitstempel als String zurückgeben"""
if self.time:
return self.time
elif self.datetime:
return self.datetime
elif self.dateTime:
from datetime import datetime as dt
return dt.fromtimestamp(self.dateTime).strftime('%Y-%m-%d %H:%M:%S')
raise ValueError("Kein Zeitstempel vorhanden (time, datetime oder dateTime)")
def get_temperature_celsius(self) -> float | None:
"""Außentemperatur in Celsius"""
if self.tempOut is not None:
return self.tempOut
elif self.temperature is not None:
return self.temperature
elif self.outTemp is not None:
return (self.outTemp - 32) * 5 / 9
return None
def get_temp_in(self) -> float | None:
"""Innentemperatur in Celsius"""
return self.tempIn
def get_humidity_int(self) -> int | None:
"""Außenfeuchte"""
if self.humOut is not None:
return int(self.humOut)
elif self.humidity is not None:
return int(self.humidity)
elif self.outHumidity is not None:
return int(self.outHumidity)
return None
def get_humidity_in(self) -> int | None:
"""Innenfeuchte"""
return int(self.humIn) if self.humIn is not None else None
def get_pressure_hpa(self) -> float | None:
"""Luftdruck in hPa"""
if self.pressure is not None:
return self.pressure
elif self.barometer is not None:
return self.barometer * 33.8639
return None
def get_wind_speed(self) -> float | None:
"""Durchschnittliche Windgeschwindigkeit"""
if self.windAvg is not None:
return self.windAvg
elif self.windSpeed is not None:
return self.windSpeed
return self.wind_speed
def get_wind_gust(self) -> float | None:
"""Windböe"""
return self.windGust if self.windGust is not None else self.wind_gust
def get_wind_dir(self) -> float | None:
"""Windrichtung"""
return self.windDir if self.windDir is not None else self.wind_dir
def get_rain_rate(self) -> float | None:
"""Regenrate"""
return self.rainRate if self.rainRate is not None else self.rain_rate
# Datenbankverbindung
def get_db_connection():
"""Datenbankverbindung herstellen"""
try:
conn = psycopg2.connect(
host=DB_HOST,
port=DB_PORT,
database=DB_NAME,
user=DB_USER,
password=DB_PASSWORD
)
return conn
except Exception as e:
logger.error(f"Datenbankverbindungsfehler: {e}")
raise HTTPException(status_code=500, detail="Datenbankverbindung fehlgeschlagen")
def setup_database():
"""Tabelle erstellen und fehlende Spalten ergänzen"""
try:
conn = get_db_connection()
with conn.cursor() as cursor:
cursor.execute("""
CREATE TABLE IF NOT EXISTS weather_data (
id SERIAL PRIMARY KEY,
datetime TIMESTAMPTZ NOT NULL,
temperature FLOAT,
humidity INTEGER,
pressure FLOAT,
wind_speed FLOAT,
wind_gust FLOAT,
wind_dir FLOAT,
rain FLOAT,
rain_rate FLOAT,
received_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
UNIQUE(datetime)
)
""")
# Neue Spalten ergänzen (idempotent)
cursor.execute("ALTER TABLE weather_data ADD COLUMN IF NOT EXISTS temp_in FLOAT")
cursor.execute("ALTER TABLE weather_data ADD COLUMN IF NOT EXISTS humidity_in INTEGER")
cursor.execute("ALTER TABLE weather_data ADD COLUMN IF NOT EXISTS forecast INTEGER")
cursor.execute("ALTER TABLE weather_data ADD COLUMN IF NOT EXISTS bar_trend INTEGER")
conn.commit()
logger.info("Tabelle weather_data bereit (inkl. neuer Spalten)")
conn.close()
except Exception as e:
logger.error(f"Fehler bei Datenbanksetup: {e}")
raise
# API Endpoints
@app.on_event("startup")
async def startup_event():
"""Bei Start die Datenbank initialisieren"""
logger.info("Collector API startet...")
setup_database()
logger.info(f"API läuft auf Port {COLLECTOR_PORT}")
@app.get("/")
async def root():
"""Root Endpoint - GET zeigt Info"""
return {
"message": "Weather Data Collector API",
"version": "1.0.0",
"endpoint": "POST /weather or POST /"
}
@app.post("/")
async def root_post(request: Request):
"""Root Endpoint - POST akzeptiert Wetterdaten (Alias für /weather)"""
try:
# Rohen Body lesen
body = await request.body()
body_str = body.decode('utf-8')
logger.info(f"POST auf Root - Raw Body: {body_str}")
# Als JSON parsen
data_dict = json.loads(body_str)
logger.info(f"POST auf Root - Parsed JSON: {data_dict}")
# Zu Pydantic Model konvertieren
data = WeatherDataInput(**data_dict)
return await receive_weather_data(data)
except json.JSONDecodeError as e:
logger.error(f"JSON Parse Error: {e}")
raise HTTPException(status_code=400, detail=f"Invalid JSON: {str(e)}")
except Exception as e:
logger.error(f"Fehler bei Root POST: {e}")
raise HTTPException(status_code=422, detail=f"Validation error: {str(e)}")
@app.post("/debug")
async def debug_post(request: dict):
"""Debug Endpoint - akzeptiert beliebige JSON und loggt sie"""
logger.info(f"Debug: Empfangene Rohdaten: {request}")
return {"status": "logged", "data": request}
@app.get("/health")
async def health_check():
"""Health Check"""
try:
conn = get_db_connection()
with conn.cursor() as cursor:
cursor.execute("SELECT 1")
conn.close()
return {"status": "healthy", "database": "connected"}
except Exception as e:
raise HTTPException(status_code=503, detail=f"Database error: {str(e)}")
@app.post("/weather")
async def receive_weather_data(data: WeatherDataInput):
"""Wetterdaten empfangen und speichern"""
logger.info(f"Empfangene Daten: {data.model_dump()}")
try:
conn = get_db_connection()
try:
# Konvertiere zu den richtigen Werten
dt_string = data.get_datetime_string()
temp_c = data.get_temperature_celsius()
temp_in = data.get_temp_in()
humidity = data.get_humidity_int()
humidity_in = data.get_humidity_in()
pressure = data.get_pressure_hpa()
bar_trend = data.barTrend
wind_speed = data.get_wind_speed()
wind_gust = data.get_wind_gust()
wind_dir = data.get_wind_dir()
rain = data.rain
rain_rate = data.get_rain_rate()
forecast = data.forecast
logger.info(
f"Konvertierte Daten - datetime: {dt_string}, "
f"tempOut: {temp_c}°C, tempIn: {temp_in}°C, "
f"humOut: {humidity}%, humIn: {humidity_in}%, "
f"pressure: {pressure} hPa, barTrend: {bar_trend}"
)
with conn.cursor() as cursor:
cursor.execute("""
INSERT INTO weather_data
(datetime, temperature, temp_in, humidity, humidity_in,
pressure, bar_trend, wind_speed, wind_gust, wind_dir,
rain, rain_rate, forecast)
VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s)
ON CONFLICT (datetime) DO UPDATE SET
temperature = EXCLUDED.temperature,
temp_in = EXCLUDED.temp_in,
humidity = EXCLUDED.humidity,
humidity_in = EXCLUDED.humidity_in,
pressure = EXCLUDED.pressure,
bar_trend = EXCLUDED.bar_trend,
wind_speed = EXCLUDED.wind_speed,
wind_gust = EXCLUDED.wind_gust,
wind_dir = EXCLUDED.wind_dir,
rain = EXCLUDED.rain,
rain_rate = EXCLUDED.rain_rate,
forecast = EXCLUDED.forecast
""", (
dt_string,
temp_c,
temp_in,
humidity,
humidity_in,
pressure,
bar_trend,
wind_speed,
wind_gust,
wind_dir,
rain,
rain_rate,
forecast
))
conn.commit()
logger.info(f"Daten gespeichert für {dt_string} (UTC)")
return {
"status": "success",
"message": f"Weather data for {dt_string} saved successfully"
}
finally:
conn.close()
except Exception as e:
logger.error(f"Fehler beim Speichern: {e}")
raise HTTPException(status_code=500, detail=f"Database error: {str(e)}")
def main():
"""Hauptfunktion"""
# Prüfen ob alle nötigen Umgebungsvariablen gesetzt sind
required_vars = ['DB_USER', 'DB_PASSWORD']
missing_vars = [var for var in required_vars if not os.getenv(var)]
if missing_vars:
logger.error(f"Fehlende Umgebungsvariablen: {', '.join(missing_vars)}")
logger.error("Bitte .env Datei mit den erforderlichen Werten erstellen")
return
uvicorn.run(app, host="0.0.0.0", port=COLLECTOR_PORT)
if __name__ == "__main__":
main()

View File

@@ -1,3 +1,6 @@
paho-mqtt==1.6.1
psycopg2-binary==2.9.10
python-dotenv==1.0.0
fastapi==0.115.5
uvicorn[standard]==0.34.0
psycopg[binary]==3.2.3
psycopg_pool==3.2.4
python-dotenv==1.0.1
slowapi==0.1.9

BIN
data/365.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 170 KiB

BIN
data/wview-archive.sdb Normal file

Binary file not shown.

89
deploy.sh Executable file
View File

@@ -0,0 +1,89 @@
#!/bin/bash
# Deploy Script für laufschrift
# Baut das Docker Image und lädt es zu docker.citysensor.de hoch
set -e
# Konfiguration
REGISTRY="docker.citysensor.de"
PROJEKT="wetterstation"
IMAGE_NAME=("${PROJEKT}-frontend" "${PROJEKT}-collector" "${PROJEKT}-api")
TAG="${TAG:-$(date +%Y%m%d%H%M)}" # default Datum
# Build-Datum
BUILD_DATE=$(date +%d.%m.%Y)
echo "=========================================="
echo " Deploy Script"
echo "=========================================="
echo "Registry: ${REGISTRY}"
echo "Images: ${IMAGE_NAME[*]}"
echo "Tag: ${TAG}"
echo "Build-Datum: ${BUILD_DATE}"
echo "=========================================="
echo ""
# 1. Login zur Registry (falls noch nicht eingeloggt)
echo ">>> Login zu ${REGISTRY}..."
docker login "${REGISTRY}"
echo ""
# 2. Multiplatform Builder einrichten (docker-container driver erforderlich)
echo ">>> Richte Multiplatform Builder ein..."
if ! docker buildx inspect multiplatform-builder &>/dev/null; then
docker buildx create --name multiplatform-builder --driver docker-container --bootstrap
fi
docker buildx use multiplatform-builder
echo ""
for image in "${IMAGE_NAME[@]}"; do
# Entferne Projekt-Präfix für Verzeichnisnamen
IMAGE_DIR="${image#${PROJEKT}-}"
FULL_IMAGE="${REGISTRY}/${image}:${TAG}"
echo "=========================================="
echo ">>> Baue ${image}..."
echo ">>> Image: ${FULL_IMAGE}"
echo "=========================================="
# Build-Args vorbereiten (für Frontend Version und Build-Date)
BUILD_ARGS="--build-arg BUILD_DATE=${BUILD_DATE}"
if [ "${IMAGE_DIR}" = "frontend" ]; then
VERSION=$(grep '"version"' "${IMAGE_DIR}/package.json" | head -1 | sed 's/.*"version": "\(.*\)".*/\1/')
BUILD_ARGS="${BUILD_ARGS} --build-arg VERSION=${VERSION}"
fi
# 3. Docker Image bauen und pushen (Multiplatform)
docker buildx build \
--platform linux/amd64,linux/arm64 \
${BUILD_ARGS} \
-t "${FULL_IMAGE}" \
--push \
"./${IMAGE_DIR}"
# 4. Tagge auch als :latest
echo ">>> Tagge ${image} als :latest..."
docker buildx imagetools create \
-t "${REGISTRY}/${image}:latest" \
"${FULL_IMAGE}"
echo "${image} erfolgreich gebaut und gepusht!"
echo ""
done
echo ">>> Alle Builds erfolgreich!"
echo ""
echo "=========================================="
echo "✓ Deploy erfolgreich abgeschlossen!"
echo "=========================================="
echo "Registry: ${REGISTRY}"
echo "Projekt: ${PROJEKT}"
echo "Tag: ${TAG}"
echo ""
echo "Auf dem Server ausführen:"
echo " docker compose -f docker-compose.prod.yml pull"
echo " docker compose -f docker-compose.prod.yml up -d"
echo ""

105
docker-compose.prod.yml Normal file
View File

@@ -0,0 +1,105 @@
services:
postgres:
image: postgres:16-alpine
container_name: wetterstation_db_prod
restart: unless-stopped
environment:
POSTGRES_DB: ${DB_NAME}
POSTGRES_USER: ${DB_USER}
POSTGRES_PASSWORD: ${DB_PASSWORD}
volumes:
- postgres_data:/var/lib/postgresql/data
networks:
- internal
healthcheck:
test: ["CMD-SHELL", "pg_isready -U ${DB_USER} -d ${DB_NAME}"]
interval: 10s
timeout: 5s
retries: 5
collector:
image: docker.citysensor.de/wetterstation-collector:latest
container_name: wetterstation_collector_prod
restart: unless-stopped
env_file:
- ./.env
environment:
DB_HOST: postgres
depends_on:
postgres:
condition: service_healthy
networks:
- internal
- proxy
labels:
- "traefik.enable=true"
- "traefik.docker.network=dockge_default"
- "traefik.http.routers.wetterstation-collector.rule=Host(`stwwetter.fuerst-stuttgart.de`) && PathPrefix(`/collector`)"
- "traefik.http.routers.wetterstation-collector.entrypoints=https"
- "traefik.http.routers.wetterstation-collector.tls=true"
- "traefik.http.routers.wetterstation-collector.tls.certresolver=letsencrypt"
- "traefik.http.middlewares.wetterstation-collector-stripprefix.stripprefix.prefixes=/collector"
- "traefik.http.routers.wetterstation-collector.middlewares=wetterstation-collector-stripprefix"
- "traefik.http.services.wetterstation-collector.loadbalancer.server.port=8001"
api:
image: docker.citysensor.de/wetterstation-api:latest
container_name: wetterstation_api_prod
restart: unless-stopped
env_file:
- ./.env
environment:
DB_HOST: postgres
depends_on:
postgres:
condition: service_healthy
networks:
- internal
- proxy
healthcheck:
test: ["CMD", "curl", "-f", "http://localhost:8000/health"]
interval: 10s
timeout: 5s
retries: 5
start_period: 10s
labels:
- "traefik.enable=true"
- "traefik.docker.network=dockge_default"
- "traefik.http.routers.wetterstation-api.rule=Host(`stwwetter.fuerst-stuttgart.de`) && PathPrefix(`/api`)"
- "traefik.http.routers.wetterstation-api.entrypoints=https"
- "traefik.http.routers.wetterstation-api.tls=true"
- "traefik.http.routers.wetterstation-api.tls.certresolver=letsencrypt"
- "traefik.http.middlewares.wetterstation-api-stripprefix.stripprefix.prefixes=/api"
- "traefik.http.routers.wetterstation-api.middlewares=wetterstation-api-stripprefix"
- "traefik.http.services.wetterstation-api.loadbalancer.server.port=8000"
frontend:
image: docker.citysensor.de/wetterstation-frontend:latest
container_name: wetterstation_frontend_prod
restart: unless-stopped
depends_on:
api:
condition: service_healthy
networks:
- internal
- proxy
labels:
- "traefik.enable=true"
- "traefik.docker.network=dockge_default"
- "traefik.http.routers.wetterstation.rule=Host(`stwwetter.fuerst-stuttgart.de`)"
- "traefik.http.routers.wetterstation.entrypoints=https"
- "traefik.http.routers.wetterstation.tls=true"
- "traefik.http.routers.wetterstation.tls.certresolver=letsencrypt"
- "traefik.http.services.wetterstation.loadbalancer.server.port=80"
volumes:
postgres_data:
name: wetterstation_postgres_data_prod
networks:
internal:
name: wetterstation_internal
driver: bridge
proxy:
name: dockge_default
external: true

View File

@@ -41,6 +41,8 @@ services:
dockerfile: Dockerfile
container_name: wetterstation_collector
restart: unless-stopped
ports:
- "8001:8001"
env_file:
- ./.env
environment:

View File

@@ -0,0 +1,41 @@
version: '3.8'
services:
postgres:
image: postgres:16-alpine
container_name: wetterstation_db
restart: unless-stopped
environment:
POSTGRES_DB: ${DB_NAME}
POSTGRES_USER: ${DB_USER}
POSTGRES_PASSWORD: ${DB_PASSWORD}
ports:
- "${DB_PORT}:5432"
volumes:
- postgres_data:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U ${DB_USER} -d ${DB_NAME}"]
interval: 10s
timeout: 5s
retries: 5
pgadmin:
image: dpage/pgadmin4:latest
container_name: wetterstation_pgadmin
restart: unless-stopped
environment:
PGADMIN_DEFAULT_EMAIL: ${PGADMIN_EMAIL:-admin@admin.com}
PGADMIN_DEFAULT_PASSWORD: ${PGADMIN_PASSWORD:-admin}
PGADMIN_CONFIG_SERVER_MODE: 'False'
ports:
- "5050:80"
volumes:
- pgadmin_data:/var/lib/pgadmin
depends_on:
- postgres
volumes:
postgres_data:
driver: local
pgadmin_data:
driver: local

View File

@@ -1,5 +1,11 @@
# syntax=docker/dockerfile:1
# Build stage
FROM node:20-alpine AS builder
FROM --platform=$BUILDPLATFORM node:20-alpine AS builder
# Build arguments
ARG BUILD_DATE=unknown
ARG VERSION=unknown
WORKDIR /app
@@ -12,14 +18,18 @@ RUN npm ci
# Copy source code
COPY . .
# Build app
# Build app with build info
ENV VITE_BUILD_DATE=${BUILD_DATE}
ENV VITE_VERSION=${VERSION}
RUN npm run build
# Production stage
FROM nginx:alpine
WORKDIR /usr/share/nginx/html
# Copy built app from builder
COPY --from=builder /app/dist /usr/share/nginx/html
COPY --from=builder /app/dist .
# Copy nginx configuration
COPY nginx.conf /etc/nginx/conf.d/default.conf

View File

@@ -1,36 +1,130 @@
# Nginx-Konfiguration fuer das Frontend (Container).
# TLS wird von Traefik vorne dran terminiert; dieser Server lauscht nur auf HTTP intern.
# Nginx-Version aus Headern und Fehlerseiten raus
server_tokens off;
server {
listen 80;
server_name localhost;
root /usr/share/nginx/html;
index index.html;
# Gzip compression
# Body-Limit (Frontend braucht keine grossen POSTs)
client_max_body_size 1m;
# Docker DNS resolver fuer dynamische Service-Aufloesung
resolver 127.0.0.11 valid=30s;
resolver_timeout 5s;
# Gzip
gzip on;
gzip_vary on;
gzip_min_length 1024;
gzip_types text/plain text/css text/xml text/javascript application/x-javascript application/xml+rss application/json;
gzip_types
text/plain
text/css
text/xml
text/javascript
application/x-javascript
application/xml+rss
application/json;
# API proxy
# ----------------------------------------------------------------- #
# Security-Header — gelten fuer alle Antworten dieses Servers.
# 'always' sorgt dafuer, dass sie auch bei 4xx/5xx ausgeliefert werden.
# ----------------------------------------------------------------- #
# HSTS: ein Jahr, inkl. Subdomains. Wenn die Domain noch nicht zu 100%
# auf HTTPS laeuft, kann der Wert auf "max-age=300" reduziert werden,
# bis sicher ist, dass nichts mehr ueber HTTP geht. preload weglassen,
# solange die Domain nicht in der Preload-Liste eingetragen werden soll.
add_header Strict-Transport-Security "max-age=31536000; includeSubDomains" always;
# MIME-Sniffing aus
add_header X-Content-Type-Options "nosniff" always;
# Clickjacking-Schutz: keine Einbettung als iframe
# Referrer nur an gleiche Origin senden
add_header Referrer-Policy "strict-origin-when-cross-origin" always;
# Browser-APIs deaktivieren, die das Frontend nicht benoetigt
add_header Permissions-Policy "camera=(), microphone=(), geolocation=(), payment=(), usb=(), magnetometer=(), gyroscope=(), accelerometer=()" always;
# Content Security Policy — strict, keine externen Quellen
# 'unsafe-inline' fuer style-src ist noetig, weil Highcharts inline-styles
# fuer dynamische Diagramme setzt. script-src bleibt strikt.
# TODO: http://test.sternwarte-welzheim.de entfernen, sobald der Test-Server
# auf HTTPS umgestellt ist. Drei Stellen: server-Block + zwei locations.
add_header Content-Security-Policy "default-src 'self'; script-src 'self'; style-src 'self' 'unsafe-inline'; img-src 'self' data:; font-src 'self' data:; connect-src 'self'; frame-ancestors 'self' https://sternwarte-welzheim.de https://www.sternwarte-welzheim.de https://test.sternwarte-welzheim.de http://test.sternwarte-welzheim.de; base-uri 'self'; form-action 'self'; object-src 'none'" always;
# ----------------------------------------------------------------- #
# API-Proxy
# ----------------------------------------------------------------- #
location /api/ {
proxy_pass http://api:8000/;
set $upstream_api api:8000;
proxy_pass http://$upstream_api/;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection 'upgrade';
# Standard-Header
proxy_set_header Host $host;
proxy_cache_bypass $http_upgrade;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
# Timeouts: lieber sichtbar fehlschlagen als ewig haengen
proxy_connect_timeout 5s;
proxy_send_timeout 15s;
proxy_read_timeout 15s;
# Wenn das Upstream tot ist, sofort 502 statt Retry-Loops
proxy_next_upstream off;
# WebSockets/Upgrade-Pfad behalten, falls spaeter noch gebraucht
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection $http_connection;
}
# Frontend routes
location / {
try_files $uri $uri/ /index.html;
}
# Cache static assets
# ----------------------------------------------------------------- #
# Statische Assets — lange Cache-Zeit, da mit Hash im Dateinamen
# ----------------------------------------------------------------- #
location ~* \.(js|css|png|jpg|jpeg|gif|ico|svg|woff|woff2|ttf|eot)$ {
expires 1y;
add_header Cache-Control "public, immutable";
add_header Cache-Control "public, immutable" always;
# nginx-Quirk: sobald ein add_header in einem location-Block steht,
# werden ALLE add_header der server-Ebene ignoriert. Daher hier
# alle Security-Header noch einmal explizit.
add_header Strict-Transport-Security "max-age=31536000; includeSubDomains" always;
add_header X-Content-Type-Options "nosniff" always;
add_header Referrer-Policy "strict-origin-when-cross-origin" always;
add_header Permissions-Policy "camera=(), microphone=(), geolocation=(), payment=(), usb=(), magnetometer=(), gyroscope=(), accelerometer=()" always;
add_header Content-Security-Policy "default-src 'self'; script-src 'self'; style-src 'self' 'unsafe-inline'; img-src 'self' data:; font-src 'self' data:; connect-src 'self'; frame-ancestors 'self' https://sternwarte-welzheim.de https://www.sternwarte-welzheim.de https://test.sternwarte-welzheim.de http://test.sternwarte-welzheim.de; base-uri 'self'; form-action 'self'; object-src 'none'" always;
}
}
# ----------------------------------------------------------------- #
# Frontend-Routing (SPA)
# ----------------------------------------------------------------- #
location / {
try_files $uri $uri/ /index.html;
# index.html selbst nicht aggressiv cachen, sonst sehen Nutzer
# nach einem Deploy alte Asset-Hashes
add_header Cache-Control "no-cache" always;
# Security-Header hier nochmal explizit (nginx-Quirk, s.o.)
add_header Strict-Transport-Security "max-age=31536000; includeSubDomains" always;
add_header X-Content-Type-Options "nosniff" always;
add_header Referrer-Policy "strict-origin-when-cross-origin" always;
add_header Permissions-Policy "camera=(), microphone=(), geolocation=(), payment=(), usb=(), magnetometer=(), gyroscope=(), accelerometer=()" always;
add_header Content-Security-Policy "default-src 'self'; script-src 'self'; style-src 'self' 'unsafe-inline'; img-src 'self' data:; font-src 'self' data:; connect-src 'self'; frame-ancestors 'self' https://sternwarte-welzheim.de https://www.sternwarte-welzheim.de https://test.sternwarte-welzheim.de http://test.sternwarte-welzheim.de; base-uri 'self'; form-action 'self'; object-src 'none'" always;
}
# Versteckte/Punktdateien blocken (z.B. .env, .git versehentlich im Build)
location ~ /\. {
deny all;
access_log off;
log_not_found off;
}
}

View File

@@ -1,7 +1,7 @@
{
"name": "wetterstation-frontend",
"private": true,
"version": "1.0.0",
"version": "1.4.0",
"type": "module",
"scripts": {
"dev": "vite",
@@ -9,11 +9,10 @@
"preview": "vite preview"
},
"dependencies": {
"chart.js": "^4.4.1",
"chartjs-adapter-date-fns": "^3.0.0",
"date-fns": "^3.3.1",
"highcharts": "^11.4.0",
"highcharts-react-official": "^3.2.1",
"react": "^18.3.1",
"react-chartjs-2": "^5.2.0",
"react-dom": "^18.3.1"
},
"devDependencies": {

View File

@@ -1,24 +1,166 @@
import { useState, useEffect } from 'react'
import { useState, useEffect, useRef } from 'react'
import WeatherDashboard from './components/WeatherDashboard'
import './App.css'
// API-Basis-URL: in Dev direkt auf Backend, in Prod ueber Nginx-Proxy
const API_BASE = import.meta.env.DEV ? 'http://localhost:8000' : '/api'
// 24-Stunden-URL fuer "Aktuell"-Anzeige (auch bei laengeren Zeitraeumen gebraucht)
const CURRENT_URL = `${API_BASE}/weather/history?hours=24&limit=5000`
// JSON-Fetch-Helfer: liefert {ok, data} oder wirft bei Netzfehler.
// Per signal kann der Request abgebrochen werden, wenn timeRange wechselt.
async function fetchJson(url, signal) {
const res = await fetch(url, { signal })
if (!res.ok) throw new Error(`HTTP ${res.status} bei ${url}`)
return res.json()
}
// Bestimmt die URLs fuer den gewaehlten Zeitbereich.
// Returns: { weatherUrl, rainUrl, needsCurrent }
function buildUrls(timeRange) {
// Custom-Range
if (typeof timeRange === 'object' && timeRange.type === 'custom') {
const start = encodeURIComponent(timeRange.start)
const end = encodeURIComponent(timeRange.end)
const days = timeRange.days || 1
const path = days >= 7 ? 'daily-aggregated-range' : 'hourly-aggregated-range'
return {
weatherUrl: `${API_BASE}/weather/${path}?start=${start}&end=${end}`,
rainUrl: null, // TODO: Regen-Aggregation fuer Range implementieren
needsCurrent: true,
}
}
switch (timeRange) {
case '24h':
return {
weatherUrl: `${API_BASE}/weather/history?hours=24&limit=5000`,
rainUrl: null,
needsCurrent: false, // Hauptdaten SIND die aktuellen 24h-Daten
}
case '7d':
return {
weatherUrl: `${API_BASE}/weather/daily-with-minmax?days=7`,
rainUrl: `${API_BASE}/weather/rain-daily?days=7`,
needsCurrent: true,
}
case '30d':
return {
weatherUrl: `${API_BASE}/weather/daily-with-minmax?days=30`,
rainUrl: `${API_BASE}/weather/rain-daily?days=30`,
needsCurrent: true,
}
case '365d':
return {
weatherUrl: `${API_BASE}/weather/daily-aggregated?days=365`,
rainUrl: `${API_BASE}/weather/rain-weekly?days=365`,
needsCurrent: true,
}
default:
return {
weatherUrl: `${API_BASE}/weather/history?hours=24`,
rainUrl: null,
needsCurrent: false,
}
}
}
function App() {
const [weatherData, setWeatherData] = useState([])
const [currentWeatherData, setCurrentWeatherData] = useState([])
const [rainData, setRainData] = useState([])
const [loading, setLoading] = useState(true)
const [error, setError] = useState(null)
const [lastUpdate, setLastUpdate] = useState(null)
const [timeRange, setTimeRange] = useState('24h')
const [showTable, setShowTable] = useState(false)
// Erster-Lade-Flag: nur beim allerersten Fetch zeigen wir den Spinner.
// Bei spaeteren Re-Fetches (Auto-Refresh, Time-Range-Wechsel) bleiben die
// alten Daten sichtbar, bis die neuen da sind — flackert weniger.
const isInitialLoadRef = useRef(true)
const handleTimeRangeChange = (range, customParams) => {
if (range === 'custom' && customParams) {
const start = new Date(customParams.start)
const end = new Date(customParams.end)
const days = Math.ceil((end - start) / (1000 * 60 * 60 * 24))
setTimeRange({ type: 'custom', start: customParams.start, end: customParams.end, days })
} else {
setTimeRange(range)
}
}
useEffect(() => {
// Prüfe ob eingebettete Daten vorhanden sind
if (window.__WEATHER_DATA__) {
// Statische Daten: kein Fetch noetig
if (window.__WEATHER_DATA__ && timeRange === '24h') {
setWeatherData(window.__WEATHER_DATA__)
setCurrentWeatherData(window.__WEATHER_DATA__)
setRainData([])
setLastUpdate(new Date())
setLoading(false)
} else {
setError('Keine Wetterdaten verfügbar')
setLoading(false)
return
}
}, [])
const controller = new AbortController()
const fetchData = async () => {
if (isInitialLoadRef.current) setLoading(true)
const { weatherUrl, rainUrl, needsCurrent } = buildUrls(timeRange)
// Alle drei Requests parallel starten (statt sequentiell wie vorher).
// allSettled, damit ein Fehler bei rain/current die Hauptdaten nicht blockiert.
const requests = [
fetchJson(weatherUrl, controller.signal), // [0] weather - Pflicht
needsCurrent ? fetchJson(CURRENT_URL, controller.signal) : null, // [1] current - optional
rainUrl ? fetchJson(rainUrl, controller.signal) : null, // [2] rain - optional
]
const results = await Promise.allSettled(requests.map(p => p ?? Promise.resolve(null)))
// AbortError ignorieren — passiert, wenn timeRange waehrend des Requests
// gewechselt hat. Der nachfolgende Effekt-Lauf macht den richtigen Fetch.
const aborted = results.some(
r => r.status === 'rejected' && r.reason?.name === 'AbortError'
)
if (aborted) return
// Hauptdaten-Fehler ist fatal; ohne die zeigen wir nichts an.
if (results[0].status === 'rejected') {
setError(results[0].reason?.message || 'Unbekannter Fehler')
setLoading(false)
isInitialLoadRef.current = false
return
}
const weatherResult = results[0].value
const currentResult = results[1].status === 'fulfilled' ? results[1].value : null
const rainResult = results[2].status === 'fulfilled' ? results[2].value : null
setError(null)
setWeatherData(weatherResult)
// Wenn 24h gewaehlt ist, sind weather und current dieselben Daten
setCurrentWeatherData(needsCurrent ? (currentResult ?? []) : weatherResult)
setRainData(rainResult ?? [])
setLastUpdate(new Date())
setLoading(false)
isInitialLoadRef.current = false
}
fetchData()
// Auto-Refresh nur bei 24h, nur wenn keine statischen Daten
let interval = null
if (!window.__WEATHER_DATA__ && timeRange === '24h') {
interval = setInterval(fetchData, 5 * 60 * 1000)
}
return () => {
controller.abort()
if (interval) clearInterval(interval)
}
}, [timeRange])
if (loading) {
return (
@@ -40,19 +182,19 @@ function App() {
// Aktuelle Zeit formatieren
const now = new Date()
const dateStr = now.toLocaleDateString('de-DE', {
weekday: 'long',
year: 'numeric',
month: 'long',
day: 'numeric'
const dateStr = now.toLocaleDateString('de-DE', {
weekday: 'long',
year: 'numeric',
month: 'long',
day: 'numeric'
})
const timeStr = now.toLocaleTimeString('de-DE', {
hour: '2-digit',
minute: '2-digit'
const timeStr = now.toLocaleTimeString('de-DE', {
hour: '2-digit',
minute: '2-digit'
})
// TODO: Sonnenauf-/untergang und Mondphase berechnen
// Aktuell Platzhalter - benötigt Bibliothek wie 'suncalc'
// Aktuell Platzhalter - benoetigt Bibliothek wie 'suncalc'
const sunrise = "06:45"
const sunset = "18:30"
const moonPhase = "abnehmend 50%"
@@ -68,9 +210,17 @@ function App() {
Sonnen-Aufgang: {sunrise} - Untergang: {sunset} &nbsp;&nbsp; Mond-Phase: {moonPhase}
</div>
</header>
<main className="app-main">
<WeatherDashboard data={weatherData} />
<WeatherDashboard
data={weatherData}
currentData={currentWeatherData}
rainData={rainData}
timeRange={timeRange}
onTimeRangeChange={handleTimeRangeChange}
showTable={showTable}
onToggleTable={() => setShowTable(v => !v)}
/>
</main>
</div>
)

View File

@@ -1,107 +0,0 @@
import { useState, useEffect } from 'react'
import WeatherDashboard from './components/WeatherDashboard'
import './App.css'
function App() {
const [weatherData, setWeatherData] = useState([])
const [loading, setLoading] = useState(true)
const [error, setError] = useState(null)
const [lastUpdate, setLastUpdate] = useState(null)
const fetchWeatherData = async () => {
try {
const apiUrl = import.meta.env.VITE_API_URL || '/api'
const response = await fetch(`${apiUrl}/weather/history?hours=24`)
if (!response.ok) {
throw new Error('Fehler beim Laden der Daten')
}
const data = await response.json()
setWeatherData(data)
setLastUpdate(new Date())
setError(null)
} catch (err) {
setError(err.message)
console.error('Fehler beim Laden der Wetterdaten:', err)
} finally {
setLoading(false)
}
}
useEffect(() => {
fetchWeatherData()
// Berechne Zeit bis zum nächsten 5-Min-Schritt + 1 Minute
const scheduleNextRefresh = () => {
const now = new Date()
const minutes = now.getMinutes()
const seconds = now.getSeconds()
const milliseconds = now.getMilliseconds()
// Nächster 5-Minuten-Schritt
const nextFiveMinStep = Math.ceil(minutes / 5) * 5
// Plus 1 Minute
const targetMinute = (nextFiveMinStep + 1) % 60
let targetTime = new Date(now)
targetTime.setMinutes(targetMinute, 0, 0)
// Wenn die Zielzeit in der Vergangenheit liegt, füge eine Stunde hinzu
if (targetTime <= now) {
targetTime.setHours(targetTime.getHours() + 1)
}
const timeUntilRefresh = targetTime - now
console.log(`Nächster Refresh: ${targetTime.toLocaleTimeString('de-DE')} (in ${Math.round(timeUntilRefresh / 1000)}s)`)
return setTimeout(() => {
fetchWeatherData()
scheduleNextRefresh()
}, timeUntilRefresh)
}
const timeout = scheduleNextRefresh()
return () => clearTimeout(timeout)
}, [])
if (loading) {
return (
<div className="loading-container">
<div className="loading-spinner"></div>
<p>Lade Wetterdaten...</p>
</div>
)
}
if (error) {
return (
<div className="error-container">
<h2>Fehler beim Laden der Daten</h2>
<p>{error}</p>
<button onClick={fetchWeatherData}>Erneut versuchen</button>
</div>
)
}
return (
<div className="app">
<header className="app-header">
<h1>🌤️ Wetterstation</h1>
{lastUpdate && (
<p className="last-update">
Letzte Aktualisierung: {lastUpdate.toLocaleTimeString('de-DE')}
</p>
)}
</header>
<main className="app-main">
<WeatherDashboard data={weatherData} />
</main>
</div>
)
}
export default App

View File

@@ -1,13 +1,57 @@
.dashboard {
width: 100%;
max-width: 795px;
/* max-width: 1900px; */
max-width: 795px;
margin: 0 auto;
}
.time-range-nav {
display: flex;
gap: 0.5rem;
margin-bottom: 1rem;
justify-content: center;
flex-wrap: wrap;
}
.time-range-nav button {
padding: 0.5rem 1.5rem;
background: white;
border: 2px solid #ddd;
border-radius: 8px;
cursor: pointer;
font-size: 0.9rem;
font-weight: 500;
color: #333;
transition: all 0.2s ease;
}
.time-range-nav button:hover {
background: #f5f5f5;
border-color: #0066cc;
}
.time-range-nav button.active {
background: #0066cc;
border-color: #0066cc;
color: white;
}
.time-range-label {
text-align: center;
font-size: 1.1rem;
font-weight: 600;
color: #333;
margin-bottom: 1.5rem;
padding: 0.5rem;
background: #f8f9fa;
border-radius: 8px;
}
.current-values {
display: grid;
grid-template-columns: repeat(auto-fit, minmax(140px, 1fr));
gap: 0.75rem;
margin-bottom: 1.5rem;
margin-bottom: 1rem;
}
.value-card {
@@ -38,20 +82,456 @@
gap: 1rem;
}
.chart-item {
display: flex;
flex-direction: column;
gap: 0.2rem;
}
.chart-container {
background: white;
padding: 1rem;
padding: 0.5rem;
border-radius: 8px;
box-shadow: 0 2px 8px rgba(0, 0, 0, 0.1);
}
.chart-container h3 {
display: flex;
justify-content: space-between;
align-items: center;
margin-bottom: 0.75rem;
color: #333;
font-size: 1rem;
background: #e0e0e0;
padding: 0.5rem;
border-radius: 4px;
margin: -0.5rem -0.5rem 0.5rem -0.5rem;
}
.chart-container h3 .unit {
font-weight: normal;
color: #666;
font-size: 0.9rem;
}
.current-value {
text-align: left;
font-size: 0.9rem;
color: #0066cc;
font-weight: 600;
}
.chart-wrapper {
height: 250px;
width: 100%;
aspect-ratio: 2 / 1;
position: relative;
}
.chart-stats {
/* margin-top: 0.5rem; */
text-align: center;
font-size: 0.7rem;
color: #666;
font-weight: 500;
background: #e0e0e0;
padding: 0.5rem;
border-radius: 4px;
margin: 0.2rem -0.5rem -0.5rem -0.5rem;
}
.bar-trend {
font-size: 0.9em;
cursor: help;
}
.dashboard-footer {
/* margin-top: 2rem;
*/ padding-top: 1rem;
}
.footer-divider {
border: none;
border-top: 1px solid #ccc;
margin: 0 0 0.5rem 0;
}
.footer-credits {
display: flex;
justify-content: space-between;
align-items: center;
font-size: 0.85rem;
color: #666;
margin-bottom: 0.75rem;
}
.footer-left {
text-align: left;
}
.footer-right {
text-align: right;
}
.footer-sponsor {
text-align: center;
font-size: 0.85rem;
color: #666;
}
.footer-sponsor a {
color: #0066cc;
text-decoration: none;
}
.footer-sponsor a:hover {
text-decoration: underline;
}
.version-line {
display: flex;
justify-content: space-between;
margin-bottom: 0.5rem;
font-size: 0.85rem;
}
.version-short {
display: none;
}
.version-full {
display: inline;
color: #666;
}
.time-range-short {
display: none;
}
.time-range-full {
display: inline;
}
/* Responsive Design für schmale Bildschirme (Smartphones) */
@media (max-width: 768px) {
.charts-grid {
grid-template-columns: 1fr;
}
.current-values {
grid-template-columns: repeat(2, 1fr);
}
.dashboard {
padding: 0 0.5rem;
}
.time-range-nav button {
padding: 0.4rem 1rem;
font-size: 0.85rem;
}
.time-range-short {
display: inline;
}
.time-range-full {
display: none;
}
.time-range-label {
font-size: 1rem;
}
.version-short {
display: inline;
color: #666;
}
.version-full {
display: none;
}
}
/* Modal Styles */
.modal-overlay {
position: fixed;
top: 0;
left: 0;
right: 0;
bottom: 0;
background: rgba(0, 0, 0, 0.5);
display: flex;
justify-content: center;
align-items: center;
z-index: 1000;
}
.modal-content {
background: white;
border-radius: 12px;
padding: 2rem;
max-width: 500px;
width: 90%;
box-shadow: 0 4px 20px rgba(0, 0, 0, 0.3);
}
.modal-content h2 {
margin-top: 0;
margin-bottom: 1.5rem;
color: #333;
font-size: 1.5rem;
}
.modal-form {
display: flex;
flex-direction: column;
gap: 1rem;
}
.form-group {
display: flex;
flex-direction: column;
gap: 0.5rem;
}
.form-group label {
font-weight: 600;
color: #333;
font-size: 0.95rem;
}
.form-group input[type="datetime-local"] {
padding: 0.75rem;
border: 2px solid #ddd;
border-radius: 8px;
font-size: 1rem;
font-family: inherit;
transition: border-color 0.2s ease;
}
.form-group input[type="datetime-local"]:focus {
outline: none;
border-color: #0066cc;
}
.error-message {
padding: 0.75rem;
background: #fee;
border: 1px solid #fcc;
border-radius: 6px;
color: #c00;
font-size: 0.9rem;
}
.modal-info {
padding: 0.75rem;
background: #f0f8ff;
border-radius: 6px;
font-size: 0.85rem;
color: #555;
}
.modal-info p {
margin: 0.25rem 0;
}
.modal-buttons {
display: flex;
gap: 1rem;
margin-top: 1rem;
}
.modal-buttons button {
flex: 1;
padding: 0.75rem 1.5rem;
border: none;
border-radius: 8px;
font-size: 1rem;
font-weight: 600;
cursor: pointer;
transition: all 0.2s ease;
}
.btn-cancel {
background: #f5f5f5;
color: #666;
}
.btn-cancel:hover {
background: #e0e0e0;
}
.btn-apply {
background: #0066cc;
color: white;
}
.btn-apply:hover {
background: #0052a3;
}
@media (max-width: 768px) {
.modal-content {
padding: 1.5rem;
}
.modal-content h2 {
font-size: 1.25rem;
}
.modal-buttons {
flex-direction: column;
}
}
/* ── Tabelle & Druck ──────────────────────────────────────── */
.table-toggle-btn {
padding: 0.5rem 1.5rem;
background: white;
border: 2px solid #0066cc;
border-radius: 8px;
cursor: pointer;
font-size: 0.9rem;
font-weight: 500;
color: #0066cc;
transition: all 0.2s ease;
}
.table-toggle-btn:hover {
background: #e6f0fa;
}
.table-toggle-btn.active {
background: #0066cc;
border-color: #0066cc;
color: white;
}
.table-view {
margin-bottom: 1.5rem;
}
.table-actions {
display: flex;
justify-content: flex-end;
margin-bottom: 0.75rem;
}
.btn-print {
padding: 0.45rem 1.2rem;
background: white;
border: 2px solid #555;
border-radius: 8px;
cursor: pointer;
font-size: 0.9rem;
font-weight: 500;
color: #333;
transition: all 0.2s ease;
}
.btn-print:hover {
background: #f0f0f0;
}
.weather-table {
width: 100%;
border-collapse: collapse;
font-size: 0.88rem;
background: white;
border-radius: 8px;
overflow: hidden;
box-shadow: 0 2px 8px rgba(0, 0, 0, 0.1);
}
.weather-table th,
.weather-table td {
border-left: 1px solid rgba(255,255,255,0.3);
border-right: 1px solid rgba(255,255,255,0.3);
}
.weather-table td {
border-left: 1px solid #ddd;
border-right: 1px solid #ddd;
}
.weather-table th {
background: #0066cc;
color: white;
padding: 0.6rem 0.5rem;
text-align: center;
font-weight: 600;
white-space: nowrap;
vertical-align: top;
}
.weather-table thead tr:nth-child(2) th {
background: #3388dd;
font-weight: 400;
font-size: 0.82rem;
padding: 0.3rem 0.5rem;
}
.weather-table thead tr:first-child th:first-child {
text-align: left;
}
.weather-table td {
padding: 0.45rem 0.5rem;
text-align: center;
border-bottom: 1px solid #eee;
color: #333;
}
.weather-table td:first-child {
text-align: left;
font-weight: 500;
white-space: nowrap;
}
.weather-table tbody tr:nth-child(even) {
background: #f5f8fd;
}
.weather-table tbody tr:hover {
background: #e6f0fa;
}
/* Drucken */
@media print {
.no-print,
.time-range-nav,
.time-range-label,
.dashboard-footer,
.modal-overlay {
display: none !important;
}
body {
background: white;
}
.dashboard {
max-width: 100%;
margin: 0;
padding: 0;
}
.weather-table {
box-shadow: none;
font-size: 0.8rem;
}
.weather-table th {
background: #333 !important;
color: white !important;
-webkit-print-color-adjust: exact;
print-color-adjust: exact;
}
.weather-table tbody tr:nth-child(even) {
background: #f0f0f0 !important;
-webkit-print-color-adjust: exact;
print-color-adjust: exact;
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -3,8 +3,10 @@ import react from '@vitejs/plugin-react'
export default defineConfig({
plugins: [react()],
base: './',
server: {
base: './', define: {
__BUILD_DATE__: JSON.stringify(process.env.VITE_BUILD_DATE || 'dev'),
__VERSION__: JSON.stringify(process.env.VITE_VERSION || '0.0.0')
}, server: {
host: '0.0.0.0',
port: 3000,
proxy: {

219
migrate_sqlite_to_postgres.py Executable file
View File

@@ -0,0 +1,219 @@
#!/usr/bin/env python3
"""
Migration Tool: SQLite (wview) → PostgreSQL (wetterstation)
Migriert Wetterdaten vom 1.1.2025 bis heute
"""
import sqlite3
import psycopg
from datetime import datetime, timezone
import os
from pathlib import Path
from dotenv import load_dotenv
import sys
# Umgebungsvariablen laden
env_path = Path(__file__).parent / '.env'
load_dotenv(dotenv_path=env_path)
# Konfiguration
SQLITE_DB = "data/wview-archive.sdb"
START_DATE = datetime(2025, 1, 1, 0, 0, 0, tzinfo=timezone.utc)
END_DATE = datetime(2026, 2, 8, 0, 0, 0, tzinfo=timezone.utc)
# PostgreSQL-Konfiguration
DB_HOST = os.getenv('DB_HOST', 'localhost')
DB_PORT = int(os.getenv('DB_PORT', 5432))
DB_NAME = os.getenv('DB_NAME', 'wetterstation')
DB_USER = os.getenv('DB_USER')
DB_PASSWORD = os.getenv('DB_PASSWORD')
# Soll die Tabelle vorher geleert werden?
TRUNCATE_TABLE = False # Auf False setzen, um vorhandene Daten zu behalten
# Konvertierungsfunktionen
def fahrenheit_to_celsius(f):
"""Fahrenheit → Celsius"""
if f is None:
return None
return (f - 32) * 5 / 9
def inches_hg_to_hpa(inhg):
"""inches Hg → hPa"""
if inhg is None:
return None
return inhg * 33.8639
def mph_to_kmh(mph):
"""mph → km/h"""
if mph is None:
return None
return mph * 1.60934
def inches_to_mm(inches):
"""inches → mm"""
if inches is None:
return None
return inches * 25.4
def unix_to_datetime(timestamp):
"""Unix timestamp → datetime"""
return datetime.fromtimestamp(timestamp, tz=timezone.utc)
def main():
print("=" * 60)
print("SQLite → PostgreSQL Migration")
print("=" * 60)
print(f"Quelle: {SQLITE_DB}")
print(f"Zeitraum: {START_DATE.date()} bis {END_DATE.date()}")
print(f"Ziel: PostgreSQL ({DB_HOST}:{DB_PORT}/{DB_NAME})")
print("=" * 60)
print()
# SQLite öffnen
try:
sqlite_conn = sqlite3.connect(SQLITE_DB)
sqlite_cursor = sqlite_conn.cursor()
print("✓ SQLite-Verbindung hergestellt")
except Exception as e:
print(f"✗ Fehler beim Öffnen der SQLite-Datenbank: {e}")
sys.exit(1)
# PostgreSQL öffnen
try:
pg_conn = psycopg.connect(
host=DB_HOST,
port=DB_PORT,
dbname=DB_NAME,
user=DB_USER,
password=DB_PASSWORD
)
pg_cursor = pg_conn.cursor()
print("✓ PostgreSQL-Verbindung hergestellt")
except Exception as e:
print(f"✗ Fehler beim Verbinden mit PostgreSQL: {e}")
sqlite_conn.close()
sys.exit(1)
# Tabelle leeren falls gewünscht
if TRUNCATE_TABLE:
print("\nLeere PostgreSQL-Tabelle weather_data...")
try:
pg_cursor.execute("TRUNCATE TABLE weather_data RESTART IDENTITY CASCADE")
pg_conn.commit()
print("✓ Tabelle geleert")
except Exception as e:
print(f"✗ Fehler beim Leeren der Tabelle: {e}")
sqlite_conn.close()
pg_conn.close()
sys.exit(1)
# Zeitraum in Unix timestamps umrechnen
start_ts = int(START_DATE.timestamp())
end_ts = int(END_DATE.timestamp())
# Daten aus SQLite laden
print(f"\nLade Daten aus SQLite (Zeitraum: {start_ts} - {end_ts})...")
sqlite_cursor.execute("""
SELECT
dateTime,
outTemp,
outHumidity,
barometer,
windSpeed,
windGust,
windDir,
rain,
rainRate
FROM archive
WHERE dateTime >= ? AND dateTime <= ?
ORDER BY dateTime ASC
""", (start_ts, end_ts))
rows = sqlite_cursor.fetchall()
print(f"{len(rows)} Datensätze gefunden")
if len(rows) == 0:
print("Keine Daten im angegebenen Zeitraum gefunden.")
sqlite_conn.close()
pg_conn.close()
return
# Migration durchführen
print("\nMigriere Daten...")
inserted = 0
skipped = 0
errors = 0
for row in rows:
try:
(dateTime, outTemp, outHumidity, barometer,
windSpeed, windGust, windDir, rain, rainRate) = row
# Konvertierungen
dt = unix_to_datetime(dateTime)
temp_c = fahrenheit_to_celsius(outTemp)
humidity = int(outHumidity) if outHumidity is not None else None
pressure_hpa = inches_hg_to_hpa(barometer)
wind_speed_kmh = mph_to_kmh(windSpeed)
wind_gust_kmh = mph_to_kmh(windGust)
rain_mm = inches_to_mm(rain)
rain_rate_mm = inches_to_mm(rainRate)
# In PostgreSQL einfügen
pg_cursor.execute("""
INSERT INTO weather_data
(datetime, temperature, humidity, pressure,
wind_speed, wind_gust, wind_dir, rain, rain_rate)
VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s)
ON CONFLICT (datetime) DO NOTHING
""", (dt, temp_c, humidity, pressure_hpa,
wind_speed_kmh, wind_gust_kmh, windDir, rain_mm, rain_rate_mm))
if pg_cursor.rowcount > 0:
inserted += 1
if inserted % 1000 == 0:
pg_conn.commit()
print(f" {inserted} Datensätze eingefügt...")
else:
skipped += 1
except Exception as e:
errors += 1
if errors <= 5: # Zeige nur die ersten 5 Fehler
print(f" Fehler bei Datensatz {dateTime}: {e}")
# Commit verbleibende Daten
pg_conn.commit()
# Zusammenfassung
print("\n" + "=" * 60)
print("Migration abgeschlossen!")
print("=" * 60)
print(f"Eingefügt: {inserted} Datensätze")
print(f"Übersprungen: {skipped} Datensätze (bereits vorhanden)")
print(f"Fehler: {errors} Datensätze")
print("=" * 60)
# Zeitraum der migrierten Daten anzeigen
if inserted > 0:
pg_cursor.execute("""
SELECT MIN(datetime), MAX(datetime), COUNT(*)
FROM weather_data
WHERE datetime >= %s AND datetime <= %s
""", (START_DATE, END_DATE))
min_dt, max_dt, count = pg_cursor.fetchone()
print(f"\nDaten in PostgreSQL:")
print(f" Von: {min_dt}")
print(f" Bis: {max_dt}")
print(f" Gesamt: {count} Datensätze")
# Verbindungen schließen
sqlite_conn.close()
pg_conn.close()
print("\n✓ Fertig!")
if __name__ == "__main__":
main()

View File

@@ -1,24 +1,38 @@
#!/bin/bash
# Script zum Bauen und Pushen der Docker Images zur Registry
# Script zum Bauen und Pushen der Docker Images zur Registry (Multi-Plattform)
set -e
REGISTRY="docker.citysensor.de"
PROJECT="wetterstation"
PLATFORMS="linux/amd64,linux/arm64"
echo "🔨 Building Docker images..."
docker compose build collector
docker compose build api
docker compose build frontend
echo "🔧 Setting up buildx builder..."
# Erstelle oder verwende existierenden Builder
docker buildx create --name multiplatform --use 2>/dev/null || docker buildx use multiplatform
echo ""
echo "📤 Pushing images to ${REGISTRY}..."
docker compose push collector
docker compose push api
docker compose push frontend
echo "🔨 Building and pushing Docker images for ${PLATFORMS}..."
# Baue und pushe alle Images mit buildx
docker buildx build --platform ${PLATFORMS} \
-t ${REGISTRY}/${PROJECT}/collector:latest \
--push \
./collector
docker buildx build --platform ${PLATFORMS} \
-t ${REGISTRY}/${PROJECT}/api:latest \
--push \
./api
docker buildx build --platform ${PLATFORMS} \
-t ${REGISTRY}/${PROJECT}/frontend:latest \
--push \
./frontend
echo ""
echo "✅ Done! Images successfully pushed to ${REGISTRY}"
echo "✅ Done! Multi-platform images successfully pushed to ${REGISTRY}"
echo " Platforms: ${PLATFORMS}"
echo ""
echo "To pull and run on another machine:"
echo " docker compose pull"

View File

@@ -21,6 +21,15 @@ echo ""
# Kurz warten bis API bereit ist
sleep 3
# Collector starten
echo "📥 Starte Collector auf Port 8001..."
cd "$SCRIPT_DIR"
source .venv/bin/activate
python -m uvicorn collector.main:app --host 0.0.0.0 --port 8001 --reload &
COLLECTOR_PID=$!
echo "Collector gestartet mit PID $COLLECTOR_PID"
echo ""
# Frontend starten
echo "🎨 Starte Frontend auf Port 3000..."
cd "$SCRIPT_DIR/frontend"
@@ -33,13 +42,14 @@ echo "✅ Alle Services gestartet!"
echo ""
echo "📊 API: http://localhost:8000"
echo "📊 API Docs: http://localhost:8000/docs"
echo "📥 Collector: http://localhost:8001"
echo "🌐 Frontend: http://localhost:3000"
echo ""
echo "Drücken Sie Ctrl+C um alle Services zu stoppen..."
echo ""
# Trap zum Beenden aller Prozesse
trap "echo ''; echo '🛑 Stoppe Services...'; kill $API_PID $FRONTEND_PID 2>/dev/null; exit 0" INT TERM
trap "echo ''; echo '🛑 Stoppe Services...'; kill $API_PID $COLLECTOR_PID $FRONTEND_PID 2>/dev/null; exit 0" INT TERM
# Warte auf Beendigung
wait