Просмотр исходного кода

refactor: découper app.py en blueprints Flask + fix sudo cat dans push

app.py (1142 lignes) → 5 blueprints + helpers.py + bootstrap slim :
- blueprints/jobs.py     — dashboard local, CRUD jobs, restauration
- blueprints/destinations.py — destinations SSH + transfert
- blueprints/network.py  — fédération, dashboard réseau, push/pull
- blueprints/settings.py — paramètres SMTP, internal/databases
- blueprints/api.py      — toutes les routes /api/v1/* (auth via before_request blueprint)
- helpers.py             — read_archive_info, get_ynh_apps (partagés)
- app.py                 — bootstrap : config, extensions, register_blueprint, démarrage

Fix inclus : _do_push_archive utilisait sudo cat (non autorisé sudoers)
→ remplacé par sudo rsync + open + sudo rm -rf comme les autres endpoints.

Tous les templates mis à jour (url_for avec préfixe blueprint).
CDC v4.3 : Phase 3 ✅ complète, feuille de route UI §14 ajoutée.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Cédric Hansen 56 минут назад
Родитель
Сommit
6960163b25

+ 49 - 9
doc/CDC_backupmanager_ynh.md

@@ -387,7 +387,7 @@ rsync -az -e "ssh -i $key -p $port" archive.tar archive.info.json user@host:/hom
 - [x] Liste BDD live dans le formulaire (mysql/postgresql)
 - [x] Accès archives root-owned via sudo (stat/find/tar/rsync)
 
-### Phase 3 — Fédération
+### Phase 3 — Fédération ✅ (testé VPS 2026-05-10)
 **Sous-phases :**
 - **3A** — Fondations : DB (RemoteInstance, RemoteRun, Upload) + API REST complète
 - **3B** — Instances distantes : UI enregistrement + test connexion + sync état
@@ -398,18 +398,58 @@ rsync -az -e "ssh -i $key -p $port" archive.tar archive.info.json user@host:/hom
 **Avancement :**
 - [x] 3A — Modèles DB RemoteInstance / RemoteRun / Upload
 - [x] 3A — API : /summary, /archives/<name>/info, /archives/<name>/restore (+status), upload chunked
+- [x] 3A — API : /archives/<name>/download, /archives/<name>/info-json-download
 - [x] 3B — UI instances distantes (liste, ajout, édition, suppression, test, sync)
 - [x] 3B — federation/client.py (FederationClient + sync_instance)
-- [ ] 3C — Dashboard réseau (vue agrégée)
-- [ ] 3D — Envoi d'archive HTTP chunked depuis le dashboard
-- [ ] 3E — Contrôle distant (run/restore depuis dashboard)
+- [x] 3C — Dashboard réseau (vue agrégée locale + distante avec statuts)
+- [x] 3D — Push archive HTTP chunked (sha256 + reprise upload_id)
+- [x] 3D — Pull dernière archive d'un job distant + .info.json
+- [x] 3E — Lancer un job sur instance distante depuis le dashboard réseau
+- [x] 3E — Token API affiché dans les Paramètres + URL instance
+
+**Notes techniques Phase 3 :**
+- sudo rsync crée des fichiers /tmp owned root → cleanup via sudo rm -rf (sudoers)
+- Pull : récupère archive_name via get_job_runs() pour toujours avoir la dernière version
+- Runs bloqués "en cours" nettoyés toutes les heures par APScheduler (> 6h → error)
 
 ### Phase 4 — Finitions
-- [ ] Rétention gfs
-- [ ] Export/import config JSON
-- [ ] Script backup/restore de l'app pour YNH
-- [ ] Tests automatisés
+- [ ] Rétention GFS (daily N / weekly N / monthly N)
+- [ ] Navigateur d'archives (liste paginée + taille + restore + push + delete) — voir §14
+- [ ] Export/import config JSON (jobs, destinations, instances)
+- [ ] Script backup/restore de l'app pour YNH (SQLite + clés SSH)
+- [ ] Tests automatisés (pytest)
+
+---
+
+## 14. Feuille de route UI
+
+Propositions classées par priorité (décidé 2026-05-10).
+
+### A — Navigateur d'archives *(priorité haute)*
+Page `/archives` manquante : le dashboard n'affiche que le dernier run par job, aucune vue exhaustive des archives sur disque.
+- Tableau paginé : nom, type, date, taille, statut dernier run
+- Filtres : par job, par type, par statut
+- Actions par ligne : Restaurer · Pousser vers instance · Télécharger · Supprimer
+- Colonne taille totale par job en pied de tableau
+
+### B — Indicateur de progression pour runs en cours *(priorité moyenne)*
+Le badge `⟳ en cours` pulse mais rien de plus.
+- Polling AJAX léger sur `/api/v1/jobs/<id>/runs` toutes les 5 s
+- Rafraîchissement du statut + taille sans rechargement de page
+- Lien cliquable vers le log en temps réel depuis le dashboard
+
+### C — Navigation revue *(priorité faible)*
+5 liens plats dans la navbar → regrouper :
+- "Réseau" + "Instances" → menu déroulant **Fédération**
+- Ajouter "Archives" comme entrée de premier niveau
+- Indicateur rouge sur "Dashboard" si au moins une erreur récente
+
+### D — Activité récente sur la page d'accueil *(priorité faible)*
+Sous le tableau des jobs, section "Activité récente" :
+- 10 derniers runs toutes sources confondues
+- Colonnes : date · job · statut · taille · durée
+- Chaque ligne cliquable → historique du job
 
 ---
 
-*backupmanager_ynh — CDC v4.2 — Avancement mis à jour le 2026-05-09 | Phase 2 ✅ | Phase 3 en cours (3A+3B fait)*
+*backupmanager_ynh — CDC v4.3 — Avancement mis à jour le 2026-05-10 | Phase 1 ✅ | Phase 2 ✅ | Phase 3 ✅ | Phase 4 en cours*

+ 21 - 1095
sources/app.py

@@ -1,24 +1,9 @@
-import glob
-import hashlib
 import json
 import logging
-import math
 import os
-import shutil
-import subprocess
-import threading
-import uuid
 from datetime import datetime
 
-from flask import (
-    Flask,
-    flash,
-    jsonify,
-    redirect,
-    render_template,
-    request,
-    url_for,
-)
+from flask import Flask
 from werkzeug.middleware.proxy_fix import ProxyFix
 
 # --- Configuration -----------------------------------------------------------
@@ -33,13 +18,9 @@ app.config.from_pyfile(_config_path)
 app.config["SQLALCHEMY_DATABASE_URI"] = "sqlite:///" + app.config["DB_PATH"]
 app.config["SQLALCHEMY_TRACK_MODIFICATIONS"] = False
 
-# Proxy headers Nginx → Flask (sous-chemin + HTTPS)
 app.wsgi_app = ProxyFix(app.wsgi_app, x_for=1, x_proto=1, x_host=1, x_prefix=1)
-
-# Filtre Jinja2 pour désérialiser du JSON dans les templates
 app.jinja_env.filters["fromjson"] = json.loads
 
-# Logging
 os.makedirs(os.path.dirname(app.config["LOG_PATH"]), exist_ok=True)
 logging.basicConfig(
     filename=app.config["LOG_PATH"],
@@ -49,33 +30,27 @@ logging.basicConfig(
 
 # --- Extensions --------------------------------------------------------------
 
-from db import db, Job, Run, Destination, Setting, RemoteInstance, RemoteRun, Upload
+from db import db, Job
 
 db.init_app(app)
 
-from scheduler import init_scheduler, schedule_job, remove_job
-
-# --- Démarrage ---------------------------------------------------------------
+from scheduler import init_scheduler, schedule_job
 
-with app.app_context():
-    db.create_all()
-    init_scheduler(app)
-    for _job in Job.query.filter_by(enabled=True).all():
-        schedule_job(_job)
+# --- Blueprints --------------------------------------------------------------
 
-# --- Auth API ----------------------------------------------------------------
+from blueprints.jobs import bp as bp_jobs
+from blueprints.destinations import bp as bp_dest
+from blueprints.network import bp as bp_network
+from blueprints.settings import bp as bp_cfg
+from blueprints.api import bp as bp_api
 
-@app.before_request
-def _check_api_auth():
-    if not request.path.startswith("/api/"):
-        return
-    if request.path == "/api/v1/health":
-        return
-    token = request.headers.get("X-BackupManager-Key", "")
-    if token != app.config["API_TOKEN"]:
-        return jsonify({"error": "Unauthorized"}), 401
+app.register_blueprint(bp_jobs)
+app.register_blueprint(bp_dest)
+app.register_blueprint(bp_network)
+app.register_blueprint(bp_cfg)
+app.register_blueprint(bp_api)
 
-# --- Context processors ------------------------------------------------------
+# --- Context processor -------------------------------------------------------
 
 @app.context_processor
 def _inject_globals():
@@ -84,1059 +59,10 @@ def _inject_globals():
         "now": datetime.utcnow(),
     }
 
-# --- Helpers -----------------------------------------------------------------
-
-def _read_archive_info(archive_name):
-    backup_dir = app.config["YUNOHOST_BACKUP_DIR"]
-    archive_path = os.path.join(backup_dir, archive_name + ".tar")
-    from jobs.utils import sudo_read_backup_info
-    info = sudo_read_backup_info(archive_path)
-    if not info.get("type"):
-        # Archives YunoHost natives : déterminer le type depuis la table Run
-        run = Run.query.filter_by(archive_name=archive_name).first()
-        if run:
-            job = db.session.get(Job, run.job_id)
-            if job:
-                info["type"] = job.type
-                info["_from_run"] = True
-    return info
-
-
-def _get_ynh_apps():
-    try:
-        result = subprocess.run(
-            ["sudo", "yunohost", "app", "list", "--output-as", "json"],
-            capture_output=True,
-            text=True,
-            timeout=15,
-        )
-        if result.returncode == 0:
-            return json.loads(result.stdout).get("apps", [])
-    except Exception:
-        pass
-    return []
-
-# --- Routes dashboard --------------------------------------------------------
-
-@app.route("/")
-def index():
-    jobs = Job.query.order_by(Job.name).all()
-    last_runs = {
-        j.id: Run.query.filter_by(job_id=j.id).order_by(Run.started_at.desc()).first()
-        for j in jobs
-    }
-    return render_template("dashboard_local.html", jobs=jobs, last_runs=last_runs)
-
-
-@app.route("/jobs/new", methods=["GET", "POST"])
-def job_new():
-    if request.method == "POST":
-        return _save_job(None)
-    return render_template("job_form.html", job=None, ynh_apps=_get_ynh_apps(),
-                           destinations=Destination.query.filter_by(enabled=True).all())
-
-
-@app.route("/jobs/<int:job_id>/edit", methods=["GET", "POST"])
-def job_edit(job_id):
-    job = db.get_or_404(Job, job_id)
-    if request.method == "POST":
-        return _save_job(job)
-    return render_template("job_form.html", job=job, ynh_apps=_get_ynh_apps(),
-                           destinations=Destination.query.filter_by(enabled=True).all())
-
-
-@app.route("/jobs/<int:job_id>/delete", methods=["POST"])
-def job_delete(job_id):
-    job = db.get_or_404(Job, job_id)
-    remove_job(job.id)
-    db.session.delete(job)
-    db.session.commit()
-    flash(f"Job « {job.name} » supprimé.", "success")
-    return redirect(url_for("index"))
-
-
-@app.route("/jobs/<int:job_id>/run", methods=["POST"])
-def job_run_now(job_id):
-    job = db.get_or_404(Job, job_id)
-    from scheduler import _execute_job
-    import threading
-    t = threading.Thread(target=_execute_job, args=(job.id,), daemon=True)
-    t.start()
-    flash(f"Job « {job.name} » lancé manuellement.", "success")
-    return redirect(url_for("index"))
-
-
-@app.route("/jobs/<int:job_id>/history")
-def job_history(job_id):
-    job = db.get_or_404(Job, job_id)
-    runs = Run.query.filter_by(job_id=job_id).order_by(Run.started_at.desc()).limit(100).all()
-    return render_template("job_history.html", job=job, runs=runs)
-
-
-def _do_restore_job(archive_name, archive_type, restore_run_id):
-    """Exécute la restauration en arrière-plan et met à jour le Run."""
-    with app.app_context():
-        run = db.session.get(Run, restore_run_id) if restore_run_id else None
-        try:
-            backup_dir = app.config["YUNOHOST_BACKUP_DIR"]
-            if archive_type == "custom_dir":
-                from jobs.custom_dir import restore_custom_dir
-                log = restore_custom_dir(archive_name, backup_dir)
-            elif archive_type in ("mysql", "postgresql"):
-                from jobs.db_dump import restore_db_dump
-                log = restore_db_dump(archive_name, backup_dir)
-            elif archive_type == "ynh_app":
-                result = subprocess.run(
-                    ["sudo", "yunohost", "backup", "restore", archive_name,
-                     "--apps", "--force"],
-                    capture_output=True, text=True, timeout=3600,
-                )
-                log = (result.stdout + result.stderr).strip()
-                if result.returncode != 0:
-                    raise RuntimeError(f"yunohost backup restore a échoué :\n{log}")
-            elif archive_type == "ynh_system":
-                result = subprocess.run(
-                    ["sudo", "yunohost", "backup", "restore", archive_name,
-                     "--system", "--force"],
-                    capture_output=True, text=True, timeout=3600,
-                )
-                log = (result.stdout + result.stderr).strip()
-                if result.returncode != 0:
-                    raise RuntimeError(f"yunohost backup restore a échoué :\n{log}")
-            else:
-                raise NotImplementedError(
-                    f"Restauration non supportée pour le type '{archive_type}'."
-                )
-            if run:
-                run.status = "success"
-                run.finished_at = datetime.utcnow()
-                run.log_text = f"[RESTAURATION]\n{log or 'OK'}"
-                db.session.commit()
-        except Exception as exc:
-            app.logger.error(f"Restauration {archive_name} échouée : {exc}")
-            if run:
-                run.status = "error"
-                run.finished_at = datetime.utcnow()
-                run.log_text = f"[RESTAURATION]\n{exc}"
-                db.session.commit()
-
-
-def _start_restore(archive_name):
-    """Crée un Run de restauration et lance le thread. Retourne (restore_run_id, archive_type)."""
-    info = _read_archive_info(archive_name)
-    archive_type = info.get("type", "")
-
-    original_run = Run.query.filter_by(archive_name=archive_name).first()
-    restore_run_id = None
-    if original_run:
-        restore_run = Run(
-            job_id=original_run.job_id,
-            started_at=datetime.utcnow(),
-            status="running",
-            archive_name=archive_name,
-            log_text="[RESTAURATION en cours…]",
-        )
-        db.session.add(restore_run)
-        db.session.commit()
-        restore_run_id = restore_run.id
-
-    threading.Thread(
-        target=_do_restore_job,
-        args=(archive_name, archive_type, restore_run_id),
-        daemon=True,
-    ).start()
-    return restore_run_id, archive_type
-
-
-@app.route("/archives/<path:archive_name>/restore", methods=["GET", "POST"])
-def archive_restore(archive_name):
-    info = _read_archive_info(archive_name)
-
-    if request.method == "GET":
-        return render_template("restore_confirm.html", archive_name=archive_name, info=info)
-
-    _start_restore(archive_name)
-    flash(f"Restauration de « {archive_name} » démarrée en arrière-plan.", "success")
-    return redirect(url_for("index"))
-
-
-@app.route("/jobs/<int:job_id>/toggle", methods=["POST"])
-def job_toggle(job_id):
-    job = db.get_or_404(Job, job_id)
-    job.enabled = not job.enabled
-    job.updated_at = datetime.utcnow()
-    db.session.commit()
-    if job.enabled:
-        schedule_job(job)
-        flash(f"Job « {job.name} » activé.", "success")
-    else:
-        remove_job(job.id)
-        flash(f"Job « {job.name} » désactivé.", "info")
-    return redirect(url_for("index"))
-
-
-def _save_job(job):
-    f = request.form
-    job_type = f.get("type", "")
-    name = f.get("name", "").strip()
-
-    if not name:
-        flash("Le nom est requis.", "error")
-        return render_template("job_form.html", job=job, ynh_apps=_get_ynh_apps(),
-                               destinations=Destination.query.filter_by(enabled=True).all())
-
-    cfg = {}
-    if job_type == "ynh_app":
-        cfg = {"app_id": f.get("app_id", ""), "core_only": f.get("core_only") == "1"}
-    elif job_type == "ynh_system":
-        cfg = {}
-    elif job_type in ("mysql", "postgresql"):
-        dbname = f.get("db_database", "").strip()
-        if not dbname:
-            flash("Le nom de la base de données est requis.", "error")
-            return render_template("job_form.html", job=job, ynh_apps=_get_ynh_apps(),
-                                   destinations=Destination.query.filter_by(enabled=True).all())
-        cfg = {"database": dbname}
-    elif job_type == "custom_dir":
-        source_path = f.get("source_path", "").strip().rstrip("/")
-        if not source_path or not source_path.startswith("/"):
-            flash("Le chemin source doit être un chemin absolu (ex: /opt/monapp).", "error")
-            return render_template("job_form.html", job=job, ynh_apps=_get_ynh_apps(),
-                                   destinations=Destination.query.filter_by(enabled=True).all())
-        excludes = [e.strip() for e in f.get("excludes", "").splitlines() if e.strip()]
-        restore_cfg = {}
-        user_name = f.get("restore_user_name", "").strip()
-        if user_name:
-            restore_cfg["system_user"] = {
-                "name": user_name,
-                "home": f.get("restore_user_home", source_path).strip() or source_path,
-                "shell": f.get("restore_user_shell", "/bin/false").strip() or "/bin/false",
-            }
-        service_name = f.get("restore_service_name", "").strip()
-        if service_name:
-            restore_cfg["systemd_service"] = {
-                "name": service_name,
-                "service_file": f.get("restore_service_file", "").strip(),
-            }
-        owner = f.get("restore_perm_owner", "").strip()
-        mode = f.get("restore_perm_mode", "").strip()
-        if owner or mode:
-            restore_cfg["permissions"] = {}
-            if owner:
-                restore_cfg["permissions"]["owner"] = owner
-            if mode:
-                restore_cfg["permissions"]["mode"] = mode
-        post_cmds = [c.strip() for c in f.get("restore_post_cmds", "").splitlines() if c.strip()]
-        if post_cmds:
-            restore_cfg["post_restore_commands"] = post_cmds
-        cfg = {"source_path": source_path, "excludes": excludes, "restore": restore_cfg}
-
-    if job is None:
-        job = Job()
-        db.session.add(job)
-
-    dest_id = f.get("destination_id", "").strip()
-    job.name = name
-    job.type = job_type
-    job.config_json = json.dumps(cfg)
-    job.cron_expr = f.get("cron_expr", "0 3 * * *").strip()
-    job.retention_mode = f.get("retention_mode", "count")
-    job.retention_value = int(f.get("retention_value", 7))
-    job.enabled = f.get("enabled") == "1"
-    job.core_only = cfg.get("core_only", False)
-    job.destination_id = int(dest_id) if dest_id else None
-    job.updated_at = datetime.utcnow()
-
-    db.session.commit()
-
-    if job.enabled:
-        schedule_job(job)
-    else:
-        remove_job(job.id)
-
-    flash(f"Job « {job.name} » enregistré.", "success")
-    return redirect(url_for("index"))
-
-# --- Destinations ------------------------------------------------------------
-
-@app.route("/destinations")
-def destinations_list():
-    destinations = Destination.query.order_by(Destination.name).all()
-    return render_template("destinations.html", destinations=destinations)
-
-
-@app.route("/destinations/new", methods=["GET", "POST"])
-def destination_new():
-    if request.method == "POST":
-        return _save_destination(None)
-    return render_template("destination_form.html", dest=None)
-
-
-@app.route("/destinations/<int:dest_id>/edit", methods=["GET", "POST"])
-def destination_edit(dest_id):
-    dest = db.get_or_404(Destination, dest_id)
-    if request.method == "POST":
-        return _save_destination(dest)
-    pub_key = _get_pub_key(dest)
-    return render_template("destination_form.html", dest=dest, pub_key=pub_key)
-
-
-@app.route("/destinations/<int:dest_id>/delete", methods=["POST"])
-def destination_delete(dest_id):
-    dest = db.get_or_404(Destination, dest_id)
-    db.session.delete(dest)
-    db.session.commit()
-    flash(f"Destination « {dest.name} » supprimée.", "success")
-    return redirect(url_for("destinations_list"))
-
-
-@app.route("/destinations/<int:dest_id>/test", methods=["POST"])
-def destination_test(dest_id):
-    dest = db.get_or_404(Destination, dest_id)
-    from jobs.transfer import test_connection
-    ok, msg = test_connection(dest, app.config["DATA_DIR"])
-    flash(msg, "success" if ok else "error")
-    return redirect(url_for("destinations_list"))
-
-
-@app.route("/archives/<path:archive_name>/transfer", methods=["POST"])
-def archive_transfer(archive_name):
-    dest_id = request.form.get("destination_id", type=int)
-    dest = db.get_or_404(Destination, dest_id)
-
-    def _do_transfer():
-        with app.app_context():
-            try:
-                from jobs.transfer import transfer_archive
-                transfer_archive(archive_name, dest, app.config["YUNOHOST_BACKUP_DIR"],
-                                 app.config["DATA_DIR"])
-                app.logger.info(f"Transfert {archive_name} → {dest.remote_str} OK")
-            except Exception as exc:
-                app.logger.error(f"Transfert {archive_name} échoué : {exc}")
-
-    import threading
-    threading.Thread(target=_do_transfer, daemon=True).start()
-    flash(f"Transfert de « {archive_name} » vers {dest.remote_str} démarré.", "success")
-    return redirect(request.referrer or url_for("index"))
-
-
-def _save_destination(dest):
-    f = request.form
-    name = f.get("name", "").strip()
-    host = f.get("host", "").strip()
-    if not name or not host:
-        flash("Nom et hôte sont requis.", "error")
-        return render_template("destination_form.html", dest=dest)
-
-    is_new = dest is None
-    if is_new:
-        dest = Destination()
-        db.session.add(dest)
-
-    dest.name = name
-    dest.host = host
-    dest.port = int(f.get("port", 22) or 22)
-    dest.user = f.get("user", "root").strip() or "root"
-    dest.remote_path = f.get("remote_path", "/home/yunohost.backup/archives").strip()
-    dest.enabled = f.get("enabled") == "1"
-    db.session.flush()  # obtenir l'id si nouveau
-
-    # Génération de la clé SSH si absente
-    if not dest.key_name:
-        from jobs.transfer import generate_key
-        dest.key_name = generate_key(dest.name, app.config["DATA_DIR"])
-
-    db.session.commit()
-    flash(f"Destination « {dest.name} » enregistrée.", "success")
-    return redirect(url_for("destination_edit", dest_id=dest.id))
-
-
-def _get_pub_key(dest):
-    if not dest.key_name:
-        return None
-    from jobs.transfer import get_public_key
-    return get_public_key(dest.key_name, app.config["DATA_DIR"])
-
-# --- Paramètres --------------------------------------------------------------
-
-_SETTING_KEYS = [
-    "smtp_host", "smtp_port", "smtp_user", "smtp_password",
-    "smtp_from", "smtp_to", "smtp_tls", "smtp_ssl",
-    "notify_on_success", "notify_on_error",
-]
-
-
-def _get_setting(key, default=""):
-    s = Setting.query.filter_by(key=key).first()
-    return s.value if s else default
-
-
-@app.route("/settings", methods=["GET", "POST"])
-def settings():
-    if request.method == "POST":
-        action = request.form.get("action")
-
-        if action == "test_smtp":
-            from notifications import send_test_email
-            try:
-                send_test_email(
-                    host=request.form.get("smtp_host", "").strip(),
-                    port=int(request.form.get("smtp_port", 587) or 587),
-                    user=request.form.get("smtp_user", "").strip(),
-                    password=request.form.get("smtp_password", ""),
-                    from_addr=request.form.get("smtp_from", "").strip(),
-                    to_addr=request.form.get("smtp_to", "").strip(),
-                    use_ssl=request.form.get("smtp_ssl") == "1",
-                    use_tls=request.form.get("smtp_tls") == "1",
-                )
-                flash("Email de test envoyé avec succès.", "success")
-            except Exception as exc:
-                flash(f"Échec du test SMTP : {exc}", "error")
-        else:
-            for key in _SETTING_KEYS:
-                if key in ("smtp_tls", "smtp_ssl", "notify_on_success", "notify_on_error"):
-                    value = "1" if request.form.get(key) == "1" else "0"
-                else:
-                    value = request.form.get(key, "").strip()
-                s = Setting.query.filter_by(key=key).first()
-                if s is None:
-                    s = Setting(key=key, value=value)
-                    db.session.add(s)
-                else:
-                    s.value = value
-            db.session.commit()
-            flash("Paramètres enregistrés.", "success")
-
-        return redirect(url_for("settings"))
-
-    cfg = {k: _get_setting(k) for k in _SETTING_KEYS}
-    cfg.setdefault("smtp_port", "587")
-    cfg["smtp_tls"] = cfg.get("smtp_tls") or "1"
-    cfg["smtp_ssl"] = cfg.get("smtp_ssl") or "0"
-    cfg["notify_on_error"] = cfg.get("notify_on_error") or "1"
-    api_token = app.config.get("API_TOKEN", "")
-    instance_url = app.config.get("INSTANCE_URL", "")
-    return render_template("settings.html", cfg=cfg, api_token=api_token,
-                           instance_url=instance_url)
-
-
-# --- Routes internes (usage formulaires) -------------------------------------
-
-@app.route("/internal/databases/<db_type>")
-def internal_databases(db_type):
-    """Liste les bases de données disponibles pour le formulaire job."""
-    databases = []
-    try:
-        if db_type == "mysql":
-            result = subprocess.run(
-                ["sudo", "mysql", "--skip-column-names", "-e", "SHOW DATABASES;"],
-                capture_output=True, text=True, timeout=10,
-            )
-            if result.returncode == 0:
-                exclude = {"information_schema", "performance_schema", "mysql", "sys"}
-                databases = [d.strip() for d in result.stdout.splitlines()
-                             if d.strip() and d.strip() not in exclude]
-        elif db_type == "postgresql":
-            result = subprocess.run(
-                ["sudo", "-u", "postgres", "psql", "-Atc",
-                 "SELECT datname FROM pg_database WHERE datistemplate = false;"],
-                capture_output=True, text=True, timeout=10,
-            )
-            if result.returncode == 0:
-                databases = [d.strip() for d in result.stdout.splitlines() if d.strip()]
-    except Exception:
-        pass
-    return jsonify(databases)
-
-
-# --- API v1 ------------------------------------------------------------------
-
-@app.route("/api/v1/health")
-def api_health():
-    return jsonify({"status": "ok", "instance": app.config.get("INSTANCE_NAME")})
-
-
-@app.route("/api/v1/jobs")
-def api_jobs():
-    jobs = Job.query.all()
-    return jsonify([
-        {
-            "id": j.id,
-            "name": j.name,
-            "type": j.type,
-            "cron_expr": j.cron_expr,
-            "enabled": j.enabled,
-            "retention_mode": j.retention_mode,
-            "retention_value": j.retention_value,
-        }
-        for j in jobs
-    ])
-
-
-@app.route("/api/v1/jobs/<int:job_id>/runs")
-def api_job_runs(job_id):
-    runs = Run.query.filter_by(job_id=job_id).order_by(Run.started_at.desc()).limit(50).all()
-    return jsonify([
-        {
-            "id": r.id,
-            "started_at": r.started_at.isoformat() if r.started_at else None,
-            "finished_at": r.finished_at.isoformat() if r.finished_at else None,
-            "status": r.status,
-            "archive_name": r.archive_name,
-            "size_bytes": r.size_bytes,
-        }
-        for r in runs
-    ])
-
-
-@app.route("/api/v1/jobs/<int:job_id>/run", methods=["POST"])
-def api_job_run(job_id):
-    job = db.get_or_404(Job, job_id)
-    from scheduler import _execute_job
-    import threading
-    threading.Thread(target=_execute_job, args=(job.id,), daemon=True).start()
-    return jsonify({"status": "triggered", "job_id": job_id})
-
-
-@app.route("/api/v1/archives")
-def api_archives():
-    backup_dir = app.config["YUNOHOST_BACKUP_DIR"]
-    archives = []
-    try:
-        from jobs.utils import sudo_listdir, sudo_getsize, sudo_getmtime
-        for fname in sorted(sudo_listdir(backup_dir)):
-            if fname.endswith(".tar"):
-                path = os.path.join(backup_dir, fname)
-                archives.append({
-                    "name": fname[:-4],
-                    "size_bytes": sudo_getsize(path),
-                    "modified_at": datetime.utcfromtimestamp(sudo_getmtime(path)).isoformat(),
-                })
-    except OSError:
-        pass
-    return jsonify(archives)
-
-
-@app.route("/api/v1/archives/<name>", methods=["DELETE"])
-def api_archive_delete(name):
-    backup_dir = app.config["YUNOHOST_BACKUP_DIR"]
-    from jobs.utils import sudo_exists
-    for ext in (".tar", ".info.json"):
-        path = os.path.join(backup_dir, name + ext)
-        if sudo_exists(path):
-            subprocess.run(["sudo", "rm", "-f", path], capture_output=True)
-    return jsonify({"status": "deleted", "name": name})
-
-
-@app.route("/api/v1/archives/<name>/info")
-def api_archive_info(name):
-    return jsonify(_read_archive_info(name))
-
-
-@app.route("/api/v1/archives/<name>/restore", methods=["POST"])
-def api_archive_restore(name):
-    restore_run_id, _ = _start_restore(name)
-    return jsonify({"status": "started", "run_id": restore_run_id})
-
-
-@app.route("/api/v1/archives/<name>/restore/status")
-def api_archive_restore_status(name):
-    run = (Run.query
-           .filter(Run.archive_name == name, Run.log_text.like("[RESTAURATION%"))
-           .order_by(Run.started_at.desc())
-           .first())
-    if not run:
-        return jsonify({"error": "Aucune restauration trouvée pour cette archive."}), 404
-    return jsonify({
-        "status": run.status,
-        "log": run.log_text,
-        "started_at": run.started_at.isoformat() if run.started_at else None,
-        "finished_at": run.finished_at.isoformat() if run.finished_at else None,
-    })
-
-
-@app.route("/api/v1/summary")
-def api_summary():
-    jobs = Job.query.all()
-    result = []
-    for job in jobs:
-        last_run = (Run.query.filter_by(job_id=job.id)
-                    .order_by(Run.started_at.desc()).first())
-        result.append({
-            "id": job.id,
-            "name": job.name,
-            "type": job.type,
-            "cron_expr": job.cron_expr,
-            "enabled": job.enabled,
-            "last_run": {
-                "id": last_run.id,
-                "started_at": last_run.started_at.isoformat() if last_run.started_at else None,
-                "status": last_run.status,
-                "archive_name": last_run.archive_name,
-                "size_bytes": last_run.size_bytes,
-            } if last_run else None,
-        })
-    return jsonify({"instance": app.config.get("INSTANCE_NAME"), "jobs": result})
-
-
-# --- Upload chunked -----------------------------------------------------------
-
-@app.route("/api/v1/archives/upload/start", methods=["POST"])
-def api_upload_start():
-    data = request.get_json(force=True) or {}
-    filename = data.get("filename", "")
-    total_size = int(data.get("total_size", 0))
-    chunk_size = int(data.get("chunk_size", 50 * 1024 * 1024))
-    chunks_total = int(data.get("chunks_total", math.ceil(total_size / chunk_size) if chunk_size else 1))
-    checksum = data.get("checksum", "")
-
-    if not filename:
-        return jsonify({"error": "filename requis"}), 400
-
-    upload_id = str(uuid.uuid4())
-    upload = Upload(
-        upload_id=upload_id,
-        filename=filename,
-        total_size=total_size,
-        chunk_size=chunk_size,
-        chunks_total=chunks_total,
-        chunks_received=0,
-        checksum=checksum,
-        status="pending",
-    )
-    db.session.add(upload)
-    db.session.commit()
-    return jsonify({"upload_id": upload_id, "chunks_total": chunks_total})
-
-
-@app.route("/api/v1/archives/upload/<upload_id>/chunk/<int:n>", methods=["POST"])
-def api_upload_chunk(upload_id, n):
-    upload = db.get_or_404(Upload, upload_id)
-    if upload.status == "complete":
-        return jsonify({"error": "upload déjà terminé"}), 400
-
-    tmp_dir = os.path.join(app.config["DATA_DIR"], "uploads", upload_id)
-    os.makedirs(tmp_dir, exist_ok=True)
-
-    chunk_path = os.path.join(tmp_dir, f"chunk_{n:06d}")
-    with open(chunk_path, "wb") as f:
-        f.write(request.data)
-
-    upload.chunks_received = (upload.chunks_received or 0) + 1
-    upload.status = "in_progress"
-    db.session.commit()
-    return jsonify({"chunk": n, "received": upload.chunks_received})
-
-
-@app.route("/api/v1/archives/upload/<upload_id>/finish", methods=["POST"])
-def api_upload_finish(upload_id):
-    upload = db.get_or_404(Upload, upload_id)
-    tmp_dir = os.path.join(app.config["DATA_DIR"], "uploads", upload_id)
-    backup_dir = app.config["YUNOHOST_BACKUP_DIR"]
-
-    chunk_files = sorted(glob.glob(os.path.join(tmp_dir, "chunk_*")))
-    if not chunk_files:
-        return jsonify({"error": "aucun chunk reçu"}), 400
-
-    tmp_archive = os.path.join(tmp_dir, upload.filename)
-    sha256 = hashlib.sha256()
-    with open(tmp_archive, "wb") as out:
-        for chunk_file in chunk_files:
-            with open(chunk_file, "rb") as f:
-                data = f.read()
-                out.write(data)
-                sha256.update(data)
-
-    if upload.checksum and sha256.hexdigest() != upload.checksum:
-        upload.status = "error"
-        db.session.commit()
-        shutil.rmtree(tmp_dir, ignore_errors=True)
-        return jsonify({"error": "checksum invalide"}), 400
-
-    dest_path = os.path.join(backup_dir, upload.filename)
-    result = subprocess.run(
-        ["sudo", "rsync", tmp_archive, dest_path],
-        capture_output=True, text=True,
-    )
-
-    if result.returncode != 0:
-        upload.status = "error"
-        db.session.commit()
-        shutil.rmtree(tmp_dir, ignore_errors=True)
-        return jsonify({"error": result.stderr.strip()}), 500
-
-    # .info.json optionnel transmis dans le body JSON
-    data = request.get_json(silent=True) or {}
-    info_json_str = data.get("info_json")
-    if info_json_str:
-        archive_base = upload.filename[:-4] if upload.filename.endswith(".tar") else upload.filename
-        tmp_info = os.path.join(tmp_dir, archive_base + ".info.json")
-        with open(tmp_info, "w") as f:
-            f.write(info_json_str)
-        subprocess.run(
-            ["sudo", "rsync", tmp_info,
-             os.path.join(backup_dir, archive_base + ".info.json")],
-            capture_output=True,
-        )
-
-    shutil.rmtree(tmp_dir, ignore_errors=True)
-    upload.status = "complete"
-    db.session.commit()
-    return jsonify({"status": "complete", "filename": upload.filename})
-
-
-@app.route("/api/v1/archives/<name>/info-json-download")
-def api_archive_info_json_download(name):
-    """Téléchargement du .info.json via sudo rsync (pour pull inter-instances)."""
-    from jobs.utils import sudo_exists
-    backup_dir = app.config["YUNOHOST_BACKUP_DIR"]
-    info_path = os.path.join(backup_dir, name + ".info.json")
-    if not sudo_exists(info_path):
-        return jsonify({"error": "info.json introuvable"}), 404
-    tmp_path = f"/tmp/backupmanager_dl_{name}.info.json"
-    content = None
-    try:
-        result = subprocess.run(["sudo", "rsync", info_path, tmp_path],
-                                capture_output=True, text=True)
-        if result.returncode != 0:
-            return jsonify({"error": result.stderr.strip()}), 500
-        with open(tmp_path, "rb") as f:
-            content = f.read()
-    except Exception as exc:
-        return jsonify({"error": str(exc)}), 500
-    finally:
-        subprocess.run(["sudo", "rm", "-rf", tmp_path], capture_output=True)
-    from flask import Response as _R
-    return _R(content, mimetype="application/json")
-
-
-@app.route("/api/v1/archives/<name>/download")
-def api_archive_download(name):
-    """Téléchargement d'une archive via sudo rsync vers /tmp (pour pull inter-instances)."""
-    from flask import Response, stream_with_context
-    from jobs.utils import sudo_exists
-
-    backup_dir = app.config["YUNOHOST_BACKUP_DIR"]
-    archive_path = os.path.join(backup_dir, name + ".tar")
-    if not sudo_exists(archive_path):
-        return jsonify({"error": "archive introuvable"}), 404
-
-    tmp_path = f"/tmp/backupmanager_dl_{name}.tar"
-    try:
-        result = subprocess.run(
-            ["sudo", "rsync", archive_path, tmp_path],
-            capture_output=True, text=True, timeout=3600,
-        )
-        if result.returncode != 0:
-            return jsonify({"error": result.stderr.strip()}), 500
-
-        def stream_and_cleanup():
-            try:
-                with open(tmp_path, "rb") as f:
-                    while True:
-                        chunk = f.read(1024 * 1024)
-                        if not chunk:
-                            break
-                        yield chunk
-            finally:
-                if os.path.exists(tmp_path):
-                    os.unlink(tmp_path)
-
-        return Response(
-            stream_with_context(stream_and_cleanup()),
-            mimetype="application/octet-stream",
-            headers={"Content-Disposition": f'attachment; filename="{name}.tar"'},
-        )
-    except Exception as exc:
-        if os.path.exists(tmp_path):
-            os.unlink(tmp_path)
-        return jsonify({"error": str(exc)}), 500
-
-
-@app.route("/api/v1/archives/upload/<upload_id>", methods=["DELETE"])
-def api_upload_cancel(upload_id):
-    upload = db.get_or_404(Upload, upload_id)
-    tmp_dir = os.path.join(app.config["DATA_DIR"], "uploads", upload_id)
-    shutil.rmtree(tmp_dir, ignore_errors=True)
-    db.session.delete(upload)
-    db.session.commit()
-    return jsonify({"status": "cancelled"})
-
-
-# --- Instances distantes (3B) -------------------------------------------------
-
-@app.route("/remote-instances")
-def remote_instances_list():
-    instances = RemoteInstance.query.order_by(RemoteInstance.name).all()
-    return render_template("remote_instances.html", instances=instances)
-
-
-@app.route("/remote-instances/new", methods=["GET", "POST"])
-def remote_instance_new():
-    if request.method == "POST":
-        return _save_remote_instance(None)
-    return render_template("remote_instance_form.html", inst=None)
-
-
-@app.route("/remote-instances/<int:inst_id>/edit", methods=["GET", "POST"])
-def remote_instance_edit(inst_id):
-    inst = db.get_or_404(RemoteInstance, inst_id)
-    if request.method == "POST":
-        return _save_remote_instance(inst)
-    return render_template("remote_instance_form.html", inst=inst)
-
-
-@app.route("/remote-instances/<int:inst_id>/delete", methods=["POST"])
-def remote_instance_delete(inst_id):
-    inst = db.get_or_404(RemoteInstance, inst_id)
-    db.session.delete(inst)
-    db.session.commit()
-    flash(f"Instance « {inst.name} » supprimée.", "success")
-    return redirect(url_for("remote_instances_list"))
-
-
-@app.route("/remote-instances/<int:inst_id>/test", methods=["POST"])
-def remote_instance_test(inst_id):
-    inst = db.get_or_404(RemoteInstance, inst_id)
-    from federation.client import FederationClient
-    try:
-        data = FederationClient(inst).health()
-        inst.status = "online"
-        inst.last_seen = datetime.utcnow()
-        db.session.commit()
-        flash(f"Instance « {inst.name} » en ligne — {data.get('instance', '?')}.", "success")
-    except Exception as exc:
-        inst.status = "error"
-        db.session.commit()
-        flash(f"Connexion échouée vers « {inst.name} » : {exc}", "error")
-    return redirect(url_for("remote_instances_list"))
-
-
-@app.route("/remote-instances/<int:inst_id>/sync", methods=["POST"])
-def remote_instance_sync(inst_id):
-    inst = db.get_or_404(RemoteInstance, inst_id)
-    from federation.client import sync_instance
-    try:
-        sync_instance(inst)
-        flash(f"Instance « {inst.name} » synchronisée.", "success")
-    except Exception as exc:
-        flash(f"Synchronisation échouée pour « {inst.name} » : {exc}", "error")
-    return redirect(url_for("remote_instances_list"))
-
-
-@app.route("/network")
-def dashboard_network():
-    local_jobs = Job.query.order_by(Job.name).all()
-    local_jobs_data = []
-    for job in local_jobs:
-        run = Run.query.filter_by(job_id=job.id).order_by(Run.started_at.desc()).first()
-        local_jobs_data.append(_JobRow(
-            job_id=job.id, name=job.name, type=job.type,
-            last_run_at=run.started_at if run else None,
-            last_status=run.status if run else None,
-            last_archive_name=run.archive_name if run else None,
-            last_size_bytes=run.size_bytes if run else None,
-        ))
-    instances = RemoteInstance.query.order_by(RemoteInstance.name).all()
-    return render_template("dashboard_network.html",
-                           local_jobs_data=local_jobs_data,
-                           instances=instances,
-                           instances_for_push=instances)
-
-
-@app.route("/network/sync-all", methods=["POST"])
-def network_sync_all():
-    from federation.client import sync_instance
-    instances = RemoteInstance.query.all()
-    errors = []
-    for inst in instances:
-        try:
-            sync_instance(inst)
-        except Exception as exc:
-            errors.append(f"{inst.name}: {exc}")
-    if errors:
-        flash("Synchronisation partielle — " + " | ".join(errors), "error")
-    else:
-        flash(f"{len(instances)} instance(s) synchronisée(s).", "success")
-    return redirect(url_for("dashboard_network"))
-
-
-@app.route("/remote-instances/<int:inst_id>/run-job/<int:job_id>", methods=["POST"])
-def remote_job_run(inst_id, job_id):
-    inst = db.get_or_404(RemoteInstance, inst_id)
-    from federation.client import FederationClient
-    try:
-        FederationClient(inst).run_job(job_id)
-        flash(f"Job déclenché sur « {inst.name} ».", "success")
-    except Exception as exc:
-        flash(f"Impossible de lancer le job sur « {inst.name} » : {exc}", "error")
-    return redirect(url_for("dashboard_network"))
-
-
-@app.route("/archives/<path:archive_name>/push/<int:inst_id>", methods=["POST"])
-def archive_push(archive_name, inst_id):
-    inst = db.get_or_404(RemoteInstance, inst_id)
-    threading.Thread(target=_do_push_archive, args=(archive_name, inst.id), daemon=True).start()
-    flash(f"Envoi de « {archive_name} » vers « {inst.name} » démarré en arrière-plan.", "success")
-    return redirect(request.referrer or url_for("index"))
-
-
-@app.route("/remote-instances/<int:inst_id>/pull-latest/<int:remote_job_id>", methods=["POST"])
-def archive_pull_latest(inst_id, remote_job_id):
-    inst = db.get_or_404(RemoteInstance, inst_id)
-    threading.Thread(target=_do_pull_latest, args=(inst.id, remote_job_id), daemon=True).start()
-    flash(f"Rapatriement depuis « {inst.name} » démarré en arrière-plan.", "success")
-    return redirect(url_for("dashboard_network"))
-
-
-def _do_push_archive(archive_name, inst_id):
-    """Pousse une archive locale vers une instance distante via HTTP chunked."""
-    import hashlib as _hashlib
-    from federation.client import FederationClient
-    from jobs.utils import sudo_exists
-
-    with app.app_context():
-        inst = db.session.get(RemoteInstance, inst_id)
-        backup_dir = app.config["YUNOHOST_BACKUP_DIR"]
-        archive_path = os.path.join(backup_dir, archive_name + ".tar")
-
-        tmp_path = None
-        try:
-            # Copie vers /tmp accessible par l'app
-            tmp_path = f"/tmp/backupmanager_push_{archive_name}.tar"
-            result = subprocess.run(
-                ["sudo", "rsync", archive_path, tmp_path],
-                capture_output=True, text=True,
-            )
-            if result.returncode != 0:
-                raise RuntimeError(f"Copie locale échouée : {result.stderr.strip()}")
-
-            total_size = os.path.getsize(tmp_path)
-            sha256 = _hashlib.sha256()
-            chunk_size = 50 * 1024 * 1024
-            with open(tmp_path, "rb") as f:
-                while True:
-                    data = f.read(65536)
-                    if not data:
-                        break
-                    sha256.update(data)
-            checksum = sha256.hexdigest()
-
-            client = FederationClient(inst)
-            upload_info = client.upload_start(archive_name + ".tar", total_size, checksum, chunk_size)
-            upload_id = upload_info["upload_id"]
-
-            with open(tmp_path, "rb") as f:
-                n = 0
-                while True:
-                    data = f.read(chunk_size)
-                    if not data:
-                        break
-                    client.upload_chunk(upload_id, n, data)
-                    n += 1
-
-            # Finish + transmettre le .info.json si présent
-            info_json_content = None
-            info_path = os.path.join(backup_dir, archive_name + ".info.json")
-            if sudo_exists(info_path):
-                r = subprocess.run(["sudo", "cat", info_path], capture_output=True)
-                if r.returncode == 0:
-                    info_json_content = r.stdout.decode("utf-8", errors="replace")
-
-            client.upload_finish_with_info(upload_id, info_json_content)
-            app.logger.info(f"Push {archive_name} → {inst.name} OK")
-
-        except Exception as exc:
-            app.logger.error(f"Push {archive_name} → {inst.name} échoué : {exc}")
-        finally:
-            if tmp_path and os.path.exists(tmp_path):
-                os.unlink(tmp_path)
-
-
-def _do_pull_latest(inst_id, remote_job_id):
-    """Rapatrie la dernière archive d'un job distant (.tar + .info.json)."""
-    from federation.client import FederationClient, sync_instance
-
-    with app.app_context():
-        inst = db.session.get(RemoteInstance, inst_id)
-        backup_dir = app.config["YUNOHOST_BACKUP_DIR"]
-        try:
-            client = FederationClient(inst)
-
-            # Sync pour obtenir la dernière archive
-            sync_instance(inst)
-            db.session.refresh(inst)
-
-            # Récupère le dernier run de ce job distant
-            runs = client.get_job_runs(remote_job_id)
-            if not runs:
-                raise RuntimeError(f"Aucun run distant pour le job {remote_job_id}")
-            archive_name = runs[0].get("archive_name")
-            if not archive_name:
-                raise RuntimeError("Le dernier run distant n'a pas d'archive.")
-
-            # Télécharge le .tar
-            archive_bytes = client.download_archive(archive_name)
-            tmp_tar = f"/tmp/backupmanager_pull_{archive_name}.tar"
-            with open(tmp_tar, "wb") as f:
-                f.write(archive_bytes)
-            subprocess.run(["sudo", "rsync", tmp_tar,
-                            os.path.join(backup_dir, archive_name + ".tar")], check=True)
-            os.unlink(tmp_tar)
-
-            # Télécharge le .info.json si disponible
-            info_bytes = client.download_info_json(archive_name)
-            if info_bytes:
-                tmp_info = f"/tmp/backupmanager_pull_{archive_name}.info.json"
-                with open(tmp_info, "wb") as f:
-                    f.write(info_bytes)
-                subprocess.run(["sudo", "rsync", tmp_info,
-                                os.path.join(backup_dir, archive_name + ".info.json")],
-                               check=True)
-                os.unlink(tmp_info)
-            else:
-                app.logger.warning(f"Pull {archive_name}: .info.json absent ou inaccessible sur {inst.name}")
-
-            app.logger.info(f"Pull {archive_name} ← {inst.name} OK")
-        except Exception as exc:
-            app.logger.error(f"Pull ← {inst.name} échoué : {exc}")
-
-
-class _JobRow:
-    """DTO pour le dashboard réseau (local et distant)."""
-    def __init__(self, job_id, name, type, last_run_at, last_status,
-                 last_archive_name, last_size_bytes):
-        self.job_id = job_id
-        self.name = name
-        self.type = type
-        self.last_run_at = last_run_at
-        self.last_status = last_status
-        self.last_archive_name = last_archive_name
-        self.last_size_bytes = last_size_bytes
-
-    @property
-    def size_human(self):
-        from db import _size_human
-        return _size_human(self.last_size_bytes)
-
-
-def _save_remote_instance(inst):
-    f = request.form
-    name = f.get("name", "").strip()
-    url = f.get("url", "").strip().rstrip("/")
-    api_key = f.get("api_key", "").strip()
-
-    if not name or not url or not api_key:
-        flash("Nom, URL et token API sont requis.", "error")
-        return render_template("remote_instance_form.html", inst=inst)
-
-    if inst is None:
-        inst = RemoteInstance()
-        db.session.add(inst)
+# --- Démarrage ---------------------------------------------------------------
 
-    inst.name = name
-    inst.url = url
-    inst.api_key = api_key
-    db.session.commit()
-    flash(f"Instance « {inst.name} » enregistrée.", "success")
-    return redirect(url_for("remote_instances_list"))
+with app.app_context():
+    db.create_all()
+    init_scheduler(app)
+    for _job in Job.query.filter_by(enabled=True).all():
+        schedule_job(_job)

+ 0 - 0
sources/blueprints/__init__.py


+ 345 - 0
sources/blueprints/api.py

@@ -0,0 +1,345 @@
+import glob
+import hashlib
+import math
+import os
+import shutil
+import subprocess
+import uuid
+from datetime import datetime
+
+from flask import (
+    Blueprint,
+    Response,
+    current_app,
+    jsonify,
+    request,
+    stream_with_context,
+)
+
+from db import db, Job, Run, Upload
+from helpers import read_archive_info
+
+bp = Blueprint("api", __name__, url_prefix="/api/v1")
+
+
+@bp.before_request
+def _check_api_auth():
+    if request.endpoint == "api.api_health":
+        return
+    token = request.headers.get("X-BackupManager-Key", "")
+    if token != current_app.config["API_TOKEN"]:
+        return jsonify({"error": "Unauthorized"}), 401
+
+
+# --- Santé / jobs -------------------------------------------------------------
+
+@bp.route("/health")
+def api_health():
+    return jsonify({"status": "ok", "instance": current_app.config.get("INSTANCE_NAME")})
+
+
+@bp.route("/jobs")
+def api_jobs():
+    jobs = Job.query.all()
+    return jsonify([
+        {
+            "id": j.id,
+            "name": j.name,
+            "type": j.type,
+            "cron_expr": j.cron_expr,
+            "enabled": j.enabled,
+            "retention_mode": j.retention_mode,
+            "retention_value": j.retention_value,
+        }
+        for j in jobs
+    ])
+
+
+@bp.route("/jobs/<int:job_id>/runs")
+def api_job_runs(job_id):
+    runs = Run.query.filter_by(job_id=job_id).order_by(Run.started_at.desc()).limit(50).all()
+    return jsonify([
+        {
+            "id": r.id,
+            "started_at": r.started_at.isoformat() if r.started_at else None,
+            "finished_at": r.finished_at.isoformat() if r.finished_at else None,
+            "status": r.status,
+            "archive_name": r.archive_name,
+            "size_bytes": r.size_bytes,
+        }
+        for r in runs
+    ])
+
+
+@bp.route("/jobs/<int:job_id>/run", methods=["POST"])
+def api_job_run(job_id):
+    import threading
+    job = db.get_or_404(Job, job_id)
+    from scheduler import _execute_job
+    threading.Thread(target=_execute_job, args=(job.id,), daemon=True).start()
+    return jsonify({"status": "triggered", "job_id": job_id})
+
+
+# --- Archives -----------------------------------------------------------------
+
+@bp.route("/archives")
+def api_archives():
+    backup_dir = current_app.config["YUNOHOST_BACKUP_DIR"]
+    archives = []
+    try:
+        from jobs.utils import sudo_listdir, sudo_getsize, sudo_getmtime
+        for fname in sorted(sudo_listdir(backup_dir)):
+            if fname.endswith(".tar"):
+                path = os.path.join(backup_dir, fname)
+                archives.append({
+                    "name": fname[:-4],
+                    "size_bytes": sudo_getsize(path),
+                    "modified_at": datetime.utcfromtimestamp(sudo_getmtime(path)).isoformat(),
+                })
+    except OSError:
+        pass
+    return jsonify(archives)
+
+
+@bp.route("/archives/<name>", methods=["DELETE"])
+def api_archive_delete(name):
+    backup_dir = current_app.config["YUNOHOST_BACKUP_DIR"]
+    from jobs.utils import sudo_exists
+    for ext in (".tar", ".info.json"):
+        path = os.path.join(backup_dir, name + ext)
+        if sudo_exists(path):
+            subprocess.run(["sudo", "rm", "-f", path], capture_output=True)
+    return jsonify({"status": "deleted", "name": name})
+
+
+@bp.route("/archives/<name>/info")
+def api_archive_info(name):
+    backup_dir = current_app.config["YUNOHOST_BACKUP_DIR"]
+    return jsonify(read_archive_info(name, backup_dir))
+
+
+@bp.route("/archives/<name>/restore", methods=["POST"])
+def api_archive_restore(name):
+    from blueprints.jobs import _start_restore
+    restore_run_id, _ = _start_restore(name)
+    return jsonify({"status": "started", "run_id": restore_run_id})
+
+
+@bp.route("/archives/<name>/restore/status")
+def api_archive_restore_status(name):
+    run = (Run.query
+           .filter(Run.archive_name == name, Run.log_text.like("[RESTAURATION%"))
+           .order_by(Run.started_at.desc())
+           .first())
+    if not run:
+        return jsonify({"error": "Aucune restauration trouvée pour cette archive."}), 404
+    return jsonify({
+        "status": run.status,
+        "log": run.log_text,
+        "started_at": run.started_at.isoformat() if run.started_at else None,
+        "finished_at": run.finished_at.isoformat() if run.finished_at else None,
+    })
+
+
+@bp.route("/summary")
+def api_summary():
+    jobs = Job.query.all()
+    result = []
+    for job in jobs:
+        last_run = (Run.query.filter_by(job_id=job.id)
+                    .order_by(Run.started_at.desc()).first())
+        result.append({
+            "id": job.id,
+            "name": job.name,
+            "type": job.type,
+            "cron_expr": job.cron_expr,
+            "enabled": job.enabled,
+            "last_run": {
+                "id": last_run.id,
+                "started_at": last_run.started_at.isoformat() if last_run.started_at else None,
+                "status": last_run.status,
+                "archive_name": last_run.archive_name,
+                "size_bytes": last_run.size_bytes,
+            } if last_run else None,
+        })
+    return jsonify({"instance": current_app.config.get("INSTANCE_NAME"), "jobs": result})
+
+
+# --- Téléchargement archives --------------------------------------------------
+
+@bp.route("/archives/<name>/info-json-download")
+def api_archive_info_json_download(name):
+    from jobs.utils import sudo_exists
+    backup_dir = current_app.config["YUNOHOST_BACKUP_DIR"]
+    info_path = os.path.join(backup_dir, name + ".info.json")
+    if not sudo_exists(info_path):
+        return jsonify({"error": "info.json introuvable"}), 404
+    tmp_path = f"/tmp/backupmanager_dl_{name}.info.json"
+    content = None
+    try:
+        result = subprocess.run(["sudo", "rsync", info_path, tmp_path],
+                                capture_output=True, text=True)
+        if result.returncode != 0:
+            return jsonify({"error": result.stderr.strip()}), 500
+        with open(tmp_path, "rb") as f:
+            content = f.read()
+    except Exception as exc:
+        return jsonify({"error": str(exc)}), 500
+    finally:
+        subprocess.run(["sudo", "rm", "-rf", tmp_path], capture_output=True)
+    return Response(content, mimetype="application/json")
+
+
+@bp.route("/archives/<name>/download")
+def api_archive_download(name):
+    from jobs.utils import sudo_exists
+    backup_dir = current_app.config["YUNOHOST_BACKUP_DIR"]
+    archive_path = os.path.join(backup_dir, name + ".tar")
+    if not sudo_exists(archive_path):
+        return jsonify({"error": "archive introuvable"}), 404
+
+    tmp_path = f"/tmp/backupmanager_dl_{name}.tar"
+    try:
+        result = subprocess.run(
+            ["sudo", "rsync", archive_path, tmp_path],
+            capture_output=True, text=True, timeout=3600,
+        )
+        if result.returncode != 0:
+            return jsonify({"error": result.stderr.strip()}), 500
+
+        def stream_and_cleanup():
+            try:
+                with open(tmp_path, "rb") as f:
+                    while True:
+                        chunk = f.read(1024 * 1024)
+                        if not chunk:
+                            break
+                        yield chunk
+            finally:
+                if os.path.exists(tmp_path):
+                    os.unlink(tmp_path)
+
+        return Response(
+            stream_with_context(stream_and_cleanup()),
+            mimetype="application/octet-stream",
+            headers={"Content-Disposition": f'attachment; filename="{name}.tar"'},
+        )
+    except Exception as exc:
+        if os.path.exists(tmp_path):
+            os.unlink(tmp_path)
+        return jsonify({"error": str(exc)}), 500
+
+
+# --- Upload chunked -----------------------------------------------------------
+
+@bp.route("/archives/upload/start", methods=["POST"])
+def api_upload_start():
+    data = request.get_json(force=True) or {}
+    filename = data.get("filename", "")
+    total_size = int(data.get("total_size", 0))
+    chunk_size = int(data.get("chunk_size", 50 * 1024 * 1024))
+    chunks_total = int(data.get("chunks_total",
+                                math.ceil(total_size / chunk_size) if chunk_size else 1))
+    checksum = data.get("checksum", "")
+
+    if not filename:
+        return jsonify({"error": "filename requis"}), 400
+
+    upload_id = str(uuid.uuid4())
+    upload = Upload(
+        upload_id=upload_id,
+        filename=filename,
+        total_size=total_size,
+        chunk_size=chunk_size,
+        chunks_total=chunks_total,
+        chunks_received=0,
+        checksum=checksum,
+        status="pending",
+    )
+    db.session.add(upload)
+    db.session.commit()
+    return jsonify({"upload_id": upload_id, "chunks_total": chunks_total})
+
+
+@bp.route("/archives/upload/<upload_id>/chunk/<int:n>", methods=["POST"])
+def api_upload_chunk(upload_id, n):
+    upload = db.get_or_404(Upload, upload_id)
+    if upload.status == "complete":
+        return jsonify({"error": "upload déjà terminé"}), 400
+
+    tmp_dir = os.path.join(current_app.config["DATA_DIR"], "uploads", upload_id)
+    os.makedirs(tmp_dir, exist_ok=True)
+
+    chunk_path = os.path.join(tmp_dir, f"chunk_{n:06d}")
+    with open(chunk_path, "wb") as f:
+        f.write(request.data)
+
+    upload.chunks_received = (upload.chunks_received or 0) + 1
+    upload.status = "in_progress"
+    db.session.commit()
+    return jsonify({"chunk": n, "received": upload.chunks_received})
+
+
+@bp.route("/archives/upload/<upload_id>/finish", methods=["POST"])
+def api_upload_finish(upload_id):
+    upload = db.get_or_404(Upload, upload_id)
+    tmp_dir = os.path.join(current_app.config["DATA_DIR"], "uploads", upload_id)
+    backup_dir = current_app.config["YUNOHOST_BACKUP_DIR"]
+
+    chunk_files = sorted(glob.glob(os.path.join(tmp_dir, "chunk_*")))
+    if not chunk_files:
+        return jsonify({"error": "aucun chunk reçu"}), 400
+
+    tmp_archive = os.path.join(tmp_dir, upload.filename)
+    sha256 = hashlib.sha256()
+    with open(tmp_archive, "wb") as out:
+        for chunk_file in chunk_files:
+            with open(chunk_file, "rb") as f:
+                data = f.read()
+                out.write(data)
+                sha256.update(data)
+
+    if upload.checksum and sha256.hexdigest() != upload.checksum:
+        upload.status = "error"
+        db.session.commit()
+        shutil.rmtree(tmp_dir, ignore_errors=True)
+        return jsonify({"error": "checksum invalide"}), 400
+
+    dest_path = os.path.join(backup_dir, upload.filename)
+    result = subprocess.run(
+        ["sudo", "rsync", tmp_archive, dest_path],
+        capture_output=True, text=True,
+    )
+    if result.returncode != 0:
+        upload.status = "error"
+        db.session.commit()
+        shutil.rmtree(tmp_dir, ignore_errors=True)
+        return jsonify({"error": result.stderr.strip()}), 500
+
+    data = request.get_json(silent=True) or {}
+    info_json_str = data.get("info_json")
+    if info_json_str:
+        archive_base = upload.filename[:-4] if upload.filename.endswith(".tar") else upload.filename
+        tmp_info = os.path.join(tmp_dir, archive_base + ".info.json")
+        with open(tmp_info, "w") as f:
+            f.write(info_json_str)
+        subprocess.run(
+            ["sudo", "rsync", tmp_info,
+             os.path.join(backup_dir, archive_base + ".info.json")],
+            capture_output=True,
+        )
+
+    shutil.rmtree(tmp_dir, ignore_errors=True)
+    upload.status = "complete"
+    db.session.commit()
+    return jsonify({"status": "complete", "filename": upload.filename})
+
+
+@bp.route("/archives/upload/<upload_id>", methods=["DELETE"])
+def api_upload_cancel(upload_id):
+    upload = db.get_or_404(Upload, upload_id)
+    tmp_dir = os.path.join(current_app.config["DATA_DIR"], "uploads", upload_id)
+    shutil.rmtree(tmp_dir, ignore_errors=True)
+    db.session.delete(upload)
+    db.session.commit()
+    return jsonify({"status": "cancelled"})

+ 113 - 0
sources/blueprints/destinations.py

@@ -0,0 +1,113 @@
+import threading
+
+from flask import (
+    Blueprint,
+    current_app,
+    flash,
+    redirect,
+    render_template,
+    request,
+    url_for,
+)
+
+from db import db, Destination
+
+bp = Blueprint("dest", __name__)
+
+
+@bp.route("/destinations")
+def destinations_list():
+    destinations = Destination.query.order_by(Destination.name).all()
+    return render_template("destinations.html", destinations=destinations)
+
+
+@bp.route("/destinations/new", methods=["GET", "POST"])
+def destination_new():
+    if request.method == "POST":
+        return _save_destination(None)
+    return render_template("destination_form.html", dest=None)
+
+
+@bp.route("/destinations/<int:dest_id>/edit", methods=["GET", "POST"])
+def destination_edit(dest_id):
+    dest = db.get_or_404(Destination, dest_id)
+    if request.method == "POST":
+        return _save_destination(dest)
+    pub_key = _get_pub_key(dest)
+    return render_template("destination_form.html", dest=dest, pub_key=pub_key)
+
+
+@bp.route("/destinations/<int:dest_id>/delete", methods=["POST"])
+def destination_delete(dest_id):
+    dest = db.get_or_404(Destination, dest_id)
+    db.session.delete(dest)
+    db.session.commit()
+    flash(f"Destination « {dest.name} » supprimée.", "success")
+    return redirect(url_for("dest.destinations_list"))
+
+
+@bp.route("/destinations/<int:dest_id>/test", methods=["POST"])
+def destination_test(dest_id):
+    dest = db.get_or_404(Destination, dest_id)
+    from jobs.transfer import test_connection
+    ok, msg = test_connection(dest, current_app.config["DATA_DIR"])
+    flash(msg, "success" if ok else "error")
+    return redirect(url_for("dest.destinations_list"))
+
+
+@bp.route("/archives/<path:archive_name>/transfer", methods=["POST"])
+def archive_transfer(archive_name):
+    dest_id = request.form.get("destination_id", type=int)
+    dest = db.get_or_404(Destination, dest_id)
+    app = current_app._get_current_object()
+
+    def _do_transfer():
+        with app.app_context():
+            try:
+                from jobs.transfer import transfer_archive
+                transfer_archive(archive_name, dest, app.config["YUNOHOST_BACKUP_DIR"],
+                                 app.config["DATA_DIR"])
+                app.logger.info(f"Transfert {archive_name} → {dest.remote_str} OK")
+            except Exception as exc:
+                app.logger.error(f"Transfert {archive_name} échoué : {exc}")
+
+    threading.Thread(target=_do_transfer, daemon=True).start()
+    flash(f"Transfert de « {archive_name} » vers {dest.remote_str} démarré.", "success")
+    return redirect(request.referrer or url_for("jobs.index"))
+
+
+def _save_destination(dest):
+    f = request.form
+    name = f.get("name", "").strip()
+    host = f.get("host", "").strip()
+    if not name or not host:
+        flash("Nom et hôte sont requis.", "error")
+        return render_template("destination_form.html", dest=dest)
+
+    is_new = dest is None
+    if is_new:
+        dest = Destination()
+        db.session.add(dest)
+
+    dest.name = name
+    dest.host = host
+    dest.port = int(f.get("port", 22) or 22)
+    dest.user = f.get("user", "root").strip() or "root"
+    dest.remote_path = f.get("remote_path", "/home/yunohost.backup/archives").strip()
+    dest.enabled = f.get("enabled") == "1"
+    db.session.flush()
+
+    if not dest.key_name:
+        from jobs.transfer import generate_key
+        dest.key_name = generate_key(dest.name, current_app.config["DATA_DIR"])
+
+    db.session.commit()
+    flash(f"Destination « {dest.name} » enregistrée.", "success")
+    return redirect(url_for("dest.destination_edit", dest_id=dest.id))
+
+
+def _get_pub_key(dest):
+    if not dest.key_name:
+        return None
+    from jobs.transfer import get_public_key
+    return get_public_key(dest.key_name, current_app.config["DATA_DIR"])

+ 270 - 0
sources/blueprints/jobs.py

@@ -0,0 +1,270 @@
+import json
+import subprocess
+import threading
+from datetime import datetime
+
+from flask import (
+    Blueprint,
+    current_app,
+    flash,
+    redirect,
+    render_template,
+    request,
+    url_for,
+)
+
+from db import db, Job, Run, Destination
+from helpers import read_archive_info, get_ynh_apps
+
+bp = Blueprint("jobs", __name__)
+
+
+# --- Dashboard local ----------------------------------------------------------
+
+@bp.route("/")
+def index():
+    jobs = Job.query.order_by(Job.name).all()
+    last_runs = {
+        j.id: Run.query.filter_by(job_id=j.id).order_by(Run.started_at.desc()).first()
+        for j in jobs
+    }
+    return render_template("dashboard_local.html", jobs=jobs, last_runs=last_runs)
+
+
+# --- CRUD Jobs ----------------------------------------------------------------
+
+@bp.route("/jobs/new", methods=["GET", "POST"])
+def job_new():
+    if request.method == "POST":
+        return _save_job(None)
+    return render_template("job_form.html", job=None, ynh_apps=get_ynh_apps(),
+                           destinations=Destination.query.filter_by(enabled=True).all())
+
+
+@bp.route("/jobs/<int:job_id>/edit", methods=["GET", "POST"])
+def job_edit(job_id):
+    job = db.get_or_404(Job, job_id)
+    if request.method == "POST":
+        return _save_job(job)
+    return render_template("job_form.html", job=job, ynh_apps=get_ynh_apps(),
+                           destinations=Destination.query.filter_by(enabled=True).all())
+
+
+@bp.route("/jobs/<int:job_id>/delete", methods=["POST"])
+def job_delete(job_id):
+    job = db.get_or_404(Job, job_id)
+    from scheduler import remove_job
+    remove_job(job.id)
+    db.session.delete(job)
+    db.session.commit()
+    flash(f"Job « {job.name} » supprimé.", "success")
+    return redirect(url_for("jobs.index"))
+
+
+@bp.route("/jobs/<int:job_id>/run", methods=["POST"])
+def job_run_now(job_id):
+    job = db.get_or_404(Job, job_id)
+    from scheduler import _execute_job
+    app = current_app._get_current_object()
+    threading.Thread(target=_execute_job, args=(job.id,), daemon=True).start()
+    flash(f"Job « {job.name} » lancé manuellement.", "success")
+    return redirect(url_for("jobs.index"))
+
+
+@bp.route("/jobs/<int:job_id>/toggle", methods=["POST"])
+def job_toggle(job_id):
+    job = db.get_or_404(Job, job_id)
+    from scheduler import schedule_job, remove_job
+    job.enabled = not job.enabled
+    job.updated_at = datetime.utcnow()
+    db.session.commit()
+    if job.enabled:
+        schedule_job(job)
+        flash(f"Job « {job.name} » activé.", "success")
+    else:
+        remove_job(job.id)
+        flash(f"Job « {job.name} » désactivé.", "info")
+    return redirect(url_for("jobs.index"))
+
+
+@bp.route("/jobs/<int:job_id>/history")
+def job_history(job_id):
+    job = db.get_or_404(Job, job_id)
+    runs = Run.query.filter_by(job_id=job_id).order_by(Run.started_at.desc()).limit(100).all()
+    return render_template("job_history.html", job=job, runs=runs)
+
+
+# --- Restauration -------------------------------------------------------------
+
+@bp.route("/archives/<path:archive_name>/restore", methods=["GET", "POST"])
+def archive_restore(archive_name):
+    backup_dir = current_app.config["YUNOHOST_BACKUP_DIR"]
+    info = read_archive_info(archive_name, backup_dir)
+
+    if request.method == "GET":
+        return render_template("restore_confirm.html", archive_name=archive_name, info=info)
+
+    _start_restore(archive_name)
+    flash(f"Restauration de « {archive_name} » démarrée en arrière-plan.", "success")
+    return redirect(url_for("jobs.index"))
+
+
+def _start_restore(archive_name):
+    """Crée un Run de restauration et lance le thread. Retourne (restore_run_id, archive_type)."""
+    backup_dir = current_app.config["YUNOHOST_BACKUP_DIR"]
+    info = read_archive_info(archive_name, backup_dir)
+    archive_type = info.get("type", "")
+
+    original_run = Run.query.filter_by(archive_name=archive_name).first()
+    restore_run_id = None
+    if original_run:
+        restore_run = Run(
+            job_id=original_run.job_id,
+            started_at=datetime.utcnow(),
+            status="running",
+            archive_name=archive_name,
+            log_text="[RESTAURATION en cours…]",
+        )
+        db.session.add(restore_run)
+        db.session.commit()
+        restore_run_id = restore_run.id
+
+    app = current_app._get_current_object()
+    threading.Thread(
+        target=_do_restore_job,
+        args=(app, archive_name, archive_type, restore_run_id),
+        daemon=True,
+    ).start()
+    return restore_run_id, archive_type
+
+
+def _do_restore_job(app, archive_name, archive_type, restore_run_id):
+    with app.app_context():
+        run = db.session.get(Run, restore_run_id) if restore_run_id else None
+        try:
+            backup_dir = app.config["YUNOHOST_BACKUP_DIR"]
+            if archive_type == "custom_dir":
+                from jobs.custom_dir import restore_custom_dir
+                log = restore_custom_dir(archive_name, backup_dir)
+            elif archive_type in ("mysql", "postgresql"):
+                from jobs.db_dump import restore_db_dump
+                log = restore_db_dump(archive_name, backup_dir)
+            elif archive_type == "ynh_app":
+                result = subprocess.run(
+                    ["sudo", "yunohost", "backup", "restore", archive_name,
+                     "--apps", "--force"],
+                    capture_output=True, text=True, timeout=3600,
+                )
+                log = (result.stdout + result.stderr).strip()
+                if result.returncode != 0:
+                    raise RuntimeError(f"yunohost backup restore a échoué :\n{log}")
+            elif archive_type == "ynh_system":
+                result = subprocess.run(
+                    ["sudo", "yunohost", "backup", "restore", archive_name,
+                     "--system", "--force"],
+                    capture_output=True, text=True, timeout=3600,
+                )
+                log = (result.stdout + result.stderr).strip()
+                if result.returncode != 0:
+                    raise RuntimeError(f"yunohost backup restore a échoué :\n{log}")
+            else:
+                raise NotImplementedError(
+                    f"Restauration non supportée pour le type '{archive_type}'."
+                )
+            if run:
+                run.status = "success"
+                run.finished_at = datetime.utcnow()
+                run.log_text = f"[RESTAURATION]\n{log or 'OK'}"
+                db.session.commit()
+        except Exception as exc:
+            app.logger.error(f"Restauration {archive_name} échouée : {exc}")
+            if run:
+                run.status = "error"
+                run.finished_at = datetime.utcnow()
+                run.log_text = f"[RESTAURATION]\n{exc}"
+                db.session.commit()
+
+
+# --- Helper save job ----------------------------------------------------------
+
+def _save_job(job):
+    f = request.form
+    job_type = f.get("type", "")
+    name = f.get("name", "").strip()
+
+    if not name:
+        flash("Le nom est requis.", "error")
+        return render_template("job_form.html", job=job, ynh_apps=get_ynh_apps(),
+                               destinations=Destination.query.filter_by(enabled=True).all())
+
+    cfg = {}
+    if job_type == "ynh_app":
+        cfg = {"app_id": f.get("app_id", ""), "core_only": f.get("core_only") == "1"}
+    elif job_type == "ynh_system":
+        cfg = {}
+    elif job_type in ("mysql", "postgresql"):
+        dbname = f.get("db_database", "").strip()
+        if not dbname:
+            flash("Le nom de la base de données est requis.", "error")
+            return render_template("job_form.html", job=job, ynh_apps=get_ynh_apps(),
+                                   destinations=Destination.query.filter_by(enabled=True).all())
+        cfg = {"database": dbname}
+    elif job_type == "custom_dir":
+        source_path = f.get("source_path", "").strip().rstrip("/")
+        if not source_path or not source_path.startswith("/"):
+            flash("Le chemin source doit être un chemin absolu (ex: /opt/monapp).", "error")
+            return render_template("job_form.html", job=job, ynh_apps=get_ynh_apps(),
+                                   destinations=Destination.query.filter_by(enabled=True).all())
+        excludes = [e.strip() for e in f.get("excludes", "").splitlines() if e.strip()]
+        restore_cfg = {}
+        user_name = f.get("restore_user_name", "").strip()
+        if user_name:
+            restore_cfg["system_user"] = {
+                "name": user_name,
+                "home": f.get("restore_user_home", source_path).strip() or source_path,
+                "shell": f.get("restore_user_shell", "/bin/false").strip() or "/bin/false",
+            }
+        service_name = f.get("restore_service_name", "").strip()
+        if service_name:
+            restore_cfg["systemd_service"] = {
+                "name": service_name,
+                "service_file": f.get("restore_service_file", "").strip(),
+            }
+        owner = f.get("restore_perm_owner", "").strip()
+        mode = f.get("restore_perm_mode", "").strip()
+        if owner or mode:
+            restore_cfg["permissions"] = {}
+            if owner:
+                restore_cfg["permissions"]["owner"] = owner
+            if mode:
+                restore_cfg["permissions"]["mode"] = mode
+        post_cmds = [c.strip() for c in f.get("restore_post_cmds", "").splitlines() if c.strip()]
+        if post_cmds:
+            restore_cfg["post_restore_commands"] = post_cmds
+        cfg = {"source_path": source_path, "excludes": excludes, "restore": restore_cfg}
+
+    if job is None:
+        job = Job()
+        db.session.add(job)
+
+    from scheduler import schedule_job, remove_job
+    dest_id = f.get("destination_id", "").strip()
+    job.name = name
+    job.type = job_type
+    job.config_json = json.dumps(cfg)
+    job.cron_expr = f.get("cron_expr", "0 3 * * *").strip()
+    job.retention_mode = f.get("retention_mode", "count")
+    job.retention_value = int(f.get("retention_value", 7))
+    job.enabled = f.get("enabled") == "1"
+    job.core_only = cfg.get("core_only", False)
+    job.destination_id = int(dest_id) if dest_id else None
+    job.updated_at = datetime.utcnow()
+    db.session.commit()
+
+    if job.enabled:
+        schedule_job(job)
+    else:
+        remove_job(job.id)
+
+    flash(f"Job « {job.name} » enregistré.", "success")
+    return redirect(url_for("jobs.index"))

+ 312 - 0
sources/blueprints/network.py

@@ -0,0 +1,312 @@
+import os
+import subprocess
+import threading
+
+from flask import (
+    Blueprint,
+    current_app,
+    flash,
+    redirect,
+    render_template,
+    request,
+    url_for,
+)
+
+from db import db, Job, Run, RemoteInstance, RemoteRun
+from db import _size_human
+
+bp = Blueprint("network", __name__)
+
+
+# --- Instances distantes ------------------------------------------------------
+
+@bp.route("/remote-instances")
+def remote_instances_list():
+    instances = RemoteInstance.query.order_by(RemoteInstance.name).all()
+    return render_template("remote_instances.html", instances=instances)
+
+
+@bp.route("/remote-instances/new", methods=["GET", "POST"])
+def remote_instance_new():
+    if request.method == "POST":
+        return _save_remote_instance(None)
+    return render_template("remote_instance_form.html", inst=None)
+
+
+@bp.route("/remote-instances/<int:inst_id>/edit", methods=["GET", "POST"])
+def remote_instance_edit(inst_id):
+    inst = db.get_or_404(RemoteInstance, inst_id)
+    if request.method == "POST":
+        return _save_remote_instance(inst)
+    return render_template("remote_instance_form.html", inst=inst)
+
+
+@bp.route("/remote-instances/<int:inst_id>/delete", methods=["POST"])
+def remote_instance_delete(inst_id):
+    inst = db.get_or_404(RemoteInstance, inst_id)
+    db.session.delete(inst)
+    db.session.commit()
+    flash(f"Instance « {inst.name} » supprimée.", "success")
+    return redirect(url_for("network.remote_instances_list"))
+
+
+@bp.route("/remote-instances/<int:inst_id>/test", methods=["POST"])
+def remote_instance_test(inst_id):
+    inst = db.get_or_404(RemoteInstance, inst_id)
+    from federation.client import FederationClient
+    from datetime import datetime
+    try:
+        data = FederationClient(inst).health()
+        inst.status = "online"
+        inst.last_seen = datetime.utcnow()
+        db.session.commit()
+        flash(f"Instance « {inst.name} » en ligne — {data.get('instance', '?')}.", "success")
+    except Exception as exc:
+        inst.status = "error"
+        db.session.commit()
+        flash(f"Connexion échouée vers « {inst.name} » : {exc}", "error")
+    return redirect(url_for("network.remote_instances_list"))
+
+
+@bp.route("/remote-instances/<int:inst_id>/sync", methods=["POST"])
+def remote_instance_sync(inst_id):
+    inst = db.get_or_404(RemoteInstance, inst_id)
+    from federation.client import sync_instance
+    try:
+        sync_instance(inst)
+        flash(f"Instance « {inst.name} » synchronisée.", "success")
+    except Exception as exc:
+        flash(f"Synchronisation échouée pour « {inst.name} » : {exc}", "error")
+    return redirect(url_for("network.remote_instances_list"))
+
+
+# --- Dashboard réseau ---------------------------------------------------------
+
+@bp.route("/network")
+def dashboard_network():
+    local_jobs = Job.query.order_by(Job.name).all()
+    local_jobs_data = []
+    for job in local_jobs:
+        run = Run.query.filter_by(job_id=job.id).order_by(Run.started_at.desc()).first()
+        local_jobs_data.append(_JobRow(
+            job_id=job.id, name=job.name, type=job.type,
+            last_run_at=run.started_at if run else None,
+            last_status=run.status if run else None,
+            last_archive_name=run.archive_name if run else None,
+            last_size_bytes=run.size_bytes if run else None,
+        ))
+    instances = RemoteInstance.query.order_by(RemoteInstance.name).all()
+    return render_template("dashboard_network.html",
+                           local_jobs_data=local_jobs_data,
+                           instances=instances,
+                           instances_for_push=instances)
+
+
+@bp.route("/network/sync-all", methods=["POST"])
+def network_sync_all():
+    from federation.client import sync_instance
+    instances = RemoteInstance.query.all()
+    errors = []
+    for inst in instances:
+        try:
+            sync_instance(inst)
+        except Exception as exc:
+            errors.append(f"{inst.name}: {exc}")
+    if errors:
+        flash("Synchronisation partielle — " + " | ".join(errors), "error")
+    else:
+        flash(f"{len(instances)} instance(s) synchronisée(s).", "success")
+    return redirect(url_for("network.dashboard_network"))
+
+
+# --- Contrôle distant ---------------------------------------------------------
+
+@bp.route("/remote-instances/<int:inst_id>/run-job/<int:job_id>", methods=["POST"])
+def remote_job_run(inst_id, job_id):
+    inst = db.get_or_404(RemoteInstance, inst_id)
+    from federation.client import FederationClient
+    try:
+        FederationClient(inst).run_job(job_id)
+        flash(f"Job déclenché sur « {inst.name} ».", "success")
+    except Exception as exc:
+        flash(f"Impossible de lancer le job sur « {inst.name} » : {exc}", "error")
+    return redirect(url_for("network.dashboard_network"))
+
+
+# --- Push / Pull archives -----------------------------------------------------
+
+@bp.route("/archives/<path:archive_name>/push/<int:inst_id>", methods=["POST"])
+def archive_push(archive_name, inst_id):
+    inst = db.get_or_404(RemoteInstance, inst_id)
+    app = current_app._get_current_object()
+    threading.Thread(target=_do_push_archive, args=(app, archive_name, inst.id), daemon=True).start()
+    flash(f"Envoi de « {archive_name} » vers « {inst.name} » démarré en arrière-plan.", "success")
+    return redirect(request.referrer or url_for("jobs.index"))
+
+
+@bp.route("/remote-instances/<int:inst_id>/pull-latest/<int:remote_job_id>", methods=["POST"])
+def archive_pull_latest(inst_id, remote_job_id):
+    inst = db.get_or_404(RemoteInstance, inst_id)
+    app = current_app._get_current_object()
+    threading.Thread(target=_do_pull_latest, args=(app, inst.id, remote_job_id), daemon=True).start()
+    flash(f"Rapatriement depuis « {inst.name} » démarré en arrière-plan.", "success")
+    return redirect(url_for("network.dashboard_network"))
+
+
+def _do_push_archive(app, archive_name, inst_id):
+    """Pousse une archive locale vers une instance distante via HTTP chunked."""
+    import hashlib as _hashlib
+    from federation.client import FederationClient
+    from jobs.utils import sudo_exists
+
+    with app.app_context():
+        inst = db.session.get(RemoteInstance, inst_id)
+        backup_dir = app.config["YUNOHOST_BACKUP_DIR"]
+        archive_path = os.path.join(backup_dir, archive_name + ".tar")
+
+        tmp_path = None
+        try:
+            tmp_path = f"/tmp/backupmanager_push_{archive_name}.tar"
+            result = subprocess.run(
+                ["sudo", "rsync", archive_path, tmp_path],
+                capture_output=True, text=True,
+            )
+            if result.returncode != 0:
+                raise RuntimeError(f"Copie locale échouée : {result.stderr.strip()}")
+
+            total_size = os.path.getsize(tmp_path)
+            sha256 = _hashlib.sha256()
+            chunk_size = 50 * 1024 * 1024
+            with open(tmp_path, "rb") as f:
+                while True:
+                    data = f.read(65536)
+                    if not data:
+                        break
+                    sha256.update(data)
+            checksum = sha256.hexdigest()
+
+            client = FederationClient(inst)
+            upload_info = client.upload_start(archive_name + ".tar", total_size, checksum, chunk_size)
+            upload_id = upload_info["upload_id"]
+
+            with open(tmp_path, "rb") as f:
+                n = 0
+                while True:
+                    data = f.read(chunk_size)
+                    if not data:
+                        break
+                    client.upload_chunk(upload_id, n, data)
+                    n += 1
+
+            # Transmettre le .info.json si présent (via sudo rsync vers /tmp)
+            info_json_content = None
+            info_path = os.path.join(backup_dir, archive_name + ".info.json")
+            if sudo_exists(info_path):
+                tmp_info_src = f"/tmp/backupmanager_push_{archive_name}.info.json"
+                r = subprocess.run(["sudo", "rsync", info_path, tmp_info_src],
+                                   capture_output=True)
+                if r.returncode == 0:
+                    try:
+                        with open(tmp_info_src, "r", encoding="utf-8", errors="replace") as fh:
+                            info_json_content = fh.read()
+                    finally:
+                        subprocess.run(["sudo", "rm", "-rf", tmp_info_src],
+                                       capture_output=True)
+
+            client.upload_finish_with_info(upload_id, info_json_content)
+            app.logger.info(f"Push {archive_name} → {inst.name} OK")
+
+        except Exception as exc:
+            app.logger.error(f"Push {archive_name} → {inst.name} échoué : {exc}")
+        finally:
+            if tmp_path and os.path.exists(tmp_path):
+                os.unlink(tmp_path)
+
+
+def _do_pull_latest(app, inst_id, remote_job_id):
+    """Rapatrie la dernière archive d'un job distant (.tar + .info.json)."""
+    from federation.client import FederationClient, sync_instance
+
+    with app.app_context():
+        inst = db.session.get(RemoteInstance, inst_id)
+        backup_dir = app.config["YUNOHOST_BACKUP_DIR"]
+        try:
+            client = FederationClient(inst)
+
+            sync_instance(inst)
+            db.session.refresh(inst)
+
+            runs = client.get_job_runs(remote_job_id)
+            if not runs:
+                raise RuntimeError(f"Aucun run distant pour le job {remote_job_id}")
+            archive_name = runs[0].get("archive_name")
+            if not archive_name:
+                raise RuntimeError("Le dernier run distant n'a pas d'archive.")
+
+            archive_bytes = client.download_archive(archive_name)
+            tmp_tar = f"/tmp/backupmanager_pull_{archive_name}.tar"
+            with open(tmp_tar, "wb") as f:
+                f.write(archive_bytes)
+            subprocess.run(["sudo", "rsync", tmp_tar,
+                            os.path.join(backup_dir, archive_name + ".tar")], check=True)
+            os.unlink(tmp_tar)
+
+            info_bytes = client.download_info_json(archive_name)
+            if info_bytes:
+                tmp_info = f"/tmp/backupmanager_pull_{archive_name}.info.json"
+                with open(tmp_info, "wb") as f:
+                    f.write(info_bytes)
+                subprocess.run(["sudo", "rsync", tmp_info,
+                                os.path.join(backup_dir, archive_name + ".info.json")],
+                               check=True)
+                os.unlink(tmp_info)
+            else:
+                app.logger.warning(
+                    f"Pull {archive_name}: .info.json absent ou inaccessible sur {inst.name}"
+                )
+
+            app.logger.info(f"Pull {archive_name} ← {inst.name} OK")
+        except Exception as exc:
+            app.logger.error(f"Pull ← {inst.name} échoué : {exc}")
+
+
+# --- Helper save instance -----------------------------------------------------
+
+def _save_remote_instance(inst):
+    f = request.form
+    name = f.get("name", "").strip()
+    url = f.get("url", "").strip().rstrip("/")
+    api_key = f.get("api_key", "").strip()
+
+    if not name or not url or not api_key:
+        flash("Nom, URL et token API sont requis.", "error")
+        return render_template("remote_instance_form.html", inst=inst)
+
+    if inst is None:
+        inst = RemoteInstance()
+        db.session.add(inst)
+
+    inst.name = name
+    inst.url = url
+    inst.api_key = api_key
+    db.session.commit()
+    flash(f"Instance « {inst.name} » enregistrée.", "success")
+    return redirect(url_for("network.remote_instances_list"))
+
+
+# --- DTO dashboard réseau -----------------------------------------------------
+
+class _JobRow:
+    def __init__(self, job_id, name, type, last_run_at, last_status,
+                 last_archive_name, last_size_bytes):
+        self.job_id = job_id
+        self.name = name
+        self.type = type
+        self.last_run_at = last_run_at
+        self.last_status = last_status
+        self.last_archive_name = last_archive_name
+        self.last_size_bytes = last_size_bytes
+
+    @property
+    def size_human(self):
+        return _size_human(self.last_size_bytes)

+ 103 - 0
sources/blueprints/settings.py

@@ -0,0 +1,103 @@
+import subprocess
+
+from flask import (
+    Blueprint,
+    current_app,
+    flash,
+    jsonify,
+    redirect,
+    render_template,
+    request,
+    url_for,
+)
+
+from db import db, Setting
+
+bp = Blueprint("cfg", __name__)
+
+_SETTING_KEYS = [
+    "smtp_host", "smtp_port", "smtp_user", "smtp_password",
+    "smtp_from", "smtp_to", "smtp_tls", "smtp_ssl",
+    "notify_on_success", "notify_on_error",
+]
+
+
+def _get_setting(key, default=""):
+    s = Setting.query.filter_by(key=key).first()
+    return s.value if s else default
+
+
+@bp.route("/settings", methods=["GET", "POST"])
+def settings():
+    if request.method == "POST":
+        action = request.form.get("action")
+
+        if action == "test_smtp":
+            from notifications import send_test_email
+            try:
+                send_test_email(
+                    host=request.form.get("smtp_host", "").strip(),
+                    port=int(request.form.get("smtp_port", 587) or 587),
+                    user=request.form.get("smtp_user", "").strip(),
+                    password=request.form.get("smtp_password", ""),
+                    from_addr=request.form.get("smtp_from", "").strip(),
+                    to_addr=request.form.get("smtp_to", "").strip(),
+                    use_ssl=request.form.get("smtp_ssl") == "1",
+                    use_tls=request.form.get("smtp_tls") == "1",
+                )
+                flash("Email de test envoyé avec succès.", "success")
+            except Exception as exc:
+                flash(f"Échec du test SMTP : {exc}", "error")
+        else:
+            for key in _SETTING_KEYS:
+                if key in ("smtp_tls", "smtp_ssl", "notify_on_success", "notify_on_error"):
+                    value = "1" if request.form.get(key) == "1" else "0"
+                else:
+                    value = request.form.get(key, "").strip()
+                s = Setting.query.filter_by(key=key).first()
+                if s is None:
+                    s = Setting(key=key, value=value)
+                    db.session.add(s)
+                else:
+                    s.value = value
+            db.session.commit()
+            flash("Paramètres enregistrés.", "success")
+
+        return redirect(url_for("cfg.settings"))
+
+    cfg = {k: _get_setting(k) for k in _SETTING_KEYS}
+    cfg.setdefault("smtp_port", "587")
+    cfg["smtp_tls"] = cfg.get("smtp_tls") or "1"
+    cfg["smtp_ssl"] = cfg.get("smtp_ssl") or "0"
+    cfg["notify_on_error"] = cfg.get("notify_on_error") or "1"
+    api_token = current_app.config.get("API_TOKEN", "")
+    instance_url = current_app.config.get("INSTANCE_URL", "")
+    return render_template("settings.html", cfg=cfg, api_token=api_token,
+                           instance_url=instance_url)
+
+
+@bp.route("/internal/databases/<db_type>")
+def internal_databases(db_type):
+    """Liste les bases de données disponibles pour le formulaire job."""
+    databases = []
+    try:
+        if db_type == "mysql":
+            result = subprocess.run(
+                ["sudo", "mysql", "--skip-column-names", "-e", "SHOW DATABASES;"],
+                capture_output=True, text=True, timeout=10,
+            )
+            if result.returncode == 0:
+                exclude = {"information_schema", "performance_schema", "mysql", "sys"}
+                databases = [d.strip() for d in result.stdout.splitlines()
+                             if d.strip() and d.strip() not in exclude]
+        elif db_type == "postgresql":
+            result = subprocess.run(
+                ["sudo", "-u", "postgres", "psql", "-Atc",
+                 "SELECT datname FROM pg_database WHERE datistemplate = false;"],
+                capture_output=True, text=True, timeout=10,
+            )
+            if result.returncode == 0:
+                databases = [d.strip() for d in result.stdout.splitlines() if d.strip()]
+    except Exception:
+        pass
+    return jsonify(databases)

+ 35 - 0
sources/helpers.py

@@ -0,0 +1,35 @@
+import os
+import subprocess
+from datetime import datetime
+
+from db import db, Job, Run
+
+
+def read_archive_info(archive_name, backup_dir):
+    """Lit les métadonnées d'une archive (backup_info.json embarqué + fallback Run table)."""
+    archive_path = os.path.join(backup_dir, archive_name + ".tar")
+    from jobs.utils import sudo_read_backup_info
+    info = sudo_read_backup_info(archive_path)
+    if not info.get("type"):
+        run = Run.query.filter_by(archive_name=archive_name).first()
+        if run:
+            job = db.session.get(Job, run.job_id)
+            if job:
+                info["type"] = job.type
+                info["_from_run"] = True
+    return info
+
+
+def get_ynh_apps():
+    """Retourne la liste des apps YunoHost installées."""
+    try:
+        import json
+        result = subprocess.run(
+            ["sudo", "yunohost", "app", "list", "--output-as", "json"],
+            capture_output=True, text=True, timeout=15,
+        )
+        if result.returncode == 0:
+            return json.loads(result.stdout).get("apps", [])
+    except Exception:
+        pass
+    return []

+ 7 - 7
sources/templates/base.html

@@ -15,16 +15,16 @@
           <path stroke-linecap="round" stroke-linejoin="round" stroke-width="2"
             d="M5 12h14M5 12l4-4m-4 4l4 4M19 12l-4-4m4 4l-4 4"/>
         </svg>
-        <a href="{{ url_for('index') }}" class="text-lg font-bold tracking-tight">Backup Manager</a>
+        <a href="{{ url_for('jobs.index') }}" class="text-lg font-bold tracking-tight">Backup Manager</a>
         <span class="bg-blue-600 text-xs font-medium px-2 py-0.5 rounded">{{ instance_name }}</span>
       </div>
       <div class="flex items-center gap-4 text-sm">
-        <a href="{{ url_for('index') }}" class="text-gray-300 hover:text-white transition">Dashboard</a>
-        <a href="{{ url_for('dashboard_network') }}" class="text-gray-300 hover:text-white transition">Réseau</a>
-        <a href="{{ url_for('remote_instances_list') }}" class="text-gray-300 hover:text-white transition">Instances</a>
-        <a href="{{ url_for('destinations_list') }}" class="text-gray-300 hover:text-white transition">Destinations</a>
-        <a href="{{ url_for('settings') }}" class="text-gray-300 hover:text-white transition">Paramètres</a>
-        <a href="{{ url_for('job_new') }}"
+        <a href="{{ url_for('jobs.index') }}" class="text-gray-300 hover:text-white transition">Dashboard</a>
+        <a href="{{ url_for('network.dashboard_network') }}" class="text-gray-300 hover:text-white transition">Réseau</a>
+        <a href="{{ url_for('network.remote_instances_list') }}" class="text-gray-300 hover:text-white transition">Instances</a>
+        <a href="{{ url_for('dest.destinations_list') }}" class="text-gray-300 hover:text-white transition">Destinations</a>
+        <a href="{{ url_for('cfg.settings') }}" class="text-gray-300 hover:text-white transition">Paramètres</a>
+        <a href="{{ url_for('jobs.job_new') }}"
            class="bg-blue-600 hover:bg-blue-700 text-white px-3 py-1.5 rounded font-medium transition">
           + Nouveau job
         </a>

+ 6 - 6
sources/templates/dashboard_local.html

@@ -47,7 +47,7 @@
   {% if not jobs %}
     <div class="px-6 py-12 text-center text-gray-400">
       <p class="text-lg">Aucun job configuré.</p>
-      <a href="{{ url_for('job_new') }}" class="mt-3 inline-block text-blue-600 hover:underline text-sm">
+      <a href="{{ url_for('jobs.job_new') }}" class="mt-3 inline-block text-blue-600 hover:underline text-sm">
         Créer le premier job →
       </a>
     </div>
@@ -110,29 +110,29 @@
               </td>
               <td class="px-6 py-4 text-right">
                 <div class="flex items-center justify-end gap-2">
-                  <form method="post" action="{{ url_for('job_run_now', job_id=job.id) }}"
+                  <form method="post" action="{{ url_for('jobs.job_run_now', job_id=job.id) }}"
                         onsubmit="return confirm('Lancer « {{ job.name }} » maintenant ?')">
                     <button type="submit"
                       class="bg-blue-50 hover:bg-blue-100 text-blue-700 text-xs font-medium px-2.5 py-1 rounded transition">
                       ▶ Lancer
                     </button>
                   </form>
-                  <a href="{{ url_for('job_history', job_id=job.id) }}"
+                  <a href="{{ url_for('jobs.job_history', job_id=job.id) }}"
                      class="text-gray-400 hover:text-gray-700 text-xs px-2 py-1 rounded hover:bg-gray-100 transition">
                     Historique
                   </a>
-                  <a href="{{ url_for('job_edit', job_id=job.id) }}"
+                  <a href="{{ url_for('jobs.job_edit', job_id=job.id) }}"
                      class="text-gray-400 hover:text-gray-700 text-xs px-2 py-1 rounded hover:bg-gray-100 transition">
                     Éditer
                   </a>
-                  <form method="post" action="{{ url_for('job_toggle', job_id=job.id) }}">
+                  <form method="post" action="{{ url_for('jobs.job_toggle', job_id=job.id) }}">
                     <button type="submit"
                       class="text-gray-400 hover:text-gray-700 text-xs px-2 py-1 rounded hover:bg-gray-100 transition"
                       title="{{ 'Désactiver' if job.enabled else 'Activer' }}">
                       {{ '⏸' if job.enabled else '▶' }}
                     </button>
                   </form>
-                  <form method="post" action="{{ url_for('job_delete', job_id=job.id) }}"
+                  <form method="post" action="{{ url_for('jobs.job_delete', job_id=job.id) }}"
                         onsubmit="return confirm('Supprimer définitivement « {{ job.name }} » et son historique ?')">
                     <button type="submit"
                       class="text-red-300 hover:text-red-600 text-xs px-2 py-1 rounded hover:bg-red-50 transition">

+ 9 - 9
sources/templates/dashboard_network.html

@@ -6,11 +6,11 @@
 <div class="flex items-center justify-between mb-6">
   <h1 class="text-xl font-bold text-gray-900">Vue réseau</h1>
   <div class="flex gap-2">
-    <a href="{{ url_for('index') }}"
+    <a href="{{ url_for('jobs.index') }}"
        class="text-sm text-gray-500 hover:text-gray-700 px-3 py-1.5 rounded border border-gray-200 bg-white transition">
       Vue locale
     </a>
-    <form method="post" action="{{ url_for('network_sync_all') }}">
+    <form method="post" action="{{ url_for('network.network_sync_all') }}">
       <button type="submit"
               class="text-sm bg-blue-600 hover:bg-blue-700 text-white px-3 py-1.5 rounded transition">
         Synchroniser tout
@@ -41,13 +41,13 @@
     </div>
     {% if inst_id %}
     <div class="flex gap-2 shrink-0">
-      <form method="post" action="{{ url_for('remote_instance_sync', inst_id=inst_id) }}">
+      <form method="post" action="{{ url_for('network.remote_instance_sync', inst_id=inst_id) }}">
         <button type="submit"
                 class="text-xs bg-gray-50 hover:bg-gray-100 text-gray-600 px-3 py-1.5 rounded border border-gray-200 transition">
           Sync
         </button>
       </form>
-      <a href="{{ url_for('remote_instance_edit', inst_id=inst_id) }}"
+      <a href="{{ url_for('network.remote_instance_edit', inst_id=inst_id) }}"
          class="text-xs bg-gray-50 hover:bg-gray-100 text-gray-600 px-3 py-1.5 rounded border border-gray-200 transition">
         Éditer
       </a>
@@ -61,7 +61,7 @@
         Aucune donnée — cliquez sur "Sync" pour récupérer l'état de cette instance.
       {% else %}
         Aucun job configuré.
-        <a href="{{ url_for('job_new') }}" class="text-blue-600 hover:underline ml-1">Créer un job →</a>
+        <a href="{{ url_for('jobs.job_new') }}" class="text-blue-600 hover:underline ml-1">Créer un job →</a>
       {% endif %}
     </div>
   {% else %}
@@ -108,7 +108,7 @@
             <div class="flex items-center justify-end gap-2">
               {% if inst_id and row.job_id %}
               <form method="post"
-                    action="{{ url_for('remote_job_run', inst_id=inst_id, job_id=row.job_id) }}"
+                    action="{{ url_for('network.remote_job_run', inst_id=inst_id, job_id=row.job_id) }}"
                     onsubmit="return confirm('Lancer « {{ row.name }} » sur {{ inst_name }} ?')">
                 <button type="submit"
                   class="bg-blue-50 hover:bg-blue-100 text-blue-700 text-xs font-medium px-2.5 py-1 rounded transition">
@@ -117,7 +117,7 @@
               </form>
               {% elif not inst_id and row.job_id %}
               <form method="post"
-                    action="{{ url_for('job_run_now', job_id=row.job_id) }}"
+                    action="{{ url_for('jobs.job_run_now', job_id=row.job_id) }}"
                     onsubmit="return confirm('Lancer « {{ row.name }} » maintenant ?')">
                 <button type="submit"
                   class="bg-blue-50 hover:bg-blue-100 text-blue-700 text-xs font-medium px-2.5 py-1 rounded transition">
@@ -127,7 +127,7 @@
               {% endif %}
               {% if inst_id and row.job_id %}
               <form method="post"
-                    action="{{ url_for('archive_pull_latest', inst_id=inst_id, remote_job_id=row.job_id) }}"
+                    action="{{ url_for('network.archive_pull_latest', inst_id=inst_id, remote_job_id=row.job_id) }}"
                     onsubmit="return confirm('Rapatrier la dernière archive de « {{ row.name }} » depuis {{ inst_name }} ?')">
                 <button type="submit"
                   class="text-gray-400 hover:text-gray-700 text-xs px-2 py-1 rounded hover:bg-gray-100 transition">
@@ -153,7 +153,7 @@
 {% if not instances %}
   <div class="bg-white rounded-xl border border-gray-200 px-6 py-10 text-center text-gray-400">
     <p>Aucune instance distante enregistrée.</p>
-    <a href="{{ url_for('remote_instance_new') }}" class="mt-2 inline-block text-blue-600 hover:underline text-sm">
+    <a href="{{ url_for('network.remote_instance_new') }}" class="mt-2 inline-block text-blue-600 hover:underline text-sm">
       Ajouter une instance →
     </a>
   </div>

+ 2 - 2
sources/templates/destination_form.html

@@ -4,7 +4,7 @@
 {% block content %}
 <div class="max-w-xl">
   <div class="mb-6">
-    <a href="{{ url_for('destinations_list') }}" class="text-gray-400 hover:text-gray-600 text-sm">← Destinations</a>
+    <a href="{{ url_for('dest.destinations_list') }}" class="text-gray-400 hover:text-gray-600 text-sm">← Destinations</a>
   </div>
   <h1 class="text-xl font-bold text-gray-900 mb-6">
     {{ 'Éditer « ' + dest.name + ' »' if dest else 'Nouvelle destination rsync SSH' }}
@@ -92,7 +92,7 @@
               class="bg-blue-600 hover:bg-blue-700 text-white px-5 py-2 rounded-lg font-medium text-sm transition">
         {{ 'Enregistrer' if dest else 'Créer la destination' }}
       </button>
-      <a href="{{ url_for('destinations_list') }}"
+      <a href="{{ url_for('dest.destinations_list') }}"
          class="bg-white hover:bg-gray-50 text-gray-700 border border-gray-300 px-5 py-2 rounded-lg font-medium text-sm transition">
         Annuler
       </a>

+ 5 - 5
sources/templates/destinations.html

@@ -4,7 +4,7 @@
 {% block content %}
 <div class="flex items-center justify-between mb-6">
   <h1 class="text-xl font-bold text-gray-900">Destinations de transfert</h1>
-  <a href="{{ url_for('destination_new') }}"
+  <a href="{{ url_for('dest.destination_new') }}"
      class="bg-blue-600 hover:bg-blue-700 text-white px-4 py-2 rounded-lg text-sm font-medium transition">
     + Nouvelle destination
   </a>
@@ -14,7 +14,7 @@
   <div class="bg-white rounded-xl border border-gray-200 px-6 py-12 text-center text-gray-400">
     <p class="text-lg">Aucune destination configurée.</p>
     <p class="text-sm mt-2">Les archives sont conservées localement uniquement.</p>
-    <a href="{{ url_for('destination_new') }}" class="mt-4 inline-block text-blue-600 hover:underline text-sm">
+    <a href="{{ url_for('dest.destination_new') }}" class="mt-4 inline-block text-blue-600 hover:underline text-sm">
       Configurer une première destination →
     </a>
   </div>
@@ -43,17 +43,17 @@
         </div>
 
         <div class="flex items-center gap-2 shrink-0">
-          <form method="post" action="{{ url_for('destination_test', dest_id=dest.id) }}">
+          <form method="post" action="{{ url_for('dest.destination_test', dest_id=dest.id) }}">
             <button type="submit"
               class="bg-gray-50 hover:bg-gray-100 text-gray-700 text-xs px-3 py-1.5 rounded border border-gray-200 transition">
               Tester
             </button>
           </form>
-          <a href="{{ url_for('destination_edit', dest_id=dest.id) }}"
+          <a href="{{ url_for('dest.destination_edit', dest_id=dest.id) }}"
              class="bg-gray-50 hover:bg-gray-100 text-gray-700 text-xs px-3 py-1.5 rounded border border-gray-200 transition">
             Éditer
           </a>
-          <form method="post" action="{{ url_for('destination_delete', dest_id=dest.id) }}"
+          <form method="post" action="{{ url_for('dest.destination_delete', dest_id=dest.id) }}"
                 onsubmit="return confirm('Supprimer la destination « {{ dest.name }} » ?')">
             <button type="submit"
               class="text-red-300 hover:text-red-600 text-xs px-2 py-1.5 rounded hover:bg-red-50 transition">

+ 4 - 4
sources/templates/job_form.html

@@ -8,7 +8,7 @@
   </h1>
 
   <form method="post"
-        action="{{ url_for('job_edit', job_id=job.id) if job else url_for('job_new') }}"
+        action="{{ url_for('jobs.job_edit', job_id=job.id) if job else url_for('jobs.job_new') }}"
         class="space-y-6">
 
     {# ── Infos générales ── #}
@@ -277,7 +277,7 @@
       {% if not destinations %}
         <p class="text-xs text-gray-400">
           Aucune destination configurée.
-          <a href="{{ url_for('destination_new') }}" class="text-blue-600 hover:underline">En créer une →</a>
+          <a href="{{ url_for('dest.destination_new') }}" class="text-blue-600 hover:underline">En créer une →</a>
         </p>
       {% endif %}
     </div>
@@ -298,7 +298,7 @@
               class="bg-blue-600 hover:bg-blue-700 text-white px-5 py-2 rounded-lg font-medium text-sm transition">
         {{ 'Enregistrer' if job else 'Créer le job' }}
       </button>
-      <a href="{{ url_for('index') }}"
+      <a href="{{ url_for('jobs.index') }}"
          class="bg-white hover:bg-gray-50 text-gray-700 border border-gray-300 px-5 py-2 rounded-lg font-medium text-sm transition">
         Annuler
       </a>
@@ -322,7 +322,7 @@
       populateSelect(sel, dbCache[dbType]);
       return;
     }
-    fetch("{{ url_for('internal_databases', db_type='__TYPE__') }}".replace('__TYPE__', dbType))
+    fetch("{{ url_for('cfg.internal_databases', db_type='__TYPE__') }}".replace('__TYPE__', dbType))
       .then(r => r.json())
       .then(dbs => {
         dbCache[dbType] = dbs;

+ 3 - 3
sources/templates/job_history.html

@@ -3,7 +3,7 @@
 
 {% block content %}
 <div class="mb-6 flex items-center gap-4">
-  <a href="{{ url_for('index') }}" class="text-gray-400 hover:text-gray-600 text-sm">← Dashboard</a>
+  <a href="{{ url_for('jobs.index') }}" class="text-gray-400 hover:text-gray-600 text-sm">← Dashboard</a>
   <h1 class="text-xl font-bold text-gray-900">{{ job.name }}</h1>
   <span class="bg-gray-100 text-gray-600 text-xs px-2 py-0.5 rounded font-mono">{{ job.type }}</span>
   <span class="text-gray-400 text-sm font-mono">{{ job.cron_expr }}</span>
@@ -86,13 +86,13 @@
               <td class="px-6 py-3">
                 {% if run.status == 'success' and run.archive_name %}
                   <div class="flex flex-col gap-1">
-                    <a href="{{ url_for('archive_restore', archive_name=run.archive_name) }}"
+                    <a href="{{ url_for('jobs.archive_restore', archive_name=run.archive_name) }}"
                        class="text-xs text-orange-600 hover:text-orange-800 hover:underline whitespace-nowrap">
                       ↩ Restaurer
                     </a>
                     {% set destinations = job.destination_id and [job.destination] or [] %}
                     {% if destinations %}
-                      <form method="post" action="{{ url_for('archive_transfer', archive_name=run.archive_name) }}">
+                      <form method="post" action="{{ url_for('dest.archive_transfer', archive_name=run.archive_name) }}">
                         <input type="hidden" name="destination_id" value="{{ job.destination_id }}">
                         <button type="submit" class="text-xs text-blue-600 hover:text-blue-800 hover:underline whitespace-nowrap text-left">
                           ↑ Transférer

+ 2 - 2
sources/templates/remote_instance_form.html

@@ -8,7 +8,7 @@
   </h1>
 
   <form method="post"
-        action="{{ url_for('remote_instance_edit', inst_id=inst.id) if inst else url_for('remote_instance_new') }}"
+        action="{{ url_for('network.remote_instance_edit', inst_id=inst.id) if inst else url_for('network.remote_instance_new') }}"
         class="space-y-6">
 
     <div class="bg-white rounded-xl border border-gray-200 p-6 space-y-4">
@@ -53,7 +53,7 @@
               class="bg-blue-600 hover:bg-blue-700 text-white px-5 py-2 rounded-lg font-medium text-sm transition">
         {{ 'Enregistrer' if inst else 'Ajouter l\'instance' }}
       </button>
-      <a href="{{ url_for('remote_instances_list') }}"
+      <a href="{{ url_for('network.remote_instances_list') }}"
          class="bg-white hover:bg-gray-50 text-gray-700 border border-gray-300 px-5 py-2 rounded-lg font-medium text-sm transition">
         Annuler
       </a>

+ 6 - 6
sources/templates/remote_instances.html

@@ -4,7 +4,7 @@
 {% block content %}
 <div class="flex items-center justify-between mb-6">
   <h1 class="text-xl font-bold text-gray-900">Instances distantes</h1>
-  <a href="{{ url_for('remote_instance_new') }}"
+  <a href="{{ url_for('network.remote_instance_new') }}"
      class="bg-blue-600 hover:bg-blue-700 text-white px-4 py-2 rounded-lg text-sm font-medium transition">
     + Ajouter une instance
   </a>
@@ -14,7 +14,7 @@
   <div class="bg-white rounded-xl border border-gray-200 px-6 py-12 text-center text-gray-400">
     <p class="text-lg">Aucune instance distante configurée.</p>
     <p class="text-sm mt-2">Ajoutez une instance pour voir son état et déclencher des sauvegardes à distance.</p>
-    <a href="{{ url_for('remote_instance_new') }}" class="mt-4 inline-block text-blue-600 hover:underline text-sm">
+    <a href="{{ url_for('network.remote_instance_new') }}" class="mt-4 inline-block text-blue-600 hover:underline text-sm">
       Ajouter une première instance →
     </a>
   </div>
@@ -78,23 +78,23 @@
         </div>
 
         <div class="flex items-center gap-2 shrink-0 flex-wrap justify-end">
-          <form method="post" action="{{ url_for('remote_instance_test', inst_id=inst.id) }}">
+          <form method="post" action="{{ url_for('network.remote_instance_test', inst_id=inst.id) }}">
             <button type="submit"
               class="bg-gray-50 hover:bg-gray-100 text-gray-700 text-xs px-3 py-1.5 rounded border border-gray-200 transition">
               Tester
             </button>
           </form>
-          <form method="post" action="{{ url_for('remote_instance_sync', inst_id=inst.id) }}">
+          <form method="post" action="{{ url_for('network.remote_instance_sync', inst_id=inst.id) }}">
             <button type="submit"
               class="bg-gray-50 hover:bg-gray-100 text-gray-700 text-xs px-3 py-1.5 rounded border border-gray-200 transition">
               Synchroniser
             </button>
           </form>
-          <a href="{{ url_for('remote_instance_edit', inst_id=inst.id) }}"
+          <a href="{{ url_for('network.remote_instance_edit', inst_id=inst.id) }}"
              class="bg-gray-50 hover:bg-gray-100 text-gray-700 text-xs px-3 py-1.5 rounded border border-gray-200 transition">
             Éditer
           </a>
-          <form method="post" action="{{ url_for('remote_instance_delete', inst_id=inst.id) }}"
+          <form method="post" action="{{ url_for('network.remote_instance_delete', inst_id=inst.id) }}"
                 onsubmit="return confirm('Supprimer l\'instance « {{ inst.name }} » ?')">
             <button type="submit"
               class="text-red-300 hover:text-red-600 text-xs px-2 py-1.5 rounded hover:bg-red-50 transition">

+ 2 - 2
sources/templates/restore_confirm.html

@@ -4,7 +4,7 @@
 {% block content %}
 <div class="max-w-2xl">
   <div class="mb-6">
-    <a href="{{ url_for('index') }}" class="text-gray-400 hover:text-gray-600 text-sm">← Dashboard</a>
+    <a href="{{ url_for('jobs.index') }}" class="text-gray-400 hover:text-gray-600 text-sm">← Dashboard</a>
   </div>
 
   <div class="bg-white rounded-xl border border-gray-200 shadow-sm p-6 space-y-5">
@@ -95,7 +95,7 @@
               class="bg-red-600 hover:bg-red-700 text-white px-5 py-2 rounded-lg font-medium text-sm transition">
         Confirmer la restauration
       </button>
-      <a href="{{ url_for('index') }}"
+      <a href="{{ url_for('jobs.index') }}"
          class="bg-white hover:bg-gray-50 text-gray-700 border border-gray-300 px-5 py-2 rounded-lg font-medium text-sm transition">
         Annuler
       </a>