Paragon: ship server-side DBC overlay as SQL so fresh installs can roll class 12

Stock Docker installs fill data/dbc/ from the vanilla 3.3.5a extract
in `ac-wotlk-client-data`, which has no class 12 in ChrClasses.dbc and
no class-12 bit on SkillRaceClassInfo.dbc. CharacterHandler.cpp's
sChrClassesStore.LookupEntry(12) returns null and the create fails
with CHAR_CREATE_FAILED ("Class (12) not found in DBC ...") before the
contributor ever sees the panel. Fixing it required hand-copying the
patched DBCs onto the named volume — undocumented, fragile, and not
portable to native installs.

DBCStores.cpp::LoadDBC merges every <table>_dbc world-DB row on top of
the on-disk DBC store (storage.LoadFromDB after storage.Load). We use
that merge layer to ship Paragon's class-12 deltas as SQL:

- chrclasses_dbc: 1 row defining class 12 (Paragon, power=Mana,
  family=Warrior, expansion=2). Resolves CHAR_CREATE_FAILED.
- skillraceclassinfo_dbc: 235 rows REPLACEing stock entries with the
  patched ClassMask (class-12 bit OR'd in) so baseline skills (defense,
  weapon skills, etc.) are available to Paragon characters.

The new `modules/mod-paragon/data/sql/db-world/updates/2026_05_09_00.sql`
is applied automatically by AC's DBUpdater on every fresh `ac-db-import`
run (Docker) or first worldserver boot (native). End-to-end verified
locally: truncate -> docker compose up ac-db-import -> rows reappear
with hash 33B1A05 recorded in updates table.

The migration is auto-generated by
fractured-tooling/from-workspace-root/_gen_paragon_dbc_overlay_sql.py
(outside this repo per the repo-tidy policy). Re-run it whenever the
DBC bake changes.

CLIENT-PATCHES.md is rewritten so contributors no longer need the
manual DBC sync section as their primary install path. Manual overlay
is preserved as a labelled fallback for tools that read data/dbc/
directly.

Co-authored-by: Cursor <cursoragent@cursor.com>
This commit is contained in:
Docker Build
2026-05-09 12:19:59 -04:00
parent 20a24b7935
commit fae3ff5028
2 changed files with 332 additions and 50 deletions
+62 -50
View File
@@ -53,16 +53,17 @@ worldserver image is older than commit `4d2a80d` (the
the same release tag and rebuild the worldserver image.
If the **client** shows the Paragon class on the create screen but the
server replies **Character Creation Failed** when you pick it: the
worldserver's on-disk DBC set still does not define class **12**
(`ChrClasses.dbc`). SQL migrations alone cannot fix that when
`chrclasses_dbc` in MySQL is empty (normal for stock AC — AzerothCore
loads `.dbc` files from `data/dbc/` and only *merges* optional DB rows
on top). See **Worldserver DBC sync** below.
server replies **Character Creation Failed** when you pick it on a
**very old** server checkout (predating commit landing
`modules/mod-paragon/data/sql/db-world/updates/2026_05_09_00.sql`):
pull `main` and run `docker compose up -d ac-db-import`. Recent
Fractured server builds ship the Paragon DBC overlay as a SQL
migration (see **Server-side Paragon DBC overlay** below); fresh
checkouts do **not** need the patched DBCs copied into `data/dbc/`.
---
## Worldserver DBC sync (required for Paragon character create)
## Server-side Paragon DBC overlay (automatic)
The Fractured **client** learns about Paragon from `patch-enUS-4.MPQ`
(DBC + GlueXML). The **worldserver** never reads your MPQs — it reads
@@ -70,46 +71,68 @@ plain `.dbc` files under its `DataDir` (`.../data/dbc/` by default).
Stock Docker installs populate `data/dbc/` from a vanilla 3.3.5a
extract (`ac-client-data-init` in `docker-compose.yml`). That tree has
no `ChrClasses` row for id **12**, so `CharacterHandler` rejects the
create with `CHAR_CREATE_FAILED` and logs:
no `ChrClasses` row for id **12** and no class-12 bit on
`SkillRaceClassInfo` rows, which would normally trigger:
`Class (12) not found in DBC while creating new char ... wrong DBC files or cheater?`
**Fix:** merge every `.dbc` that ships inside `patch-enUS-4.MPQ` under
`DBFilesClient/` (or the archive's `dbc/` root — same layout depending
on tool) into the server's `data/dbc/` directory, then restart
**only** `ac-worldserver`. Copy the **whole** set from the MPQ, not
just `ChrClasses.dbc`, so dependent stores (`CharBaseInfo`,
`SkillRaceClassInfo`, `TalentTab`, `PowerDisplay`, etc.) stay consistent
with the client patch.
…and reject the create with `CHAR_CREATE_FAILED`.
### Extract from the MPQ (any OS)
To remove that gap, the repo ships
`modules/mod-paragon/data/sql/db-world/updates/2026_05_09_00.sql`,
which `INSERT`s the Paragon class-12 deltas into:
Use Ladik's MPQ Editor, `mpqcli`, or any StormLib tool. You want every
`*.dbc` that Fractured added or changed in that patch, staged into a
host folder (example: `./paragon-dbc-extract/`).
- `chrclasses_dbc` — 1 row defining class 12 ("Paragon", power=Mana,
family=Warrior, expansion=2).
- `skillraceclassinfo_dbc` — 235 rows replacing stock entries with the
patched ClassMask (class-12 bit OR'd in) so every baseline skill is
available to Paragon characters.
### Docker: write into the named volume
`AzerothCore`'s DBC loader (`DBCStores.cpp::LoadDBC` -> `LoadFromDB`)
merges these rows on top of whatever `data/dbc/` contains at every
worldserver boot. The DBUpdater in `ac-db-import` (Docker) or the
worldserver itself (native) applies the migration automatically — so
the **only** steps a fresh contributor needs are `git clone` and
`docker compose up -d`.
Compose uses `${DOCKER_VOL_DATA:-ac-client-data}` (see
`docker-compose.yml`). Discover the real volume name:
### Regenerating the migration
```bash
docker volume ls | grep -i client-data
The SQL is auto-generated from the patched DBCs that already live
inside `patch-enUS-4.MPQ`. The bake script lives outside this repo
(per the repo-tidy policy) at:
`fractured-tooling/from-workspace-root/_gen_paragon_dbc_overlay_sql.py`
Re-run it whenever you change the Paragon DBC bake — for example,
adding a new race to the Paragon class mask. It diffs the patched
DBCs against a stock 3.3.5a DBC extract and emits a fresh
`2026_05_09_00.sql` (or successor migration with a new timestamp if
deltas change). Workflow:
```powershell
# Extract the patched DBCs once:
.\tools\mpq\mpqcli.exe extract `
"ChromieCraft_3.3.5a\Data\enUS\patch-enUS-4.MPQ" `
-o "$env:TEMP\paragon-dbc-extract"
# Regenerate the SQL migration:
python fractured-tooling\from-workspace-root\_gen_paragon_dbc_overlay_sql.py
```
Copy extracted `.dbc` files into the volume's `dbc/` subdirectory.
Example (Linux / macOS — adjust volume name and host path):
If the regenerated SQL has new content, commit it as a **new** dated
migration filename (e.g. `2026_06_01_00.sql`) — never edit a file that
has already been applied to live databases, AC's DBUpdater will detect
the hash change and re-run the SQL, which can be fine but is best
reserved for emergencies.
```bash
docker run --rm \
-v ac-client-data:/data \
-v "$PWD/paragon-dbc-extract:/patch:ro" \
alpine sh -c 'cp -f /patch/*.dbc /data/dbc/'
docker compose restart ac-worldserver
```
### Manual DBC overlay (rare, fallback)
Windows PowerShell (same idea; escape backticks if you wrap lines):
If you ever need the patched DBCs *on disk* — e.g. for a tool that
reads `data/dbc/` directly outside the worldserver, or to verify a
client-vs-server DBC mismatch — extract `patch-enUS-4.MPQ` and copy
its `DBFilesClient/*.dbc` into `data/dbc/`:
**Docker:**
```powershell
docker run --rm `
@@ -119,22 +142,11 @@ docker run --rm `
docker compose restart ac-worldserver
```
If your compose **project** name prefixes the volume (e.g.
`fractured_ac-client-data`), use that full name from `docker volume ls`.
**Native:** copy into `<CMAKE_INSTALL_PREFIX>/data/dbc/` and restart.
### Native install (no Docker)
Copy the same extracted `.dbc` files into:
`<CMAKE_INSTALL_PREFIX>/data/dbc/`
then restart `worldserver`.
### Verify
After restart, search startup logs for errors mentioning `ChrClasses` or
`dbc`. On failure you would still see the `Class (12) not found` line at
create time — if that disappears, the DBC merge worked.
This is **not required** for normal operation — the SQL migration
covers everything `mod-paragon` needs at runtime. Use the manual
overlay only when you're consciously bypassing the SQL merge layer.
---