Compare commits

...

54 Commits

Author SHA1 Message Date
Docker Build 8ad6a2aca3 Paragon: cascade guard for class skill lines + panel catalog backfill
The skill-line cascade in Player::learnSkillRewardedSpells re-fires from
_LoadSkills (every login), UpdateSkillsForLevel (every level-up),
UpdateSkillPro (every weapon-skill tick on a training dummy), and
SetSkill (first time a class skill is granted). Each pass re-grants
every SkillLineAbility-tagged class ability on the matching skill line,
which leaks Blood Presence / Death Coil / Death Grip / etc. back into
the spellbook within seconds even after the player intentionally
refunded them via the Character Advancement panel.

Path B fix: a 5-line guard at the top of learnSkillRewardedSpells skips
the cascade for class-category skill lines on CLASS_PARAGON characters.
mod-paragon already calls Player::learnSpell directly for the abilities
the player actually purchased (and their attached passives), so the
panel becomes the sole authority over class abilities. Profession,
weapon, language, and racial cascades stay enabled so recipe auto-learn,
weapon proficiencies, and racial perks still work.

Side effect: passives that previously rode along on the cascade
(Forceful Deflection on Blood Strike, Runic Focus on Icy Touch) must be
force-attached the same way Blood Plague / Frost Fever already are.
Extend kAttached and kFixup in Paragon_Essence.cpp to do that; existing
characters self-heal on next login.

Backfill paragon_spell_ae_cost for 42 spells newly exposed by the panel
after the ClassMask=0 filter was removed from the client catalog
generator (Lava Burst, Hex, Evocation, Kill Shot, Path of Frost,
Horn of Winter, Rune Strike, Raise Ally, Dark Command, etc.). Migration
is INSERT IGNORE so any per-spell tuning on existing rows is preserved.

Co-authored-by: Cursor <cursoragent@cursor.com>
2026-05-10 23:53:13 -04:00
Docker Build 36ac3dbd1d fix(launcher): force .MPQ extension uppercase on disk for WoW compatibility
Co-authored-by: Cursor <cursoragent@cursor.com>
2026-05-10 22:09:01 -05:00
Docker Build 24d1ae71d9 fix(launcher): install release MPQs under Data/enUS (not Data root)
Co-authored-by: Cursor <cursoragent@cursor.com>
2026-05-10 22:06:48 -05:00
Docker Build 9cef99f0ff feat(launcher): sync release assets from manifest or attachment list (no fixed exe name)
- default files []: resolve sync list from patch-manifest keys, else discover
  release attachments (exclude launcher artifacts).
- Explicit files[] still overrides; strip deprecated Wow-patched.exe on merge.
- listReleaseAttachmentNames + fetchGiteaReleaseRecord helpers.
- Version 1.0.7; README config docs.

Co-authored-by: Cursor <cursoragent@cursor.com>
2026-05-10 22:04:48 -05:00
Docker Build f409ffad12 fix(launcher): Gitea http URL; Wine Z: path + Wow.exe case check
- baked-gitea-channel: http:// for brassnet mirror.
- win-game-dir: map Unix /home/... to Z:\ under win32 (Wine folder picker).
- resolveGameDir + saveGameDir + patch paths use it; Wow.exe resolved case-insensitively.
- Version 1.0.6; README checklist for Wine.

Co-authored-by: Cursor <cursoragent@cursor.com>
2026-05-10 21:54:04 -05:00
Docker Build c1f7eaa153 fix(launcher): clearer fetch errors for Gitea TLS/DNS (fetch failed)
- fetchOrThrow wraps global fetch with TLS/DNS/refused hints + URL (sanitized).
- Use in gitea-release, github paths; fetchToFile already benefits.
- README checklist for sync Wow.exe fetch failed; version 1.0.5.

Co-authored-by: Cursor <cursoragent@cursor.com>
2026-05-10 21:46:48 -05:00
Docker Build b455db0db8 fix(launcher): drop patch-Z.MPQ from default files and migrate old configs
- default-launcher.json files: only Wow-patched.exe from release.
- config-store: strip deprecated patch-Z.MPQ from merged files; rewrite
  launcher.json on load if user still had that entry.
- Docs/scripts examples updated; version 1.0.4.

Co-authored-by: Cursor <cursoragent@cursor.com>
2026-05-10 21:36:50 -05:00
Docker Build 1fb284cb5c docs(launcher): clarify userData path wording for Linux config
Co-authored-by: Cursor <cursoragent@cursor.com>
2026-05-10 21:33:07 -05:00
Docker Build ebd8d81924 fix(launcher): Linux/macOS packaged config in userData (AppImage EROFS)
AppImage mounts read-only at /tmp/.mount_*; writing launcher.json beside
execPath failed. Use app.getPath('userData') for linux/darwin when packaged.
Bump version to 1.0.3.

Co-authored-by: Cursor <cursoragent@cursor.com>
2026-05-10 21:32:43 -05:00
Docker Build 362084b829 ci(gitea-sync): validate workflow_dispatch tag; reject release title as ref
- Trim input; fail fast if tag contains whitespace (common mistake: pasting
  release title instead of git tag).
- Multiline GITHUB_OUTPUT for tag value safety.
- README checklist + input description clarify tag vs title.

Co-authored-by: Cursor <cursoragent@cursor.com>
2026-05-10 21:21:45 -05:00
Docker Build 656cf2d07d Paragon panel: keep cascade passives, strip free actives; DK passive DBC fix
- PanelLearnSpellChain: record every non-chain passive as panel_spell_child;
  only revoke non-passive (Blood Presence, Death Coil, Death Grip, etc.).
- RevokeUnwantedCascadeSpellsForPlayer: skip passive rewards on login sweep.
- RevokeBlockedSpellsForPlayer: migrate legacy passive revoke rows to
  children; walk (parent, revoked) pairs from DB.
- PruneSkillLineCascadeChildrenFromDb: only strip actives wrongly stored as
  children; never strip passives.
- SpellInfoCorrections: set SPELL_ATTR0_PASSIVE on Forceful Deflection (49410)
  and Runic Focus (61455) so IsPassive() matches spellbook behavior.
- PanelUnlearnTalentPurchase: mirror resetTalents (_removeTalentAurasAndSpells,
  _removeTalent, SendTalentsInfoData) so Beast Mastery loss triggers pet reset.
- OnPlayerLogin: run legacy passive attach before scoped cascade sweep.
- Add .paragon recalibrate GM command (RBAC modify): full panel reset + AE/TE
  reconciliation for selected player or self.

Co-authored-by: Cursor <cursoragent@cursor.com>
2026-05-10 22:20:13 -04:00
Docker Build bfe51f6ad4 docs(launcher): note manual Windows pack for local test
Co-authored-by: Cursor <cursoragent@cursor.com>
2026-05-10 20:50:16 -05:00
Docker Build 2a3107a78d feat(launcher): Linux AppImage 1.0.2, Gitea sync + CI, manual pack script
- Add pack:linux (AppImage x64), linux/appImage artifact names in package.json.
- Gitea sync: parallel build-electron-linux, merge Windows+Linux into Gitea upload;
  rename Windows artifact to electron-dist-windows.
- Fractured launcher CI: electron-launcher-windows + electron-launcher-linux jobs.
- scripts/manual-pack-linux.sh for local test builds from current tree.
- Normalize Gitea base_url (prepend https if missing); baked channel uses full URL.

Co-authored-by: Cursor <cursoragent@cursor.com>
2026-05-10 20:50:06 -05:00
Docker Build 48826e21d6 refactor(launcher): hardcode Gitea channel in lib/baked-gitea-channel.js
- Merge baked base_url/owner/repo/release_tag at load time (no inject script,
  no fractured-release-channel.json, no CI env for pack).
- Fix mergeConfig deep-merge for gitea, patch_manifest, launcher_updates_from_github.
- Remove inject-release-channel.js and fractured-release-channel.json.

Co-authored-by: Cursor <cursoragent@cursor.com>
2026-05-10 20:15:38 -05:00
Docker Build 15c476c12d ci(gitea-sync): overlay launcher from default branch before pack
Release tags can point at commits older than launcher lib additions; building
only from the tag omitted gitea-release.js etc. Fetch default branch and
checkout tools/fractured-launcher-electron from it before npm ci/pack.

Co-authored-by: Cursor <cursoragent@cursor.com>
2026-05-10 19:15:27 -05:00
Docker Build 6c4d7244c3 fix(launcher): add missing gitea-release and patch-manifest to repo
These modules were required by main.js / auto-update.js / github.js but never
committed, so packaged builds lacked them and crashed at startup.

Co-authored-by: Cursor <cursoragent@cursor.com>
2026-05-10 19:08:55 -05:00
Docker Build 9fb80102c8 Paragon: spell unlearn queue + AE/TE reconciliation
Two related additions to mod-paragon:

  * HandleCommit gains a third payload section, " u:<id>,...", carrying
    spell IDs the player wants to refund + unlearn in the same commit
    that learns / talents through. The protocol stays backward-compat
    (older clients omit the section). PanelUnlearnSpellPurchase mirrors
    the per-spell branch of HandleParagonResetAbilities: tracked passive
    children are removed first, then the chain head, then panel_spells /
    panel_spell_children / panel_spell_revoked rows for that purchase
    are dropped, then LookupSpellAECost(head) is refunded into the
    cache. Unlearns are applied before learns inside the commit so the
    refund covers the same-commit spends. Allow-list for the silence
    window now includes chain ranks + panel_spell_children for the
    intentional unlearns so "You have unlearned X" toasts stay visible
    for the targeted spell while cascade dependents stay silenced.

  * ReconcileEssenceForPlayer reads panel_spells + panel_talents and
    sets the cache to ComputeStartingAE/TE(level) - sum-of-spends.
    Self-heals drift in either direction: clamps the cache down when
    the player has more essence than their level + spends allow
    (cheese clamp), and tops up when they have less (admin-tweak /
    crash recovery). Wired into OnPlayerLogin (after LoadCurrencyFromDb,
    before PushCurrency so the first balance the client sees is the
    reconciled one) and OnPlayerLevelChanged (replaces the old
    GrantLevelUpEssence delta -- Reconcile sets the absolute correct
    balance from level + spend, so it subsumes the per-level grant and
    the cheese clamp in one call). Costs come from the same
    paragon_spell_ae_cost / config keys HandleCommit uses so the math
    stays in lockstep across any future cost rebalance.

Both features ship in patch-enUS-6.MPQ v0.9.16: right-click a learned
spell row to queue an unlearn (header shows +N AE refund preview) and
hit Learn All to apply. The icon picker also got two fixes -- the
leading INV_Misc_QuestionMark is no longer duplicated, and the
selection ring is now a tooltip-border Frame anchored to the cell
bounds (the prior UI-ActionButton-Border texture rendered nearly
invisible at non-native sizes).

Co-authored-by: Cursor <cursoragent@cursor.com>
2026-05-10 19:57:47 -04:00
Docker Build 7028258084 feat(launcher): bake Gitea base_url/owner/repo into pack from env or channel file
- inject-release-channel.js merges GITEA_* (or fractured-release-channel.json) into
  default-launcher.json before electron-builder.
- CI passes existing GITEA_BASE_URL/OWNER/REPO secrets into the Windows pack job.
- npm run pack:win/publish:win run the injector; workflows use npm run pack:win.

Co-authored-by: Cursor <cursoragent@cursor.com>
2026-05-10 17:33:28 -05:00
Docker Build 5966eb0ffc scripts: document default mysql acore/acore for FRACTURED_MYSQL example
Co-authored-by: Cursor <cursoragent@cursor.com>
2026-05-10 16:24:37 -05:00
Docker Build 90c8db0b04 scripts: tee vps-paragon-diagnostics output to var/vps-paragon-diagnostics-last.txt
Co-authored-by: Cursor <cursoragent@cursor.com>
2026-05-10 15:57:54 -05:00
Docker Build 9240bf1243 scripts: clarify empty spell_dbc samples; add version + rune override probes
Co-authored-by: Cursor <cursoragent@cursor.com>
2026-05-10 15:53:57 -05:00
Docker Build 88f8dcb0e7 scripts: extend vps-paragon-diagnostics for rune/RP DBC and binary parity
- Binary sha256 + revision-like strings for dev vs VPS compare
- worldserver.conf Rate.RunicPower and mod_paragon.conf Paragon.* keys
- MySQL: chrclasses_dbc 6/12, spell_dbc sample, spellrunecost join
- FRACTURED_SPELL_IDS override for custom spell spot-checks

Co-authored-by: Cursor <cursoragent@cursor.com>
2026-05-10 15:46:35 -05:00
Docker Build 9cb3c79dbe fix(launcher): opt-in GitHub auto-update; clarify Gitea for from_release
- Gate electron-updater GitHub provider on launcher_updates_from_github (default false)
  so GITHUB_TOKEN no longer targets the source repo without latest.yml.
- Improve GitHub releases 404 hint when assets are on Gitea.
- Document in README and default-launcher.json.

Co-authored-by: Cursor <cursoragent@cursor.com>
2026-05-10 15:38:07 -05:00
Docker Build 75e3b59442 chore(gitea): add bootstrap-gitea-repo.sh for initial README commit
Co-authored-by: Cursor <cursoragent@cursor.com>
2026-05-10 15:29:07 -05:00
Docker Build 030c2307c2 scripts: add vps-paragon-diagnostics.sh for native VPS triage
Co-authored-by: Cursor <cursoragent@cursor.com>
2026-05-10 15:19:59 -05:00
Docker Build 27d54f15a2 fix(gitea): document and explain HTTP 422 repo is empty on release create
Co-authored-by: Cursor <cursoragent@cursor.com>
2026-05-10 14:37:02 -05:00
Docker Build 5e18c2b766 docs(ci): explain Re-run vs Run workflow for Gitea sync (GH_TOKEN error)
Co-authored-by: Cursor <cursoragent@cursor.com>
2026-05-10 14:29:21 -05:00
Docker Build 1c85341b1f ci: disable electron-builder GitHub publish; add Gitea sync workflow
- Use --publish never in pack/CI so tagged builds do not require GH_TOKEN.
- Set build.publish to null and align publish:win with local-only packaging.
- Add Gitea release sync workflow and upload script; fetch script from default
  branch so reruns work for tags that predate the script.

Co-authored-by: Cursor <cursoragent@cursor.com>
2026-05-10 14:24:42 -05:00
Docker Build ef02839ea0 Paragon: Save-Current build, archive retired share codes, reset clears active
Server side of the v0.7.10 Builds drop. Squashes a few footguns from
the original Builds catalog and adds a one-click "save what I have
right now" path the Overview pane can hook directly into.

- HandleBuildSaveCurrent: new C BUILD SAVE_CURRENT verb. Inserts a
  fresh build row, snapshots the live panel into its recipe, sets it
  active. No AE/TE motion, no relearning -- just a named slot for
  whatever the player already has.
- Reset abilities / Reset talents now SetActiveBuildId(0) and re-push
  the catalog. Without this, the next swap silently overwrote the
  active build's saved recipe with the (now empty/partial) post-reset
  state -- effectively erasing the build.
- Delete of the *active* build is now a hard reset (HandleParagonResetAll):
  unlearn everything the panel bought, refund all AE/TE. Deleting a
  non-active slot still just removes the saved recipe row + parked pet.
- Load of the currently-active build is now a "revert to last snapshot"
  instead of a no-op refresh: keeps the saved recipe authoritative,
  parks the pet, resets, re-applies. Useful for discarding pending
  edits.
- After a successful Learn All while a build is active: archive the
  build's previous share_code + recipe into
  character_paragon_build_share_archive* (so codes already posted to
  Discord keep importing the frozen loadout), snapshot the new panel
  into the live build, assign a fresh share_code, push catalog.
- HandleBuildImport now falls back to the archive tables when a code
  isn't in the live catalog -- old shared codes resurrect the recipe
  they pointed at when they were retired.
- Imports never copy pet_number (the parked pet belongs to the source
  player); if the imported recipe contains Tame Beast we hint that the
  importer needs to tame their own pet.
- BuildPanelOwnedSpellsAllowlist now walks SPELL_EFFECT_LEARN_SPELL
  effects on talent rank spells (Mangle, Feral Charge, Mutilate, ...)
  so the login cascade sweep stops revoking talent-granted active
  abilities.

Schema: new mod-paragon migration 2026_05_10_05.sql adds
character_paragon_build_share_archive (+ _spells / _talents).

Co-authored-by: Cursor <cursoragent@cursor.com>
2026-05-10 15:12:12 -04:00
Docker Build 377927b878 chore(launcher): Electron-only distro, CI sync with Windows pack 2026-05-10 12:34:43 -05:00
Docker Build a251e56c59 Paragon: Builds QoL -- share codes, unload, remaining AE/TE on hover
- Replace the "favorite" toggle with import-by-share-code: every build
  gets a 6-char realm-unique alphanumeric code on creation; pasting one
  into the BuildsPane share box copies the recipe (name + icon + spells
  + talents) into the importer's catalog as a new build, with a fresh
  share code so the imported copy can be re-shared independently.
- Add C BUILD UNLOAD verb so the client can clear a stale active-build
  pointer without forcing a swap. Wired to a new "Unload (clear active)"
  right-click context menu entry on the active build.
- Per-build tooltip now shows "Remaining if loaded: X AE / Y TE",
  computed server-side as total_earned - recipe_cost. Negative renders
  red so the player sees insufficient-currency cases before clicking
  Load. Suppressed for the active build (HandleBuildLoad short-circuits
  on target == active so the line would be misleading).
- Schema migration 2026_05_10_04.sql: drop is_favorite from
  character_paragon_builds and add share_code CHAR(6) UNIQUE NULL with
  lazy backfill on every PushBuildCatalog (so pre-migration rows pick
  up codes the first time the player opens the panel).

Co-authored-by: Cursor <cursoragent@cursor.com>
2026-05-10 04:15:11 -04:00
Docker Build 7de018f7eb Paragon: add Builds catalog (saved loadouts with pet park/unpark)
Server-side Character Advancement now stores named, icon-tagged build
recipes (panel-purchased spells + per-spec talent ranks) and atomically
swaps between them by snapshotting the active build, refunding AE/TE
through HandleParagonReset{Talents,Abilities}, and re-spending on the
target recipe. Hunter pets attached to a build are parked to
PET_SAVE_NOT_IN_SLOT (mirroring HandleStableSwapPet) so name, talents,
and exp survive swaps; non-hunter pets (warlock demon, DK ghoul, mage
water elemental) are NOT parked because the engine resummons them from
a fresh template each cast.

New PARAA verbs: Q BUILDS / C BUILD NEW / C BUILD EDIT / C BUILD
DELETE / C BUILD FAVORITE / C BUILD LOAD. The catalog is pushed on
login and after every mutation as a single addon message.

Schema (mod-paragon migration 2026_05_10_03.sql):
- character_paragon_builds (build_id PK, guid, name, icon, is_favorite,
  pet_number, created_at)
- character_paragon_build_spells (build_id, spell_id)
- character_paragon_build_talents (build_id, spec, talent_id, rank)
- character_paragon_active_build (guid PK, build_id)

The talent recipe table is spec-keyed so a build remembers tank/dps
dual-spec layouts independently. Swaps are blocked while in combat.

Co-authored-by: Cursor <cursoragent@cursor.com>
2026-05-10 02:35:55 -04:00
Docker Build abb25f56d1 Paragon: expand IsClass hooks and addon pet talent reset
Broaden OnPlayerIsClass for CLASS_CONTEXT_ABILITY, pet/charm/equip contexts; add PARAA C RESET PET TALENTS handler. Update CLIENT-PATCHES.md for patch-enUS-5/6 and PARAA.

Co-authored-by: Cursor <cursoragent@cursor.com>
2026-05-10 01:36:54 -04:00
Docker Build 7a92231614 Add scripts/vps-update-server.sh for native VPS git pull and compile
Co-authored-by: Cursor <cursoragent@cursor.com>
2026-05-09 21:42:38 -05:00
Docker Build f2952c905a Fractured: strip class-spell reagents at load; Paragon relic ranged slot
- SpellInfoCorrections: zero Reagent/ReagentCount on spells with non-zero
  SpellFamilyName so class abilities no longer require shards, candles,
  etc., while profession crafts (SpellFamilyName 0) keep mats. Matches
  the client Spell.dbc bake in patch-enUS-4.MPQ.
- Paragon_SC: OnPlayerIsClass returns true for CLASS_CONTEXT_EQUIP_RELIC
  for paladin/druid/shaman/warlock/dk so Paragon can equip all relic types
  in the ranged slot.
- CLIENT-PATCHES: document Spell.dbc reagent pass, rune script order, and
  stock ammo slot behavior in patch-enUS-5.

Co-authored-by: Cursor <cursoragent@cursor.com>
2026-05-09 22:38:32 -04:00
Docker Build 8abd40f217 Paragon: give class 12 intrinsic AP and SP scaling from stats
Stock 3.3.5 hardcodes per-class stat -> AP/SP formulas in
Player::UpdateAttackPowerAndDamage and Unit::SpellBase{Damage,Healing}BonusDone,
so class 12 fell into the default branches and ended up with 0 AP and 0 SP
regardless of STR / AGI / INT / SPI. The character sheet, combat log, and
ability damage all reflected this, and Mental Quickness-style AP->SP plumbing
silently no-oped on Paragon characters.

Add Paragon-specific branches in core (no PlayerScript hooks - those caused
SIGSEGVs when the new mid-list enum entry shifted later hook ordinals and
broke vtable dispatch):

- StatSystem.cpp: melee and ranged AP = level*2 + STR + AGI - 20, mirroring
  the formula the UI patch already advertises in tooltips.
- Unit.cpp:       intrinsic SP    = level*2 + INT + SPI - 20 (clamped >=0),
  added symmetrically to SpellBaseDamageBonusDone and
  SpellBaseHealingBonusDone so the single advertised Spell Power value the
  character sheet renders matches what spells actually use in combat.

Drop the now-unused UnitDefines.h include in Paragon_SC.cpp - it was only
needed by the AP PlayerScript hook that was rolled back in favor of the
core change.

Co-authored-by: Cursor <cursoragent@cursor.com>
2026-05-09 20:39:23 -04:00
Docker Build 34cc87a5f9 mod-paragon: fix panel cascade sweep revoking talent-granted spells
RevokeUnwantedCascadeSpellsForPlayer and RevokeBlockedSpellsForPlayer
built their allowlist only from character_paragon_panel_spells and
panel_spell_children. Many Character Advancement "abilities" (e.g.
Scourge Strike) are panel talents stored in character_paragon_panel_talents,
so learning Death Coil afterward activated DK skill lines and the sweep
removed those spells as false orphans.

Add BuildPanelOwnedSpellsAllowlist to union spell chains, talent rank spell
IDs up to the purchased rank, and passive children. Also keep the prior
fixes: clear stale panel_spell_revoked rows on purchase and skip+delete
revoke entries that now match the allowlist on login.

Co-authored-by: Cursor <cursoragent@cursor.com>
2026-05-09 17:37:08 -04:00
Docker Build f986fdcddd mod-paragon: let class 12 actually USE class-restricted glyphs / items
Two paths still rejected glyph use on Paragon characters even after the
earlier AllowableClass server bypass:

1. Spell::CheckItems (server) treated cast-from-glyph as a normal
   "equipped item required" cast and called HasItemFitToSpellRequirements,
   which only handles weapon/armor and falls through default for
   ITEM_CLASS_GLYPH -> SPELL_FAILED_EQUIPPED_ITEM_CLASS. Skip that check
   when the cast item itself is the glyph.

2. The 3.3.5 client engine pre-checks ItemTemplate.AllowableClass against
   the player's class locally and refuses the right-click before sending
   CMSG_USE_ITEM, regardless of what the server would do. Bake the
   Paragon class bit (1<<11 = 2048) into AllowableClass for every
   class-restricted item via a mod-paragon SQL migration so the engine's
   pre-check passes for class 12.

Cache caveat: clients that previously inspected an affected item have
the old AllowableClass cached in Cache/<locale>/itemcache.wdb; deleting
the Cache folder forces a re-query. The server also caches item_template
in memory at boot, so this migration only takes effect for clients after
a worldserver restart (or .reload item_template) once the SQL has been
applied -- DBUpdater handles the SQL automatically on the next start.

Co-authored-by: Cursor <cursoragent@cursor.com>
2026-05-09 16:52:37 -04:00
Docker Build a212717c37 docs: note tooltip 'Classes:' patcher in patch-enUS-5 description
Co-authored-by: Cursor <cursoragent@cursor.com>
2026-05-09 16:08:19 -04:00
Docker Build 49cb354133 mod-paragon: ignore item AllowableClass for class 12 (gear + glyphs)
Paragon is a classless concept layered on top of WotLK class data: stock items and class glyphs gate equip / vendor visibility / loot rolls / AH 'usable' filter via ItemTemplate.AllowableClass, which never has the class-12 bit (0x800). Bypassing the gate at the five enforcement sites lets Paragon equip any class-restricted item -- including class glyphs, since EffectApplyGlyph itself has no class check beyond the item gate. Race / level / proficiency / skill / required-spell checks still apply, so Paragon can't skip baseline progression.

Co-authored-by: Cursor <cursoragent@cursor.com>
2026-05-09 16:07:01 -04:00
Docker Build 7298d89c9a mod-paragon: default HasActivePowers on for rage from white hits
Paragon OnPlayerHasActivePowerType only reported POWER_RAGE when
Paragon.MultiResource.HasActivePowers was true. Core melee rage uses
Unit::DealDamage -> HasActivePowerType(POWER_RAGE) before RewardRage;
missing module config (common on fresh clones / Docker without merged
mod_paragon.conf) fell through to GetOption(..., false) and white swings
never generated rage. Match mod_paragon.conf.dist and default the C++
fallback to true so Paragon behaves correctly out of the box. Set
Paragon.MultiResource.HasActivePowers = 0 only for intentional test builds.

Co-authored-by: Cursor <cursoragent@cursor.com>
2026-05-09 15:54:39 -04:00
Docker Build 3a2ae82593 mod-paragon: combined Arcane Torrent also refunds 15 rage
Extend spell_paragon_arcane_torrent to EnergizeBySpell POWER_RAGE 150 (15
displayed; rage uses the same 10x internal scaling as runic power, see the
`-20` rage decay step in Player::Regenerate). Paragon's combined Arcane
Torrent now refunds mana, rage, energy, and runic power -- whichever pool
the character is using at the moment. ModifyPower no-ops on pools with
MaxPower == 0, so it's safe even before the Paragon picks up rage abilities.

Co-authored-by: Cursor <cursoragent@cursor.com>
2026-05-09 15:51:37 -04:00
Docker Build 16717acdd3 mod-paragon: combined Arcane Torrent that refunds mana, energy, and runic power
Building on the previous fix that hid the rogue and DK Arcane Torrent variants
for Paragon Blood Elves: instead of just dropping the duplicates, turn the
remaining mana variant (28730) into a single combined racial that refunds
whichever resource pool the character is using at the moment.

Add SpellScript spell_paragon_arcane_torrent in modules/mod-paragon/src/
Paragon_SC.cpp. Hooks AfterCast on 28730: when the caster is class 12 the
script EnergizeBySpell's 15 energy and 150 internal runic power (= 15 displayed,
matching stock 25046 / 50613 amounts) on top of the spell's stock mana effect.
ModifyPower no-ops on pools the player has no max for, so it is safe even
before the Paragon picks up energy- or RP-using abilities. Non-Paragon Blood
Elves are untouched and keep learning their stock racial.

Update migration 2026_05_10_03.sql to also register the script binding via
spell_script_names (28730 -> 'spell_paragon_arcane_torrent'). Idempotent
DELETE + INSERT.

Co-authored-by: Cursor <cursoragent@cursor.com>
2026-05-09 15:44:42 -04:00
Docker Build d96123e661 mod-paragon: single Arcane Torrent for Paragon Blood Elves
Blood Elf racial skill line 756 grants three different Arcane Torrent spell
IDs (28730 mana, 25046 rogue energy, 50613 DK runic power). The blanket
SkillLineAbility overlay in 2026_05_10_02 OR'd class 12 into all three, so
Paragon Blood Elves auto-learned every variant and the spellbook listed three
identical "Arcane Torrent" entries.

Add db-world migration 2026_05_10_03.sql to clear the class-12 bit on the rogue
and DK rows only (SkillLineAbility IDs 13338 and 17510), leaving 28730 as the
sole Paragon-visible racial cast. OnPlayerLogin removes 25046/50613 if still
present so existing characters self-heal without a manual unlearn.

The fractured-tooling DBC overlay generator is updated in the same workspace
to skip those two rows when regenerating SkillLineAbility SQL.

Co-authored-by: Cursor <cursoragent@cursor.com>
2026-05-09 15:34:02 -04:00
Docker Build 8a0da95ed2 mod-paragon: bypass talent DependsOn check for Paragon class
Player::LearnTalent enforces the column-arrow prereq (talentInfo->DependsOn)
even when called with command=true, so Character Advancement's commit path
was silently dropping any talent whose Talent.dbc row points to an unrelated
sibling -- e.g. Deep Wounds (depends on Improved Heroic Strike), Bloody
Vengeance (depends on Dark Conviction), Expose Weakness (depends on Lethal
Shots). Players spent points in the panel, hit Learn All, and the talent
silently never reached addTalent / OnPlayerLearnTalents -- the snapshot came
back without it and the client repainted the points as "unspent."

The Character Advancement panel gates progression via AE/TE essence cost,
not via the spec-tree column arrows, so the DependsOn rule doesn't apply to
class 12. Skip it for Paragon, mirroring the existing class-mask bypass a
few lines above.

Co-authored-by: Cursor <cursoragent@cursor.com>
2026-05-09 15:16:30 -04:00
Docker Build 8363b1b6c8 Paragon: ship class-12 SkillLineAbility overlay so proficiency passives auto-learn
Companion to 2026_05_09_00.sql (DBC overlay for chrclasses + srci) and
2026_05_10_01.sql (proficiency skill rows in playercreateinfo_skills).
Those two grant the SKILL (Maces, Shield, Cloth, ...) to Paragon at
character creation; this one opens the SkillLineAbility rows that
CASCADE skill -> passive spell, so when a fresh Paragon is created
AC's `Player::LearnDefaultSkill` actually grants the proficiency
passives:

  Block (107), Parry (3127), Dual Wield (674), Defense, weapon Shoot,
  racial Mace/Sword Specialization, ...

Without this overlay, a class-12 Paragon spawns with the right skill
rows but a near-empty spellbook past the racials and class defaults
that come from playercreateinfo_action.

How it works
------------
AC's DBCStores.cpp::LoadDBC loads each store from the on-disk .dbc
file first, then merges <table>_dbc world-DB rows on top. Our patched
client SkillLineAbility.dbc (in patch-enUS-4.MPQ) OR's the class-12
bit (0x800) into ClassMask on 3,314 rows -- the same rows the server
needs for the cascade to fire on Paragon. Stock Docker installs use
the upstream `ac-wotlk-client-data` image which fills data/dbc/ from
a vanilla 3.3.5a extract, so without this SQL overlay the server
runs against an unmodified SkillLineAbility.dbc and the cascade
never fires.

Generation
----------
Auto-generated end-to-end by
`fractured-tooling/from-workspace-root/_gen_paragon_dbc_overlay_sql.py`,
extended in this commit to handle SkillLineAbility.dbc (14-int
WotLK layout, 56 bytes per record). The script diffs patched vs
stock by ID, keeps only rows whose stock ClassMask did NOT include
the class-12 bit but whose patched ClassMask does, and emits the
3,314 REPLACE INTO rows. Re-running with the same inputs is byte-
stable.

Verified locally
----------------
- Migration applies twice in a row at exactly 3,314 SQL overlay rows
  (idempotent: DELETE WHERE ID IN (...) before INSERT).
- ac-worldserver restart logs:
    >> Loaded 10219 SkillLineAbility MultiMap Data
  -- the same total as stock (10,219 rows), confirming our overlay
  REPLACES existing rows by ID rather than appending duplicates.
- Spot-checked spell IDs: 107 (Block, ClassMask 2115 = warrior +
  paladin + dk + Paragon), 3127 (Parry, 2063), 674 (Dual Wield,
  2157), 75 (Auto Shoot, 2052) all carry the 0x800 bit.

Existing characters
-------------------
The cascade fires inside Player::Create and Player::LearnDefaultSkill
at character spawn, so existing class-12 characters created before
this migration keep their broken state. Delete and re-roll, or hand-
grant the missing spells via .learn for individual existing chars.

CLIENT-PATCHES.md updated to add the third symptom ("proficiency
skills exist but passive spells don't auto-learn") and document
this migration as the fourth piece of the class-12 bootstrap.

Co-authored-by: Cursor <cursoragent@cursor.com>
2026-05-09 14:38:55 -04:00
Docker Build 2874119c6d Paragon: ship class-12 weapon/armor proficiencies as SQL migration
Companion to 2026_05_10_00.sql. The spawn-data migration teaches the
worldserver where Paragon characters spawn and what per-level base
stats they have; this one teaches it which weapon/armor skill lines
to grant at first character login.

Without these rows a fresh Paragon character lands in their newbie
zone with no weapon or armor proficiencies (auto-attack greys out
on anything beyond a fist) -- the universal classMask=0 rows in
playercreateinfo_skills only cover Defense, Unarmed, Cloth,
languages, Mounts, and Companion Pets.

Adds 20 rows in playercreateinfo_skills with classMask=2048 (class
12 only) for every weapon and armor proficiency:
  - Weapons: Swords, Axes, Bows, Guns, Maces, 2H Swords, Dual Wield,
             Staves, 2H Maces, 2H Axes, Daggers, Thrown, Crossbows,
             Wands, Polearms, Fist Weapons.
  - Armor:   Plate Mail, Mail, Leather, Shield. (Cloth already
             granted via the classMask=0 universal row.)

Idempotent: DELETE WHERE classMask=2048 then INSERT, so it replays
cleanly on a partially-seeded DB (e.g. one where a contributor hand-
patched these rows before the migration landed).

Verified locally: applies cleanly twice in a row, worldserver restart
now logs `>> Loaded 1391 Player Create Skills` (was 1371 pre-Paragon
= +20 class-12 rows) and a freshly-rolled Draenei Paragon spawns with
the full weapon/armor kit.

CLIENT-PATCHES.md troubleshooting block updated to call out the
"Paragon spawns naked / can't equip anything" failure mode and list
all three migrations in the current rebuild recipe.

Co-authored-by: Cursor <cursoragent@cursor.com>
2026-05-09 13:38:27 -04:00
Docker Build 56fa2fc7f7 docs(client): note spellbook expansion in patch-enUS-5.MPQ
Co-authored-by: Cursor <cursoragent@cursor.com>
2026-05-09 13:24:50 -04:00
Docker Build 5deb9e3255 Paragon: ship class-12 starter spawn data so character creation works on fresh installs
Companion to 2026_05_09_00.sql (DBC overlay). The DBC overlay teaches
the world server that class 12 (Paragon) exists; this migration
teaches it WHERE class-12 characters spawn, what action bar they boot
with, and what per-level base stats Player::InitStatsForLevel uses.

Without these rows, contributors hit:
  - Player::Create -> "invalid race/class pair (R/12) - refusing"
    and the client shows "Error creating character".
  - WorldServer load -> "class-12 Level-L does not have stats data!"
    integrity warnings.

Tables touched (idempotent: DELETE WHERE class=12 then INSERT):
  - playercreateinfo         : 10 rows, every DK-eligible race spawning
                               in their racial newbie zone (Northshire,
                               Valley of Trials, Ammen Vale, ...).
                               NOT Acherus -- Paragon is from-level-1.
  - playercreateinfo_action  : 46 rows, default action bar layout
                               per race (attack 6603, eat 78, racial,
                               etc.).
  - player_class_stats       : 80 rows, per-level base HP/Mana/STR/AGI/
                               STA/INT/SPI. Curve mirrors Warrior to
                               level 60, Paladin-style HP inflation
                               past 60 to keep Paragon competitive
                               in Wrath content.

Tables intentionally untouched: playercreateinfo_item is empty for
class 12 (Paragon ships no per-class starting items, only racial
kit), and the mask-based playercreateinfo_skills/_cast_spell/
_spell_custom rows already cover class 12 via their classMask=0
"all classes" entries.

Verified locally: applies cleanly twice in a row (idempotent),
worldserver restart now logs `>> Loaded 72 Player Create Definitions`
(was 62 pre-Paragon = +10 races for class 12) and creates a Draenei
Paragon without rejection.

CLIENT-PATCHES.md troubleshooting block updated to merge the two
"Character Creation Failed" modes (DBC overlay missing + spawn data
missing) into a single fix recipe. Existing contributors with a
pre-built dbimport image need
`docker compose build ac-db-import ac-worldserver` before this
migration is visible to DBUpdater; fresh clones get it on first
`docker compose up`.

Co-authored-by: Cursor <cursoragent@cursor.com>
2026-05-09 13:06:39 -04:00
Docker Build ecd8eacb1f chore(conf): revert seed defaults to stock so fresh installs auto-connect
The previous seed pinned auth/realmlist to production values
(`hsrwow.net` + RealmServerPort 47497), which silently bricked every
fresh local install: after auth login the realm hand-off pointed
clients at our public host, where their local credentials don't
exist, and they were dropped within a frame.

Seed now matches stock AzerothCore for solo dev:
- realmlist.address  = 127.0.0.1   (was hsrwow.net)
- RealmServerPort    = 3724        (was 47497)

Production owners apply both overrides post-dbimport via a one-shot
SQL UPDATE + an authserver.conf edit. Documented end-to-end in
contrib/fractured-dev-extras/BUILD-NATIVE.md (new "Production
deployment overrides" section) and the disconnect-after-login
symptom is called out in CLIENT-PATCHES.md.

Co-authored-by: Cursor <cursoragent@cursor.com>
2026-05-09 12:44:45 -04:00
Docker Build 1811c0ec35 Fix MariaDB dbimport: utf8mb4 collation and dbimport.conf
Use utf8mb4_unicode_ci in base SQL (MariaDB lacks utf8mb4_0900_ai_ci).
Add Updates.ExceptionShutdownDelay to dbimport.conf.dist to match
DBUpdater expectations.
2026-05-09 11:41:57 -05:00
Docker Build fae3ff5028 Paragon: ship server-side DBC overlay as SQL so fresh installs can roll class 12
Stock Docker installs fill data/dbc/ from the vanilla 3.3.5a extract
in `ac-wotlk-client-data`, which has no class 12 in ChrClasses.dbc and
no class-12 bit on SkillRaceClassInfo.dbc. CharacterHandler.cpp's
sChrClassesStore.LookupEntry(12) returns null and the create fails
with CHAR_CREATE_FAILED ("Class (12) not found in DBC ...") before the
contributor ever sees the panel. Fixing it required hand-copying the
patched DBCs onto the named volume — undocumented, fragile, and not
portable to native installs.

DBCStores.cpp::LoadDBC merges every <table>_dbc world-DB row on top of
the on-disk DBC store (storage.LoadFromDB after storage.Load). We use
that merge layer to ship Paragon's class-12 deltas as SQL:

- chrclasses_dbc: 1 row defining class 12 (Paragon, power=Mana,
  family=Warrior, expansion=2). Resolves CHAR_CREATE_FAILED.
- skillraceclassinfo_dbc: 235 rows REPLACEing stock entries with the
  patched ClassMask (class-12 bit OR'd in) so baseline skills (defense,
  weapon skills, etc.) are available to Paragon characters.

The new `modules/mod-paragon/data/sql/db-world/updates/2026_05_09_00.sql`
is applied automatically by AC's DBUpdater on every fresh `ac-db-import`
run (Docker) or first worldserver boot (native). End-to-end verified
locally: truncate -> docker compose up ac-db-import -> rows reappear
with hash 33B1A05 recorded in updates table.

The migration is auto-generated by
fractured-tooling/from-workspace-root/_gen_paragon_dbc_overlay_sql.py
(outside this repo per the repo-tidy policy). Re-run it whenever the
DBC bake changes.

CLIENT-PATCHES.md is rewritten so contributors no longer need the
manual DBC sync section as their primary install path. Manual overlay
is preserved as a labelled fallback for tools that read data/dbc/
directly.

Co-authored-by: Cursor <cursoragent@cursor.com>
2026-05-09 12:19:59 -04:00
Docker Build 20a24b7935 docs(client): document worldserver DBC sync for Paragon character create
Explains why Character Creation Failed occurs when the client has
patch-enUS-4 but Docker/native data/dbc is still vanilla: ChrClasses
row 12 only exists in the patched DBC set. Adds Docker volume copy
steps, native install path, and log verification.

Co-authored-by: Cursor <cursoragent@cursor.com>
2026-05-09 12:00:16 -04:00
Docker Build 526022e2bc Add VPS clone scripts for sparse checkout without Docker
Track vps-clone-without-docker.sh and vps-sparse-checkout-no-docker.sh
so a fresh GitHub clone can run the helper end-to-end. Mark both
executable in git for ./ usage after clone.
2026-05-09 10:41:40 -05:00
63 changed files with 16104 additions and 148 deletions
+150
View File
@@ -0,0 +1,150 @@
# When a release is published on this repo (or manual dispatch):
# 1. Builds the Electron launcher from that tag (npm run pack:win).
# 2. Downloads any assets attached to the same release on this repo (patches, Wow exe, …).
# 3. Merges them (launcher files win on name collision) and creates/updates the matching
# release on Fractured-Distro.
#
# Setup (GitHub → Settings → Secrets and variables → Actions):
# DISTRO_SYNC_TOKEN — PAT with releases write on Fractured-Distro (see repo README).
#
# Change DISTRO_REPO or the job `if:` if your GitHub slugs differ.
name: Sync release to Fractured-Distro
on:
release:
types: [published]
workflow_dispatch:
inputs:
tag:
description: 'Release tag on this repo (must exist; e.g. v1.0.0)'
required: true
type: string
permissions:
contents: read
env:
DISTRO_REPO: Dawnforger/Fractured-Distro
jobs:
meta:
runs-on: ubuntu-latest
if: github.repository == 'Dawnforger/Fractured'
outputs:
tag: ${{ steps.t.outputs.tag }}
steps:
- name: Resolve tag
id: t
shell: bash
run: |
if [ "${{ github.event_name }}" = "workflow_dispatch" ]; then
echo "tag=${{ inputs.tag }}" >> "$GITHUB_OUTPUT"
else
echo "tag=${{ github.event.release.tag_name }}" >> "$GITHUB_OUTPUT"
fi
build-electron:
needs: meta
if: github.repository == 'Dawnforger/Fractured'
runs-on: windows-latest
timeout-minutes: 45
steps:
- uses: actions/checkout@v4
with:
ref: ${{ needs.meta.outputs.tag }}
- uses: actions/setup-node@v4
with:
node-version: '20'
cache: npm
cache-dependency-path: tools/fractured-launcher-electron/package-lock.json
- name: Install and pack (NSIS + portable)
working-directory: tools/fractured-launcher-electron
run: |
npm ci
npm run pack:win
- name: Stage launcher files for upload
shell: pwsh
run: |
New-Item -ItemType Directory -Force -Path launcher-publish | Out-Null
Copy-Item tools/fractured-launcher-electron/dist/*.exe launcher-publish/
if (Test-Path tools/fractured-launcher-electron/dist/latest.yml) {
Copy-Item tools/fractured-launcher-electron/dist/latest.yml launcher-publish/
}
Get-ChildItem tools/fractured-launcher-electron/dist/*.blockmap -ErrorAction SilentlyContinue |
Copy-Item -Destination launcher-publish/
- uses: actions/upload-artifact@v4
with:
name: electron-dist
path: launcher-publish/
sync-distro:
needs: [meta, build-electron]
if: github.repository == 'Dawnforger/Fractured'
runs-on: ubuntu-latest
steps:
- uses: actions/download-artifact@v4
with:
name: electron-dist
path: /tmp/electron
- name: Merge main release assets + Electron build
env:
GH_TOKEN: ${{ github.token }}
run: |
set -euo pipefail
TAG="${{ needs.meta.outputs.tag }}"
mkdir -p combined
mkdir -p /tmp/from-main
if gh release download "$TAG" -R "${{ github.repository }}" -D /tmp/from-main 2>/tmp/dl.err; then
shopt -s nullglob
for f in /tmp/from-main/*; do
if [ -f "$f" ]; then
cp -f "$f" combined/
fi
done
echo "Merged assets from ${{ github.repository }} release $TAG"
else
echo "Main release download note (continuing with launcher only):"
cat /tmp/dl.err || true
fi
shopt -s nullglob
for f in /tmp/electron/*; do
if [ -f "$f" ]; then
cp -f "$f" combined/
fi
done
echo "Combined directory:"
ls -la combined/
- name: Upload to Fractured-Distro
env:
GH_TOKEN: ${{ secrets.DISTRO_SYNC_TOKEN }}
run: |
set -euo pipefail
if [ -z "${GH_TOKEN:-}" ]; then
echo "Missing secret DISTRO_SYNC_TOKEN (PAT with access to $DISTRO_REPO)."
exit 1
fi
TAG="${{ needs.meta.outputs.tag }}"
shopt -s nullglob
files=(combined/*)
if [ "${#files[@]}" -eq 0 ]; then
echo "Nothing to upload (Electron pack produced no files?)."
exit 1
fi
SRC_URL="https://github.com/${{ github.repository }}/releases/tag/${TAG}"
if gh release view "$TAG" -R "$DISTRO_REPO" &>/dev/null; then
gh release upload "$TAG" -R "$DISTRO_REPO" "${files[@]}" --clobber
echo "Uploaded (clobber) to $DISTRO_REPO release $TAG"
else
gh release create "$TAG" -R "$DISTRO_REPO" \
--title "Fractured $TAG" \
--notes "Synced from [$TAG]($SRC_URL) on ${{ github.repository }}. Includes CI-built Electron launcher + release assets." \
"${files[@]}"
echo "Created $DISTRO_REPO release $TAG with ${#files[@]} asset(s)."
fi
@@ -0,0 +1,81 @@
# Verifies Electron launcher Windows pack and uploads installers for testing.
name: Fractured launcher CI
on:
workflow_dispatch:
push:
branches: [master, main]
paths:
- 'tools/fractured-launcher-electron/**'
- '.github/workflows/fractured-launcher-ci.yml'
pull_request:
paths:
- 'tools/fractured-launcher-electron/**'
- '.github/workflows/fractured-launcher-ci.yml'
permissions:
contents: read
concurrency:
group: fractured-launcher-ci-${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
jobs:
electron-launcher-windows:
runs-on: windows-latest
timeout-minutes: 45
defaults:
run:
working-directory: tools/fractured-launcher-electron
steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version: '20'
cache: npm
cache-dependency-path: tools/fractured-launcher-electron/package-lock.json
- name: Install and pack (NSIS + portable)
run: |
npm ci
npm run pack:win
- uses: actions/upload-artifact@v4
with:
name: fractured-launcher-electron-windows-${{ github.run_id }}
if-no-files-found: warn
path: |
tools/fractured-launcher-electron/dist/*.exe
tools/fractured-launcher-electron/dist/latest.yml
tools/fractured-launcher-electron/dist/*.blockmap
electron-launcher-linux:
runs-on: ubuntu-latest
timeout-minutes: 45
defaults:
run:
working-directory: tools/fractured-launcher-electron
steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version: '20'
cache: npm
cache-dependency-path: tools/fractured-launcher-electron/package-lock.json
- name: Install and pack (AppImage)
run: |
npm ci
npm run pack:linux
- uses: actions/upload-artifact@v4
with:
name: fractured-launcher-electron-linux-${{ github.run_id }}
if-no-files-found: warn
path: |
tools/fractured-launcher-electron/dist/*.AppImage
tools/fractured-launcher-electron/dist/*.yml
tools/fractured-launcher-electron/dist/*.blockmap
+241
View File
@@ -0,0 +1,241 @@
# Primary path for player-facing binaries: every *published* GitHub Release on this repo
# is mirrored to your self-hosted Gitea (same tag). No public GitHub distro repo.
#
# Triggers:
# - release: published / released → GitHub “Release” (not a raw git tag alone).
# - workflow_dispatch → Actions → this workflow → “Run workflow” (enter tag).
#
# Troubleshooting: “Re-run failed jobs” on an OLD run replays the *original* workflow
# YAML (e.g. still runs `npm run pack:win` without --publish never). After changing this
# file on default branch, start a *new* run via “Run workflow”, not Re-run on a pre-fix run.
#
# Important: pushing only a git tag does NOT run this — you must create/publish a
# Release on github.com (Releases → Draft/new release → Publish). The workflow
# definition must exist on the repo DEFAULT branch (GitHub runs it from there).
#
# Steps: Windows (NSIS+portable) + Linux (AppImage) in parallel, launcher from DEFAULT BRANCH
# overlay on tag checkout → merge with GitHub release assets → upload all to Gitea.
#
# Secrets: GITEA_BASE_URL, GITEA_TOKEN, GITEA_OWNER, GITEA_REPO
# Optional variable: GITEA_TARGET_REF (see tools/fractured-launcher-electron/README.md)
#
# Job guard: edit `if:` if github.repository is not Dawnforger/Fractured.
name: Sync release to Gitea
on:
release:
types: [published, released]
workflow_dispatch:
inputs:
tag:
description: 'Git tag only (e.g. v0.7.11-paragon-foo). NOT the release title — open the release and copy the tag next to the title.'
required: true
type: string
permissions:
contents: read
concurrency:
group: gitea-release-sync-${{ github.repository }}-${{ github.event_name == 'workflow_dispatch' && github.event.inputs.tag || github.event.release.tag_name }}
cancel-in-progress: false
jobs:
meta:
runs-on: ubuntu-latest
if: github.repository == 'Dawnforger/Fractured'
outputs:
tag: ${{ steps.t.outputs.tag }}
steps:
- name: Resolve tag
id: t
shell: bash
run: |
set -euo pipefail
if [ "${{ github.event_name }}" = "workflow_dispatch" ]; then
RAW="${{ github.event.inputs.tag }}"
else
RAW="${{ github.event.release.tag_name }}"
fi
TAG="$(printf '%s' "$RAW" | sed 's/^[[:space:]]*//;s/[[:space:]]*$//')"
if [ "${{ github.event_name }}" = "workflow_dispatch" ]; then
if [ -z "$TAG" ]; then
echo '::error::Tag input is empty. Paste the git tag (e.g. v0.7.11-…).'
exit 1
fi
if printf '%s' "$TAG" | grep -q '[[:space:]]'; then
echo '::error::Tag contains whitespace — that is usually the **release title**, not the tag. On GitHub → Releases → open the release → copy the **tag** (short ref like v0.7.11-…), not the long title line.'
exit 1
fi
fi
{
echo "tag<<__TAG_EOF__"
echo "$TAG"
echo "__TAG_EOF__"
} >> "$GITHUB_OUTPUT"
build-electron:
needs: meta
if: github.repository == 'Dawnforger/Fractured'
runs-on: windows-latest
timeout-minutes: 45
steps:
- uses: actions/checkout@v4
with:
ref: ${{ needs.meta.outputs.tag }}
# Release tags often point at server/game commits that predate launcher lib fixes.
# Always pack the launcher from default branch so app.asar includes the full tree.
- name: Overlay launcher from default branch
shell: bash
run: |
set -euo pipefail
DB="${{ github.event.repository.default_branch }}"
git fetch --no-tags --depth=1 origin "+refs/heads/${DB}:refs/remotes/origin/${DB}"
git checkout "origin/${DB}" -- tools/fractured-launcher-electron
- uses: actions/setup-node@v4
with:
node-version: '20'
cache: npm
cache-dependency-path: tools/fractured-launcher-electron/package-lock.json
- name: Install and pack (NSIS + portable)
working-directory: tools/fractured-launcher-electron
run: |
npm ci
npm run pack:win
- name: Stage launcher files for upload
shell: pwsh
run: |
New-Item -ItemType Directory -Force -Path launcher-publish | Out-Null
Copy-Item tools/fractured-launcher-electron/dist/*.exe launcher-publish/
if (Test-Path tools/fractured-launcher-electron/dist/latest.yml) {
Copy-Item tools/fractured-launcher-electron/dist/latest.yml launcher-publish/
}
Get-ChildItem tools/fractured-launcher-electron/dist/*.blockmap -ErrorAction SilentlyContinue |
Copy-Item -Destination launcher-publish/
- uses: actions/upload-artifact@v4
with:
name: electron-dist-windows
path: launcher-publish/
build-electron-linux:
needs: meta
if: github.repository == 'Dawnforger/Fractured'
runs-on: ubuntu-latest
timeout-minutes: 45
steps:
- uses: actions/checkout@v4
with:
ref: ${{ needs.meta.outputs.tag }}
- name: Overlay launcher from default branch
shell: bash
run: |
set -euo pipefail
DB="${{ github.event.repository.default_branch }}"
git fetch --no-tags --depth=1 origin "+refs/heads/${DB}:refs/remotes/origin/${DB}"
git checkout "origin/${DB}" -- tools/fractured-launcher-electron
- uses: actions/setup-node@v4
with:
node-version: '20'
cache: npm
cache-dependency-path: tools/fractured-launcher-electron/package-lock.json
- name: Install and pack (AppImage)
working-directory: tools/fractured-launcher-electron
run: |
npm ci
npm run pack:linux
- name: Stage Linux launcher for upload
shell: bash
run: |
set -euo pipefail
mkdir -p launcher-linux-publish
shopt -s nullglob
cp -f tools/fractured-launcher-electron/dist/*.AppImage launcher-linux-publish/ 2>/dev/null || true
cp -f tools/fractured-launcher-electron/dist/*.yml launcher-linux-publish/ 2>/dev/null || true
cp -f tools/fractured-launcher-electron/dist/*.blockmap launcher-linux-publish/ 2>/dev/null || true
ls -la launcher-linux-publish/
if ! compgen -G "launcher-linux-publish/*.AppImage" > /dev/null; then
echo "No AppImage under dist/ — electron-builder linux target failed" >&2
exit 1
fi
- uses: actions/upload-artifact@v4
with:
name: electron-dist-linux
path: launcher-linux-publish/
sync-gitea:
needs: [meta, build-electron, build-electron-linux]
if: github.repository == 'Dawnforger/Fractured'
runs-on: ubuntu-latest
env:
GITEA_BASE_URL: ${{ secrets.GITEA_BASE_URL }}
GITEA_TOKEN: ${{ secrets.GITEA_TOKEN }}
GITEA_OWNER: ${{ secrets.GITEA_OWNER }}
GITEA_REPO: ${{ secrets.GITEA_REPO }}
GITEA_TARGET_REF: ${{ vars.GITEA_TARGET_REF }}
steps:
- uses: actions/checkout@v4
with:
# Script may not exist on older release tags; always use default branch.
ref: ${{ github.event.repository.default_branch }}
sparse-checkout: |
tools/fractured-launcher-electron/scripts
sparse-checkout-cone-mode: true
- uses: actions/download-artifact@v4
with:
name: electron-dist-windows
path: /tmp/electron-win
- uses: actions/download-artifact@v4
with:
name: electron-dist-linux
path: /tmp/electron-linux
- name: Merge GitHub release assets + Electron build
env:
GH_TOKEN: ${{ github.token }}
run: |
set -euo pipefail
TAG="${{ needs.meta.outputs.tag }}"
mkdir -p combined
mkdir -p /tmp/from-main
if gh release download "$TAG" -R "${{ github.repository }}" -D /tmp/from-main 2>/tmp/dl.err; then
shopt -s nullglob
for f in /tmp/from-main/*; do
if [ -f "$f" ]; then
cp -f "$f" combined/
fi
done
echo "Merged assets from ${{ github.repository }} release $TAG"
else
echo "GitHub release download note (continuing with launcher only):"
cat /tmp/dl.err || true
fi
shopt -s nullglob
for f in /tmp/electron-win/* /tmp/electron-linux/*; do
if [ -f "$f" ]; then
cp -f "$f" combined/
fi
done
ls -la combined/
- name: Upload to Gitea
run: |
set -euo pipefail
for v in GITEA_BASE_URL GITEA_TOKEN GITEA_OWNER GITEA_REPO; do
if [ -z "${!v:-}" ]; then
echo "Missing secret $v — add it under repo Settings → Secrets and variables → Actions." >&2
exit 1
fi
done
bash tools/fractured-launcher-electron/scripts/upload-release-to-gitea.sh combined "${{ needs.meta.outputs.tag }}"
+42 -6
View File
@@ -17,17 +17,53 @@ prerequisites; everything here is just the deltas you need on top of it.
## Fractured client + network defaults
Production Fractured uses a non-default **auth** port so the client realmlist can be:
Stock-friendly defaults for fresh local installs. A `git clone` ->
`docker compose up` (or native install) lets a single developer log in
from the same machine without any post-install config tweaks.
- **`authserver.conf` -> `RealmServerPort`** = **3724** (stock WoW). A
patched `Wow.exe` with `set realmlist 127.0.0.1` (no port) reaches
the auth handshake.
- **`realmlist` table -> `port`** is the **world** port (default
**8085**, matches `WorldServerPort` in `worldserver.conf.dist`).
Auth tells the client to handshake to this port for the world hand-off.
- **`realmlist` table -> `address`** defaults to **`127.0.0.1`** in the
base SQL. The auth server hands this address to clients after login,
so 127.0.0.1 means "talk to the world server on the same machine
auth is running on" -- correct for solo dev. **Override on production
deploys**, see *Production deployment overrides* below.
### Production deployment overrides
Production Fractured runs on a remote VPS at `hsrwow.net` with auth
bound to a non-stock port (47497 -- 3724 was unavailable on that host).
Apply the overrides **once per fresh dbimport** on the production box.
```sql
-- Run against acore_auth on the production database after first dbimport:
UPDATE realmlist
SET address = 'hsrwow.net',
port = 8085 -- world port; leave at 8085 unless changed
WHERE id = 1;
```
Edit the production `authserver.conf` (NOT `authserver.conf.dist`)
to bind the auth listener to the production port:
```ini
RealmServerPort = 47497
```
Restart the auth server. Production clients connect with:
```text
set realmlist hsrwow.net:47497
```
(Patched 3.3.5 clients that support `host:port`; otherwise use port forwarding to **3724**.)
- **`authserver.conf``RealmServerPort`** must be **`47497`** (matches `authserver.conf.dist` in this repo).
- **`realmlist` table → `port`** is the **world** port (default **8085**, same as `WorldServerPort` in `worldserver.conf.dist`), **not** 47497.
- **`realmlist``address`** defaults to **`hsrwow.net`** in base SQL; change if your public hostname differs.
The Fractured-patched 3.3.5 client supports the `host:port` syntax;
stock 3.3.5 clients do not, so any contributor distributing the
client bundle for production must include the patched `Wow.exe` from
the GitHub release.
---
+189 -6
View File
@@ -7,22 +7,35 @@ re-downloaded without bloating `git clone`.
This file is the table of contents and install guide.
**Launcher (Windows):** The maintained client launcher lives in
[`tools/fractured-launcher-electron/`](../../tools/fractured-launcher-electron/)
(see its README for build and config). **Public downloads** for the launcher
and mirrored patch assets are pushed to
[Fractured-Distro releases](https://github.com/Dawnforger/Fractured-Distro/releases)
when a release is published here (workflow **Sync release to Fractured-Distro**).
---
## What ships in a release
| Artifact | Size | Purpose |
|---|---|---|
| `patch-enUS-4.MPQ` | ~5 MB | DBC + GlueXML bake. Adds `CLASS_PARAGON` (id 12), the character-create slot, glue strings, talent-tab DBC entries, and the Paragon resource bar definitions. Required for character creation as Paragon to even show up. |
| `patch-enUS-5.MPQ` | ~40 KB | FrameXML overrides. Replaces stock `PlayerFrame.lua` / `RuneFrame.lua` / `ComboFrame.lua` / `UnitFrame.lua` with Paragon-aware versions: rune simulator, combo-point simulator, server-authoritative resource sync over the `PARAA` addon channel, action-button usability + click guards. |
| `patch-enUS-6.MPQ` | ~160 KB | The `ParagonAdvancement` addon. Replaces the talent pane (`N` key) for Paragon characters with the Character Advancement panel: per-class spell tabs, talent grid, Overview/Search tabs, AE/TE currency, commit / reset / preview, login-time toast suppression. |
| `patch-enUS-4.MPQ` | ~5 MB | DBC + GlueXML bake. Adds `CLASS_PARAGON` (id 12), the character-create slot, glue strings, game-table DBCs, and a patched `Spell.dbc`: **(1)** `RuneCostID` zeroed on every rune-cost spell so nonDeath Knight clients still send DK casts (rune costs are shown via `RuneFrame.lua`); **(2)** `Reagent[]` / `ReagentCount[]` zeroed on every spell whose `SpellFamilyName` is non-zero (all class abilities), while profession crafts (`SpellFamilyName == 0`) keep their materials. Both edits mirror server load-time corrections so client preflight and server validation stay aligned. Required for character creation as Paragon to even show up. |
| `patch-enUS-5.MPQ` | ~57 KB | FrameXML overrides. Replaces stock `PlayerFrame.lua` / `RuneFrame.lua` / `ComboFrame.lua` / `UnitFrame.lua` / `SpellBookFrame.lua` + `SpellBookFrame.xml` with Paragon-aware versions: rune simulator, combo-point simulator, server-authoritative resource sync over the `PARAA` addon channel, action-button usability + click guards, an expanded spellbook (higher `MAX_SPELLS`, 24 skill-line tabs instead of stock 8) so all-class spells render, Paragon stat tooltips on the character sheet (including filtering duplicate “attack power from strength” lines so the paper doll matches server AP), a tooltip post-processor that appends ", Paragon" to the "Classes:" line on class-restricted gear / glyphs (the server bypasses `AllowableClass` for class 12, but the engine paints the line red and omits Paragon — the Lua hook recolors it green and adds the name so the player can tell it's wearable), and **PetFrame** re-anchored so the **pet unit frame sits below the rune row** for Paragon (stock layout had runes overlapping the pet portrait). The paper-doll **ammo slot** follows stock visibility rules (shown for hunters / ranged weapons; hidden when `UnitHasRelicSlot` applies). |
| `patch-enUS-6.MPQ` | ~134 KB | The `ParagonAdvancement` addon. Replaces the talent pane (`N` key) for Paragon characters with the Character Advancement panel: per-class spell tabs, talent grid, Overview/Search tabs, AE/TE currency, commit / reset / preview, login-time toast suppression, a **PETS** tab with live hunter pet talent trees (preview learn, no TE/AE cost), a dedicated **Reset Pet Talents** control (server `PARAA` `C RESET PET TALENTS` — instant, no gold, no confirmation; requires matching worldserver), bottom-row **Reset all Abilities / Reset Build / Reset all Talents** disabled while on the PETS tab so those paths cannot dismiss the pet or unlearn Tame Beast, and a **Builds** page (full-pane overlay opened from the bottom-row Builds button) for saving named, icon-tagged loadouts: New Build (+) icon picker reuses `MACRO_ICON_FILENAMES`, right-click for edit/delete, shift-left-click to favorite (favorites bubble to the top), left-click pops a Load Build confirm. Build swaps reset + refund AE/TE, re-spend on the saved recipe, and **park hunter pets** to `PET_SAVE_NOT_IN_SLOT` so their name/talents/exp are preserved across swaps. |
| `Wow.exe` | ~7.5 MB | 3.3.5a (build 12340) client byte-patched to skip the MPQ signature check so custom `patch-enUS-N.MPQ` files load. Diff against stock is a few bytes; everything else is unchanged. |
Server and client work as a pair: the addon talks to `mod-paragon` on the
worldserver via `WHISPER` addon-channel messages with the `PARAA` prefix
(currency push, spell/talent snapshot, commit, combo points, rune
cooldowns, learn-toast silence window). Mismatched versions usually
manifest as the panel rendering blank or AE/TE reading 0/0.
cooldowns, learn-toast silence window, **`C RESET PET TALENTS`**
for hunter pet talent resets from the Character Advancement PETS tab,
and the **build catalog** verbs `Q BUILDS` / `C BUILD NEW` / `C BUILD
EDIT` / `C BUILD DELETE` / `C BUILD FAVORITE` / `C BUILD LOAD` for the
saved-loadout system on the Builds page). Build swaps require the
matching worldserver image because the swap path is server-driven
(snapshot → reset → re-spend → pet park/unpark). Mismatched versions
usually manifest as the panel rendering blank or AE/TE reading 0/0.
---
@@ -52,6 +65,171 @@ worldserver image is older than commit `4d2a80d` (the
`character_paragon_panel_spell_revoked` migration). Pull both ends to
the same release tag and rebuild the worldserver image.
If the **client** shows the Paragon class on the create screen but the
server replies **Character Creation Failed** (sometimes shown as
"Error creating character") when you pick it -- **or** the character
is created but spawns with no weapon / armor proficiencies (auto-attack
greys out, can't equip anything beyond a fist), or with the proficiency
**skills** but no **passive spells** like Block, Parry, Dual Wield --
the worldserver is missing one of four pieces of class-12 data. All
ship as SQL migrations under
`modules/mod-paragon/data/sql/db-world/updates/` and are auto-applied
by AzerothCore's DBUpdater on every `ac-db-import` run, but the SQL
files are baked into the dbimport Docker image at build time -- so a
stale image won't pick up new migrations. Fix:
```bash
git pull origin main
docker compose build ac-db-import ac-worldserver
docker compose up -d ac-db-import
docker compose restart ac-worldserver
```
Existing class-12 characters created before these migrations will
keep their broken state -- the cascade only fires inside
`Player::Create` and `Player::LearnDefaultSkill` at character spawn.
Delete the old Paragon and re-roll after the rebuild.
The four migrations:
- `2026_05_09_00.sql` -- DBC overlay rows for `chrclasses_dbc` and
`skillraceclassinfo_dbc`. Without this the server can't even
resolve class 12 in `sChrClassesStore`. See **Server-side Paragon
DBC overlay** below.
- `2026_05_10_00.sql` -- `playercreateinfo`, `playercreateinfo_action`,
and `player_class_stats` rows for class 12. Without this
`Player::Create` rejects every (race, class=12) pair as an
"invalid race/class pair" and the worldserver prints
`class-N Level-L does not have stats data!` integrity warnings on
load.
- `2026_05_10_01.sql` -- 20 `playercreateinfo_skills` rows
(`classMask = 2048` = class 12) granting every weapon /
armor proficiency at level 1. Without this a Paragon spawns with
only the universal `classMask = 0` skills (Defense, Unarmed,
Cloth, languages, Mounts) -- no Swords, no Mail, no Shield, etc.
- `2026_05_10_02.sql` -- 3,314 `skilllineability_dbc` rows opening
the class-12 bit on every SkillLineAbility row our patched
`SkillLineAbility.dbc` modified. AC reads these rows in
`Player::LearnDefaultSkill` to drive the `skill -> passive spell`
cascade. Without it the proficiency *skills* from `_01.sql` exist
but the *passive spells* (Block, Parry, Dual Wield, Defense,
weapon Shoot, racial Mace/Sword Specialization, etc.) never auto-
learn, so the spellbook past the racials looks empty.
After the rebuild + restart, `ac-worldserver` should log
`>> Loaded 72 Player Create Definitions` (was 62 pre-Paragon),
`>> Loaded 1391 Player Create Skills` (was 1371),
`>> Loaded 10219 SkillLineAbility MultiMap Data` (unchanged total --
the SQL overlay replaces existing rows by ID, doesn't add new ones),
and character creation succeeds for any DK-eligible race with a full
weapon / armor kit and the matching passive spells.
If the client **logs in** successfully but **disconnects immediately**
when entering the realm: the auth server is handing your client the
wrong world-server address. On a fresh local install the seed defaults
to `127.0.0.1` (commit landing this paragraph). If your DB was
imported from an older Fractured checkout, the seed may still point at
`hsrwow.net`, which sends the client to our production world server
instead of yours. Fix:
```bash
# Docker:
docker exec ac-database mysql -uroot -ppassword \
-e "UPDATE acore_auth.realmlist SET address='127.0.0.1' WHERE id=1;"
docker compose restart ac-authserver
```
Substitute your public hostname/IP for `127.0.0.1` if remote players
will be connecting. See `BUILD-NATIVE.md` -> *Production deployment
overrides* for the full list of values to set on a production box.
---
## Server-side Paragon DBC overlay (automatic)
The Fractured **client** learns about Paragon from `patch-enUS-4.MPQ`
(DBC + GlueXML). The **worldserver** never reads your MPQs — it reads
plain `.dbc` files under its `DataDir` (`.../data/dbc/` by default).
Stock Docker installs populate `data/dbc/` from a vanilla 3.3.5a
extract (`ac-client-data-init` in `docker-compose.yml`). That tree has
no `ChrClasses` row for id **12** and no class-12 bit on
`SkillRaceClassInfo` rows, which would normally trigger:
`Class (12) not found in DBC while creating new char ... wrong DBC files or cheater?`
…and reject the create with `CHAR_CREATE_FAILED`.
To remove that gap, the repo ships
`modules/mod-paragon/data/sql/db-world/updates/2026_05_09_00.sql`,
which `INSERT`s the Paragon class-12 deltas into:
- `chrclasses_dbc` — 1 row defining class 12 ("Paragon", power=Mana,
family=Warrior, expansion=2).
- `skillraceclassinfo_dbc` — 235 rows replacing stock entries with the
patched ClassMask (class-12 bit OR'd in) so every baseline skill is
available to Paragon characters.
`AzerothCore`'s DBC loader (`DBCStores.cpp::LoadDBC` -> `LoadFromDB`)
merges these rows on top of whatever `data/dbc/` contains at every
worldserver boot. The DBUpdater in `ac-db-import` (Docker) or the
worldserver itself (native) applies the migration automatically — so
the **only** steps a fresh contributor needs are `git clone` and
`docker compose up -d`.
### Regenerating the migration
The SQL is auto-generated from the patched DBCs that already live
inside `patch-enUS-4.MPQ`. The bake script lives outside this repo
(per the repo-tidy policy) at:
`fractured-tooling/from-workspace-root/_gen_paragon_dbc_overlay_sql.py`
Re-run it whenever you change the Paragon DBC bake — for example,
adding a new race to the Paragon class mask. It diffs the patched
DBCs against a stock 3.3.5a DBC extract and emits a fresh
`2026_05_09_00.sql` (or successor migration with a new timestamp if
deltas change). Workflow:
```powershell
# Extract the patched DBCs once:
.\tools\mpq\mpqcli.exe extract `
"ChromieCraft_3.3.5a\Data\enUS\patch-enUS-4.MPQ" `
-o "$env:TEMP\paragon-dbc-extract"
# Regenerate the SQL migration:
python fractured-tooling\from-workspace-root\_gen_paragon_dbc_overlay_sql.py
```
If the regenerated SQL has new content, commit it as a **new** dated
migration filename (e.g. `2026_06_01_00.sql`) — never edit a file that
has already been applied to live databases, AC's DBUpdater will detect
the hash change and re-run the SQL, which can be fine but is best
reserved for emergencies.
### Manual DBC overlay (rare, fallback)
If you ever need the patched DBCs *on disk* — e.g. for a tool that
reads `data/dbc/` directly outside the worldserver, or to verify a
client-vs-server DBC mismatch — extract `patch-enUS-4.MPQ` and copy
its `DBFilesClient/*.dbc` into `data/dbc/`:
**Docker:**
```powershell
docker run --rm `
-v ac-client-data:/data `
-v ${PWD}\paragon-dbc-extract:/patch:ro `
alpine sh -c "cp -f /patch/*.dbc /data/dbc/"
docker compose restart ac-worldserver
```
**Native:** copy into `<CMAKE_INSTALL_PREFIX>/data/dbc/` and restart.
This is **not required** for normal operation — the SQL migration
covers everything `mod-paragon` needs at runtime. Use the manual
overlay only when you're consciously bypassing the SQL merge layer.
---
## Building the patches yourself
@@ -68,7 +246,12 @@ tools\build_paragon_advancement_patch.ps1 -Deploy # -> patch-enUS-6.MPQ
`patch-enUS-4.MPQ` is the DBC + GlueXML bake; the bake scripts live with
the rest of the dev tooling and are not part of this repo by design
(see the repo-tidy policy in `README.txt` next to this file).
(see the repo-tidy policy in `README.txt` next to this file). Typical
order on a maintainer machine:
1. `fractured-tooling/from-workspace-root/_patch_spell_dbc_runes.py` — stage `Spell.dbc` with `RuneCostID` cleared.
2. `fractured-tooling/from-workspace-root/_patch_spell_dbc_reagents.py` — same staged `Spell.dbc`, clear class-spell reagents for client preflight.
3. `fractured-tooling/from-workspace-root/_make_paragon_dbc_patch.py` — rebuild `ChrClasses` / `CharBaseInfo` / game tables, then pack `patch-enUS-4.MPQ`.
The patched `Wow.exe` is a one-time hex-edit of the stock 3.3.5a
client. The diff is publicly documented in the WoW emulation community
+16 -5
View File
@@ -42,15 +42,26 @@ CREATE TABLE `realmlist` (
--
-- Dumping data for table `realmlist`
--
-- Fractured defaults: `address` / `port` are the WORLD server (must match
-- WorldServerPort in worldserver.conf). Client auth uses RealmServerPort from
-- authserver.conf (Fractured dist: 47497), e.g. set realmlist hsrwow.net:47497
-- Adjust `localAddress` if your LAN/internal routing differs.
-- Defaults are tuned for fresh local installs: `address` is what the auth
-- server hands clients after login as the WORLD server endpoint. Stock
-- 127.0.0.1 means "the same box auth is running on", so a fresh
-- `git clone` -> `docker compose up` works without any post-install
-- tweaks for a developer hosting on their own machine.
--
-- Production deployments must override `address` after first dbimport,
-- e.g.:
-- UPDATE realmlist SET address = 'your.public.host', port = 8085 WHERE id = 1;
-- See contrib/fractured-dev-extras/BUILD-NATIVE.md for the full deploy
-- checklist (auth/world ports, firewall, public hostnames).
--
-- `port` is the WORLD server port (must match WorldServerPort in
-- worldserver.conf). The auth-server LISTEN port is separately configured
-- via RealmServerPort in authserver.conf (stock default 3724).
LOCK TABLES `realmlist` WRITE;
/*!40000 ALTER TABLE `realmlist` DISABLE KEYS */;
INSERT INTO `realmlist` VALUES
(1,'Fractured WoW','hsrwow.net','127.0.0.1','255.255.255.0',8085,0,0,1,0,0,12340);
(1,'Fractured WoW','127.0.0.1','127.0.0.1','255.255.255.0',8085,0,0,1,0,0,12340);
/*!40000 ALTER TABLE `realmlist` ENABLE KEYS */;
UNLOCK TABLES;
+1 -1
View File
@@ -24,7 +24,7 @@ CREATE TABLE `world_state` (
`Id` int unsigned NOT NULL COMMENT 'Internal save ID',
`Data` longtext,
PRIMARY KEY (`Id`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_0900_ai_ci COMMENT='WorldState save system';
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci COMMENT='WorldState save system';
/*!40101 SET character_set_client = @saved_cs_client */;
--
@@ -27,7 +27,7 @@ CREATE TABLE `player_shapeshift_model` (
`GenderID` tinyint unsigned NOT NULL,
`ModelID` int unsigned NOT NULL,
PRIMARY KEY (`ShapeshiftID`,`RaceID`,`CustomizationID`,`GenderID`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_0900_ai_ci PACK_KEYS=0;
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci PACK_KEYS=0;
/*!40101 SET character_set_client = @saved_cs_client */;
--
@@ -25,7 +25,7 @@ CREATE TABLE `player_totem_model` (
`RaceID` tinyint unsigned NOT NULL,
`ModelID` int unsigned NOT NULL,
PRIMARY KEY (`TotemID`,`RaceID`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_0900_ai_ci PACK_KEYS=0;
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci PACK_KEYS=0;
/*!40101 SET character_set_client = @saved_cs_client */;
--
@@ -12,6 +12,11 @@ Paragon.StickyComboPoints = 1
# in addition to runes/runic power. Required for the patch-enUS-5.MPQ player
# frame to populate Mana/Rage/Energy bars - otherwise the server treats those
# powers as inactive and never sends max values, leaving the bars empty.
# Also required for core rage generation: Unit::DealDamage only calls
# RewardRage() when the attacker HasActivePowerType(POWER_RAGE); if this is off,
# Paragon white swings never grant rage (users without this line in any loaded
# config used to hit the C++ fallback default of false). Default is on; set 0
# only if you intentionally want a stripped-down Paragon test build.
Paragon.MultiResource.HasActivePowers = 1
# Ability / Talent Essence (AE/TE) — Ascension-inspired currency
@@ -0,0 +1,62 @@
-- mod-paragon Character Advancement: Build catalog (saved loadouts).
-- ----------------------------------------------------------------------------
-- A "build" is a named, icon-tagged loadout of panel-purchased spells and
-- talent ranks. Each Paragon character can save many builds and swap
-- between them via the Builds page in the Character Advancement panel.
--
-- Swap workflow (see HandleBuildLoad in Paragon_Builds.cpp):
-- 1. If a build is currently active, snapshot the player's current
-- panel-purchased spells + per-spec talent ranks into that build's
-- recipe rows (overwriting the stored recipe).
-- 2. If the active build's hunter pet is currently summoned, unsummon
-- it to PET_SAVE_NOT_IN_SLOT and store its `pet_number` on the
-- active build row so it can be restored on swap-back.
-- 3. Reset all panel-bought abilities and talents (refunding AE/TE).
-- 4. Re-buy each spell + talent in the target build's recipe (charging
-- AE/TE; aborts if insufficient AE/TE -- player keeps refunded
-- currency in that case and active becomes NULL).
-- 5. Move the target build's parked pet (if any) back to current.
-- 6. Update active_build pointer.
--
-- Pet ownership: a parked pet sits in `character_pet` with slot=100
-- (PET_SAVE_NOT_IN_SLOT), exactly like the engine's stable-master
-- offload, but tied to the build via `pet_number` instead of any
-- in-game stable slot. Build deletion drops the parked pet rows
-- entirely (PET_SAVE_AS_DELETED equivalent) -- player is warned.
-- ----------------------------------------------------------------------------
CREATE TABLE IF NOT EXISTS `character_paragon_builds` (
`build_id` INT UNSIGNED NOT NULL AUTO_INCREMENT,
`guid` INT UNSIGNED NOT NULL COMMENT 'characters.guid',
`name` VARCHAR(32) NOT NULL,
`icon` VARCHAR(64) NOT NULL DEFAULT 'INV_Misc_QuestionMark',
`is_favorite` TINYINT UNSIGNED NOT NULL DEFAULT 0,
`pet_number` INT UNSIGNED NULL COMMENT 'character_pet.id of parked hunter pet, NULL when no pet bound to this build',
`created_at` TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP,
PRIMARY KEY (`build_id`),
KEY `idx_guid` (`guid`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci
COMMENT='mod-paragon: saved Character Advancement build catalog';
CREATE TABLE IF NOT EXISTS `character_paragon_build_spells` (
`build_id` INT UNSIGNED NOT NULL,
`spell_id` INT UNSIGNED NOT NULL,
PRIMARY KEY (`build_id`, `spell_id`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci
COMMENT='mod-paragon: per-build recipe -- panel-purchased spells';
CREATE TABLE IF NOT EXISTS `character_paragon_build_talents` (
`build_id` INT UNSIGNED NOT NULL,
`spec` TINYINT UNSIGNED NOT NULL COMMENT '0 = primary spec, 1 = secondary (dual spec)',
`talent_id` SMALLINT UNSIGNED NOT NULL,
`rank` TINYINT UNSIGNED NOT NULL,
PRIMARY KEY (`build_id`, `spec`, `talent_id`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci
COMMENT='mod-paragon: per-build recipe -- panel-purchased talent ranks per spec';
CREATE TABLE IF NOT EXISTS `character_paragon_active_build` (
`guid` INT UNSIGNED NOT NULL COMMENT 'characters.guid',
`build_id` INT UNSIGNED NOT NULL COMMENT 'character_paragon_builds.build_id (per-character active pointer)',
PRIMARY KEY (`guid`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci
COMMENT='mod-paragon: pointer to whichever build is currently loaded (one row per Paragon character)';
@@ -0,0 +1,30 @@
-- mod-paragon Character Advancement: Builds catalog schema cleanup.
-- ----------------------------------------------------------------------------
-- Two changes:
-- 1. Drop `is_favorite` -- the favorite flag and shift-click-to-favorite
-- flow are removed. Builds are now ordered solely by build_id ASC.
-- 2. Add `share_code` CHAR(6) -- a random alphanumeric token generated
-- server-side at build creation that uniquely identifies a saved
-- build across the realm. Players exchange codes out-of-band and
-- use the BuildsPane "Load Build!" share box to import a copy of
-- the build (name + icon + spell + talent recipe) into their own
-- catalog. The copy gets a fresh share_code so re-sharing is
-- always traceable to the latest owner; the original isn't touched.
--
-- The column is NULL-tolerant so any rows that pre-date this migration
-- (created under 2026_05_10_03's schema) coexist cleanly. The server
-- backfills NULLs lazily in PushBuildCatalog -- the next time a player
-- opens the BuildsPane on a Paragon character, any of their builds that
-- still have a NULL share_code will get one generated and persisted.
--
-- Charset: 31 unambiguous chars (A-Z minus I/O minus 0/1) gives 31^6 ~=
-- 887M codes; collision retry on insert keeps probability of a duplicate
-- vanishing for any realistic catalog size.
-- ----------------------------------------------------------------------------
ALTER TABLE `character_paragon_builds`
DROP COLUMN `is_favorite`,
ADD COLUMN `share_code` CHAR(6) NULL DEFAULT NULL
COMMENT 'random alphanumeric token for import-by-code; lazily generated'
AFTER `icon`,
ADD UNIQUE INDEX `uk_share_code` (`share_code`);
@@ -0,0 +1,34 @@
-- mod-paragon: preserve superseded share codes as importable snapshots.
-- ----------------------------------------------------------------------------
-- When an active build is updated (Learn All), the live row gets a new
-- share_code and a fresh recipe. Older codes the player posted to Discord
-- must keep working: each retired code is frozen here with its spell/talent
-- recipe so `C BUILD IMPORT <code>` still materializes that exact loadout.
-- ----------------------------------------------------------------------------
CREATE TABLE IF NOT EXISTS `character_paragon_build_share_archive` (
`share_code` CHAR(6) NOT NULL COMMENT 'retired code (same charset as live builds)',
`name` VARCHAR(32) NOT NULL,
`icon` VARCHAR(64) NOT NULL DEFAULT 'INV_Misc_QuestionMark',
`archived_at` TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP,
PRIMARY KEY (`share_code`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci
COMMENT='mod-paragon: frozen build metadata for retired share codes';
CREATE TABLE IF NOT EXISTS `character_paragon_build_share_archive_spells` (
`share_code` CHAR(6) NOT NULL,
`spell_id` INT UNSIGNED NOT NULL,
PRIMARY KEY (`share_code`, `spell_id`),
KEY `idx_share` (`share_code`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci
COMMENT='mod-paragon: spell recipe rows for an archived share code';
CREATE TABLE IF NOT EXISTS `character_paragon_build_share_archive_talents` (
`share_code` CHAR(6) NOT NULL,
`spec` TINYINT UNSIGNED NOT NULL,
`talent_id` SMALLINT UNSIGNED NOT NULL,
`rank` TINYINT UNSIGNED NOT NULL,
PRIMARY KEY (`share_code`, `spec`, `talent_id`),
KEY `idx_share` (`share_code`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci
COMMENT='mod-paragon: talent recipe rows for an archived share code';
@@ -21,6 +21,7 @@ INSERT INTO `paragon_spell_ae_cost` (`spell_id`, `ae_cost`) VALUES
(10, 1),
(17, 1),
(53, 1),
(66, 1),
(72, 1),
(75, 1),
(78, 1),
@@ -30,6 +31,7 @@ INSERT INTO `paragon_spell_ae_cost` (`spell_id`, `ae_cost`) VALUES
(118, 1),
(120, 1),
(122, 1),
(126, 1),
(130, 1),
(131, 1),
(132, 1),
@@ -52,6 +54,7 @@ INSERT INTO `paragon_spell_ae_cost` (`spell_id`, `ae_cost`) VALUES
(469, 1),
(475, 1),
(498, 1),
(526, 1),
(527, 1),
(528, 1),
(543, 1),
@@ -73,22 +76,28 @@ INSERT INTO `paragon_spell_ae_cost` (`spell_id`, `ae_cost`) VALUES
(676, 1),
(686, 1),
(687, 1),
(688, 1),
(689, 1),
(691, 1),
(693, 1),
(694, 1),
(697, 1),
(698, 1),
(702, 1),
(703, 1),
(706, 1),
(710, 1),
(712, 1),
(740, 1),
(755, 1),
(759, 1),
(768, 1),
(770, 1),
(772, 1),
(774, 1),
(779, 1),
(781, 1),
(783, 1),
(845, 1),
(853, 1),
(871, 1),
@@ -104,6 +113,7 @@ INSERT INTO `paragon_spell_ae_cost` (`spell_id`, `ae_cost`) VALUES
(1038, 1),
(1044, 1),
(1064, 1),
(1066, 1),
(1079, 1),
(1082, 1),
(1098, 1),
@@ -176,6 +186,7 @@ INSERT INTO `paragon_spell_ae_cost` (`spell_id`, `ae_cost`) VALUES
(2812, 1),
(2825, 1),
(2893, 1),
(2894, 1),
(2908, 1),
(2912, 1),
(2944, 1),
@@ -194,6 +205,7 @@ INSERT INTO `paragon_spell_ae_cost` (`spell_id`, `ae_cost`) VALUES
(3565, 1),
(3566, 1),
(3567, 1),
(3714, 1),
(3738, 1),
(4987, 1),
(5116, 1),
@@ -205,7 +217,9 @@ INSERT INTO `paragon_spell_ae_cost` (`spell_id`, `ae_cost`) VALUES
(5185, 1),
(5209, 1),
(5211, 1),
(5215, 1),
(5215, 1);
INSERT INTO `paragon_spell_ae_cost` (`spell_id`, `ae_cost`) VALUES
(5217, 1),
(5221, 1),
(5225, 1),
@@ -215,11 +229,10 @@ INSERT INTO `paragon_spell_ae_cost` (`spell_id`, `ae_cost`) VALUES
(5308, 1),
(5384, 1),
(5484, 1),
(5487, 1),
(5500, 1),
(5502, 1),
(5504, 1);
INSERT INTO `paragon_spell_ae_cost` (`spell_id`, `ae_cost`) VALUES
(5504, 1),
(5675, 1),
(5676, 1),
(5697, 1),
@@ -244,6 +257,8 @@ INSERT INTO `paragon_spell_ae_cost` (`spell_id`, `ae_cost`) VALUES
(6770, 1),
(6785, 1),
(6789, 1),
(6795, 1),
(6807, 1),
(6940, 1),
(7294, 1),
(7302, 1),
@@ -283,6 +298,7 @@ INSERT INTO `paragon_spell_ae_cost` (`spell_id`, `ae_cost`) VALUES
(11418, 1),
(11419, 1),
(11420, 1),
(12051, 1),
(13159, 1),
(13161, 1),
(13163, 1),
@@ -297,6 +313,7 @@ INSERT INTO `paragon_spell_ae_cost` (`spell_id`, `ae_cost`) VALUES
(16857, 1),
(16914, 1),
(18499, 1),
(19263, 1),
(19740, 1),
(19742, 1),
(19746, 1),
@@ -323,7 +340,6 @@ INSERT INTO `paragon_spell_ae_cost` (`spell_id`, `ae_cost`) VALUES
(20252, 1),
(20484, 1),
(20736, 1),
(21084, 1),
(21562, 1),
(21849, 1),
(22568, 1),
@@ -331,6 +347,8 @@ INSERT INTO `paragon_spell_ae_cost` (`spell_id`, `ae_cost`) VALUES
(22812, 1),
(22842, 1),
(23028, 1),
(23161, 1),
(23214, 1),
(23920, 1),
(23922, 1),
(24275, 1),
@@ -349,6 +367,7 @@ INSERT INTO `paragon_spell_ae_cost` (`spell_id`, `ae_cost`) VALUES
(29722, 1),
(29858, 1),
(29893, 1),
(30449, 1),
(30451, 1),
(30455, 1),
(30482, 1),
@@ -372,12 +391,14 @@ INSERT INTO `paragon_spell_ae_cost` (`spell_id`, `ae_cost`) VALUES
(33745, 1),
(33763, 1),
(33786, 1),
(33943, 1),
(34026, 1),
(34074, 1),
(34428, 1),
(34433, 1),
(34477, 1),
(34600, 1),
(34767, 1),
(35715, 1),
(35717, 1),
(36936, 1),
@@ -398,7 +419,9 @@ INSERT INTO `paragon_spell_ae_cost` (`spell_id`, `ae_cost`) VALUES
(47541, 1),
(47568, 1),
(47897, 1),
(48018, 1),
(48018, 1);
INSERT INTO `paragon_spell_ae_cost` (`spell_id`, `ae_cost`) VALUES
(48020, 1),
(48045, 1),
(48263, 1),
@@ -416,14 +439,19 @@ INSERT INTO `paragon_spell_ae_cost` (`spell_id`, `ae_cost`) VALUES
(49576, 1),
(49998, 1),
(50464, 1),
(50769, 1),
(50842, 1),
(51505, 1),
(51514, 1),
(51722, 1),
(51723, 1),
(52610, 1);
INSERT INTO `paragon_spell_ae_cost` (`spell_id`, `ae_cost`) VALUES
(51730, 1),
(52127, 1),
(52610, 1),
(53140, 1),
(53142, 1),
(53271, 1),
(53351, 1),
(53407, 1),
(53408, 1),
(53600, 1),
@@ -432,15 +460,23 @@ INSERT INTO `paragon_spell_ae_cost` (`spell_id`, `ae_cost`) VALUES
(54428, 1),
(55342, 1),
(55694, 1),
(56222, 1),
(56641, 1),
(56815, 1),
(57330, 1),
(57755, 1),
(57934, 1),
(57994, 1),
(60192, 1),
(61846, 1),
(61999, 1),
(62078, 1),
(62124, 1),
(62757, 1),
(64382, 1),
(64843, 1);
(64843, 1),
(64901, 1),
(66842, 1),
(66843, 1),
(66844, 1);
@@ -0,0 +1,270 @@
-- mod-paragon: server-side DBC overlay for class 12 (Paragon).
-- Auto-generated by fractured-tooling/from-workspace-root/
-- _gen_paragon_dbc_overlay_sql.py
--
-- AzerothCore's DBCStores.cpp::LoadDBC merges every <table>_dbc
-- world-DB row on top of the on-disk DBC store at startup
-- (storage.LoadFromDB). We use that to ship Paragon's class-12
-- DBC deltas in SQL form so a stock data/dbc/ tree (e.g. the
-- vanilla `ac-wotlk-client-data` Docker image) still resolves
-- class 12 in sChrClassesStore and class-12 entries in
-- sSkillRaceClassInfoStore.
--
-- Without this migration, fresh installs hit:
-- CHAR_CREATE_FAILED -- "Class (12) not found in DBC ..."
-- the moment a contributor tries to roll a Paragon character.
--
-- This file is regenerated end-to-end from patch-enUS-4.MPQ;
-- do not hand-edit. Update the patched DBC source and rerun
-- the bake script.
-- chrclasses_dbc: classes added or modified by patch-enUS-4.MPQ.
-- AzerothCore merges this on top of the on-disk ChrClasses.dbc
-- so a stock data/dbc tree still gets class 12 at runtime.
DELETE FROM `chrclasses_dbc` WHERE `ID` IN (12);
INSERT INTO `chrclasses_dbc` (`ID`,`Field01`,`DisplayPower`,`PetNameToken`,`Name_Lang_enUS`,`Name_Lang_Mask`,`Name_Female_Lang_Mask`,`Name_Male_Lang_Mask`,`Filename`,`SpellClassSet`,`Flags`,`CinematicSequenceID`,`Required_Expansion`) VALUES
(12, 0, 0, 0, 'Paragon', 0, 0, 0, 'PARAGON', 4, 50, 0, 2);
-- skillraceclassinfo_dbc: rows where patch-enUS-4 OR'd the
-- class-12 bit (0x800) into ClassMask, opening every
-- baseline skill to Paragon. Replaces the stock row by ID so
-- AzerothCore picks the patched mask on the SQL merge pass.
DELETE FROM `skillraceclassinfo_dbc` WHERE `ID` IN (
57,301,107,82,75,140,328,638,872,880,881,885,886,910,117,335,628,629,630,912,126,127,133,134,635,31,39,135,325,636,637,643,644,888,889,914,125,626,884,898,901,58,60,916,59,40,41,68,48,49,44,45,42,43,50,51,131,132,883,913,105,71,70,69,925,54,25,138,139,91,882,85,84,93,88,865,87,441,94,443,92,481,89,442,123,124,624,625,702,908,6,922,33,243,899,241,122,621,622,701,907,970,129,323,631,632,633,634,641,642,142,143,639,640,28,63,282,29,284,65,97,244,940,72,128,878,879,137,144,136,915,55,79,81,76,149,112,111,106,66,26,83,74,73,108,109,110,113,38,35,36,37,61,62,64,24,34,21,906,46,47,52,53,281,104,102,101,27,95,98,96,30,145,146,147,148,151,155,158,159,271,175,178,183,186,270,189,191,193,198,200,265,266,203,204,205,268,269,246,272,330,381,403,445,446,461,501,463,464,521,522,541,544,581,601,741,742,781,841,861,862,866,867,877,934,892,896,897,951,895,900,936,938,939,947
);
INSERT INTO `skillraceclassinfo_dbc` (`ID`,`SkillID`,`RaceMask`,`ClassMask`,`Flags`,`MinLevel`,`SkillTierID`,`SkillCostIndex`) VALUES
(57,6,-1,2176,1040,0,0,0),
(301,8,-1,2176,1040,0,0,0),
(107,26,-1,2049,1040,0,0,0),
(82,38,-1,2056,1040,0,0,0),
(75,39,-1,2056,1040,0,0,0),
(140,43,1115,2049,128,0,0,0),
(328,43,3071,2052,128,0,0,0),
(638,43,164,2049,128,0,0,0),
(872,43,32767,2056,128,0,0,0),
(880,43,1024,2052,128,0,0,0),
(881,43,32767,2432,128,0,0,0),
(885,43,1029,2050,128,0,0,0),
(886,43,512,2050,128,0,0,0),
(910,43,262143,2080,128,0,0,0),
(117,44,166,2052,128,0,0,0),
(335,44,2147483647,2122,128,0,0,0),
(628,44,1544,2052,128,0,0,0),
(629,44,167,2049,128,0,0,0),
(630,44,1112,2049,128,0,0,0),
(912,44,262143,2080,128,0,0,0),
(126,45,650,2052,128,0,0,0),
(127,45,32767,2061,128,0,0,0),
(133,46,36,2052,128,0,0,0),
(134,46,32767,2057,128,0,0,0),
(635,46,1674,2052,128,0,0,0),
(31,50,-1,2052,1040,0,0,0),
(39,51,-1,2052,1040,0,0,0),
(135,54,2147483647,2128,128,0,0,0),
(325,54,-1,2056,128,0,0,0),
(636,54,1133,2049,128,0,0,0),
(637,54,658,2049,128,0,0,0),
(643,54,8,3072,128,0,0,0),
(644,54,32,3072,128,0,0,0),
(888,54,261631,2050,128,0,0,0),
(889,54,512,2050,128,0,0,0),
(914,54,262143,2080,128,0,0,0),
(125,55,262143,2052,128,0,0,0),
(626,55,163839,2049,128,0,0,0),
(884,55,512,2050,128,0,0,0),
(898,55,262143,2080,128,0,0,0),
(901,55,261631,2050,128,0,0,0),
(58,56,-1,2064,1040,0,0,0),
(60,78,-1,2064,1040,0,0,0),
(916,95,524287,2080,640,0,0,0),
(59,96,2047,3072,1168,0,0,0),
(40,98,1101,3583,128,0,0,0),
(41,98,674,3551,160,0,21,0),
(68,101,4,3583,1170,0,0,0),
(48,109,690,3583,128,0,0,0),
(49,109,1101,3551,160,0,21,0),
(44,111,4,3583,128,0,0,0),
(45,111,2043,3551,160,0,21,0),
(42,113,8,3583,128,0,0,0),
(43,113,2039,3551,160,0,21,0),
(50,115,32,3583,128,0,0,0),
(51,115,2015,3551,160,0,21,0),
(131,118,32767,2056,146,1,0,0),
(132,118,32767,2053,146,20,0,0),
(883,118,32767,2112,402,0,0,0),
(913,118,262143,2080,146,0,0,0),
(105,120,2047,2304,1170,0,0,0),
(71,124,32,3583,1170,0,0,0),
(70,125,2,3583,146,0,0,0),
(69,126,8,3583,1170,0,0,0),
(925,129,-1,2080,128,0,63,0),
(54,130,2047,2176,1168,4,0,0),
(25,134,-1,3072,1040,10,0,0),
(138,136,32767,3536,128,0,0,0),
(139,136,32767,2053,128,0,0,0),
(91,137,1535,3551,160,0,21,0),
(882,137,512,3583,128,0,0,0),
(85,138,2047,3583,128,0,0,0),
(84,139,2047,3583,160,0,21,0),
(93,140,2047,3583,128,0,0,0),
(88,141,2047,3583,160,0,21,0),
(865,142,2047,3583,0,0,0,0),
(87,148,1,3551,1170,0,181,0),
(441,148,222,3583,1170,0,182,0),
(94,149,2,3551,1170,0,181,0),
(443,149,509,3583,1170,0,182,0),
(92,150,8,3551,1170,0,181,0),
(481,150,215,3583,1170,0,182,0),
(89,152,4,3551,1170,0,181,0),
(442,152,219,3583,1170,0,182,0),
(123,160,262143,2050,128,0,0,0),
(124,160,-1,3072,128,0,0,0),
(624,160,32,2049,128,0,0,0),
(625,160,262111,2049,128,0,0,0),
(702,160,-1,2112,128,0,0,0),
(908,160,262143,2080,128,0,0,0),
(6,162,2147483647,3551,128,0,0,0),
(922,162,262143,2080,128,0,0,0),
(33,163,-1,2052,1040,0,0,0),
(243,164,2047,3583,160,0,41,0),
(899,165,2047,3583,160,0,41,0),
(241,171,2047,3583,160,0,41,0),
(122,172,163839,2050,128,0,0,0),
(621,172,6,2049,128,0,0,0),
(622,172,1529,2049,128,0,0,0),
(701,172,163839,2112,128,0,0,0),
(907,172,524287,2080,128,0,0,0),
(970,172,163839,2052,128,0,0,0),
(129,173,32767,2312,128,0,0,0),
(323,173,32767,2256,128,0,0,0),
(631,173,520,2052,128,0,0,0),
(632,173,1190,2052,128,0,0,0),
(633,173,216,2049,128,0,0,0),
(634,173,1063,2049,128,0,0,0),
(641,173,32,3072,128,0,0,0),
(642,173,8,3072,128,0,0,0),
(142,176,-1,2056,128,0,0,0),
(143,176,-1,2052,128,0,0,0),
(639,176,128,2049,128,0,0,0),
(640,176,262015,2049,128,0,0,0),
(28,182,2047,3583,160,0,2,0),
(63,184,-1,2050,1040,0,0,0),
(282,185,2047,3583,128,0,61,0),
(29,186,2047,3583,160,0,2,0),
(284,197,2047,3583,160,0,62,0),
(65,198,2047,2050,1168,0,0,0),
(97,199,2047,2112,1168,0,0,0),
(244,202,2047,3583,160,0,41,0),
(940,205,524287,2176,2048,0,0,0),
(72,220,16,3583,1170,0,0,0),
(128,226,32767,2057,128,0,0,0),
(878,226,1024,2052,128,0,0,0),
(879,226,31743,2052,128,0,0,0),
(137,227,2047,3077,128,0,0,0),
(144,228,-1,2448,128,0,0,0),
(136,229,32767,3079,128,20,0,0),
(915,229,262143,2080,128,0,0,0),
(55,237,-1,2176,1040,0,0,0),
(79,238,2047,2056,1168,4,0,0),
(81,239,2047,2056,1168,0,0,0),
(76,241,2047,2056,128,40,0,0),
(149,242,2047,2056,1168,16,0,0),
(112,243,2047,2049,1170,0,0,0),
(111,244,2047,2049,1168,0,0,0),
(106,245,2047,2049,1168,0,0,0),
(66,246,2047,2050,1168,0,0,0),
(26,247,2047,3072,1168,20,0,0),
(83,252,2047,2057,128,0,0,0),
(74,253,-1,2056,1040,0,0,0),
(73,254,2047,2056,1168,10,0,0),
(108,255,2047,2049,1168,0,0,0),
(109,256,-1,2049,1040,0,0,0),
(110,257,-1,2049,1040,0,0,0),
(113,258,2047,2049,1168,10,0,0),
(38,260,2047,2052,128,0,0,0),
(35,262,2047,2052,128,0,0,0),
(36,263,2047,2052,128,0,0,0),
(37,264,2047,2052,128,0,0,0),
(61,267,-1,2050,1040,0,0,0),
(62,268,2047,2050,1170,0,0,0),
(64,269,2047,2050,1168,0,0,0),
(24,272,2047,3072,1168,10,0,0),
(34,273,2047,2052,128,0,0,0),
(21,293,2047,2051,128,40,0,0),
(906,293,262143,2080,128,0,0,0),
(46,313,64,3583,128,0,0,0),
(47,313,1983,3551,160,0,21,0),
(52,315,128,3583,128,0,0,0),
(53,315,1919,3551,160,0,21,0),
(281,333,2047,3583,160,0,62,0),
(104,353,2047,2304,1170,0,0,0),
(102,354,-1,2304,1040,0,0,0),
(101,355,-1,2304,1040,0,0,0),
(27,356,2047,3583,128,0,23,0),
(95,373,-1,2112,1040,0,0,0),
(98,374,262143,2112,1040,0,0,0),
(96,375,262143,2112,1040,0,0,0),
(30,393,2047,3583,160,0,161,0),
(145,413,2047,2116,128,40,0,0),
(146,413,2047,2083,128,0,0,0),
(147,414,2047,3183,128,0,0,0),
(148,415,2047,3583,128,0,0,0),
(151,416,2047,2049,192,0,0,0),
(155,416,2047,2050,192,0,0,1),
(158,416,2047,3136,192,0,0,1),
(159,416,2047,2060,192,0,0,1),
(271,416,2047,2448,192,0,0,2),
(175,418,2047,2049,384,0,0,0),
(178,418,2047,2050,384,0,0,0),
(183,418,2047,3332,384,0,0,1),
(186,418,2047,2192,384,0,0,1),
(270,418,2047,2120,384,0,0,1),
(189,419,2047,2060,640,0,0,2),
(191,419,2047,3072,640,0,0,1),
(193,419,2047,2192,640,0,0,0),
(198,419,2047,2050,640,0,0,1),
(200,419,2047,2049,640,0,0,2),
(265,419,2047,2304,640,0,0,0),
(266,419,2047,2112,640,0,0,1),
(203,420,2047,2061,1152,0,0,2),
(204,420,2047,3074,1152,0,0,1),
(205,420,2047,2320,1152,0,0,0),
(268,420,2047,2176,1152,0,0,0),
(269,420,2047,2112,1152,0,0,1),
(246,433,2047,2115,128,0,0,0),
(272,453,2047,2051,128,0,0,0),
(330,473,4095,3149,130,0,0,0),
(381,493,8,3583,164,0,0,0),
(403,515,2047,3551,128,0,0,0),
(445,533,128,3583,1170,0,181,0),
(446,533,95,3551,1170,0,182,0),
(461,553,64,3551,1170,0,181,0),
(501,553,4,3583,1170,0,182,0),
(463,554,16,3551,1170,0,181,0),
(464,554,207,3583,1170,0,182,0),
(521,573,-1,3072,1040,0,0,0),
(522,574,-1,3072,1040,0,0,0),
(541,593,-1,2304,1040,0,0,0),
(544,594,-1,2050,1040,0,0,0),
(581,613,-1,2064,1040,0,0,0),
(601,633,-1,2056,128,0,0,0),
(741,673,16,3583,128,0,0,0),
(742,673,2031,3551,160,0,21,0),
(781,713,255,3583,1170,0,181,0),
(841,733,128,3583,1170,0,0,0),
(861,753,64,3583,1170,0,0,0),
(862,754,1,3583,1170,0,0,0),
(866,755,2047,3583,160,0,41,0),
(867,756,512,3583,146,0,0,0),
(877,760,1024,3583,146,0,0,0),
(934,762,524287,2080,144,0,223,0),
(892,769,32767,3583,1040,0,0,0),
(896,770,-1,2080,1040,0,0,0),
(897,771,262143,2080,1040,0,0,0),
(951,771,2097151,3583,0,0,0,0),
(895,772,-1,2080,1040,0,0,0),
(900,773,262143,3583,160,0,41,0),
(936,776,262143,2080,128,0,0,0),
(938,777,524287,3583,2,0,0,0),
(939,778,524287,3583,2,0,0,0),
(947,778,2097151,3583,0,0,0,0);
@@ -0,0 +1,179 @@
-- mod-paragon: starter spawn data for class 12 (Paragon).
--
-- Companion to 2026_05_09_00.sql. The DBC overlay teaches the world
-- server that class 12 exists; this migration teaches it WHERE
-- characters of that class spawn, what action bar they boot with,
-- and what per-level base stats to integrity-check against.
--
-- Without these rows, character creation fails inside Player::Create:
--
-- PlayerInfo const* info = sObjectMgr->GetPlayerInfo(race, class);
-- if (!info) {
-- LOG_ERROR("entities.player",
-- "Player::Create: ... invalid race/class pair ({}/{})"
-- " - refusing to do so.", ..., race, class);
-- return false; // -> client sees "Error creating character"
-- }
--
-- and on world load the player_class_stats integrity check trips:
--
-- "Class N Level L does not have stats data!"
--
-- Tables touched:
-- - playercreateinfo : (race, class=12) -> map/zone/x/y/z
-- Race-specific starting zones (Paragon
-- spawns in each race's standard newbie
-- area, NOT Acherus, since it is a
-- from-level-1 class).
-- - playercreateinfo_action : (race, class=12, button) -> action,type
-- Default action bar layout per race.
-- - player_class_stats : (class=12, level 1..80) -> base stats
-- Per-level HP/Mana/STR/AGI/STA/INT/SPI
-- used by Player::InitStatsForLevel.
--
-- Tables intentionally NOT touched here:
-- - playercreateinfo_item : Paragon ships no per-class starting
-- items; gear comes from the racial
-- kit only.
-- - playercreateinfo_skills / _cast_spell / _spell_custom :
-- These are mask-based. Class-12 baseline
-- weapon/defense skills come through
-- classMask=0 ("all classes") rows that
-- already cover Paragon. The DBC overlay
-- in 2026_05_09_00.sql opens
-- SkillRaceClassInfo for class 12.
-- Idempotent: blow away any pre-existing class-12 rows first so this
-- migration can be replayed cleanly on a partially-seeded DB (e.g.
-- after a contributor manually patched their local DB before this
-- migration landed).
DELETE FROM `playercreateinfo` WHERE `class` = 12;
DELETE FROM `playercreateinfo_action` WHERE `class` = 12;
DELETE FROM `player_class_stats` WHERE `Class` = 12;
-- ---------------------------------------------------------------
-- playercreateinfo (10 rows: every DK-eligible race, racial start)
-- ---------------------------------------------------------------
INSERT INTO `playercreateinfo` (`race`, `class`, `map`, `zone`, `position_x`, `position_y`, `position_z`, `orientation`) VALUES
( 1, 12, 0, 12, -8949.95, -132.493, 83.5312, 0 ), -- Human -> Northshire, Elwynn Forest
( 2, 12, 1, 14, -618.518, -4251.67, 38.718, 0 ), -- Orc -> Valley of Trials, Durotar
( 3, 12, 0, 1, -6240.32, 331.033, 382.758, 6.17716 ), -- Dwarf -> Coldridge Valley, Dun Morogh
( 4, 12, 1, 141, 10311.3, 832.463, 1326.41, 5.69632 ), -- Night Elf -> Shadowglen, Teldrassil
( 5, 12, 0, 85, 1676.71, 1678.31, 121.67, 2.70526 ), -- Undead -> Deathknell, Tirisfal
( 6, 12, 1, 215, -2917.58, -257.98, 52.9968, 0 ), -- Tauren -> Camp Narache, Mulgore
( 7, 12, 0, 1, -6240.32, 331.033, 382.758, 0 ), -- Gnome -> Coldridge Valley (shared)
( 8, 12, 1, 14, -618.518, -4251.67, 38.718, 0 ), -- Troll -> Valley of Trials (shared)
(10, 12, 530, 3431, 10349.6, -6357.29, 33.4026, 5.31605 ), -- Blood Elf -> Sunstrider Isle, Eversong
(11, 12, 530, 3526, -3961.64,-13931.2, 100.615, 2.08364 ); -- Draenei -> Ammen Vale, Azuremyst Isle
-- ---------------------------------------------------------------
-- playercreateinfo_action (46 rows)
-- Buttons: 72=Attack(6603), 73=Eat(78), 74=racial, 75=race-extra,
-- 82=Skinning(59752, Tauren only), 84=Attack, 96=Attack
-- ---------------------------------------------------------------
INSERT INTO `playercreateinfo_action` (`race`, `class`, `button`, `action`, `type`) VALUES
( 1, 12, 72, 6603, 0), ( 1, 12, 73, 78, 0), ( 1, 12, 82, 59752, 0),
( 1, 12, 84, 6603, 0), ( 1, 12, 96, 6603, 0),
( 2, 12, 72, 6603, 0), ( 2, 12, 73, 78, 0), ( 2, 12, 74, 20572, 0),
( 2, 12, 84, 6603, 0), ( 2, 12, 96, 6603, 0),
( 3, 12, 72, 6603, 0), ( 3, 12, 73, 78, 0), ( 3, 12, 74, 20594, 0),
( 3, 12, 75, 2481, 0), ( 3, 12, 84, 6603, 0), ( 3, 12, 96, 6603, 0),
( 4, 12, 72, 6603, 0), ( 4, 12, 73, 78, 0), ( 4, 12, 74, 58984, 0),
( 4, 12, 84, 6603, 0), ( 4, 12, 96, 6603, 0),
( 5, 12, 72, 6603, 0), ( 5, 12, 73, 78, 0), ( 5, 12, 74, 20577, 0),
( 5, 12, 84, 6603, 0), ( 5, 12, 96, 6603, 0),
( 6, 12, 72, 6603, 0), ( 6, 12, 73, 78, 0), ( 6, 12, 74, 20549, 0),
( 6, 12, 84, 6603, 0), ( 6, 12, 96, 6603, 0),
( 7, 12, 72, 6603, 0), ( 7, 12, 73, 78, 0), ( 7, 12, 84, 6603, 0),
( 7, 12, 96, 6603, 0),
( 8, 12, 72, 6603, 0), ( 8, 12, 73, 78, 0), ( 8, 12, 74, 2764, 0),
( 8, 12, 75, 26297, 0), ( 8, 12, 84, 6603, 0), ( 8, 12, 96, 6603, 0),
(11, 12, 72, 6603, 0), (11, 12, 73, 78, 0), (11, 12, 74, 28880, 0),
(11, 12, 84, 6603, 0), (11, 12, 96, 6603, 0);
-- ---------------------------------------------------------------
-- player_class_stats (80 rows: levels 1..80 per-class base stats)
-- Curve mirrors Warrior baseline -> Paladin past 60 (vehicle-style HP
-- inflation past 60 to keep Paragon competitive in Wrath content).
-- ---------------------------------------------------------------
INSERT INTO `player_class_stats` (`Class`, `Level`, `BaseHP`, `BaseMana`, `Strength`, `Agility`, `Stamina`, `Intellect`, `Spirit`) VALUES
(12, 1, 20, 60, 23, 20, 22, 20, 20),
(12, 2, 29, 66, 24, 21, 23, 20, 20),
(12, 3, 38, 73, 25, 21, 24, 20, 21),
(12, 4, 47, 81, 26, 22, 25, 20, 21),
(12, 5, 56, 90, 28, 23, 26, 20, 21),
(12, 6, 65, 100, 29, 24, 27, 21, 21),
(12, 7, 74, 111, 30, 24, 28, 21, 22),
(12, 8, 83, 123, 31, 25, 29, 21, 22),
(12, 9, 92, 136, 32, 26, 30, 21, 22),
(12, 10, 97, 150, 33, 26, 31, 21, 23),
(12, 11, 103, 165, 35, 27, 33, 21, 23),
(12, 12, 109, 182, 36, 28, 34, 21, 23),
(12, 13, 118, 200, 37, 29, 35, 21, 24),
(12, 14, 128, 219, 39, 30, 36, 22, 24),
(12, 15, 139, 239, 40, 30, 37, 22, 24),
(12, 16, 151, 260, 41, 31, 38, 22, 25),
(12, 17, 154, 282, 42, 32, 40, 22, 25),
(12, 18, 168, 305, 44, 33, 41, 22, 25),
(12, 19, 183, 329, 45, 34, 42, 22, 26),
(12, 20, 199, 354, 47, 35, 43, 22, 26),
(12, 21, 206, 380, 48, 35, 45, 23, 26),
(12, 22, 224, 392, 49, 36, 46, 23, 27),
(12, 23, 243, 420, 51, 37, 47, 23, 27),
(12, 24, 253, 449, 52, 38, 49, 23, 28),
(12, 25, 274, 479, 54, 39, 50, 23, 28),
(12, 26, 296, 509, 55, 40, 51, 23, 28),
(12, 27, 309, 524, 57, 41, 53, 23, 29),
(12, 28, 333, 554, 58, 42, 54, 24, 29),
(12, 29, 348, 584, 60, 43, 56, 24, 30),
(12, 30, 374, 614, 62, 44, 57, 24, 30),
(12, 31, 401, 629, 63, 45, 58, 24, 30),
(12, 32, 419, 659, 65, 46, 60, 24, 31),
(12, 33, 448, 689, 66, 47, 61, 24, 31),
(12, 34, 468, 704, 68, 48, 63, 25, 32),
(12, 35, 499, 734, 70, 49, 64, 25, 32),
(12, 36, 521, 749, 72, 50, 66, 25, 33),
(12, 37, 545, 779, 73, 51, 68, 25, 33),
(12, 38, 581, 809, 75, 52, 69, 25, 33),
(12, 39, 609, 824, 77, 53, 71, 26, 34),
(12, 40, 649, 854, 79, 54, 72, 26, 34),
(12, 41, 681, 869, 80, 56, 74, 26, 35),
(12, 42, 715, 899, 82, 57, 76, 26, 35),
(12, 43, 761, 914, 84, 58, 77, 26, 36),
(12, 44, 799, 944, 86, 59, 79, 26, 36),
(12, 45, 839, 959, 88, 60, 81, 27, 37),
(12, 46, 881, 989, 90, 61, 83, 27, 37),
(12, 47, 935, 1004, 92, 63, 84, 27, 38),
(12, 48, 981, 1019, 94, 64, 86, 27, 38),
(12, 49, 1029, 1049, 96, 65, 88, 28, 39),
(12, 50, 1079, 1064, 98, 66, 90, 28, 39),
(12, 51, 1131, 1079, 100, 68, 92, 28, 40),
(12, 52, 1185, 1109, 102, 69, 94, 28, 40),
(12, 53, 1241, 1124, 104, 70, 96, 28, 41),
(12, 54, 1299, 1139, 106, 72, 98, 29, 42),
(12, 55, 1359, 1154, 109, 73, 100, 29, 42),
(12, 56, 1421, 1169, 111, 74, 102, 29, 43),
(12, 57, 1485, 1199, 113, 76, 104, 29, 43),
(12, 58, 1551, 1214, 115, 77, 106, 30, 44),
(12, 59, 1619, 1229, 118, 79, 108, 30, 44),
(12, 60, 1689, 1244, 120, 80, 110, 30, 45),
(12, 61, 1902, 1357, 122, 81, 112, 30, 46),
(12, 62, 2129, 1469, 125, 83, 114, 30, 46),
(12, 63, 2357, 1582, 127, 84, 117, 31, 47),
(12, 64, 2612, 1694, 130, 86, 119, 31, 47),
(12, 65, 2883, 1807, 132, 88, 121, 31, 48),
(12, 66, 3169, 1919, 135, 89, 123, 32, 49),
(12, 67, 3455, 2032, 137, 91, 126, 32, 49),
(12, 68, 3774, 2145, 140, 92, 128, 32, 50),
(12, 69, 4109, 2257, 142, 94, 130, 32, 51),
(12, 70, 4444, 2370, 145, 96, 133, 33, 51),
(12, 71, 4720, 2482, 148, 97, 135, 33, 52),
(12, 72, 5013, 2595, 150, 99, 138, 33, 53),
(12, 73, 5325, 2708, 153, 101, 140, 33, 54),
(12, 74, 5656, 2820, 156, 102, 143, 34, 54),
(12, 75, 6008, 2933, 159, 104, 145, 34, 55),
(12, 76, 6381, 3045, 162, 106, 148, 34, 56),
(12, 77, 6778, 3158, 165, 108, 151, 35, 57),
(12, 78, 7198, 3270, 168, 109, 153, 35, 57),
(12, 79, 7646, 3383, 171, 111, 156, 35, 58),
(12, 80, 8121, 3496, 174, 113, 159, 36, 59);
@@ -0,0 +1,50 @@
-- mod-paragon: starter weapon / armor skills for class 12 (Paragon).
--
-- Companion to 2026_05_10_00.sql. The spawn-data migration teaches
-- Player::Create *that* class 12 exists at a given race; this one
-- teaches it which weapon and armor skill lines to grant on first
-- character login.
--
-- Without these rows a fresh Paragon character lands in their newbie
-- zone with **no** weapon or armor proficiencies (auto-attack greys
-- out the moment they equip anything beyond a fist). The classMask=0
-- "all classes" rows in playercreateinfo_skills only cover Defense,
-- Unarmed, Cloth, the racial / language skills, Mounts and
-- Companion Pets -- which is exactly what bare-fisted, naked
-- characters look like.
--
-- Paragon plays every class, so it grants every weapon / armor
-- proficiency at level 1. The skillline rows themselves are still
-- gated by skillraceclassinfo_dbc (handled in 2026_05_09_00.sql),
-- so the client/server agree on what's allowed.
--
-- Idempotent: deletes any pre-existing classMask=2048 rows first
-- (class 12 owns this bitmask on Fractured) so the migration can
-- replay cleanly on a partially-seeded DB.
DELETE FROM `playercreateinfo_skills` WHERE `classMask` = 2048;
INSERT INTO `playercreateinfo_skills`
(`raceMask`, `classMask`, `skill`, `rank`, `comment`) VALUES
-- Weapon proficiencies
(0, 2048, 43, 0, 'Paragon - Swords'),
(0, 2048, 44, 0, 'Paragon - Axes'),
(0, 2048, 45, 0, 'Paragon - Bows'),
(0, 2048, 46, 0, 'Paragon - Guns'),
(0, 2048, 54, 0, 'Paragon - Maces'),
(0, 2048, 55, 0, 'Paragon - Two-Handed Swords'),
(0, 2048, 118, 0, 'Paragon - Dual Wield'),
(0, 2048, 136, 0, 'Paragon - Staves'),
(0, 2048, 160, 0, 'Paragon - Two-Handed Maces'),
(0, 2048, 172, 0, 'Paragon - Two-Handed Axes'),
(0, 2048, 173, 0, 'Paragon - Daggers'),
(0, 2048, 176, 0, 'Paragon - Thrown'),
(0, 2048, 226, 0, 'Paragon - Crossbows'),
(0, 2048, 228, 0, 'Paragon - Wands'),
(0, 2048, 229, 0, 'Paragon - Polearms'),
(0, 2048, 473, 0, 'Paragon - Fist Weapons'),
-- Armor proficiencies (Cloth is in a classMask=0 row already)
(0, 2048, 293, 0, 'Paragon - Plate Mail'),
(0, 2048, 413, 0, 'Paragon - Mail'),
(0, 2048, 414, 0, 'Paragon - Leather'),
(0, 2048, 433, 0, 'Paragon - Shield');
File diff suppressed because one or more lines are too long
@@ -0,0 +1,27 @@
-- mod-paragon: Blood Elf "Arcane Torrent" uses three spell IDs in WotLK
-- (28730 mana/casters, 25046 rogue energy, 50613 death knight runic power),
-- all on racial skill line 756. Migration 2026_05_10_02 OR'd class 12 into
-- every SkillLineAbility delta from patch-enUS-4, so Paragon Blood Elves
-- auto-learned all three and the spellbook showed three identical entries.
--
-- Paragon should learn a single combined Arcane Torrent that refunds mana,
-- energy, AND runic power -- whichever pool the character is using at the
-- moment. We keep spell 28730 as the in-book entry for class 12 and attach
-- the SpellScript spell_paragon_arcane_torrent (modules/mod-paragon/src/
-- Paragon_SC.cpp) so casts by a Paragon also EnergizeBySpell energy + RP on
-- top of the stock mana effect. Other classes' Blood Elves are unaffected.
--
-- IDs 13338 / 17510 match stock WotLK SkillLineAbility rows for spells 25046
-- / 50613 on skill line 756.
UPDATE `skilllineability_dbc`
SET `ClassMask` = `ClassMask` & ~2048
WHERE `ID` IN (13338, 17510);
-- Bind spell_paragon_arcane_torrent (defined in Paragon_SC.cpp) to spell
-- 28730. AC's `spell_script_names` is the standard mapping: script name on
-- the right, spell id on the left. Idempotent via DELETE + INSERT.
DELETE FROM `spell_script_names`
WHERE `spell_id` = 28730 AND `ScriptName` = 'spell_paragon_arcane_torrent';
INSERT INTO `spell_script_names` (`spell_id`, `ScriptName`) VALUES
(28730, 'spell_paragon_arcane_torrent');
@@ -0,0 +1,30 @@
-- mod-paragon: extend ItemTemplate::AllowableClass to include class 12
-- (Paragon, bit 1<<11 = 2048) for every class-restricted item.
--
-- Server-side, Player::CanUseItem (PlayerStorage.cpp) already short-
-- circuits the AllowableClass check for class 12. That's enough for any
-- code path the server controls (vendor list filter, AH "usable" filter,
-- CanRollForItemInLFG, CanBuyItem). It is NOT enough on the 3.3.5 client:
-- the WoW.exe binary independently pre-checks AllowableClass against the
-- player's class on right-click of a bag item and refuses *locally* with
-- the red "You can't use that item." text in UIErrorsFrame, never sending
-- CMSG_USE_ITEM at all. Server logs stay silent; only client knows it
-- refused.
--
-- Fix: OR class 12's bit into AllowableClass on every class-restricted
-- row so the client engine's pre-check passes for Paragon. Other
-- classes' bits are unchanged, so e.g. a warrior-only item is still
-- warrior-only for everyone except Paragon. Items with AllowableClass
-- == -1 ("all classes") or 0 ("no restriction recorded") already pass
-- the client engine's check and are not touched.
--
-- After applying this migration the *client* still caches item info in
-- Cache/<locale>/itemcache.wdb. Players who already inspected the item
-- before the change must delete that file (or the whole Cache folder)
-- and reconnect to repopulate it from the worldserver, otherwise the
-- stale cached AllowableClass keeps the engine pre-check failing.
UPDATE `item_template`
SET `AllowableClass` = `AllowableClass` | 2048
WHERE `AllowableClass` > 0
AND (`AllowableClass` & 2048) = 0;
@@ -0,0 +1,62 @@
-- mod-paragon: backfill paragon_spell_ae_cost rows for spells newly exposed
-- by the Character Advancement panel after removing the over-aggressive
-- ClassMask=0 filter from tools/_gen_paragon_advancement_spells_lua.py.
--
-- The base file (data/sql/db-world/base/paragon_spell_ae_cost.sql) was
-- regenerated alongside this migration so fresh deployments already have
-- these rows. Existing servers do not re-run base files on content change,
-- so this update inserts the new (spell_id, ae_cost) pairs idempotently.
-- INSERT IGNORE keeps any per-row tuning a server operator may have already
-- applied to spell_ids that happen to overlap.
--
-- New ids include: 51505 Lava Burst (Shaman), 12051 Evocation / 1066 Aqueous
-- Form / Hex / Mage Ward / Spellsteal (Mage), 53351 Kill Shot / 19263
-- Deterrence / 53271 Master's Call (Hunter), 3714 Path of Frost / 57330
-- Horn of Winter / 56815 Rune Strike / 61999 Raise Ally / 56222 Dark Command
-- (DK), and 39 other trainer-taught class abilities whose stock
-- SkillLineAbility.dbc rows have ClassMask=0 (the skill line itself pins the
-- class for these rows; ClassMask is redundant on class-spec lines).
INSERT IGNORE INTO `paragon_spell_ae_cost` (`spell_id`, `ae_cost`) VALUES
(66, 1), -- Invisibility (Mage)
(126, 1), -- Eye of Kilrogg (Warlock)
(526, 1), -- Cure Toxins (Shaman)
(688, 1), -- Summon Imp (Warlock)
(691, 1), -- Summon Felhunter (Warlock)
(697, 1), -- Summon Voidwalker (Warlock)
(712, 1), -- Summon Succubus (Warlock)
(768, 1), -- Cat Form (Druid)
(783, 1), -- Travel Form (Druid)
(1066, 1), -- Aqueous Form (Mage)
(2894, 1), -- Fire Resistance Totem (Shaman)
(3714, 1), -- Path of Frost (DK)
(5215, 1), -- Prowl (Druid)
(5487, 1), -- Bear Form (Druid)
(5504, 1), -- Conjure Refreshment (Mage)
(6795, 1), -- Growl (Druid)
(6807, 1), -- Maul (Druid)
(12051, 1), -- Evocation (Mage)
(19263, 1), -- Deterrence (Hunter)
(23161, 1), -- Summon Dreadsteed (Warlock)
(23214, 1), -- Summon Charger (Paladin)
(30449, 1), -- Spellsteal (Mage)
(33943, 1), -- Flight Form (Druid)
(34767, 1), -- Summon Felguard (Warlock)
(48018, 1), -- Demonic Circle: Summon (Warlock)
(50769, 1), -- Revive (Druid)
(51505, 1), -- Lava Burst (Shaman)
(51514, 1), -- Hex (Shaman)
(51730, 1), -- Earthliving Weapon (Shaman)
(52127, 1), -- Water Shield (Shaman)
(52610, 1), -- Savage Roar (Druid)
(53271, 1), -- Master's Call (Hunter)
(53351, 1), -- Kill Shot (Hunter)
(56222, 1), -- Dark Command (DK)
(56815, 1), -- Rune Strike (DK)
(57330, 1), -- Horn of Winter (DK)
(61999, 1), -- Raise Ally (DK)
(64843, 1), -- Divine Hymn (Priest)
(64901, 1), -- Hymn of Hope (Priest)
(66842, 1), -- Call of the Elements (Shaman totem set)
(66843, 1), -- Call of the Ancestors (Shaman totem set)
(66844, 1); -- Call of the Spirits (Shaman totem set)
File diff suppressed because it is too large Load Diff
+263 -16
View File
@@ -7,13 +7,17 @@
#include "Chat.h"
#include "Config.h"
#include "Creature.h"
#include "CreatureData.h"
#include "GameTime.h"
#include "Log.h"
#include "ObjectGuid.h"
#include "Pet.h"
#include "Player.h"
#include "ScriptMgr.h"
#include "SharedDefines.h"
#include "UnitDefines.h"
#include "SpellScript.h"
#include "SpellScriptLoader.h"
#include "WorldPacket.h"
#include "WorldSession.h"
@@ -35,7 +39,7 @@ public:
{
LOG_INFO("module", "[paragon] Paragon_PlayerScript registered "
"(MultiResource.HasActivePowers={})",
sConfigMgr->GetOption<bool>("Paragon.MultiResource.HasActivePowers", false));
sConfigMgr->GetOption<bool>("Paragon.MultiResource.HasActivePowers", true));
}
[[nodiscard]] Optional<bool> OnPlayerIsClass(Player const* player, Classes unitClass, ClassContext context) override
@@ -43,27 +47,218 @@ public:
if (!player || player->getClass() != CLASS_PARAGON)
return std::nullopt;
// Death Knight rune / runic power ability stack (narrow on purpose).
if (unitClass == CLASS_DEATH_KNIGHT && context == CLASS_CONTEXT_ABILITY)
// ============================================================
// Ability stack -- claim ALL nine vanilla classes.
// ============================================================
// CLASS_CONTEXT_ABILITY is read by every class-specific spell
// gate in core / scripts: DK rune mechanics (Spell.cpp,
// SpellEffects.cpp, spell_dk.cpp, SpellAuraEffects.cpp),
// Warrior Titan's Grip / Bladestorm (Player.cpp 3783, 15432,
// PlayerUpdates.cpp 1547), Paladin Rebuke (Player.cpp 15441),
// Shaman dual-wield bookkeeping (Player.cpp 5028), Hunter pet
// / Hunter's Mark gates (spell_item.cpp 3718), Druid Insect
// Swarm / Wild Growth (SpellAuraEffects.cpp 2153, 2232),
// Priest Spirit of Redemption out-of-bounds check (Unit.cpp
// 14238), Rogue pickpocketing (LootHandler.cpp 86/165/385,
// Vehicle.cpp 80). Paragon learns abilities from every class
// through Character Advancement, so claiming all of them lets
// every gated spell script execute its class-specific branch
// for our players. The only downside is double-pathed scripts
// (e.g. a spell with both warrior and rogue branches) will
// pick whichever the script tests first -- acceptable.
if (context == CLASS_CONTEXT_ABILITY)
return true;
// Warrior ability stack: enables warrior-spec ability gates anywhere
// they're checked. None of the currently-traced sites in core/scripts
// gate on (CLASS_WARRIOR, CLASS_CONTEXT_ABILITY), so this is a safe
// forward-compatible claim. Rage generation itself is gated on
// HasActivePowerType(POWER_RAGE) and is wired below.
if (unitClass == CLASS_WARRIOR && context == CLASS_CONTEXT_ABILITY)
return true;
// Reactive melee states: Overpower-on-dodge (warrior), Counterattack window (hunter).
// We intentionally do NOT claim CLASS_ROGUE here: that context skips the generic
// AURA_STATE_DEFENSE update on dodge (Riposte path) in Unit::ProcDamageAndSpellFor.
// ============================================================
// Reactive melee states.
// ============================================================
// Warrior dodge -> AURA_STATE_DEFENSE (Overpower window).
// Hunter parry -> AURA_STATE_HUNTER_PARRY (Counterattack).
// We intentionally do NOT claim CLASS_ROGUE here:
// Unit::ProcDamageAndSpellFor (Unit.cpp 12824) skips the
// generic AURA_STATE_DEFENSE update on dodge for rogues so
// Riposte can take over. Claiming rogue would silently kill
// Overpower for Paragon, and Riposte already works for us via
// the warrior-style state we already grant.
if (context == CLASS_CONTEXT_ABILITY_REACTIVE)
{
if (unitClass == CLASS_WARRIOR || unitClass == CLASS_HUNTER)
return true;
}
// ============================================================
// Pet ownership contexts.
// ============================================================
// CLASS_CONTEXT_PET is read by Pet::AddToWorld, Pet::CreateBase
// AtCreatureInfo, Pet::InitStatsForLevel (twice -- the
// MAX_PET_TYPE bootstrap branch and the per-class attack-time
// scaling), Pet::IsPermanentPetFor, Player::SummonPet,
// Player::CanResummonPet, Spell::EffectTameCreature,
// SpellEffects.cpp (CreateTamedPet debug effects, Eyes of the
// Beast), spell_generic.cpp 1760 (charm-as-pet conversion),
// and PlayerGossip.cpp's hunter stable check.
//
// The cleanest disambiguation is by the *active pet's* shape:
// HUNTER_PET -> hunter (beast tame)
// SUMMON_PET + DEMON type -> warlock (Imp/VW/Succ/...)
// SUMMON_PET + UNDEAD type -> DK ghoul / Army of Dead
// SUMMON_PET + ELEMENTAL type -> mage water / shaman fire
// For HUNTER specifically the no-pet case is also claimed so
// Tame Beast's EffectTameCreature gate passes during cast.
if (context == CLASS_CONTEXT_PET)
{
Pet const* activePet = const_cast<Player*>(player)->GetPet();
// Hunter beast: claim during taming OR when a HUNTER_PET is
// already active. This is what makes Tame Beast / Call Pet
// / pet stable / Counterattack pet aura feedback work.
if (unitClass == CLASS_HUNTER)
{
if (!activePet || activePet->getPetType() == HUNTER_PET)
return true;
return std::nullopt;
}
// All other classes only claim when an active SUMMON_PET is
// present. We then disambiguate by the creature's type
// because warlock / DK / mage / shaman all use SUMMON_PET.
if (!activePet || activePet->getPetType() != SUMMON_PET)
return std::nullopt;
CreatureTemplate const* tmpl = activePet->GetCreatureTemplate();
if (!tmpl)
return std::nullopt;
switch (unitClass)
{
case CLASS_WARLOCK:
// Drives Master Demonologist / Demonic Knowledge /
// Demonic Pact propagation, last-pet-spell tracking
// (Pet.cpp 112), and IsPermanentPetFor (Pet.cpp
// 2288) so demon pets persist across logins.
if (tmpl->type == CREATURE_TYPE_DEMON)
return true;
break;
case CLASS_DEATH_KNIGHT:
// Risen Ghoul + Army of the Dead. Player.cpp 14354
// and Pet.cpp 243 / 1046 / 2290 read this; without
// it the ghoul is invisible to the owner mid-load
// and ScriptedAI hooks on the ghoul mis-route.
if (tmpl->type == CREATURE_TYPE_UNDEAD)
return true;
break;
case CLASS_MAGE:
// Glyph-of-Eternal-Water permanent Water Elemental
// (entry 510, 37994). Used by Pet.cpp 1047/2292.
if (tmpl->type == CREATURE_TYPE_ELEMENTAL)
return true;
break;
case CLASS_SHAMAN:
// Fire Elemental / Earth Elemental. The base
// engine spawns these as creatures rather than
// proper Pet instances in most code paths, so the
// claim mostly matters for the Pet.cpp 1045 stat
// bootstrap when one is loaded as a SUMMON_PET.
if (tmpl->type == CREATURE_TYPE_ELEMENTAL)
return true;
break;
default:
break;
}
return std::nullopt;
}
// Warlock pet-charm context (Enslave Demon -- Unit.cpp 14828,
// 14894, 15025). Without this claim, charming a demon as a
// Paragon doesn't get the warlock-flavor charm semantics
// (faction-set-on-charm, action-bar layout, charm-break logic).
if (unitClass == CLASS_WARLOCK && context == CLASS_CONTEXT_PET_CHARM)
return true;
// ============================================================
// Equipment contexts.
// ============================================================
// CLASS_CONTEXT_EQUIP_RELIC: PlayerStorage.cpp 224-240 +
// 2475-2493. Routes Librams/Idols/Totems/Misc/Sigils into
// EQUIPMENT_SLOT_RANGED for the matching class. Claim every
// relic-bearing class so a Paragon can drop any of them into
// the ranged slot.
if (context == CLASS_CONTEXT_EQUIP_RELIC)
{
switch (unitClass)
{
case CLASS_PALADIN:
case CLASS_DRUID:
case CLASS_SHAMAN:
case CLASS_WARLOCK:
case CLASS_DEATH_KNIGHT:
return true;
default:
break;
}
}
// CLASS_CONTEXT_EQUIP_ARMOR_CLASS: PlayerStorage.cpp 2326,
// 2330, 2503-2523. At level 40 each class auto-learns its
// top armor proficiency. Paragon should pick up plate (via
// paladin/DK), shields (paladin/warrior/shaman), mail
// (hunter/shaman), and leather (rogue) so the level-40 train
// event grants Paragon full proficiency and we don't have to
// hand-curate it through the Paragon proficiency SQL.
if (context == CLASS_CONTEXT_EQUIP_ARMOR_CLASS)
{
switch (unitClass)
{
case CLASS_PALADIN:
case CLASS_WARRIOR:
case CLASS_DEATH_KNIGHT:
case CLASS_HUNTER:
case CLASS_SHAMAN:
case CLASS_DRUID:
case CLASS_ROGUE:
return true;
default:
break;
}
}
// CLASS_CONTEXT_EQUIP_SHIELDS: PlayerStorage.cpp 2467-2469.
// Lets a Paragon equip shields without a paladin/warrior/
// shaman skill gate.
if (context == CLASS_CONTEXT_EQUIP_SHIELDS)
{
switch (unitClass)
{
case CLASS_PALADIN:
case CLASS_WARRIOR:
case CLASS_SHAMAN:
return true;
default:
break;
}
}
// CLASS_CONTEXT_WEAPON_SWAP: PlayerStorage.cpp 1920, 2838 --
// rogue uses cooldown spell 6123 instead of 6119 on weapon
// swap (Quick Draw / Combat Potency interactions). Claim
// rogue so Paragon picks up the same cooldown spell.
if (context == CLASS_CONTEXT_WEAPON_SWAP && unitClass == CLASS_ROGUE)
return true;
// ============================================================
// Contexts we DELIBERATELY DO NOT claim:
// ============================================================
// CLASS_CONTEXT_STATS -- Paragon has its own STR/AGI->AP and
// INT/SPI->SP curves wired in StatSystem.cpp's CLASS_PARAGON
// branch (level*2 + STR + AGI - 20 etc.). Claiming any
// vanilla class here would override our curves with theirs.
//
// CLASS_CONTEXT_INIT, _TELEPORT, _QUEST, _TAXI, _SKILL,
// _GRAVEYARD, _CLASS_TRAINER, _TALENT_POINT_CALC -- all
// used by DK Ebon Hold / druid Moonglade starting-zone
// scripts. Paragon doesn't go through those zones and we
// don't want our players bound to Acherus or trapped in
// the DK starting quest gates.
return std::nullopt;
}
@@ -75,7 +270,7 @@ public:
if (power == POWER_RUNIC_POWER || power == POWER_RUNE)
return true;
if (sConfigMgr->GetOption<bool>("Paragon.MultiResource.HasActivePowers", false))
if (sConfigMgr->GetOption<bool>("Paragon.MultiResource.HasActivePowers", true))
{
switch (power)
{
@@ -268,7 +463,59 @@ private:
std::unordered_map<ObjectGuid, Paragon_PlayerScript::ParagonRuneSyncState> Paragon_PlayerScript::runeSyncByGuid;
// Arcane Torrent (28730) for Paragon: Blood Elf racial skill line 756 has
// three Arcane Torrent variants in stock WotLK (28730 mana, 25046 rogue
// energy, 50613 DK runic power). For Paragon Blood Elves we keep only 28730
// (see migration 2026_05_10_03.sql) and turn it into a "combined" version:
// the stock spell already silences nearby enemies and energizes mana via its
// own effects; this script adds energy, rage, and runic power energize on
// top when the caster is class 12, so a single button refunds whichever
// resource pool the player is actually using. Non-Paragon casters are
// untouched and keep learning their stock racial variant.
class spell_paragon_arcane_torrent : public SpellScript
{
PrepareSpellScript(spell_paragon_arcane_torrent);
void HandleAfterCast()
{
Unit* caster = GetCaster();
if (!caster || !caster->IsPlayer())
return;
Player* player = caster->ToPlayer();
if (player->getClass() != CLASS_PARAGON)
return;
// Stock energize amounts from spell_dbc:
// 25046 Arcane Torrent (Energy) -> 15 energy
// 50613 Arcane Torrent (Runic Power) -> 15 displayed RP (= 150
// internal; AC stores RP scaled 10x, see Player::SetMaxPower
// POWER_RUNIC_POWER, 1000).
// Rage uses the same 10x internal scaling as runic power (see
// Player.cpp:Regenerate where rage decay is `-20` for "2 rage by
// tick"), so 15 displayed rage = 150 internal.
// ModifyPower no-ops on pools the player has no max for, so this is
// safe even before the Paragon picks up energy/rage/RP abilities.
constexpr int32 kEnergyGain = 15;
constexpr int32 kRageGain = 150;
constexpr int32 kRunicPowerGain = 150;
SpellInfo const* spellInfo = GetSpellInfo();
uint32 const spellId = spellInfo ? spellInfo->Id : 28730u;
caster->EnergizeBySpell(player, spellId, kEnergyGain, POWER_ENERGY);
caster->EnergizeBySpell(player, spellId, kRageGain, POWER_RAGE);
caster->EnergizeBySpell(player, spellId, kRunicPowerGain, POWER_RUNIC_POWER);
}
void Register() override
{
AfterCast += SpellCastFn(spell_paragon_arcane_torrent::HandleAfterCast);
}
};
void AddSC_paragon()
{
new Paragon_PlayerScript();
RegisterSpellScript(spell_paragon_arcane_torrent);
}
+29
View File
@@ -0,0 +1,29 @@
#!/usr/bin/env bash
# Clone Dawnforger/Fractured and omit Docker-only paths. Use when this script is
# already on disk (e.g. scp). Otherwise: git clone … && cd Fractured && bash scripts/vps-sparse-checkout-no-docker.sh
#
# Usage:
# bash scripts/vps-clone-without-docker.sh /path/to/Fractured git@github.com:Dawnforger/Fractured.git
set -euo pipefail
TARGET="${1:?usage: $0 /path/to/Fractured <git-remote-url>}"
REMOTE="${2:?usage: $0 /path/to/Fractured <git-remote-url>}"
if [[ -e "$TARGET" ]]; then
echo "error: $TARGET already exists; remove it or pick another path." >&2
exit 1
fi
mkdir -p "$(dirname "$TARGET")"
git clone "$REMOTE" "$TARGET"
cd "$TARGET"
if [[ ! -f scripts/vps-sparse-checkout-no-docker.sh ]]; then
echo "error: clone missing scripts/vps-sparse-checkout-no-docker.sh — pull latest main." >&2
exit 1
fi
bash scripts/vps-sparse-checkout-no-docker.sh
echo "Done. Next: docs/DEPLOY_LINUX_VPS.md"
+336
View File
@@ -0,0 +1,336 @@
#!/usr/bin/env bash
# Collect VPS evidence for Paragon / DBUpdater / binary staleness triage.
# Run ON the VPS (Linux). Safe: read-only; does not restart services.
#
# Usage (from clone):
# bash scripts/vps-paragon-diagnostics.sh
#
# Optional environment:
# FRACTURED_REPO — absolute path to Fractured git root (default: parent of scripts/)
# FRACTURED_WS_BIN — path to worldserver binary (default: auto-detect)
# FRACTURED_WORLDSERVER_CONF — path to worldserver.conf (default: guess from BIN + common layouts)
# FRACTURED_SYSTEMD_UNITS — space-separated units to try (default: "fractured-world worldserver ac-worldserver")
# FRACTURED_MYSQL — prefix to invoke mysql, e.g. 'mysql -uacore -pacore -h127.0.0.1'
# (default Fractured local DB user/password are often both "acore"; use ~/.my.cnf if you prefer not to pass -p on the command line)
# If unset, SQL blocks are printed for manual copy-paste only.
# FRACTURED_SPELL_IDS — space-separated spell IDs for spell_dbc spot-check (defaults to common DK rune spenders)
# FRACTURED_DIAG_OUTPUT — full log file path (default: <repo>/var/vps-paragon-diagnostics-last.txt)
#
# All output is mirrored to the log file (tee) while still printing to the terminal.
# Default path lives under var/ (gitignored in this repo). Open that file in Cursor,
# scp it down, or: git add -f var/vps-paragon-diagnostics-last.txt if you intend to commit it.
set -u
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
REPO="${FRACTURED_REPO:-$(cd "$SCRIPT_DIR/.." && pwd)}"
DIAG_OUT="${FRACTURED_DIAG_OUTPUT:-$REPO/var/vps-paragon-diagnostics-last.txt}"
mkdir -p "$(dirname "$DIAG_OUT")"
exec > >(tee "$DIAG_OUT") 2>&1
echo "Logging to: $DIAG_OUT"
hr() { printf '\n%s\n' "================================================================================"; }
sub() { printf '\n-- %s\n' "$1"; }
detect_worldserver_bin() {
local bin="" es path u units
if [[ -n "${FRACTURED_WS_BIN:-}" ]]; then
readlink -f "$FRACTURED_WS_BIN" 2>/dev/null && return
echo "$FRACTURED_WS_BIN"
return
fi
units="${FRACTURED_SYSTEMD_UNITS:-fractured-world worldserver ac-worldserver}"
for u in $units; do
if systemctl is-active --quiet "$u" 2>/dev/null || systemctl is-enabled --quiet "$u" 2>/dev/null; then
es=$(systemctl show "$u" -p ExecStart --value 2>/dev/null || true)
if [[ -n "$es" ]]; then
if [[ "$es" == \{*path=* ]]; then
path=$(printf '%s' "$es" | sed -n 's/.*path=\([^;]*\).*/\1/p')
else
path=$(printf '%s' "$es" | awk '{print $1}' | sed 's/^path=//')
fi
if [[ -n "$path" && -x "$path" ]]; then
readlink -f "$path" 2>/dev/null && return
fi
fi
fi
done
local pid
pid=$(pgrep -xo worldserver 2>/dev/null || true)
if [[ -n "$pid" ]]; then
readlink -f "/proc/$pid/exe" 2>/dev/null && return
fi
if command -v worldserver >/dev/null 2>&1; then
readlink -f "$(command -v worldserver)" 2>/dev/null && return
fi
echo ""
}
guess_worldserver_conf() {
local bin="$1"
local d cands=()
[[ -z "$bin" ]] && return
d=$(dirname "$bin")
cands+=("$d/../etc/worldserver.conf")
cands+=("$d/../../etc/worldserver.conf")
cands+=("$HOME/azeroth-server/etc/worldserver.conf")
cands+=("$HOME/env/dist/etc/worldserver.conf")
for f in "${cands[@]}"; do
f=$(readlink -f "$f" 2>/dev/null || true)
if [[ -n "$f" && -f "$f" ]]; then
echo "$f"
return
fi
done
echo ""
}
binary_strings_paths() {
local ws="$1"
[[ -z "$ws" || ! -f "$ws" ]] && return
strings "$ws" 2>/dev/null | grep -iE '/(home|root|opt|srv|var)[^[:space:]]*/(Fractured|fractured|azeroth|AzerothCore|acore)' | sort -u | head -40
}
hr
echo "Fractured Paragon / native VPS diagnostics"
echo "Date (UTC): $(date -u '+%Y-%m-%d %H:%M:%S UTC')"
echo "Repo (expected): $REPO"
sub "1A — worldserver binary"
WS=$(detect_worldserver_bin || true)
if [[ -z "$WS" ]]; then
echo "ERROR: Could not find worldserver. Set FRACTURED_WS_BIN=/full/path/to/worldserver and re-run."
else
echo "Binary: $WS"
if stat -c 'binary mtime: %y' "$WS" 2>/dev/null; then
:
else
stat -f 'binary mtime: %Sm' -t '%Y-%m-%d %H:%M:%S %z' "$WS" 2>/dev/null || stat "$WS"
fi
fi
sub "1B — repo HEAD + Paragon_Essence.cpp mtime"
if [[ -d "$REPO/.git" ]]; then
(cd "$REPO" && git log -1 --format='HEAD commit: %h %ci %s')
else
echo "WARN: not a git repo: $REPO (set FRACTURED_REPO)"
fi
PE="$REPO/modules/mod-paragon/src/Paragon_Essence.cpp"
if [[ -f "$PE" ]]; then
if stat -c 'Paragon_Essence.cpp mtime: %y' "$PE" 2>/dev/null; then
:
else
stat -f 'Paragon_Essence.cpp mtime: %Sm' -t '%Y-%m-%d %H:%M:%S %z' "$PE" 2>/dev/null || stat "$PE"
fi
else
echo "WARN: missing $PE"
fi
sub "1C — strings heuristics (0 can mean stripped binary — use 1A+1B)"
if [[ -n "$WS" && -f "$WS" ]]; then
c1=$(strings "$WS" 2>/dev/null | grep -c 'CLASS_PARAGON' || true)
c2=$(strings "$WS" 2>/dev/null | grep -c 'C BUILD SAVE_CURRENT' || true)
c3=$(strings "$WS" 2>/dev/null | grep -c 'character_paragon_build_share_archive' || true)
echo "CLASS_PARAGON count: $c1"
echo "C BUILD SAVE_CURRENT count: $c2"
echo "character_paragon_build_share_archive count: $c3"
else
echo "(skipped — no binary)"
fi
sub "1D — binary fingerprint (compare sha256 across dev vs VPS)"
if [[ -n "$WS" && -f "$WS" ]]; then
if command -v sha256sum >/dev/null 2>&1; then
sha256sum "$WS"
elif command -v shasum >/dev/null 2>&1; then
shasum -a 256 "$WS"
else
echo "(no sha256sum — install coreutils)"
fi
echo "Embedded revision / version strings (first matches):"
strings "$WS" 2>/dev/null | grep -iE 'azerothcore|revision|git|commit|build.*20[0-9]{2}' | head -25 || echo "(none matched)"
else
echo "(skipped — no binary)"
fi
CONF="${FRACTURED_WORLDSERVER_CONF:-}"
if [[ -z "$CONF" && -n "$WS" ]]; then
CONF=$(guess_worldserver_conf "$WS")
fi
sub "2B — worldserver.conf (updater / source / rates / paragon)"
if [[ -n "$CONF" && -f "$CONF" ]]; then
echo "Using conf: $CONF"
grep -E '^SourceDirectory|^Updates\.EnableDatabases|^Updates\.AutoSetup|^[[:space:]]*SourceDirectory|^[[:space:]]*Updates\.EnableDatabases|^[[:space:]]*Updates\.AutoSetup' "$CONF" 2>/dev/null || echo "(no matching lines or unreadable)"
echo "--- Rate.RunicPower (if set) ---"
grep -iE '^Rate\.RunicPower|^[[:space:]]*Rate\.RunicPower' "$CONF" 2>/dev/null || echo "(not set — server uses default)"
echo "--- Paragon.* module options (if any) ---"
grep -iE '^Paragon\.|^[[:space:]]*Paragon\.' "$CONF" 2>/dev/null || echo "(no Paragon.* keys in worldserver.conf — check etc/modules/mod_paragon.conf)"
else
echo "WARN: worldserver.conf not found. Set FRACTURED_WORLDSERVER_CONF=/path/to/worldserver.conf"
fi
if [[ -n "$WS" && -f "$WS" ]]; then
ETCGuess=$(readlink -f "$(dirname "$WS")/../etc" 2>/dev/null || true)
MPC="$ETCGuess/modules/mod_paragon.conf"
if [[ -f "$MPC" ]]; then
sub "2B2 — mod_paragon.conf Paragon.* toggles (non-comment)"
grep -E '^Paragon\.' "$MPC" 2>/dev/null | head -40 || echo "(no uncommented Paragon.* lines)"
fi
fi
sub "2A — path-like strings from binary (candidate source roots)"
if [[ -n "$WS" && -f "$WS" ]]; then
binary_strings_paths "$WS" || true
else
echo "(skipped)"
fi
sub "Resolved source root for 2D"
RESOLVED=""
if [[ -n "$CONF" && -f "$CONF" ]]; then
sd=$(awk -F= '/^[[:space:]]*SourceDirectory[[:space:]]*=/ {
gsub(/^[[:space:]]+|[[:space:]]+$/, "", $2);
gsub(/^["'\'']|["'\'']$/, "", $2);
print $2; exit }' "$CONF" 2>/dev/null || true)
if [[ -n "${sd:-}" ]]; then
RESOLVED="$sd"
fi
fi
if [[ -z "$RESOLVED" ]]; then
RESOLVED="$REPO"
fi
echo "Using RESOLVED=$RESOLVED (from SourceDirectory if set in conf, else FRACTURED_REPO)"
sub "2D — Paragon SQL dirs under RESOLVED"
for subdir in \
"$RESOLVED/modules/mod-paragon/data/sql/db-world/updates/" \
"$RESOLVED/modules/mod-paragon/data/sql/db-characters/updates/"; do
if [[ -d "$subdir" ]]; then
echo "Listing: $subdir"
ls -la "$subdir" 2>/dev/null | tail -15
else
echo "MISSING: $subdir"
fi
done
sub "CMake build dir hints (common Fractured layouts)"
for cand in "$REPO/var/build/obj" "$REPO/build" "$REPO/../build"; do
if [[ -f "$cand/CMakeCache.txt" ]]; then
echo "Found CMakeCache: $cand/CMakeCache.txt"
grep -E '^CMAKE_HOME_DIRECTORY:|^MODULES:|^CMAKE_INSTALL_PREFIX:' "$cand/CMakeCache.txt" 2>/dev/null | head -5
fi
done
sub "DATABASE — updates rows (2026_05_10 / paragon)"
SQL_WORLD=$(cat <<'EOS'
SELECT name, hash, speed FROM updates
WHERE name LIKE '2026_05_10%' OR name LIKE '%paragon%'
ORDER BY name DESC LIMIT 30;
EOS
)
SQL_CHAR="$SQL_WORLD"
if [[ -n "${FRACTURED_MYSQL:-}" ]]; then
echo "--- acore_world ---"
$FRACTURED_MYSQL acore_world -e "$SQL_WORLD" || echo "(mysql failed for acore_world)"
echo "--- acore_characters ---"
$FRACTURED_MYSQL acore_characters -e "$SQL_CHAR" || echo "(mysql failed for acore_characters)"
sub "DATABASE — DBC parity for runes / Paragon (acore_world)"
# Common DK rune spenders (WotLK). Override: export FRACTURED_SPELL_IDS='45477 45462'
SPELL_IDS="${FRACTURED_SPELL_IDS:-45477 45462 49923 55050 56815}"
IDS_CSV=$(echo "$SPELL_IDS" | tr ' ' ',')
echo "--- spell_dbc table size (world DB overrides; 0 rows = all spells from disk DBC only) ---"
$FRACTURED_MYSQL acore_world -e "SELECT COUNT(*) AS spell_dbc_rows FROM spell_dbc;" 2>/dev/null || echo "(spell_dbc missing or no access)"
echo "--- acore_world.version (last core revision written by worldserver) ---"
$FRACTURED_MYSQL acore_world -e "SELECT * FROM version LIMIT 5;" 2>/dev/null || echo "(version table missing?)"
echo "--- chrclasses_dbc class 6 + 12 (DisplayPower: 0=mana, 5=POWER_RUNE in AC) ---"
$FRACTURED_MYSQL acore_world -e "
SELECT ID, DisplayPower, Name_Lang_enUS FROM chrclasses_dbc WHERE ID IN (6,12);
" 2>/dev/null || echo "(query failed — chrclasses_dbc missing?)"
echo "Note: If only ID=12 appears, class 6 (DK) is not overridden in DB — loaded from disk DBC (normal)."
echo "--- spell_dbc: are sample DK spells overridden in DB? ---"
spell_sample_n=$($FRACTURED_MYSQL acore_world -N -B -e \
"SELECT COUNT(*) FROM spell_dbc WHERE ID IN ($IDS_CSV);" 2>/dev/null || echo 0)
echo "Row count in spell_dbc for sample IDs ($SPELL_IDS): ${spell_sample_n:-0}"
if [[ "${spell_sample_n:-0}" == "0" ]]; then
echo "=> 0 means those spells use on-disk Spell.dbc only; the sample block below will be empty (not an error)."
fi
echo "--- spell_dbc sample (PowerType 5 = POWER_RUNE in AC) ---"
$FRACTURED_MYSQL acore_world -e "
SELECT ID, PowerType, ManaCost, RuneCostID FROM spell_dbc WHERE ID IN ($IDS_CSV);
" 2>/dev/null || echo "(query failed — spell_dbc missing or wrong schema)"
echo "--- spellrunecost join for sample IDs (empty if no spell_dbc rows above) ---"
$FRACTURED_MYSQL acore_world -e "
SELECT s.ID AS spell_id, s.PowerType, s.RuneCostID, r.Blood, r.Unholy, r.Frost, r.RunicPower
FROM spell_dbc s
LEFT JOIN spellrunecost_dbc r ON r.ID = s.RuneCostID
WHERE s.ID IN ($IDS_CSV);
" 2>/dev/null || echo "(join failed — check spellrunecost_dbc)"
echo "--- spell_dbc suspicious overrides: RuneCostID>0 but PowerType!=5 (can break rune checks) ---"
$FRACTURED_MYSQL acore_world -e "
SELECT ID, PowerType, ManaCost, RuneCostID FROM spell_dbc
WHERE RuneCostID > 0 AND PowerType <> 5
ORDER BY ID LIMIT 40;
" 2>/dev/null || echo "(query failed)"
echo "Compare counts/IDs to dev: unexpected rows here warrant a DB diff."
echo "--- spell_dbc POWER_RUNE (5) spells with RuneCostID (sample) ---"
$FRACTURED_MYSQL acore_world -e "
SELECT ID, PowerType, RuneCostID FROM spell_dbc
WHERE PowerType = 5 AND RuneCostID > 0
ORDER BY ID LIMIT 15;
" 2>/dev/null || echo "(query failed)"
else
echo "FRACTURED_MYSQL not set — run manually (example: export FRACTURED_MYSQL='mysql -uUSER -hHOST')"
echo "acore_world:"
echo "$SQL_WORLD"
echo "acore_characters:"
echo "$SQL_CHAR"
echo ""
echo "Optional DBC parity (acore_world) — run after connecting:"
echo " SELECT ID, DisplayPower, Name_Lang_enUS FROM chrclasses_dbc WHERE ID IN (6,12);"
echo " SELECT ID, PowerType, ManaCost, RuneCostID FROM spell_dbc WHERE ID IN (45477,45462,49923,55050,56815);"
echo " SELECT s.ID, s.RuneCostID, r.Blood, r.Unholy, r.Frost, r.RunicPower FROM spell_dbc s"
echo " LEFT JOIN spellrunecost_dbc r ON r.ID = s.RuneCostID WHERE s.ID IN (45477,45462,49923,55050,56815);"
fi
sub "mod_paragon.conf vs .dist (install etc)"
ETC=""
if [[ -n "$WS" ]]; then
ETC=$(readlink -f "$(dirname "$WS")/../etc" 2>/dev/null || true)
fi
if [[ -z "$ETC" || ! -d "$ETC" ]]; then
ETC=$(readlink -f "$HOME/azeroth-server/etc" 2>/dev/null || true)
fi
if [[ -n "$ETC" && -d "$ETC/modules" ]]; then
MP="$ETC/modules/mod_paragon.conf"
MPD="$ETC/modules/mod_paragon.conf.dist"
if [[ -f "$MP" && -f "$MPD" ]]; then
diff -u "$MP" "$MPD" 2>/dev/null | head -80 || true
else
echo "ETC=$ETC — mod_paragon.conf or .dist missing (MP=$MP MPD=$MPD)"
fi
else
echo "Could not find install etc/modules (set paths manually for diff)."
fi
hr
echo "DELIVERABLE for maintainer:"
echo "1) Paste 1A1D (binary mtime, git HEAD, strings, sha256 + revision strings)."
echo "2) Paste DATABASE blocks: updates + DBC parity (chrclasses 12, spell_dbc, spellrunecost join)."
echo "3) Paste 2A path strings + 2D listings (or MISSING lines)."
echo "4) From dev: same 1D sha256 of worldserver OR same SQL block — proves binary/data parity."
echo "5) ONE sentence: exact in-game symptom."
echo "Done."
echo ""
echo "Full transcript: $DIAG_OUT"
+40
View File
@@ -0,0 +1,40 @@
#!/usr/bin/env bash
# Omit Docker-only paths from the working tree (native VPS / production clones).
# Repository root is the AzerothCore tree (flat layout).
#
# Run from repository root (directory that contains acore.sh and apps/).
#
# Usage:
# bash scripts/vps-sparse-checkout-no-docker.sh
#
# Restore full tree: git sparse-checkout disable
set -euo pipefail
ROOT="$(cd "$(dirname "${BASH_SOURCE[0]}")/.." && pwd)"
cd "$ROOT"
if [[ ! -d .git ]]; then
echo "error: run from a git clone (no .git in $ROOT)." >&2
exit 1
fi
git sparse-checkout init --no-cone
cat >.git/info/sparse-checkout <<'EOF'
/*
!/docker-compose.yml
!/docker-compose.override.yml
!/apps/docker/
!/env/docker-focal-build/
!/.devcontainer/
EOF
if git sparse-checkout reapply 2>/dev/null; then
:
else
git read-tree -mu HEAD
fi
echo "Sparse checkout applied (Docker-only paths omitted)."
echo "To restore full tree locally: git sparse-checkout disable"
+181
View File
@@ -0,0 +1,181 @@
#!/usr/bin/env bash
# Fractured / AzerothCore — native VPS rolling update (git + compile).
#
# Run from anywhere; resolves the repository root from this script's location.
# Typical production layout: sources in ~/src/Fractured, install prefix in ~/azeroth-server
# (see docs/DEPLOY_LINUX_VPS.md).
#
# What this does:
# 1. git pull on the current branch (optional; can skip)
# 2. ./acore.sh compiler build — or compiler all for a full clean rebuild
#
# Database migrations from data/sql/updates/ run when you next start worldserver/authserver
# (Updates.* / SourceDirectory in *.conf). This script does not start or stop daemons unless
# you pass --run-after or set FRACTURED_POST_UPDATE_CMD.
#
# Usage:
# bash scripts/vps-update-server.sh
# bash scripts/vps-update-server.sh --full
# bash scripts/vps-update-server.sh --no-pull
# bash scripts/vps-update-server.sh --dry-run
# FRACTURED_POST_UPDATE_CMD='sudo systemctl restart fractured-world' bash scripts/vps-update-server.sh --run-after
# bash scripts/vps-update-server.sh --run-after 'sudo systemctl restart fractured-world'
#
# Environment:
# FRACTURED_GIT_REMOTE — remote name (default: origin)
# FRACTURED_POST_UPDATE_CMD — shell command run after a successful compile (if --run-after is passed without an argument, this is used)
set -euo pipefail
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
ROOT="$(cd "$SCRIPT_DIR/.." && pwd)"
NO_PULL=0
FULL_BUILD=0
COMPILE_ONLY=0
DRY_RUN=0
DO_RUN_AFTER=0
POST_UPDATE_CMD="${FRACTURED_POST_UPDATE_CMD:-}"
GIT_REMOTE="${FRACTURED_GIT_REMOTE:-origin}"
usage() {
cat <<'EOF'
Fractured VPS update — git pull + compiler (see header in script for full notes).
Usage:
bash scripts/vps-update-server.sh [options]
Options:
--no-pull Skip git pull (only compile current tree).
--full ./acore.sh compiler all (clean + configure + compile).
--compile-only ./acore.sh compiler compile (incremental).
--dry-run Print commands without running them.
--run-after [CMD] Run shell command after successful compile. If CMD is omitted,
uses FRACTURED_POST_UPDATE_CMD from the environment.
Environment:
FRACTURED_GIT_REMOTE Git remote (default: origin).
FRACTURED_POST_UPDATE_CMD Used with bare --run-after.
EOF
}
run() {
if [[ "$DRY_RUN" -eq 1 ]]; then
printf '[dry-run] '
printf '%q ' "$@"
printf '\n'
else
"$@"
fi
}
while [[ $# -gt 0 ]]; do
case "$1" in
-h | --help)
usage
exit 0
;;
--no-pull)
NO_PULL=1
shift
;;
--full)
FULL_BUILD=1
shift
;;
--compile-only)
COMPILE_ONLY=1
shift
;;
--dry-run)
DRY_RUN=1
shift
;;
--run-after)
DO_RUN_AFTER=1
shift
if [[ $# -gt 0 && "$1" != -* ]]; then
POST_UPDATE_CMD="$1"
shift
fi
;;
*)
echo "error: unknown option: $1" >&2
echo "Try: bash scripts/vps-update-server.sh --help" >&2
exit 2
;;
esac
done
if [[ "$FULL_BUILD" -eq 1 && "$COMPILE_ONLY" -eq 1 ]]; then
echo "error: use only one of --full or --compile-only" >&2
exit 2
fi
if [[ ! -d "$ROOT/.git" ]]; then
echo "error: not a git clone: $ROOT" >&2
exit 1
fi
if [[ ! -f "$ROOT/acore.sh" ]]; then
echo "error: acore.sh not found under $ROOT" >&2
exit 1
fi
if [[ ! -f "$ROOT/conf/config.sh" ]]; then
echo "error: missing $ROOT/conf/config.sh — copy conf/dist/config.sh and edit (see DEPLOY_LINUX_VPS.md)." >&2
exit 1
fi
cd "$ROOT"
if [[ "$DO_RUN_AFTER" -eq 1 && -z "${POST_UPDATE_CMD// }" ]]; then
echo "error: --run-after needs a command or FRACTURED_POST_UPDATE_CMD set in the environment." >&2
exit 2
fi
current_branch() {
git symbolic-ref -q --short HEAD || git rev-parse --short HEAD
}
if [[ "$NO_PULL" -eq 0 ]]; then
ref="$(current_branch)"
if [[ "$ref" == "HEAD" ]]; then
echo "error: detached HEAD; checkout a branch or use --no-pull." >&2
exit 1
fi
echo "==> git pull $GIT_REMOTE $ref"
run git pull "$GIT_REMOTE" "$ref"
else
echo "==> skipping git pull (--no-pull)"
fi
echo "==> ensuring acore.sh and JSONPath are executable"
if [[ "$DRY_RUN" -eq 1 ]]; then
run chmod +x acore.sh deps/jsonpath/JSONPath.sh
else
chmod +x acore.sh deps/jsonpath/JSONPath.sh 2>/dev/null || true
fi
if [[ "$FULL_BUILD" -eq 1 ]]; then
echo "==> ./acore.sh compiler all (clean, configure, compile)"
run ./acore.sh compiler all
elif [[ "$COMPILE_ONLY" -eq 1 ]]; then
echo "==> ./acore.sh compiler compile (incremental; build dir must exist)"
run ./acore.sh compiler compile
else
echo "==> ./acore.sh compiler build (configure + compile)"
run ./acore.sh compiler build
fi
if [[ "$DO_RUN_AFTER" -eq 1 ]]; then
echo "==> post-update: $POST_UPDATE_CMD"
if [[ "$DRY_RUN" -eq 1 ]]; then
printf '[dry-run] eval %q\n' "$POST_UPDATE_CMD"
else
# shellcheck disable=SC2086
eval "$POST_UPDATE_CMD"
fi
fi
echo "Done. Restart authserver/worldserver (or your service manager) when ready so new binaries and SQL updates apply."
@@ -53,12 +53,14 @@ MaxPingTime = 30
#
# RealmServerPort
# Description: TCP port the auth server listens on (login handshake).
# Fractured production: match your client realmlist host:port, e.g.
# set realmlist hsrwow.net:47497
# requires RealmServerPort = 47497 and firewall/NAT to this process.
# Default: 3724 (stock WoW); Fractured dist default: 47497
# 3724 is the stock WoW default; clients with `set realmlist <host>`
# (no port) connect here. Production deployments that cannot bind
# 3724 (NAT, conflicting service, etc.) can set this to e.g. 47497
# and have clients use `set realmlist <host>:47497` -- the
# Fractured-patched Wow.exe supports the host:port syntax.
# Default: 3724
RealmServerPort = 47497
RealmServerPort = 3724
#
#
@@ -669,7 +669,12 @@ bool AuctionHouseUsablePlayerInfo::PlayerCanUseItem(ItemTemplate const* proto) c
return false;
}
if ((proto->AllowableClass & classMask) == 0 || (proto->AllowableRace & raceMask) == 0)
// mod-paragon: class 12 (Paragon) ignores AllowableClass for AH "Usable"
// filter. classMask here is the searching player's mask; PARAGON_BIT 0x800
// = (1 << (12 - 1)). Race restriction still applies.
bool const searcherIsParagon = (classMask & 0x800u) != 0;
if ((!searcherIsParagon && (proto->AllowableClass & classMask) == 0)
|| (proto->AllowableRace & raceMask) == 0)
return false;
if (proto->RequiredSkill != 0)
+34 -2
View File
@@ -10687,7 +10687,12 @@ bool Player::BuyItemFromVendorSlot(ObjectGuid vendorguid, uint32 vendorslot, uin
return false;
}
if (!(pProto->AllowableClass & getClassMask()) && pProto->Bonding == BIND_WHEN_PICKED_UP && !IsGameMaster())
// mod-paragon: class 12 ignores BoP buy-side AllowableClass gate, so
// class-restricted vendor items (e.g. class glyphs) can be purchased.
if (getClass() != CLASS_PARAGON
&& !(pProto->AllowableClass & getClassMask())
&& pProto->Bonding == BIND_WHEN_PICKED_UP
&& !IsGameMaster())
{
SendBuyError(BUY_ERR_CANT_FIND_ITEM, nullptr, item, 0);
return false;
@@ -12012,6 +12017,28 @@ void Player::learnSkillRewardedSpells(uint32 skill_id, uint32 skill_value)
uint32 raceMask = getRaceMask();
uint32 classMask = getClassMask();
// Fractured / Paragon: the Character Advancement panel is the sole
// authority over which class abilities a Paragon owns. The skill-line
// cascade re-fires from _LoadSkills (every login), UpdateSkillsForLevel
// (every level-up), UpdateSkillPro (every weapon-skill tick on a
// training dummy), and SetSkill (first time a class skill is granted).
// Each of those re-grants every SLA-tagged class ability on the
// matching skill line — leaking Blood Presence / Death Coil / Death
// Grip / etc. back into the spellbook within seconds even after the
// player intentionally refunded them via the panel. Skip the cascade
// for class-category skill lines on Paragon characters; mod-paragon
// calls Player::learnSpell directly for the abilities the player
// actually purchased, including their attached passives. Profession,
// weapon, language, and racial skill cascades stay enabled so things
// like recipe auto-learn, weapon proficiencies, and racial perks
// still work.
if (getClass() == CLASS_PARAGON)
{
if (SkillLineEntry const* sl = sSkillLineStore.LookupEntry(skill_id))
if (sl->categoryId == SKILL_CATEGORY_CLASS)
return;
}
// Get all abilities for this skill and sort by MinSkillLineRank (lowest to highest)
auto abilities = GetSkillLineAbilitiesBySkillLine(skill_id);
std::vector<SkillLineAbilityEntry const*> sortedAbilities(abilities.begin(), abilities.end());
@@ -14084,7 +14111,12 @@ void Player::LearnTalent(uint32 talentId, uint32 talentRank, bool command /*= fa
}
// xinef: check if talent deponds on another talent
if (talentInfo->DependsOn > 0)
// mod-paragon: Character Advancement gates talents by AE/TE essence cost,
// not by the column-arrow prereq from Blizzard's spec UI. For class 12
// (Paragon) we skip the DependsOn check so e.g. Deep Wounds, Bloody
// Vengeance and Expose Weakness can be picked without first speccing into
// their unrelated prereq sibling.
if (talentInfo->DependsOn > 0 && getClass() != CLASS_PARAGON)
if (TalentEntry const* depTalentInfo = sTalentStore.LookupEntry(talentInfo->DependsOn))
{
bool hasEnoughRank = false;
@@ -2364,7 +2364,16 @@ InventoryResult Player::CanUseItem(ItemTemplate const* proto) const
return EQUIP_ERR_YOU_CAN_NEVER_USE_THAT_ITEM;
}
if ((proto->AllowableClass & getClassMask()) == 0 || (proto->AllowableRace & getRaceMask()) == 0)
// mod-paragon: class 12 (Paragon) ignores AllowableClass entirely, so any
// class-restricted item (including class glyphs) can be equipped/used.
// Race restriction still applies; proficiency/level/skill checks below
// still gate it sensibly via the standard skill cascade.
if (getClass() != CLASS_PARAGON
&& (proto->AllowableClass & getClassMask()) == 0)
{
return EQUIP_ERR_YOU_CAN_NEVER_USE_THAT_ITEM;
}
if ((proto->AllowableRace & getRaceMask()) == 0)
{
return EQUIP_ERR_YOU_CAN_NEVER_USE_THAT_ITEM;
}
@@ -2430,7 +2439,11 @@ InventoryResult Player::CanRollForItemInLFG(ItemTemplate const* proto, WorldObje
SKILL_FISHING
}; //Copy from function Item::GetSkill()
if ((proto->AllowableClass & getClassMask()) == 0 || (proto->AllowableRace & getRaceMask()) == 0)
// mod-paragon: class 12 ignores AllowableClass for LFG roll eligibility.
if (getClass() != CLASS_PARAGON
&& (proto->AllowableClass & getClassMask()) == 0)
return EQUIP_ERR_YOU_CAN_NEVER_USE_THAT_ITEM;
if ((proto->AllowableRace & getRaceMask()) == 0)
return EQUIP_ERR_YOU_CAN_NEVER_USE_THAT_ITEM;
if (proto->RequiredSpell != 0 && !HasSpell(proto->RequiredSpell))
@@ -385,6 +385,13 @@ void Player::UpdateAttackPowerAndDamage(bool ranged)
break;
}
}
else if (getClass() == CLASS_PARAGON)
{
// Fractured class 12: same hybrid curve as requested for Paragon UI
// (level*2 + AGI + STR - 20). Implemented in core so we do not rely
// on PlayerScript hooks in this hot path.
val2 = level * 2.0f + GetStat(STAT_AGILITY) + GetStat(STAT_STRENGTH) - 20.0f;
}
else
{
val2 = GetStat(STAT_AGILITY) - 10.0f;
@@ -481,6 +488,10 @@ void Player::UpdateAttackPowerAndDamage(bool ranged)
break;
}
}
else if (getClass() == CLASS_PARAGON)
{
val2 = level * 2.0f + GetStat(STAT_STRENGTH) + GetStat(STAT_AGILITY) - 20.0f;
}
else if (IsClass(CLASS_MAGE, CLASS_CONTEXT_STATS) || IsClass(CLASS_PRIEST, CLASS_CONTEXT_STATS) || IsClass(CLASS_WARLOCK, CLASS_CONTEXT_STATS))
{
val2 = GetStat(STAT_STRENGTH) - 10.0f;
+29
View File
@@ -9046,6 +9046,21 @@ int32 Unit::SpellBaseDamageBonusDone(SpellSchoolMask schoolMask)
DoneAdvertisedBenefit += ToPlayer()->GetBaseSpellPowerBonus();
DoneAdvertisedBenefit += ToPlayer()->GetBaseSpellDamageBonus();
// Fractured class 12 (Paragon) intrinsic spell power:
// SP = level*2 + INT + SPI - 20 (clamped at 0)
// Read live from current stats so character-sheet refreshes (via
// UpdateSpellDamageAndHealingBonus) and live spell casts both see the
// up-to-date value with no script hooks or m_baseSpellPower mutation.
if (ToPlayer()->getClass() == CLASS_PARAGON)
{
int32 paragonSP = int32(GetLevel()) * 2
+ int32(GetStat(STAT_INTELLECT))
+ int32(GetStat(STAT_SPIRIT))
- 20;
if (paragonSP > 0)
DoneAdvertisedBenefit += paragonSP;
}
// Damage bonus from stats
AuraEffectList const& mDamageDoneOfStatPercent = GetAuraEffectsByType(SPELL_AURA_MOD_SPELL_DAMAGE_OF_STAT_PERCENT);
for (AuraEffectList::const_iterator i = mDamageDoneOfStatPercent.begin(); i != mDamageDoneOfStatPercent.end(); ++i)
@@ -9803,6 +9818,20 @@ int32 Unit::SpellBaseHealingBonusDone(SpellSchoolMask schoolMask)
AdvertisedBenefit += ToPlayer()->GetBaseSpellPowerBonus();
AdvertisedBenefit += ToPlayer()->GetBaseSpellHealingBonus();
// Fractured class 12 (Paragon) intrinsic spell power: same level*2 +
// INT + SPI - 20 floor as on the damage side (the character sheet
// shows a single Spell Power value, so both sides must add the same
// bonus).
if (ToPlayer()->getClass() == CLASS_PARAGON)
{
int32 paragonSP = int32(GetLevel()) * 2
+ int32(GetStat(STAT_INTELLECT))
+ int32(GetStat(STAT_SPIRIT))
- 20;
if (paragonSP > 0)
AdvertisedBenefit += paragonSP;
}
// Healing bonus from stats
AuraEffectList const& mHealingDoneOfStatPercent = GetAuraEffectsByType(SPELL_AURA_MOD_SPELL_HEALING_OF_STAT_PERCENT);
for (AuraEffectList::const_iterator i = mHealingDoneOfStatPercent.begin(); i != mHealingDoneOfStatPercent.end(); ++i)
+6 -1
View File
@@ -908,7 +908,12 @@ void WorldSession::SendListInventory(ObjectGuid vendorGuid, uint32 vendorEntry)
{
if (ItemTemplate const* itemTemplate = sObjectMgr->GetItemTemplate(item->item))
{
if (!(itemTemplate->AllowableClass & _player->getClassMask()) && itemTemplate->Bonding == BIND_WHEN_PICKED_UP && !_player->IsGameMaster())
// mod-paragon: class 12 sees every BoP class-restricted item
// in vendor lists (class glyphs, class tier sets, ...).
if (_player->getClass() != CLASS_PARAGON
&& !(itemTemplate->AllowableClass & _player->getClassMask())
&& itemTemplate->Bonding == BIND_WHEN_PICKED_UP
&& !_player->IsGameMaster())
{
continue;
}
+9 -1
View File
@@ -7296,9 +7296,17 @@ SpellCastResult Spell::CheckItems(uint32* param1, uint32* param2)
{
// Xinef: this is not true in my opinion, in eg bladestorm will not be canceled after disarm
//if (!HasTriggeredCastFlag(TRIGGERED_IGNORE_EQUIPPED_ITEM_REQUIREMENT))
if (m_caster->IsPlayer() && !m_caster->ToPlayer()->HasItemFitToSpellRequirements(m_spellInfo))
if (m_caster->IsPlayer())
{
// Cast-from-glyph: many glyph on-use spells set EquippedItemClass to ITEM_CLASS_GLYPH.
// HasItemFitToSpellRequirements only implements weapon/armor, so it would always fail here
// even though the glyph item in the bag is the valid spell source.
bool const castFromGlyphScroll = m_CastItem && m_CastItem->GetTemplate() &&
m_CastItem->GetTemplate()->Class == ITEM_CLASS_GLYPH;
if (!castFromGlyphScroll && !m_caster->ToPlayer()->HasItemFitToSpellRequirements(m_spellInfo))
return SPELL_FAILED_EQUIPPED_ITEM_CLASS;
}
}
// do not take reagents for these item casts
if (!(m_CastItem && m_CastItem->GetTemplate()->HasFlag(ITEM_FLAG_NO_REAGENT_COST)))
@@ -5368,6 +5368,56 @@ void SpellMgr::LoadSpellInfoCorrections()
LockEntry* key = const_cast<LockEntry*>(sLockStore.LookupEntry(36)); // 3366 Opening, allows to open without proper key
key->Type[2] = LOCK_KEY_NONE;
// Fractured / Paragon: DK weapon-line "passives" Forceful Deflection and
// Runic Focus ship in 3.3.5a Spell.dbc without SPELL_ATTR0_PASSIVE set.
// SpellInfo::IsPassive() is therefore false, and mod-paragon's panel-learn
// diff treats them as castable actives and revokes them — while true
// actives (Blood Presence, Death Coil, Death Grip, ...) must stay
// stripped. Mark these two passive in-memory so the panel policy matches
// the spellbook UX for every class (stock DK benefits too).
ApplySpellFix({ 49410, 61455 }, [](SpellInfo* spellInfo)
{
spellInfo->Attributes |= SPELL_ATTR0_PASSIVE;
});
// Fractured: strip reagent requirements from every player-class spell at
// load time. Filtered by SpellFamilyName != 0 so that profession spells
// (cooking, alchemy, enchanting, blacksmithing, jewelcrafting, leatherworking,
// tailoring, engineering, inscription, mining, herbalism, skinning, fishing,
// first aid — all SpellFamilyName == SPELLFAMILY_GENERIC == 0) keep their
// mats and only the class abilities that asked for ankhs / candles / soul
// shards / verdant spheres / etc. cast freely. Done here in core spell
// data rather than as a runtime bypass in Spell::CheckItems / TakeReagents
// so the change is data-driven (the in-memory SpellInfo simply has no
// reagents to require). The client-side preflight is mirrored by the
// matching Spell.dbc patch shipped via patch-enUS-4.MPQ
// (fractured-tooling/_patch_spell_dbc_reagents.py).
{
uint32 fixedClassSpells = 0;
for (uint32 spellId = 1; spellId < sSpellMgr->GetSpellInfoStoreSize(); ++spellId)
{
SpellInfo const* info = sSpellMgr->GetSpellInfo(spellId);
if (!info || info->SpellFamilyName == 0)
continue;
bool hadAny = false;
for (uint32 i = 0; i < MAX_SPELL_REAGENTS; ++i)
if (info->Reagent[i] != 0 || info->ReagentCount[i] != 0)
{ hadAny = true; break; }
if (!hadAny)
continue;
SpellInfo* mut = const_cast<SpellInfo*>(info);
for (uint32 i = 0; i < MAX_SPELL_REAGENTS; ++i)
{
mut->Reagent[i] = 0;
mut->ReagentCount[i] = 0;
}
++fixedClassSpells;
}
LOG_INFO("server.loading", ">> Fractured: cleared reagents on {} class spells", fixedClassSpells);
}
LOG_INFO("server.loading", ">> Loading spell dbc data corrections in {} ms", GetMSTimeDiffToNow(oldMSTime));
LOG_INFO("server.loading", " ");
}
+8
View File
@@ -215,6 +215,14 @@ Updates.AllowRehash = 1
# -1 - (Enabled - unlimited)
Updates.CleanDeadRefMaxCount = 3
#
# Updates.ExceptionShutdownDelay
# Description: Time (in milliseconds) to wait before shutting down after a fatal exception (e.g. failed SQL update).
# Default: 10000 - 10 seconds
# 0 - Disabled (immediate shutdown)
Updates.ExceptionShutdownDelay = 10000
###################################################################################################
###################################################################################################
@@ -0,0 +1,5 @@
node_modules/
dist/
launcher.json
.DS_Store
Thumbs.db
+173
View File
@@ -0,0 +1,173 @@
# Fractured Launcher (Electron)
**Windows** and **Linux (AppImage)** launcher with **no extra console window**, **native Browse folder** dialog, **Gitea or GitHub** release assets + GitHub repo file sync, **realmlist**, optional **auth**, **Play**, and **auto-update** (via `electron-updater`). This is the **only** supported client launcher in this repo.
## Requirements
- [Node.js](https://nodejs.org/) 20+ (includes npm)
## Run from source
```bash
cd tools/fractured-launcher-electron
npm install
npm start
```
On first run, **`launcher.json`** is created: **dev** — next to the app in this folder; **Windows packaged** — beside the `.exe`; **Linux AppImage / macOS packaged** — under Electron **`app.getPath('userData')`** (typically under **`~/.config/`**, folder name from the app; AppImage mount is read-only so config cannot live beside the binary).
### Where patches download from
- **Recommended (self-hosted Gitea):** set **`gitea.base_url`**, **`gitea.owner`**, **`gitea.repo`** in `launcher.json` (see **`default-launcher.json`**). Players need **`GITEA_TOKEN`** (or the env name in **`gitea.token_env`**) if the Gitea repo is **private** — same trade-off as any private host (per-player token, SSO proxy, or a read-only deploy token you accept distributing).
- **Fallback:** if **`gitea.base_url`** is empty, **`from_release`** uses the **GitHub** Releases API against **`github.owner` / `github.repo`** (defaults to this **`Fractured`** repo for non-release paths), with optional **`GITHUB_TOKEN`** for private assets.
## Build Windows installers
```bash
npm install
npm run pack:win
```
Produces under **`dist/`**:
| Artifact | Purpose |
|----------|---------|
| `Fractured-Launcher-${version}-Setup.exe` (NSIS) | **Recommended for players** — supports seamless **auto-update** and restart. |
| `Fractured-Launcher-${version}-Windows-Portable.exe` | No installer; players replace the file manually. Auto-update is **less reliable** than NSIS. |
## Build Linux AppImage
```bash
cd tools/fractured-launcher-electron
npm install
npm run pack:linux
```
Produces **`dist/Fractured-Launcher-${version}-Linux-x86_64.AppImage`**. Same **`lib/baked-gitea-channel.js`** and **`default-launcher.json`** as Windows; run on **Linux** (or use **Fractured launcher CI** / **Sync release to Gitea**, which upload this file to Gitea with the Windows installers).
**Quick local test (avoids tag snapshot / CI):**
- **Linux:** from repo root, **`bash tools/fractured-launcher-electron/scripts/manual-pack-linux.sh`** → **`dist/*.AppImage`**.
- **Windows:** on a Windows machine, **`cd tools/fractured-launcher-electron`**, **`npm ci`**, **`npm run pack:win`** → **`dist/*.exe`**.
### Hardcoded Gitea channel (non-token)
**`lib/baked-gitea-channel.js`** exports **`base_url`**, **`owner`**, **`repo`**, **`release_tag`**. Set those strings once in the repo (same values you use for CI upload — not secret). At runtime **`config-store`** merges them into **`gitea.*`** so **`launcher.json`** does not need those fields; **`GITEA_TOKEN`** (or **`gitea.token_env`**) is still only for **private** Gitea. Leave a field **`''`** in the baked file to fall back to **`default-launcher.json`** / user **`launcher.json`** for that key.
**`npm run pack:win`** is plain **electron-builder** — no inject step, no extra JSON beside the app.
## Auto-update behaviour
- **Packaged** builds only (`npm run pack:win` output). In `npm start` dev mode, update checks are skipped (button still explains that).
- **No implicit GitHub feed:** the app does **not** guess `package.json``repository` anymore. Without configuration you get a clear “skipped” message instead of a **404** on a private repo.
- **Configured feeds** (first match wins): **`update_feed_url` / `LAUNCHER_UPDATE_URL`** (generic `latest.yml`); or **`gitea`** block filled in + **`GITEA_TOKEN`** when the instance is private (resolves `…/releases/download/{tag}/`); or **`GITHUB_TOKEN`** + **`github.owner` / `github.repo`** for **private** GitHub releases only.
- **~5 seconds** after launch, then **every 6 hours**, the app checks when a feed is configured.
- When a download finishes, a dialog offers **Restart now** (calls `quitAndInstall`) or **Later**.
- **Manual check:** button **Check launcher updates** in the UI.
### Where launcher updates are hosted
**`npm run publish:win`** runs **`electron-builder` with `--publish never`** — artifacts stay in **`dist/`**; CI uploads them to Gitea when you **publish a GitHub release**. For ad-hoc uploads, use **`scripts/upload-release-to-gitea.sh`**. For launcher auto-update, prefer:
- Set **`update_feed_url`** (or **`LAUNCHER_UPDATE_URL`**) to a **generic** HTTPS base URL where **`latest.yml`** and the installer files are hosted (often the same Gitea release attachment URLs pattern your reverse proxy exposes), **or**
- Keep publishing to a GitHub release only for **`latest.yml`** + installers if you accept that small metadata/binary channel there.
**Private GitHub** updater: set **`GH_TOKEN`** / **`GITHUB_TOKEN`** / **`github.token_env`** as documented in `lib/auto-update.js` behaviour.
**Generic feed:** optional Bearer token via the same token envs if your static host checks `Authorization`.
### Publishing a new launcher version
1. Bump **`version`** in `package.json` on `main` (or your release branch) and merge.
2. Create a **GitHub release** (tag + attach patches / `Wow.exe` if needed) and click **Publish****Sync release to Gitea** builds **Windows + Linux** launcher artifacts and mirrors everything to Gitea.
3. Local check: **`npm run pack:win`** (on Windows) or **`npm run pack:linux`** / **`scripts/manual-pack-linux.sh`**, then **`scripts/upload-release-to-gitea.sh`** with the same **`GITEA_*`** env vars as CI if you need a manual upload.
## Sync to Gitea (patches + launcher binaries)
CI workflow **Sync release to Gitea** (`.github/workflows/gitea-release-sync.yml`) runs on **every published GitHub release** on this repo:
1. Triggers on **release published** on **`Dawnforger/Fractured`** (or **workflow_dispatch** with a tag).
2. Builds **Windows** (NSIS + portable) and **Linux** (AppImage) in parallel, each using **`tools/fractured-launcher-electron` from the default branch** (overlaid onto the tag checkout), so older release tags never ship a launcher missing new **`lib/*.js`** files.
3. Downloads **all assets** attached to that **GitHub** release (MPQs, patched `Wow.exe`, etc.).
4. Merges with the built launcher artifacts and uploads everything to a **Gitea release** with the **same tag** (existing attachments on that Gitea release are replaced).
**GitHub Actions secrets** (repository → Settings → Secrets and variables → Actions):
| Secret | Example |
|--------|---------|
| **`GITEA_BASE_URL`** | `https://git.yourdomain.com` (no trailing slash) |
| **`GITEA_TOKEN`** | Gitea personal access token with permission to manage releases and attachments on the target repo |
| **`GITEA_OWNER`** | Organization or username on Gitea |
| **`GITEA_REPO`** | Repository name — must already have **at least one commit** (Gitea returns HTTP 422 “repo is empty” for zero-commit repos; push e.g. a README on **`main`** or set **`GITEA_TARGET_REF`** to your default branch) |
**Optional variable** (Settings → Variables): **`GITEA_TARGET_REF`** — default branch/commitish used **only when the workflow must create a new Gitea release** and Gitea needs `target_commitish` (defaults to **`main`** in the upload script if unset).
**Player `launcher.json`:** packaged builds should already include **`gitea.base_url` / `owner` / `repo`** from the bake step above. Players only need to set **`GITEA_TOKEN`** (or your **`token_env`**) if the Gitea repo is **private**. To point at another instance, edit **`gitea`** in **`launcher.json`**:
```json
"gitea": {
"base_url": "https://git.yourdomain.com",
"owner": "myorg",
"repo": "fractured-patches",
"release_tag": "latest",
"token_env": "GITEA_TOKEN"
}
```
**Manual upload:** `bash scripts/upload-release-to-gitea.sh /path/to/files v1.0.0` with the same env vars as CI.
### Sync did not run / Gitea unchanged — checklist
1. **Git tag ≠ GitHub Release** — Only **Releases** (published on the GitHub **Releases** page) trigger this workflow. If your teammate only **`git push --tags`**, create a **Release** from that tag and click **Publish** (or run **Actions → Sync release to Gitea → Run workflow** and enter the tag).
2. **Manual run: tag vs title****Run workflow** must receive the **git tag** (e.g. `v0.7.11-paragon-…`), copied from the release pages tag badge. Pasting the **release title** (long line with spaces/parentheses) breaks `git fetch` with `invalid refspec`.
3. **Draft release** — Must click **Publish release**; drafts do not mirror.
4. **Workflow on default branch** — GitHub runs `release` workflows from the **default branch** (e.g. `main`). Ensure `.github/workflows/gitea-release-sync.yml` is merged there.
5. **Repo name guard** — Jobs use `if: github.repository == 'Dawnforger/Fractured'`. Forks or renames must change that line or runs are skipped.
6. **Secrets****`GITEA_BASE_URL`**, **`GITEA_TOKEN`**, **`GITEA_OWNER`**, **`GITEA_REPO`** must be set under **Settings → Secrets and variables → Actions**. A failed “Upload to Gitea” step usually prints which is missing.
7. **Actions tab** — Open the latest **Sync release to Gitea** run; a red **build-electron** (old tag without `package-lock.json`, etc.) or **Upload to Gitea** step shows the real error.
8. **HTTP 422 `repo is empty`** — The Gitea repo has **no commits** yet. Push any initial commit (e.g. **Add README** in the Gitea web UI, or `git push` to **`main`**). Optionally set **`GITEA_TARGET_REF`** to match your real default branch if it is not **`main`**. From this repo you can run **`scripts/bootstrap-gitea-repo.sh`** (see script header for `GITEA_*` env or pass the HTTPS/SSH clone URL as the first argument).
9. **`sync Wow.exe: fetch failed`** — Often **HTTPS/TLS** to Gitea; use **`http://…`** in **`lib/baked-gitea-channel.js`** if you only serve plain HTTP, or fix certs / **`NODE_EXTRA_CA_CERTS`**. Ensure **`Wow-patched.exe`** exists on the release (**`release_tag`**: `latest` vs pinned). Errors include the failing URL when possible.
10. **Wine + Windows portable** — If the folder picker returns **`/home/...`**, the launcher maps it to **`Z:\home\...`** (Wines Unix root). **`Wow.exe`** is matched case-insensitively for Linux-backed folders. Re-save the WoW folder after upgrading if validation still fails.
### Private Gitea token for players
Do **not** embed a shared admin PAT in a shipped `launcher.json`. Prefer read-only tokens scoped to one repo, short-lived tokens, or a small auth service that redirects to signed URLs.
**Release asset names** must match **`files[].source`** when **`from_release`**: true. Use **`release_tag`**: `"latest"` or a pinned tag matching both GitHub and Gitea.
## Patch versions (same filenames, different bytes)
The launcher does **not** read Git commits. For **turn-key** updates when asset names stay fixed (e.g. **`Wow-patched.exe`** — add more **`files`** entries for any extra MPQs you ship):
1. Ship **`patch-manifest.json`** next to those files on **every** release (Gitea/GitHub attachment). It lists a **`version`** label (any string you bump per release, e.g. `v0.9.0-client`) and a **`sha256`** per **`files[].source`** name.
2. With **`patch_manifest.enabled`**: true (default in **`default-launcher.json`**), **Download updates** first fetches the manifest from the same release channel. If the files already on disk match those checksums, the player sees **“already match build … (nothing to download)”** — no redundant downloads.
3. After a real download, the launcher **re-hashes** installed files and compares to the manifest; mismatch → clear error. It also writes **`.fractured/patch-state.json`** under the WoW folder so the UI can show **“Installed client files: …”**.
If **`patch-manifest.json`** is missing on a release, the launcher falls back to **always downloading** all configured files (same as before).
**Generate the manifest** when you cut a release (paths are your local patch binaries):
```bash
cd /path/to/staging
node tools/fractured-launcher-electron/scripts/generate-patch-manifest.js v0.9.0-client Wow-patched.exe > patch-manifest.json
```
Attach **`patch-manifest.json`** together with the MPQ/exe to the GitHub release (CI sync copies it to Gitea with everything else).
## CI
Workflow **Fractured launcher CI** (`.github/workflows/fractured-launcher-ci.yml`) runs on pushes/PRs under `tools/fractured-launcher-electron/`: **Windows** (`npm run pack:win`) and **Linux** (`npm run pack:linux`) jobs, each **`electron-builder … --publish never`**. **Actions → Fractured launcher CI → Run workflow** runs it manually.
**Sync release to Gitea** (`.github/workflows/gitea-release-sync.yml`) uses the same pack commands. If you see `GH_TOKEN` / `GitHubPublisher` errors in logs, the job is almost certainly an old **Re-run failed jobs** — open **Actions → Sync release to Gitea → Run workflow**, enter the tag, and start a **new** run instead.
## Config
Schema is defined by **`default-launcher.json`** (shipped in the app; first run copies to **`launcher.json`** — beside the **Windows** exe, or under **`userData`** on **Linux/macOS** packaged builds):
- **`game_dir`**: WoW 3.3.5a root (contains `Wow.exe`).
- **`update_feed_url`**: optional generic HTTPS base for launcher auto-update.
- **`launcher_updates_from_github`**: default **`false`**. Only when **`true`** will a **`GITHUB_TOKEN`** (or **`github.token_env`**) enable **electron-updater**s GitHub provider against **`github.owner` / `github.repo`**. Leave **`false`** when launcher binaries and **`latest.yml`** live on **Gitea** (use **`gitea`** + token instead) so a stray GitHub token does not produce “No published versions on GitHub”.
- **`gitea`**: **`base_url`**, **`owner`**, **`repo`**, **`release_tag`**, **`token_env`** — when **`base_url`** is set (and owner/repo set), **`from_release`** downloads and (with token if needed) the **generic** updater feed use **Gitea**. **Required** for players if your CI mirrors patches/launchers to Gitea only.
- **`github`**: used for **non-release** repo paths (`from_release`: false) and for **GitHub** **`from_release`** when **`gitea.base_url`** is empty.
- **`patch_manifest`**: **`enabled`**, **`source`** (default `patch-manifest.json`), **`from_release`** — checksum-based skip + verify (see above).
- **`files`**: default **`[]`**. **Download updates** resolves what to pull in order: (**1**) non-empty **`files`** if you set explicit **`source`** → **`dest`** pairs; (**2**) else each key in **`patch-manifest.json`** on the release (recommended); (**3**) else release attachments except launcher artifacts (`Fractured-Launcher*`, `*.blockmap`, `latest*.yml`, `.AppImage`, `patch-manifest.json`): **`.MPQ`** → **`Data/enUS/<name>.MPQ`** (extension forced to **`.MPQ`** caps for client compatibility), one **`.exe`** → **`launch.exe`**. Multiple `.exe` attachments require a manifest. Legacy **`Wow-patched.exe`** entries are removed when merging **`launcher.json`**.
- **`realmlist`**, **`auth`**, **`launch`**.
@@ -0,0 +1,42 @@
{
"game_dir": "",
"update_feed_url": "",
"launcher_updates_from_github": false,
"gitea": {
"base_url": "",
"owner": "",
"repo": "",
"release_tag": "latest",
"token_env": "GITEA_TOKEN"
},
"github": {
"owner": "Dawnforger",
"repo": "Fractured",
"ref": "main",
"release_tag": "latest",
"token_env": "GITHUB_TOKEN"
},
"patch_manifest": {
"enabled": true,
"source": "patch-manifest.json",
"from_release": true
},
"files": [],
"realmlist": {
"enabled": true,
"line": "set realmlist fracturedwow.ddns.net:47497",
"paths": ["Data/enUS/realmlist.wtf", "Data/enGB/realmlist.wtf"]
},
"auth": {
"enabled": false,
"url": "https://auth.your-realm.example/api/launcher/login",
"method": "POST",
"username_field": "username",
"password_field": "password"
},
"launch": {
"exe": "Wow.exe",
"args": [],
"linux_wrapper": ["wine"]
}
}
@@ -0,0 +1,46 @@
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8" />
<meta http-equiv="Content-Security-Policy" content="default-src 'self'; script-src 'self'; style-src 'self' 'unsafe-inline'" />
<title>Fractured Launcher</title>
<link rel="stylesheet" href="styles.css" />
</head>
<body>
<header>
<h1>Fractured Launcher</h1>
<p class="sub">Point at your 3.3.5a client, download patches, then play.</p>
</header>
<section class="card">
<label class="lbl">World of Warcraft folder (contains <span id="wowExeName">Wow.exe</span>)</label>
<div class="row">
<input type="text" id="gameDir" placeholder="Browse… or paste the folder that contains Wow.exe" />
<button type="button" id="btnBrowse">Browse…</button>
<button type="button" id="btnSaveFolder" class="primary">Save folder</button>
</div>
</section>
<section class="card hidden" id="authCard">
<label class="lbl">Account</label>
<div class="row stack">
<input type="text" id="username" autocomplete="username" placeholder="Username" />
<input type="password" id="password" autocomplete="current-password" placeholder="Password" />
<button type="button" id="btnAuth" class="primary">Sign in</button>
</div>
</section>
<section class="card row-actions">
<button type="button" id="btnCheckLauncher" class="ghost">Check launcher updates</button>
</section>
<section class="card">
<button type="button" id="btnSync" class="primary wide" disabled>Download updates</button>
<button type="button" id="btnPlay" class="success wide hidden" disabled>Play</button>
</section>
<pre id="log" class="log" aria-live="polite"></pre>
<script src="renderer.js"></script>
</body>
</html>
@@ -0,0 +1,148 @@
'use strict';
const { dialog } = require('electron');
const { autoUpdater } = require('electron-updater');
const { useGiteaReleases, getGiteaUpdaterFeedBase } = require('./gitea-release');
/**
* @param {import('electron').App} app
* @param {() => import('electron').BrowserWindow | null} getMainWindow
* @param {{
* updateFeedUrl?: string,
* githubOwner?: string,
* githubRepo?: string,
* githubToken?: string,
* giteaToken?: string,
* allowGithubLauncherUpdates?: boolean,
* config?: object,
* }} opts
*/
async function setupAutoUpdater(app, getMainWindow, opts = {}) {
if (!app.isPackaged) {
return {
checkNow: async () => ({ skipped: true, reason: 'development build' }),
};
}
const ghToken = String(opts.githubToken || '').trim();
const giteaTok = String(opts.giteaToken || '').trim();
const envGeneric = String(process.env.LAUNCHER_UPDATE_URL || '').trim();
const configGeneric = String(opts.updateFeedUrl || '').trim();
let genericUrl = envGeneric || configGeneric;
let genericAuthHeader = '';
if (!genericUrl && opts.config && useGiteaReleases(opts.config)) {
const gfb = await getGiteaUpdaterFeedBase(opts.config);
if (gfb && gfb.url) {
genericUrl = gfb.url;
const t = String(gfb.token || giteaTok || '').trim();
if (t) genericAuthHeader = `token ${t}`;
}
} else if (genericUrl) {
if (giteaTok) genericAuthHeader = `token ${giteaTok}`;
else if (ghToken) genericAuthHeader = `Bearer ${ghToken}`;
}
const owner = String(opts.githubOwner || '').trim();
const repo = String(opts.githubRepo || '').trim();
let feedConfigured = false;
if (genericUrl) {
const base = genericUrl.replace(/\/?$/, '/');
autoUpdater.setFeedURL({
provider: 'generic',
url: base,
});
if (genericAuthHeader) {
autoUpdater.requestHeaders = {
...autoUpdater.requestHeaders,
Authorization: genericAuthHeader,
};
}
feedConfigured = true;
} else if (opts.allowGithubLauncherUpdates && ghToken && owner && repo) {
autoUpdater.setFeedURL({
provider: 'github',
owner,
repo,
private: true,
token: ghToken,
});
feedConfigured = true;
}
if (!feedConfigured) {
const reason =
'No update channel configured. Set launcher.json → update_feed_url (HTTPS folder with latest.yml), ' +
'or fill gitea.base_url/owner/repo (+ GITEA_TOKEN for private), ' +
'or set launcher_updates_from_github to true with GITHUB_TOKEN for private GitHub release feeds.';
return {
checkNow: async () => ({ skipped: true, reason }),
};
}
autoUpdater.autoDownload = true;
autoUpdater.autoInstallOnAppQuit = true;
const send = (msg) => {
const w = getMainWindow();
if (w && !w.isDestroyed()) {
w.webContents.send('launcher:progress', msg);
}
};
autoUpdater.on('checking-for-update', () => send('Checking for launcher updates…'));
autoUpdater.on('update-available', (info) => {
send(`Launcher update available: ${info.version}`);
});
autoUpdater.on('update-not-available', () => {});
autoUpdater.on('error', (err) => {
const m = (err && (err.message || String(err))) || '';
if (/404|releases\.atom|HttpError:\s*404/i.test(m)) {
send(
'Launcher update: 404 (no latest.yml or wrong URL). For Gitea use gitea.* + token, or set update_feed_url. ' +
'For private GitHub set GITHUB_TOKEN.'
);
return;
}
if (m && !/net::ERR|ENOTFOUND|ETIMEDOUT/i.test(m)) {
send(`Launcher update: ${m.slice(0, 400)}`);
}
});
autoUpdater.on('download-progress', (p) => {
const pct = Math.round(p.percent || 0);
send(`Launcher update download: ${pct}%`);
});
autoUpdater.on('update-downloaded', async (info) => {
const win = getMainWindow();
const r = await dialog.showMessageBox(win || undefined, {
type: 'info',
title: 'Launcher update',
message: `Version ${info.version} is ready to install.`,
detail: 'Restart the launcher now to finish. You can finish patching WoW after restart.',
buttons: ['Restart now', 'Later'],
defaultId: 0,
cancelId: 1,
noLink: true,
});
if (r.response === 0) {
autoUpdater.quitAndInstall(false, true);
}
});
const checkNow = async () => {
const r = await autoUpdater.checkForUpdates();
return { ok: true, updateInfo: r && r.updateInfo };
};
const tick = () => {
checkNow().catch(() => {});
};
setTimeout(tick, 5000);
setInterval(tick, 6 * 60 * 60 * 1000);
return { checkNow };
}
module.exports = { setupAutoUpdater };
@@ -0,0 +1,14 @@
'use strict';
/**
* Production Gitea mirror (non-secret). Edit here and ship no inject script,
* no fractured-release-channel.json, no CI env needed for these fields.
* Token stays in env: GITEA_TOKEN or launcher.json gitea.token_env.
*/
module.exports = {
// http:// kept as-is; bare host gets https in gitea-release.js
base_url: 'http://brassnet.ddns.net:33983',
owner: 'Dawnsorrow',
repo: 'Fractured-Distro',
release_tag: 'latest',
};
@@ -0,0 +1,116 @@
'use strict';
const path = require('path');
const fs = require('fs').promises;
const { normalizeWinGameDir } = require('./win-game-dir');
/** Sources no longer shipped; drop from merged files so old launcher.json does not keep fetching them. */
const DEPRECATED_FILE_SOURCES = new Set(['patch-Z.MPQ', 'Wow-patched.exe']);
function mergeFilesList(defaults, user) {
const dep = (e) => DEPRECATED_FILE_SOURCES.has(String(e && e.source ? e.source : '').trim());
if (Array.isArray(user.files) && user.files.length) {
const filtered = user.files.map((e) => ({ ...e })).filter((e) => !dep(e));
if (filtered.length) return filtered;
}
const defList = Array.isArray(defaults.files) ? defaults.files : [];
return defList.map((e) => ({ ...e })).filter((e) => !dep(e));
}
function userFilesContainDeprecated(user) {
const files = user && user.files;
if (!Array.isArray(files)) return false;
return files.some((e) => DEPRECATED_FILE_SOURCES.has(String(e && e.source ? e.source : '').trim()));
}
function mergeConfig(defaults, user) {
return {
...defaults,
...user,
update_feed_url:
user.update_feed_url != null && user.update_feed_url !== ''
? user.update_feed_url
: defaults.update_feed_url,
launcher_updates_from_github:
user.launcher_updates_from_github != null
? user.launcher_updates_from_github
: defaults.launcher_updates_from_github,
github: { ...defaults.github, ...(user.github || {}) },
gitea: { ...defaults.gitea, ...(user.gitea || {}) },
patch_manifest: { ...defaults.patch_manifest, ...(user.patch_manifest || {}) },
launch: { ...defaults.launch, ...(user.launch || {}) },
auth: user.auth != null ? { ...defaults.auth, ...user.auth } : defaults.auth,
realmlist: user.realmlist != null ? { ...defaults.realmlist, ...user.realmlist } : defaults.realmlist,
files: mergeFilesList(defaults, user),
};
}
/** Hardcoded Gitea host/repo (see lib/baked-gitea-channel.js). Non-empty baked values win. */
function applyBakedGitea(cfg) {
let baked;
try {
baked = require('./baked-gitea-channel');
} catch {
return cfg;
}
if (!baked || typeof baked !== 'object') return cfg;
cfg.gitea = { ...(cfg.gitea || {}) };
for (const k of ['base_url', 'owner', 'repo', 'release_tag']) {
const v = baked[k];
if (v != null && String(v).trim() !== '') cfg.gitea[k] = String(v).trim();
}
return cfg;
}
function getConfigPath(app) {
if (process.env.FRACTURED_LAUNCHER_CONFIG) return process.env.FRACTURED_LAUNCHER_CONFIG;
if (app && app.isPackaged) {
// AppImage (and macOS .app) run from a read-only mount — cannot write beside execPath.
if (process.platform === 'linux' || process.platform === 'darwin') {
return path.join(app.getPath('userData'), 'launcher.json');
}
return path.join(path.dirname(process.execPath), 'launcher.json');
}
return path.join(__dirname, '..', 'launcher.json');
}
async function loadConfig(app) {
const p = getConfigPath(app);
const defPath = path.join(__dirname, '..', 'default-launcher.json');
const defaults = JSON.parse(await fs.readFile(defPath, 'utf8'));
try {
const user = JSON.parse(await fs.readFile(p, 'utf8'));
const config = applyBakedGitea(mergeConfig(defaults, user));
if (userFilesContainDeprecated(user)) {
await fs.writeFile(p, JSON.stringify(config, null, 2), 'utf8');
}
return { configPath: p, config };
} catch (e) {
if (e.code === 'ENOENT') {
const initial = applyBakedGitea(mergeConfig(defaults, {}));
await fs.writeFile(p, JSON.stringify(initial, null, 2), 'utf8');
return { configPath: p, config: JSON.parse(JSON.stringify(initial)) };
}
throw e;
}
}
async function saveGameDir(configPath, gameDir) {
const defPath = path.join(__dirname, '..', 'default-launcher.json');
const defaults = JSON.parse(await fs.readFile(defPath, 'utf8'));
const user = JSON.parse(await fs.readFile(configPath, 'utf8'));
user.game_dir = gameDir;
const merged = applyBakedGitea(mergeConfig(defaults, user));
await fs.writeFile(configPath, JSON.stringify(merged, null, 2), 'utf8');
return merged;
}
function resolveGameDir(cfg, configPath) {
const gd = cfg.game_dir;
if (!gd) return '';
const abs = path.isAbsolute(gd) ? path.normalize(gd) : path.normalize(path.join(path.dirname(configPath), gd));
if (process.platform === 'win32') return normalizeWinGameDir(abs);
return abs;
}
module.exports = { getConfigPath, loadConfig, saveGameDir, resolveGameDir, mergeConfig, applyBakedGitea };
@@ -0,0 +1,120 @@
'use strict';
const { downloadBodyToFile, fetchOrThrow } = require('./http-download');
function normalizeGiteaBaseUrl(raw) {
let b = String(raw || '').trim().replace(/\/+$/, '');
if (!b) return '';
if (!/^https?:\/\//i.test(b)) b = `https://${b}`;
return b;
}
function giteaApiBase(cfg) {
const base = normalizeGiteaBaseUrl(cfg.gitea.base_url);
return `${base}/api/v1`;
}
function giteaToken(cfg) {
const name = cfg.gitea && cfg.gitea.token_env;
if (name && process.env[name]) return String(process.env[name]).trim();
return String(process.env.GITEA_TOKEN || '').trim();
}
function giteaHeaders(token, json = false) {
const h = { 'User-Agent': 'Fractured-Launcher-Electron' };
if (json) h.Accept = 'application/json';
if (token) h.Authorization = `token ${token}`;
return h;
}
function useGiteaReleases(cfg) {
const g = cfg.gitea;
if (!g) return false;
return !!(String(g.base_url || '').trim() && String(g.owner || '').trim() && String(g.repo || '').trim());
}
async function fetchGiteaReleaseRecord(cfg) {
const api = giteaApiBase(cfg);
const { owner, repo } = cfg.gitea;
const tag = (cfg.gitea.release_tag || 'latest').trim() || 'latest';
const token = giteaToken(cfg);
let listUrl;
if (tag.toLowerCase() === 'latest') {
listUrl = `${api}/repos/${encodeURIComponent(owner)}/${encodeURIComponent(repo)}/releases/latest`;
} else {
listUrl = `${api}/repos/${encodeURIComponent(owner)}/${encodeURIComponent(repo)}/releases/tags/${encodeURIComponent(tag)}`;
}
const res = await fetchOrThrow(listUrl, { headers: giteaHeaders(token, true) });
const text = await res.text();
if (!res.ok) {
let hint = '';
if (res.status === 404) hint = ' (wrong tag / no release / check base_url owner repo)';
if (res.status === 401 || res.status === 403) hint = ' (set GITEA_TOKEN or gitea.token_env)';
throw new Error(`Gitea release ${res.status}${hint}: ${text.slice(0, 600)}`);
}
return JSON.parse(text);
}
async function listGiteaReleaseAttachmentNames(cfg) {
const rel = await fetchGiteaReleaseRecord(cfg);
const list = rel.attachments || rel.assets || [];
return list.map((x) => x.name).filter(Boolean);
}
async function downloadGiteaReleaseAsset(cfg, assetName, destPath) {
const token = giteaToken(cfg);
const rel = await fetchGiteaReleaseRecord(cfg);
const list = rel.attachments || rel.assets || [];
let downloadUrl = '';
for (const a of list) {
if (a.name !== assetName) continue;
downloadUrl = a.browser_download_url || a.download_url || '';
break;
}
if (!downloadUrl) {
const names = list.map((x) => x.name).filter(Boolean);
throw new Error(`Gitea release asset "${assetName}" not found; attachments: ${names.join(', ') || '(none)'}`);
}
const h = { Accept: 'application/octet-stream' };
if (token) h.Authorization = `token ${token}`;
const dl = await fetchOrThrow(downloadUrl, { headers: h, redirect: 'follow' });
await downloadBodyToFile(dl, destPath);
}
/**
* Base URL for electron-updater generic provider (expects latest.yml under this path).
* Matches Giteas pattern: /owner/repo/releases/download/{tag}/latest.yml
*/
async function getGiteaUpdaterFeedBase(cfg) {
if (!useGiteaReleases(cfg)) return null;
const api = giteaApiBase(cfg);
const { owner, repo } = cfg.gitea;
const tag = (cfg.gitea.release_tag || 'latest').trim() || 'latest';
const token = giteaToken(cfg);
let listUrl;
if (tag.toLowerCase() === 'latest') {
listUrl = `${api}/repos/${encodeURIComponent(owner)}/${encodeURIComponent(repo)}/releases/latest`;
} else {
listUrl = `${api}/repos/${encodeURIComponent(owner)}/${encodeURIComponent(repo)}/releases/tags/${encodeURIComponent(tag)}`;
}
const res = await fetchOrThrow(listUrl, { headers: giteaHeaders(token, true) });
if (!res.ok) return null;
const rel = await res.json();
const tagName = rel.tag_name;
if (!tagName || typeof tagName !== 'string') return null;
const root = normalizeGiteaBaseUrl(cfg.gitea.base_url);
const url = `${root}/${encodeURIComponent(owner)}/${encodeURIComponent(repo)}/releases/download/${encodeURIComponent(tagName)}/`;
return { url, token };
}
module.exports = {
downloadGiteaReleaseAsset,
fetchGiteaReleaseRecord,
listGiteaReleaseAttachmentNames,
giteaToken,
useGiteaReleases,
getGiteaUpdaterFeedBase,
};
@@ -0,0 +1,9 @@
'use strict';
function githubToken(cfg) {
const name = cfg.github && cfg.github.token_env;
if (name && process.env[name]) return process.env[name];
return process.env.GITHUB_TOKEN || '';
}
module.exports = { githubToken };
@@ -0,0 +1,141 @@
'use strict';
const path = require('path');
const fs = require('fs').promises;
const { githubToken } = require('./github-token');
const { downloadGiteaReleaseAsset, useGiteaReleases, listGiteaReleaseAttachmentNames } = require('./gitea-release');
const { fetchToFile, downloadBodyToFile, fetchOrThrow } = require('./http-download');
function encodeRepoPath(repoPath) {
let p = String(repoPath || '').replace(/\\/g, '/').replace(/^\/+|\/+$/g, '');
if (!p) return '';
return p.split('/').map((seg) => encodeURIComponent(seg)).join('/');
}
function ghHeaders(token, json = false) {
const h = {
'User-Agent': 'Fractured-Launcher-Electron',
'X-GitHub-Api-Version': '2022-11-28',
};
if (json) h.Accept = 'application/vnd.github+json';
if (token) h.Authorization = `Bearer ${token}`;
return h;
}
async function downloadGitHubRepoFile(cfg, repoPath, destPath) {
const token = githubToken(cfg);
const enc = encodeRepoPath(repoPath);
const ref = cfg.github.ref || 'main';
const { owner, repo } = cfg.github;
if (!token) {
const url = `https://raw.githubusercontent.com/${owner}/${repo}/${ref}/${enc}`;
await fetchToFile(url, {}, destPath);
return;
}
const apiUrl = `https://api.github.com/repos/${owner}/${repo}/contents/${enc}?ref=${encodeURIComponent(ref)}`;
const res = await fetchOrThrow(apiUrl, { headers: ghHeaders(token, true) });
const body = await res.text();
if (!res.ok) {
throw new Error(`GitHub contents API ${res.status}: ${body.slice(0, 800)}`);
}
const meta = JSON.parse(body);
if (meta.type && meta.type !== 'file') {
throw new Error(`not a file: ${repoPath}`);
}
if (meta.download_url) {
const h = { Accept: 'application/octet-stream' };
if (token) {
h.Authorization = `Bearer ${token}`;
h['X-GitHub-Api-Version'] = '2022-11-28';
}
await fetchToFile(meta.download_url, h, destPath);
return;
}
if (meta.content && meta.encoding === 'base64') {
const buf = Buffer.from(String(meta.content).replace(/\n/g, ''), 'base64');
if (!buf.length) throw new Error('empty base64 content');
await fs.mkdir(path.dirname(destPath), { recursive: true });
const tmp = destPath + '.downloading';
await fs.writeFile(tmp, buf);
await fs.rename(tmp, destPath);
return;
}
throw new Error(`unexpected GitHub response for ${repoPath}`);
}
async function fetchGitHubReleaseJson(cfg) {
const token = githubToken(cfg);
const tag = (cfg.github.release_tag || 'latest').trim() || 'latest';
const { owner, repo } = cfg.github;
let listUrl;
if (tag.toLowerCase() === 'latest') {
listUrl = `https://api.github.com/repos/${owner}/${repo}/releases/latest`;
} else {
listUrl = `https://api.github.com/repos/${owner}/${repo}/releases/tags/${encodeURIComponent(tag)}`;
}
const res = await fetchOrThrow(listUrl, { headers: ghHeaders(token, true) });
const text = await res.text();
if (!res.ok) {
let hint = '';
if (res.status === 404) {
hint =
' (wrong tag, private repo without token, or releases live on Gitea — set gitea.base_url, gitea.owner, gitea.repo in launcher.json)';
}
if (res.status === 401 || res.status === 403) hint = ' (set GITHUB_TOKEN or token_env PAT)';
throw new Error(`releases list ${res.status}${hint}: ${text.slice(0, 600)}`);
}
return JSON.parse(text);
}
async function listReleaseAttachmentNames(cfg) {
if (useGiteaReleases(cfg)) {
return listGiteaReleaseAttachmentNames(cfg);
}
const rel = await fetchGitHubReleaseJson(cfg);
const assets = rel.assets || [];
return assets.map((a) => a.name).filter(Boolean);
}
async function downloadReleaseAsset(cfg, assetName, destPath) {
if (useGiteaReleases(cfg)) {
return downloadGiteaReleaseAsset(cfg, assetName, destPath);
}
const token = githubToken(cfg);
const rel = await fetchGitHubReleaseJson(cfg);
const assets = rel.assets || [];
let assetURL = '';
for (const a of assets) {
if (a.name !== assetName) continue;
if (token && a.url) {
assetURL = a.url;
break;
}
if (a.browser_download_url) {
assetURL = a.browser_download_url;
break;
}
assetURL = a.url;
break;
}
if (!assetURL) {
const names = assets.map((x) => x.name);
throw new Error(`release asset "${assetName}" not found; attachments: ${names.join(', ')}`);
}
const h = { Accept: 'application/octet-stream' };
if (token) {
h.Authorization = `Bearer ${token}`;
h['X-GitHub-Api-Version'] = '2022-11-28';
}
const dl = await fetchOrThrow(assetURL, { headers: h, redirect: 'follow' });
await downloadBodyToFile(dl, destPath);
}
module.exports = {
downloadGitHubRepoFile,
downloadReleaseAsset,
encodeRepoPath,
fetchGitHubReleaseJson,
listReleaseAttachmentNames,
};
@@ -0,0 +1,76 @@
'use strict';
const fs = require('fs').promises;
const path = require('path');
const { createWriteStream } = require('fs');
const { pipeline } = require('stream/promises');
const { Readable } = require('stream');
function safeUrlForLog(url) {
try {
const u = new URL(url);
return `${u.origin}${u.pathname}`;
} catch {
return String(url || '').split('?')[0].slice(0, 200);
}
}
function explainFetchFailure(err, url) {
const msg = err && err.message ? err.message : String(err);
const cause = err && err.cause;
const code = cause && cause.code ? cause.code : '';
const combined = `${msg} ${code}`;
const hints = [];
if (/CERT|TLS|SSL|UNABLE_TO_VERIFY|SELF_SIGNED|certificate|unknown ca|unable to verify/i.test(combined)) {
hints.push(
'TLS certificate not trusted — install a valid cert on Gitea, or trust your CA system-wide, or set NODE_EXTRA_CA_CERTS to a .pem bundle (self-signed mirrors)'
);
}
if (/ECONNREFUSED/.test(combined)) hints.push('connection refused (wrong host/port or server down)');
if (/ENOTFOUND|EAI_AGAIN/.test(combined)) hints.push('DNS lookup failed');
if (/ETIMEDOUT|TIMEOUT/i.test(combined)) hints.push('connection timed out');
const hintStr = hints.length ? ` ${hints.join(' ')}` : '';
return new Error(`${msg}${hintStr}${safeUrlForLog(url)}`);
}
/** Wrap global fetch with clearer errors for TLS/DNS/refused (Electron reports bare "fetch failed"). */
async function fetchOrThrow(url, init) {
try {
return await fetch(url, init);
} catch (e) {
throw explainFetchFailure(e, url);
}
}
async function downloadBodyToFile(res, destPath) {
if (!res.ok) {
const errText = await res.text().catch(() => '');
throw new Error(`HTTP ${res.status}: ${errText.slice(0, 500)}`);
}
if (!res.body) {
throw new Error('download has no body');
}
await fs.mkdir(path.dirname(destPath), { recursive: true });
const tmp = destPath + '.downloading';
let body = res.body;
if (body && typeof body.pipe !== 'function') {
body = Readable.fromWeb(body);
}
await pipeline(body, createWriteStream(tmp));
const st = await fs.stat(tmp);
if (st.size === 0) {
await fs.unlink(tmp).catch(() => {});
throw new Error('empty download');
}
await fs.rename(tmp, destPath);
}
async function fetchToFile(url, headers, destPath) {
const res = await fetchOrThrow(url, {
headers,
redirect: 'follow',
});
await downloadBodyToFile(res, destPath);
}
module.exports = { fetchToFile, downloadBodyToFile, fetchOrThrow, safeUrlForLog };
@@ -0,0 +1,150 @@
'use strict';
const fs = require('fs').promises;
const path = require('path');
const os = require('os');
const { createHash } = require('node:crypto');
const { downloadReleaseAsset, downloadGitHubRepoFile } = require('./github');
async function sha256File(absPath) {
const buf = await fs.readFile(absPath);
return createHash('sha256').update(buf).digest('hex');
}
function stateDir(gameDir) {
return path.join(gameDir, '.fractured');
}
function statePath(gameDir) {
return path.join(stateDir(gameDir), 'patch-state.json');
}
async function readPatchState(gameDir) {
if (!gameDir) return null;
try {
const t = await fs.readFile(statePath(gameDir), 'utf8');
return JSON.parse(t);
} catch {
return null;
}
}
async function writePatchState(gameDir, manifestVersion, fileShas) {
const p = statePath(gameDir);
await fs.mkdir(path.dirname(p), { recursive: true });
const body = {
client_build: manifestVersion,
updated_at: new Date().toISOString(),
files: fileShas,
};
const tmp = p + '.tmp';
await fs.writeFile(tmp, JSON.stringify(body, null, 2), 'utf8');
await fs.rename(tmp, p);
}
function validateManifest(m) {
if (!m || m.version == null || String(m.version).trim() === '') return false;
if (!m.files || typeof m.files !== 'object') return false;
return true;
}
/**
* Download and parse patch-manifest.json (or custom name). Returns null on any failure.
*/
async function loadManifest(cfg) {
const pm = cfg.patch_manifest;
if (!pm || !pm.enabled || !String(pm.source || '').trim()) return null;
const tmp = path.join(os.tmpdir(), `fr-patch-manifest-${Date.now()}-${Math.random().toString(36).slice(2)}.json`);
try {
if (pm.from_release) {
await downloadReleaseAsset(cfg, String(pm.source).trim(), tmp);
} else {
await downloadGitHubRepoFile(cfg, String(pm.source).trim(), tmp);
}
const raw = await fs.readFile(tmp, 'utf8');
await fs.unlink(tmp).catch(() => {});
return JSON.parse(raw);
} catch {
await fs.unlink(tmp).catch(() => {});
return null;
}
}
/**
* True if every from_release file on disk matches manifest sha256.
*/
async function patchesMatchManifest(cfg, manifest, onStatus) {
if (!validateManifest(manifest)) return false;
const { buildResolvedReleaseFiles } = require('./release-sync');
const entries = await buildResolvedReleaseFiles(cfg, manifest);
const gameDir = cfg.game_dir;
for (const entry of entries) {
if (!entry.from_release) continue;
const spec = manifest.files[entry.source];
if (!spec || !spec.sha256) return false;
const parts = String(entry.dest).replace(/\\/g, '/').split('/').filter(Boolean);
const destAbs = path.join(gameDir, ...parts);
let disk;
try {
disk = await sha256File(destAbs);
} catch {
return false;
}
if (disk.toLowerCase() !== String(spec.sha256).trim().toLowerCase()) return false;
}
if (onStatus) {
onStatus(`Client files already match build ${manifest.version} (nothing to download).`);
}
return true;
}
async function verifyInstalledAgainstManifest(cfg, manifest) {
if (!validateManifest(manifest)) return;
const { buildResolvedReleaseFiles } = require('./release-sync');
const entries = await buildResolvedReleaseFiles(cfg, manifest);
for (const entry of entries) {
if (!entry.from_release) continue;
const spec = manifest.files[entry.source];
if (!spec || !spec.sha256) {
throw new Error(
`patch-manifest.json is missing a sha256 for "${entry.source}" — regenerate the manifest for this release.`
);
}
const parts = String(entry.dest).replace(/\\/g, '/').split('/').filter(Boolean);
const destAbs = path.join(cfg.game_dir, ...parts);
const disk = await sha256File(destAbs);
if (disk.toLowerCase() !== String(spec.sha256).trim().toLowerCase()) {
throw new Error(
`${entry.source}: checksum mismatch after install (expected ${spec.sha256.slice(0, 12)}…, got ${disk.slice(0, 12)}…). Try syncing again.`
);
}
}
}
async function recordPatchState(cfg, manifest) {
if (!validateManifest(manifest)) return;
const { buildResolvedReleaseFiles } = require('./release-sync');
const entries = await buildResolvedReleaseFiles(cfg, manifest);
const shas = {};
for (const entry of entries) {
if (!entry.from_release) continue;
const parts = String(entry.dest).replace(/\\/g, '/').split('/').filter(Boolean);
const destAbs = path.join(cfg.game_dir, ...parts);
try {
shas[entry.source] = await sha256File(destAbs);
} catch {
/* skip */
}
}
await writePatchState(cfg.game_dir, String(manifest.version), shas);
}
module.exports = {
loadManifest,
validateManifest,
patchesMatchManifest,
verifyInstalledAgainstManifest,
recordPatchState,
readPatchState,
statePath,
};
@@ -0,0 +1,151 @@
'use strict';
const path = require('path');
const fs = require('fs').promises;
const fsSync = require('fs');
const { downloadGitHubRepoFile, downloadReleaseAsset } = require('./github');
const { normalizeWinGameDir } = require('./win-game-dir');
const { loadManifest } = require('./patch-manifest');
const { buildResolvedReleaseFiles } = require('./release-sync');
function pad2(n) {
return String(n).padStart(2, '0');
}
function backupSuffix() {
const d = new Date();
return `${d.getFullYear()}${pad2(d.getMonth() + 1)}${pad2(d.getDate())}-${pad2(d.getHours())}${pad2(d.getMinutes())}${pad2(d.getSeconds())}`;
}
function wowExePath(cfg) {
const gd = normalizeWinGameDir(cfg.game_dir || '');
const exe = (cfg.launch && cfg.launch.exe) || 'Wow.exe';
const parts = exe.replace(/\\/g, '/').split('/').filter(Boolean);
const primary = path.join(gd, ...parts);
if (process.platform === 'win32' && gd && fsSync.existsSync(primary)) return primary;
if (process.platform === 'win32' && gd) {
try {
const base = path.basename(primary);
const dir = path.dirname(primary);
const names = fsSync.readdirSync(dir);
const hit = names.find((n) => n.toLowerCase() === base.toLowerCase());
if (hit) {
const alt = path.join(dir, hit);
if (fsSync.statSync(alt).isFile()) return alt;
}
} catch (_) {
/* ignore */
}
}
return primary;
}
function wowInstallValid(cfg) {
if (!cfg.game_dir) return false;
const p = wowExePath(cfg);
return fsSync.existsSync(p) && fsSync.statSync(p).isFile();
}
/** WoW expects patch MPQ names with a literal .MPQ extension (case-sensitive clients). */
function normalizeMpqDestinationPath(absPath) {
const s = String(absPath || '');
return /\.mpq$/i.test(s) ? s.replace(/\.mpq$/i, '.MPQ') : s;
}
async function installFile(cfg, entry) {
const parts = String(entry.dest).replace(/\\/g, '/').split('/').filter(Boolean);
const root = normalizeWinGameDir(cfg.game_dir || '');
const destAbs = normalizeMpqDestinationPath(path.join(root, ...parts));
if (entry.backup) {
try {
const st = await fs.stat(destAbs);
if (st.isFile()) {
const bak = `${destAbs}.bak-${backupSuffix()}`;
await fs.rename(destAbs, bak);
}
} catch (e) {
if (e.code !== 'ENOENT') throw e;
}
} else {
try {
await fs.unlink(destAbs);
} catch (e) {
if (e.code !== 'ENOENT') throw e;
}
}
const tmp = destAbs + '.new';
if (entry.from_release) {
await downloadReleaseAsset(cfg, entry.source, tmp);
} else {
await downloadGitHubRepoFile(cfg, entry.source, tmp);
}
await fs.rename(tmp, destAbs);
}
async function applyRealmlist(cfg) {
if (!cfg.realmlist || !cfg.realmlist.enabled) return;
let line = String(cfg.realmlist.line || '').trim();
if (!line) throw new Error('realmlist.line empty');
if (!line.toLowerCase().startsWith('set realmlist ')) {
line = `set realmlist ${line}`;
}
const content = line + '\n';
let paths = cfg.realmlist.paths;
if (!paths || !paths.length) paths = ['Data/enUS/realmlist.wtf'];
for (const rel of paths) {
const r = String(rel).trim().replace(/\\/g, '/');
if (!r) continue;
const segs = r.split('/').filter(Boolean);
const abs = path.join(normalizeWinGameDir(cfg.game_dir || ''), ...segs);
await fs.mkdir(path.dirname(abs), { recursive: true });
await fs.writeFile(abs, content, 'utf8');
}
}
async function applyPatches(cfg, onStatus) {
let manifest = null;
if (cfg.patch_manifest && cfg.patch_manifest.enabled) {
manifest = await loadManifest(cfg);
}
const entries = await buildResolvedReleaseFiles(cfg, manifest);
for (const f of entries) {
if (onStatus) onStatus(`Updating ${f.dest}`);
try {
await installFile(cfg, f);
} catch (e) {
throw new Error(`sync ${f.dest}: ${e.message || e}`);
}
}
if (cfg.realmlist && cfg.realmlist.enabled) {
if (onStatus) onStatus('Applying realmlist …');
await applyRealmlist(cfg);
}
if (onStatus) onStatus('All patches applied.');
}
async function doAuth(cfg, username, password) {
if (!cfg.auth || !cfg.auth.enabled) return;
const u = String(username || '').trim();
const p = String(password || '');
if (!u || !p) throw new Error('username and password required');
const body = {
[cfg.auth.username_field || 'username']: u,
[cfg.auth.password_field || 'password']: p,
};
const res = await fetch(cfg.auth.url, {
method: cfg.auth.method || 'POST',
headers: { 'Content-Type': 'application/json', Accept: 'application/json' },
body: JSON.stringify(body),
});
const t = await res.text();
if (res.status < 200 || res.status >= 300) {
throw new Error(`login failed ${res.status}: ${t.slice(0, 400)}`);
}
}
module.exports = {
applyPatches,
applyRealmlist,
wowExePath,
wowInstallValid,
doAuth,
};
@@ -0,0 +1,128 @@
'use strict';
const path = require('path');
const { listReleaseAttachmentNames } = require('./github');
/** Legacy launcher.json rows — ignored when merging explicit files. */
const DEPRECATED_SOURCES = new Set(['patch-Z.MPQ', 'Wow-patched.exe']);
function filterExplicitFiles(files) {
if (!Array.isArray(files)) return [];
return files
.filter((e) => e && String(e.source || '').trim())
.filter((e) => !DEPRECATED_SOURCES.has(String(e.source).trim()))
.map((e) => ({
source: String(e.source).trim(),
dest: String(e.dest || '').trim(),
backup: e.backup !== false,
from_release: e.from_release !== false,
}))
.filter((e) => e.dest);
}
function manifestLooksUsable(m) {
return !!(m && m.files && typeof m.files === 'object' && Object.keys(m.files).length > 0);
}
/** Launcher / updater attachments — never copied into the WoW folder. */
function isExcludedFromGameSync(fileName) {
const n = String(fileName || '');
const lower = n.toLowerCase();
if (lower === 'patch-manifest.json') return true;
if (/^fractured-launcher/i.test(n)) return true;
if (/\.blockmap$/i.test(n)) return true;
if (/^latest.*\.ya?ml$/i.test(n) || lower === 'latest.yml') return true;
if (lower.includes('builder-debug')) return true;
if (/\.appimage$/i.test(n)) return true;
return false;
}
function mpqDestFromSource(source) {
const base = path.basename(String(source || ''));
const stem = base.replace(/\.mpq$/i, '');
return `Data/enUS/${stem}.MPQ`;
}
function destForReleaseSource(source, cfg) {
const base = path.basename(String(source || ''));
if (/\.mpq$/i.test(base)) return mpqDestFromSource(source);
if (/\.exe$/i.test(base)) return (cfg.launch && cfg.launch.exe) || 'Wow.exe';
return base;
}
/**
* Explicit `files` in config wins. Otherwise use patch-manifest keys if present,
* else discover attachments on the release (excluding launcher artifacts).
*/
async function buildResolvedReleaseFiles(cfg, manifestMaybeNull) {
const explicit = filterExplicitFiles(cfg.files);
if (explicit.length) return explicit;
const manifest = manifestMaybeNull;
if (manifestLooksUsable(manifest)) {
const keys = Object.keys(manifest.files).filter((k) => k && !isExcludedFromGameSync(k));
if (!keys.length) {
throw new Error('patch-manifest.json has no file entries — add files or attach assets to the release.');
}
return keys.map((source) => ({
source,
dest: destForReleaseSource(source, cfg),
backup: true,
from_release: true,
}));
}
const names = await listReleaseAttachmentNames(cfg);
const game = names.filter((n) => n && !isExcludedFromGameSync(n));
if (!game.length) {
throw new Error(
'No patch files on this release (after excluding launcher installers). ' +
'Attach MPQ/exe assets or ship patch-manifest.json listing filenames.'
);
}
const exes = game.filter((n) => /\.exe$/i.test(n));
const mpqs = game.filter((n) => /\.mpq$/i.test(n));
const rest = game.filter((n) => !/\.(exe|mpq)$/i.test(n));
if (exes.length > 1) {
throw new Error(
`Release has multiple .exe files (${exes.join(', ')}). ` +
'Remove extras or publish patch-manifest.json with the exact filenames to install.'
);
}
const out = [];
for (const n of mpqs) {
out.push({
source: n,
dest: mpqDestFromSource(n),
backup: true,
from_release: true,
});
}
if (exes.length === 1) {
out.push({
source: exes[0],
dest: (cfg.launch && cfg.launch.exe) || 'Wow.exe',
backup: true,
from_release: true,
});
}
for (const n of rest) {
out.push({
source: n,
dest: path.basename(n),
backup: true,
from_release: true,
});
}
return out;
}
module.exports = {
buildResolvedReleaseFiles,
filterExplicitFiles,
isExcludedFromGameSync,
DEPRECATED_SOURCES,
};
@@ -0,0 +1,21 @@
'use strict';
const path = require('path');
/**
* Under Wine, the folder picker often returns a Unix absolute path (/home/...).
* Windows Node does not resolve that to the WoW install; map to Wine's Z: drive
* (Z: == / on typical Wine prefixes).
*/
function normalizeWinGameDir(gameDir) {
if (process.platform !== 'win32') return String(gameDir || '').trim();
let s = String(gameDir || '').trim();
if (!s) return s;
s = s.replace(/\//g, path.win32.sep);
if (s.startsWith('\\\\')) return path.normalize(s);
if (/^[A-Za-z]:/.test(s)) return path.normalize(s);
if (s.startsWith(path.win32.sep)) return path.win32.normalize(`Z:${s}`);
return path.normalize(s);
}
module.exports = { normalizeWinGameDir };
+158
View File
@@ -0,0 +1,158 @@
'use strict';
const { app, BrowserWindow, ipcMain, dialog, Menu } = require('electron');
const path = require('path');
const { spawn } = require('child_process');
const { loadConfig, saveGameDir, resolveGameDir } = require('./lib/config-store');
const { normalizeWinGameDir } = require('./lib/win-game-dir');
const { applyPatches, wowExePath, wowInstallValid, doAuth } = require('./lib/patch');
const { readPatchState } = require('./lib/patch-manifest');
const { setupAutoUpdater } = require('./lib/auto-update');
let mainWindow;
let autoUpdateApi = {
checkNow: async () => ({ skipped: true, reason: 'not initialized' }),
};
function createWindow() {
mainWindow = new BrowserWindow({
width: 720,
height: 640,
show: false,
autoHideMenuBar: true,
webPreferences: {
preload: path.join(__dirname, 'preload.cjs'),
contextIsolation: true,
nodeIntegration: false,
sandbox: false,
},
});
Menu.setApplicationMenu(null);
mainWindow.loadFile(path.join(__dirname, 'index.html'));
mainWindow.once('ready-to-show', () => mainWindow.show());
}
function sendProgress(msg) {
if (mainWindow && !mainWindow.isDestroyed()) {
mainWindow.webContents.send('launcher:progress', msg);
}
}
async function readMergedConfig() {
const { configPath, config } = await loadConfig(app);
const gameDir = resolveGameDir(config, configPath);
const merged = { ...config, game_dir: gameDir };
return { configPath, config: merged };
}
app.whenReady().then(async () => {
createWindow();
const { config } = await loadConfig(app);
const ghEnv = config.github && config.github.token_env;
const githubToken =
(ghEnv && String(process.env[ghEnv] || '').trim()) ||
String(process.env.GH_TOKEN || process.env.GITHUB_TOKEN || '').trim();
const giteaEnv = config.gitea && config.gitea.token_env;
const giteaToken =
(giteaEnv && String(process.env[giteaEnv] || '').trim()) ||
String(process.env.GITEA_TOKEN || '').trim();
const updateFeedUrl = String(process.env.LAUNCHER_UPDATE_URL || config.update_feed_url || '').trim();
autoUpdateApi = await setupAutoUpdater(app, () => mainWindow, {
updateFeedUrl,
config,
githubOwner: config.github && config.github.owner,
githubRepo: config.github && config.github.repo,
githubToken,
giteaToken,
allowGithubLauncherUpdates: config.launcher_updates_from_github === true,
});
app.on('activate', () => {
if (BrowserWindow.getAllWindows().length === 0) createWindow();
});
});
app.on('window-all-closed', () => {
if (process.platform !== 'darwin') app.quit();
});
ipcMain.handle('launcher:load', async () => {
const { configPath, config } = await readMergedConfig();
let clientBuild = '';
if (wowInstallValid(config)) {
const st = await readPatchState(config.game_dir);
if (st && st.client_build) clientBuild = String(st.client_build);
}
return {
configPath,
gameDir: config.game_dir || '',
authEnabled: !!(config.auth && config.auth.enabled),
wowExe: (config.launch && config.launch.exe) || 'Wow.exe',
wowOk: wowInstallValid(config),
clientBuild,
};
});
ipcMain.handle('launcher:saveGameDir', async (_e, dir) => {
const trimmed = String(dir || '').trim();
if (!trimmed) throw new Error('folder path is empty');
const { configPath } = await loadConfig(app);
const norm =
process.platform === 'win32' ? normalizeWinGameDir(path.normalize(trimmed)) : path.normalize(trimmed);
const probe = { ...(await readMergedConfig()).config, game_dir: norm };
if (!wowInstallValid(probe)) {
throw new Error(`That folder does not contain ${(probe.launch && probe.launch.exe) || 'Wow.exe'}`);
}
const c = await saveGameDir(configPath, norm);
const merged = { ...c, game_dir: resolveGameDir(c, configPath) };
return { ok: true, gameDir: merged.game_dir, wowOk: wowInstallValid(merged) };
});
ipcMain.handle('launcher:pickFolder', async (_e, startDir) => {
const win = BrowserWindow.getFocusedWindow() || mainWindow;
const r = await dialog.showOpenDialog(win, {
title: 'Select World of Warcraft 3.3.5a folder',
properties: ['openDirectory', 'createDirectory'],
defaultPath: startDir && String(startDir).trim() ? String(startDir).trim() : undefined,
});
if (r.canceled || !r.filePaths || !r.filePaths[0]) return { canceled: true, path: '' };
return { canceled: false, path: r.filePaths[0] };
});
ipcMain.handle('launcher:auth', async (_e, { user, pass }) => {
const { config } = await readMergedConfig();
await doAuth(config, user, pass);
return { ok: true };
});
ipcMain.handle('launcher:sync', async () => {
const { config } = await readMergedConfig();
if (!wowInstallValid(config)) {
throw new Error('Set a valid WoW folder (must contain Wow.exe) first.');
}
await applyPatches(config, sendProgress);
return { ok: true };
});
ipcMain.handle('launcher:checkUpdates', async () => {
try {
return await autoUpdateApi.checkNow();
} catch (e) {
const msg = e && (e.message || String(e));
return { ok: false, error: msg };
}
});
ipcMain.handle('launcher:play', async () => {
const { config } = await readMergedConfig();
const exe = wowExePath(config);
const args = (config.launch && config.launch.args) || [];
const child = spawn(exe, args, {
cwd: config.game_dir,
detached: true,
stdio: 'ignore',
windowsHide: true,
shell: false,
});
child.unref();
return { ok: true };
});
File diff suppressed because it is too large Load Diff
@@ -0,0 +1,78 @@
{
"name": "fractured-launcher-electron",
"version": "1.0.9",
"description": "Fractured WoW launcher (Electron) — no console window, native folder picker, auto-update",
"main": "main.js",
"repository": {
"type": "git",
"url": "https://github.com/Dawnforger/Fractured.git"
},
"scripts": {
"start": "electron .",
"pack:win": "electron-builder --win nsis portable --x64 --publish never",
"pack:linux": "electron-builder --linux AppImage --x64 --publish never",
"publish:win": "electron-builder --win nsis portable --x64 --publish never"
},
"author": "",
"license": "GPL-3.0",
"devDependencies": {
"electron": "^33.2.1",
"electron-builder": "^25.1.8"
},
"dependencies": {
"electron-updater": "^6.3.9"
},
"build": {
"appId": "net.fractured.launcher",
"productName": "Fractured Launcher",
"directories": {
"output": "dist"
},
"publish": null,
"files": [
"main.js",
"preload.cjs",
"index.html",
"renderer.js",
"styles.css",
"default-launcher.json",
"lib/win-game-dir.js",
"lib/baked-gitea-channel.js",
"lib/gitea-release.js",
"lib/patch-manifest.js",
"lib/**/*"
],
"win": {
"target": [
{
"target": "nsis",
"arch": ["x64"]
},
{
"target": "portable",
"arch": ["x64"]
}
]
},
"nsis": {
"oneClick": false,
"allowToChangeInstallationDirectory": true,
"artifactName": "Fractured-Launcher-${version}-Setup.${ext}"
},
"portable": {
"artifactName": "Fractured-Launcher-${version}-Windows-Portable.${ext}"
},
"linux": {
"target": [
{
"target": "AppImage",
"arch": ["x64"]
}
],
"category": "Game"
},
"appImage": {
"artifactName": "Fractured-Launcher-${version}-Linux-x86_64.${ext}"
}
}
}
@@ -0,0 +1,16 @@
'use strict';
const { contextBridge, ipcRenderer } = require('electron');
contextBridge.exposeInMainWorld('launcher', {
load: () => ipcRenderer.invoke('launcher:load'),
saveGameDir: (dir) => ipcRenderer.invoke('launcher:saveGameDir', dir),
pickFolder: (startDir) => ipcRenderer.invoke('launcher:pickFolder', startDir),
auth: (user, pass) => ipcRenderer.invoke('launcher:auth', { user, pass }),
sync: () => ipcRenderer.invoke('launcher:sync'),
checkUpdates: () => ipcRenderer.invoke('launcher:checkUpdates'),
play: () => ipcRenderer.invoke('launcher:play'),
onProgress: (cb) => {
ipcRenderer.on('launcher:progress', (_e, msg) => cb(msg));
},
});
@@ -0,0 +1,130 @@
'use strict';
const logEl = document.getElementById('log');
const gameDirEl = document.getElementById('gameDir');
const btnBrowse = document.getElementById('btnBrowse');
const btnSave = document.getElementById('btnSaveFolder');
const btnSync = document.getElementById('btnSync');
const btnPlay = document.getElementById('btnPlay');
const btnCheckLauncher = document.getElementById('btnCheckLauncher');
const authCard = document.getElementById('authCard');
const btnAuth = document.getElementById('btnAuth');
const wowExeName = document.getElementById('wowExeName');
function log(msg) {
logEl.textContent += (logEl.textContent ? '\n' : '') + msg;
logEl.scrollTop = logEl.scrollHeight;
}
function setError(e) {
const m = e && (e.message || String(e));
log('Error: ' + m);
}
let authEnabled = false;
let signedIn = false;
async function refresh() {
try {
const s = await window.launcher.load();
authEnabled = s.authEnabled;
signedIn = !s.authEnabled;
wowExeName.textContent = s.wowExe || 'Wow.exe';
gameDirEl.value = s.gameDir || '';
authCard.classList.toggle('hidden', !authEnabled);
btnSync.disabled = !s.wowOk || (authEnabled && !signedIn);
btnPlay.classList.add('hidden');
btnPlay.disabled = true;
logEl.textContent = '';
if (!s.gameDir) log('Choose your WoW installation folder.');
else if (!s.wowOk) log('Folder does not look valid yet — pick the directory that contains ' + (s.wowExe || 'Wow.exe') + ', then Save folder.');
else if (authEnabled && !signedIn) log('Sign in, then download updates.');
else log('Ready — tap Download updates to sync from GitHub.');
} catch (e) {
setError(e);
}
}
window.launcher.onProgress((msg) => log(msg));
btnBrowse.addEventListener('click', async () => {
try {
const start = gameDirEl.value.trim();
const r = await window.launcher.pickFolder(start);
if (!r.canceled && r.path) {
gameDirEl.value = r.path;
log('Selected: ' + r.path);
}
} catch (e) {
setError(e);
}
});
btnSave.addEventListener('click', async () => {
try {
const dir = gameDirEl.value.trim();
if (!dir) {
log('Pick a folder with Browse… first.');
return;
}
const r = await window.launcher.saveGameDir(dir);
gameDirEl.value = r.gameDir;
btnSync.disabled = !r.wowOk || (authEnabled && !signedIn);
log('Saved installation folder.');
} catch (e) {
setError(e);
}
});
btnAuth.addEventListener('click', async () => {
try {
const u = document.getElementById('username').value;
const p = document.getElementById('password').value;
await window.launcher.auth(u, p);
signedIn = true;
log('Signed in.');
btnSync.disabled = !gameDirEl.value.trim() || (authEnabled && !signedIn);
const s = await window.launcher.load();
btnSync.disabled = !s.wowOk;
} catch (e) {
setError(e);
}
});
btnSync.addEventListener('click', async () => {
btnSync.disabled = true;
log('—');
try {
await window.launcher.sync();
btnPlay.classList.remove('hidden');
btnPlay.disabled = false;
log('Done. You can launch the game.');
} catch (e) {
setError(e);
} finally {
const s = await window.launcher.load().catch(() => null);
btnSync.disabled = !s || !s.wowOk || (authEnabled && !signedIn);
}
});
btnPlay.addEventListener('click', async () => {
try {
await window.launcher.play();
window.close();
} catch (e) {
setError(e);
}
});
btnCheckLauncher.addEventListener('click', async () => {
try {
log('Checking for launcher updates…');
const r = await window.launcher.checkUpdates();
if (r && r.skipped) log('Launcher auto-update: ' + (r.reason || 'skipped (use a packaged build).'));
else if (r && r.ok === false && r.error) setError(new Error(r.error));
} catch (e) {
setError(e);
}
});
refresh();
@@ -0,0 +1,50 @@
#!/usr/bin/env bash
# Push a one-file README so the Gitea repo is non-empty (fixes HTTP 422 "repo is empty"
# when CI creates a release). Safe to re-run only if the repo still has no commits;
# if it already has history, skip or use the Gitea web UI instead.
#
# Usage:
# export GITEA_BASE_URL=https://git.example.com
# export GITEA_OWNER=myorg
# export GITEA_REPO=fractured-patches
# ./bootstrap-gitea-repo.sh
#
# Or pass an explicit clone URL (HTTPS or SSH):
# ./bootstrap-gitea-repo.sh https://git.example.com/myorg/fractured-patches.git
#
set -euo pipefail
BRANCH="${GITEA_TARGET_REF:-main}"
if [ "${1:-}" != "" ]; then
URL="$1"
else
: "${GITEA_BASE_URL:?Set GITEA_BASE_URL or pass clone URL as first argument}"
: "${GITEA_OWNER:?Set GITEA_OWNER or pass clone URL as first argument}"
: "${GITEA_REPO:?Set GITEA_REPO or pass clone URL as first argument}"
BASE="${GITEA_BASE_URL%/}"
URL="${BASE}/${GITEA_OWNER}/${GITEA_REPO}.git"
fi
TMP=$(mktemp -d)
trap 'rm -rf "$TMP"' EXIT
cd "$TMP"
git init -q
git checkout -q -b "$BRANCH"
cat >README.md <<'EOF'
# Fractured release mirror
Release assets (launcher builds, patches, `patch-manifest.json`, etc.) are uploaded here by **GitHub Actions** (“Sync release to Gitea”) from the main Fractured repository.
This initial commit exists because **Gitea requires at least one commit** in the repository before releases can be created.
EOF
git add README.md
git commit -q -m "chore: initial commit (required for Gitea releases)"
git remote add origin "$URL"
git push -u origin "$BRANCH"
echo "Pushed initial README to $URL (branch $BRANCH)."
@@ -0,0 +1,32 @@
#!/usr/bin/env node
/**
* Build patch-manifest.json for a release (same names as files[].source in launcher.json).
*
* Usage (from a folder containing the patch binaries list every files[].source name):
* node generate-patch-manifest.js v0.9.0-client Wow-patched.exe
*
* Prints JSON to stdout redirect to file:
* node generate-patch-manifest.js v0.9.0-client Wow-patched.exe > patch-manifest.json
*/
'use strict';
const fs = require('fs');
const path = require('path');
const crypto = require('crypto');
const version = process.argv[2];
const names = process.argv.slice(3);
if (!version || names.length === 0) {
console.error('Usage: generate-patch-manifest.js <version-label> <file1> [file2 ...]');
console.error(' Example: generate-patch-manifest.js v0.9.0-client Wow-patched.exe');
process.exit(1);
}
const out = { version, files: {} };
for (const f of names) {
const base = path.basename(f);
const buf = fs.readFileSync(f);
const sha256 = crypto.createHash('sha256').update(buf).digest('hex');
out.files[base] = { sha256 };
}
process.stdout.write(`${JSON.stringify(out, null, 2)}\n`);
@@ -0,0 +1,11 @@
#!/usr/bin/env bash
# Local Linux AppImage build (uses current tree — no tag snapshot). Run from repo root or this dir.
set -euo pipefail
ROOT="$(cd "$(dirname "$0")/.." && pwd)"
cd "$ROOT"
echo "==> npm ci"
npm ci
echo "==> npm run pack:linux (AppImage x64)"
npm run pack:linux
echo "==> dist/:"
ls -la dist/
@@ -0,0 +1,73 @@
#!/usr/bin/env bash
# Upload local files to a GitHub release on the public distro repo (default: Dawnforger/Fractured-Distro).
#
# Usage (from repo root or this directory):
# export GH_TOKEN=ghp_... # PAT with repo/releases on the distro repo
# ./tools/fractured-launcher-electron/scripts/publish-to-distro.sh v1.0.0 Wow-patched.exe
#
# Optional:
# DISTRO_REPO=YourOrg/Fratured-Distro # if your GitHub slug differs
# SRC_TAG=v1.0.0 ./publish-to-distro.sh v1.0.0 # copy all assets from SOURCE_REPO release SRC_TAG
#
set -euo pipefail
DISTRO_REPO="${DISTRO_REPO:-Dawnforger/Fractured-Distro}"
SOURCE_REPO="${SOURCE_REPO:-Dawnforger/Fractured}"
if ! command -v gh >/dev/null 2>&1; then
echo "Install GitHub CLI: https://cli.github.com/"
exit 1
fi
if [ -z "${GH_TOKEN:-}" ]; then
echo "Set GH_TOKEN to a PAT with releases write access to ${DISTRO_REPO}."
exit 1
fi
if [ "$#" -lt 1 ]; then
echo "Usage: $0 <release-tag> [files...]"
echo " or: SRC_TAG=v1.0.0 $0 <release-tag> # copies all assets from ${SOURCE_REPO} release SRC_TAG"
exit 1
fi
TAG="$1"
shift
if [ "$#" -eq 0 ] && [ -z "${SRC_TAG:-}" ]; then
echo "After the tag, list files to upload, or set SRC_TAG=... to copy all assets from ${SOURCE_REPO}."
exit 1
fi
tmpdir=$(mktemp -d)
cleanup() { rm -rf "$tmpdir"; }
trap cleanup EXIT
if [ "$#" -eq 0 ] && [ -n "${SRC_TAG:-}" ]; then
echo "Downloading assets from ${SOURCE_REPO}@${SRC_TAG}"
gh release download "$SRC_TAG" -R "$SOURCE_REPO" -D "$tmpdir"
else
for f in "$@"; do
if [ ! -f "$f" ]; then
echo "Not a file: $f"
exit 1
fi
cp -a "$f" "$tmpdir/"
done
fi
shopt -s nullglob
files=("$tmpdir"/*)
if [ "${#files[@]}" -eq 0 ]; then
echo "No files to upload."
exit 1
fi
if gh release view "$TAG" -R "$DISTRO_REPO" &>/dev/null; then
gh release upload "$TAG" -R "$DISTRO_REPO" "${files[@]}" --clobber
echo "Uploaded to https://github.com/${DISTRO_REPO}/releases/tag/${TAG}"
else
gh release create "$TAG" -R "$DISTRO_REPO" \
--title "Fractured ${TAG}" \
--notes "Published from ${SOURCE_REPO} (local script)." \
"${files[@]}"
echo "Created https://github.com/${DISTRO_REPO}/releases/tag/${TAG}"
fi
@@ -0,0 +1,91 @@
#!/usr/bin/env bash
# Upload all files in a directory as attachments on a Gitea release (create release if missing).
#
# Usage:
# export GITEA_BASE_URL=https://git.example.com
# export GITEA_TOKEN=gta_...
# export GITEA_OWNER=myorg
# export GITEA_REPO=fractured-patches
# export GITEA_TARGET_REF=main # optional, used when creating a new release (tag must not exist yet)
# ./upload-release-to-gitea.sh /path/to/combined v1.0.0
#
set -euo pipefail
COMBINED_DIR="${1:?first arg: directory of files to attach}"
TAG="${2:?second arg: release tag (e.g. v1.0.0)}"
: "${GITEA_BASE_URL:?Set GITEA_BASE_URL (no trailing slash required)}"
: "${GITEA_TOKEN:?Set GITEA_TOKEN}"
: "${GITEA_OWNER:?Set GITEA_OWNER}"
: "${GITEA_REPO:?Set GITEA_REPO}"
BASE="${GITEA_BASE_URL%/}"
API="$BASE/api/v1"
TARGET="${GITEA_TARGET_REF:-main}"
AUTH_H=(-H "Authorization: token ${GITEA_TOKEN}" -H "Accept: application/json")
TAG_ENC=$(python3 -c "import urllib.parse,sys; print(urllib.parse.quote(sys.argv[1], safe=''))" "$TAG")
REL_JSON=$(mktemp)
trap 'rm -f "$REL_JSON"' EXIT
code=$(curl -sS -o "$REL_JSON" -w "%{http_code}" "${AUTH_H[@]}" \
"$API/repos/${GITEA_OWNER}/${GITEA_REPO}/releases/tags/${TAG_ENC}")
if [ "$code" = "200" ]; then
rel_id=$(jq -r '.id' "$REL_JSON")
elif [ "$code" = "404" ]; then
body=$(jq -n \
--arg tag "$TAG" \
--arg name "Fractured $TAG" \
--arg body "Synced from GitHub Actions (Fractured)." \
--arg target "$TARGET" \
'{tag_name:$tag,name:$name,body:$body,draft:false,prerelease:false,target_commitish:$target}')
code=$(curl -sS -o "$REL_JSON" -w "%{http_code}" -X POST "${AUTH_H[@]}" \
-H "Content-Type: application/json" \
-d "$body" \
"$API/repos/${GITEA_OWNER}/${GITEA_REPO}/releases")
if [ "$code" != "201" ] && [ "$code" != "200" ]; then
echo "Gitea create release failed HTTP $code:" >&2
cat "$REL_JSON" >&2
if [ "$code" = "422" ] && jq -e '.message == "repo is empty"' "$REL_JSON" >/dev/null 2>&1; then
echo >&2
echo "Gitea does not allow releases on a repo with zero commits. Fix: push at least one commit" >&2
echo "to ${GITEA_OWNER}/${GITEA_REPO} (e.g. add README.md on branch ${TARGET} via web UI or git push)," >&2
echo "or set Actions variable GITEA_TARGET_REF to an existing default branch name." >&2
fi
exit 1
fi
rel_id=$(jq -r '.id' "$REL_JSON")
else
echo "Gitea GET release by tag failed HTTP $code:" >&2
cat "$REL_JSON" >&2
exit 1
fi
if [ -z "$rel_id" ] || [ "$rel_id" = "null" ]; then
echo "Could not resolve Gitea release id" >&2
exit 1
fi
while read -r aid; do
[ -z "$aid" ] || [ "$aid" = "null" ] && continue
curl -fsS -X DELETE "${AUTH_H[@]}" \
"$API/repos/${GITEA_OWNER}/${GITEA_REPO}/releases/${rel_id}/assets/${aid}" || true
done < <(jq -r '(.attachments // .assets // [])[] | .id' "$REL_JSON")
shopt -s nullglob
files=("$COMBINED_DIR"/*)
if [ "${#files[@]}" -eq 0 ]; then
echo "No files in $COMBINED_DIR" >&2
exit 1
fi
for f in "${files[@]}"; do
[ -f "$f" ] || continue
echo "Uploading $(basename "$f")"
curl -fsS -X POST "${AUTH_H[@]}" \
-F "attachment=@${f}" \
"$API/repos/${GITEA_OWNER}/${GITEA_REPO}/releases/${rel_id}/assets"
done
echo "Gitea release $TAG (id=$rel_id) updated with ${#files[@]} file(s)."
@@ -0,0 +1,126 @@
* {
box-sizing: border-box;
}
body {
margin: 0;
font-family: system-ui, Segoe UI, Roboto, sans-serif;
background: #121018;
color: #e8e4f0;
padding: 20px 24px 28px;
min-height: 100vh;
}
header h1 {
margin: 0 0 6px;
font-size: 1.35rem;
font-weight: 600;
}
.sub {
margin: 0 0 18px;
color: #9a92b0;
font-size: 0.9rem;
}
.card {
background: #1c1828;
border: 1px solid #2e2840;
border-radius: 10px;
padding: 14px 16px;
margin-bottom: 14px;
}
.lbl {
display: block;
font-size: 0.8rem;
color: #b8b0d0;
margin-bottom: 8px;
}
.row {
display: flex;
gap: 8px;
align-items: center;
flex-wrap: wrap;
}
.row.stack {
flex-direction: column;
align-items: stretch;
}
#gameDir {
flex: 1;
min-width: 200px;
padding: 10px 12px;
border-radius: 8px;
border: 1px solid #3d3558;
background: #0e0c14;
color: #f0ecff;
font-size: 0.85rem;
}
input[type='text'],
input[type='password'] {
padding: 10px 12px;
border-radius: 8px;
border: 1px solid #3d3558;
background: #0e0c14;
color: #f0ecff;
font-size: 0.9rem;
}
button {
padding: 10px 16px;
border-radius: 8px;
border: 1px solid #4a4268;
background: #2a243c;
color: #e8e4f0;
cursor: pointer;
font-size: 0.88rem;
}
button:hover:not(:disabled) {
background: #352d4c;
}
button:disabled {
opacity: 0.45;
cursor: not-allowed;
}
button.primary {
background: #4c3d8a;
border-color: #5c4d9a;
}
button.primary:hover:not(:disabled) {
background: #5a4a9e;
}
button.success {
background: #1d6b45;
border-color: #2a8a5a;
margin-top: 10px;
}
button.success:hover:not(:disabled) {
background: #258055;
}
button.ghost {
background: transparent;
border-color: #4a4268;
color: #b0a8d0;
font-size: 0.82rem;
}
button.ghost:hover:not(:disabled) {
background: #241f34;
}
.row-actions {
padding: 10px 16px;
}
button.wide {
width: 100%;
}
.log {
margin: 12px 0 0;
padding: 12px;
background: #0a090e;
border-radius: 8px;
border: 1px solid #2a2438;
min-height: 120px;
max-height: 200px;
overflow: auto;
font-size: 0.78rem;
color: #c4bdd8;
white-space: pre-wrap;
word-break: break-word;
}
.hidden {
display: none !important;
}