Fluxed Up: The Flux NSFW Checkpoint on CivitAI
Fluxed Up is the most-downloaded Flux NSFW checkpoint on CivitAI, and "fluxed up 7.1" is what most search results still point at. This page reconciles the version situation: v7.1 is still there, v10.x is current. It also maps each precision tier to the VRAM band it fits and walks through the local setup recipe — VAE, CLIP-L, T5-XXL, sampler, and scheduler — so you can pick a file and run it the same hour.
The Models
1. Fluxed Up (Flux NSFW Checkpoint)
Top PickMost-downloaded Flux NSFW checkpoint on CivitAI. 100k+ downloads, 10M+ generations, 545+ reviews. Five format tiers from 22.17 GB BF16 down to ~6.81 GB Q4 GGUF.
Architecture: Flux.1 D (12B fine-tune) · VRAM:12 GB+ FP16, 8 GB FP8, ~6 GB Q4 GGUF · Best for: Photorealistic female NSFW, run locally
View on CivitAI →2. Persephone 2.0
~1.1M downloads, 182 reviews. The pick if Fluxed Up's female-subject bias is too narrow.
Architecture: Flux.1 D (merge) · VRAM:12 GB+ FP16, 8 GB+ FP8, 6 GB+ Q4 GGUF · Best for: NSFW + SFW dual use
View on CivitAI →3. aidmaNSFWunlock LoRA
1,380+ reviews. Trigger word aidmaNSFWunlock at 0.5–1.0. Stays on top of plain Flux dev — no checkpoint swap.
Architecture: Flux LoRA · VRAM:Minimal overhead · Best for: Add NSFW to existing Flux dev
View on CivitAI →4. Flux Unchained (SCG)
681 reviews, ~5k training images. Last meaningful update Aug 2024.
Architecture: Flux.1 D · VRAM:12 GB+ · Best for: Stable, archived checkpoint
View on CivitAI →Version Map
The CivitAI version dropdown is long — 30+ files across 14 months. Here's what's actually current versus what older blog posts and search results point at.
| Version | Released | Status | Why you'd pick it |
|---|---|---|---|
| 10.2_BF16 | May 7, 2026 | Early Access (paid) | Latest. Free release follows. |
| 10.0_BF16 | recent | Latest free BF16 | Default for 16 GB+ cards |
| 9.0_BF16 / 9.0_FP8 / 9.0_FP16 | — | Free, multi-format | Good FP8 option for 8–12 GB |
| 8.1 / 8.0_BF16 | Mar 2026 | Free | The "March 2026" reference release |
| 7.1_FP16 / 7.1_Q8_GGUF / 7.1_Q4_GGUF | Feb 2026 | Free | What most "Fluxed Up 7.1" results land on |
| 7.0 / 6.x / 5.1 / 5.0 / 4.x / 3.x / 2.x / 1.0 | older | Free | Archived family |
If you typed "fluxed up 7.1 civitai" — that release is still downloadable. It's in the same dropdown. v10.0 is the newer free pick if you want updated training. v7.1 is the recommended floor if you want a Q4 or Q8 GGUF — that family was the one packaged with both quantizations.
The creator's version cadence is roughly monthly, with each new BF16 starting in paid Early Access for a few weeks before flipping to free.
Pick Your File for Your VRAM
Fluxed Up ships in five precision tiers across versions. Match to your card:
| Your VRAM | Recommended file | On-disk size | Notes |
|---|---|---|---|
| 6–8 GB | 7.1_Q4_GGUF or 5.1_Q4_K_S | ~6–7 GB | Use the GGUF T5 encoder too |
| 8–12 GB | 7.1_Q8_GGUF, or 9.0_FP8 / 5.1_FP8 | ~12–13 GB | FP8 if your version has it |
| 12–16 GB | 9.0_FP16 / 7.1_FP16 | ~22 GB | Full Flux dev floor |
| 16–24 GB+ | 10.0_BF16 / 9.0_BF16 | 22.17 GB | Full precision, room for LoRAs |
The full BF16 file is 22.17 GB (current 10.2_BF16 listing). The Q4 GGUF floor matches the standard flux1-dev-Q4_K_S.gguf size of ~6.81 GB, which is the same physical footprint as base Flux dev — Fluxed Up's fine-tune doesn't change quantization size.
GGUF needs the ComfyUI-GGUF custom node pack (install via ComfyUI Manager → "Install Missing Custom Nodes") and the matching Unet Loader (GGUF) and DualCLIPLoader (GGUF) workflow nodes.
Files You Need Alongside the Checkpoint
Fluxed Up does not bake in VAE, CLIP, or T5 — the model card says this directly. Download these once and reuse them across every Flux checkpoint:
clip_l.safetensors— CLIP-L text encoder. From comfyanonymous/flux_text_encoders.t5xxl_fp16.safetensors(or a GGUF T5 if you're tight on VRAM) — T5 XXL text encoder. Same HF repo, or city96/t5-v1_1-xxl-encoder-gguf for the quantized versions.ae.safetensors— Flux VAE. Mirrored at Comfy-Org/HiDream-I1_ComfyUI.
ComfyUI placement:
- Checkpoint →
ComfyUI\models\diffusion_models\(GGUF) orComfyUI\models\checkpoints\(SafeTensor). - CLIP / T5 →
ComfyUI\models\clip\. - VAE →
ComfyUI\models\vae\.
Sampler and Steps
The model card pins exact recommended settings:
- Sampler:
dpmpp_2m(DPM++ 2M). - Scheduler:
beta. - Steps: 20 is the broad Flux dev default — the creator doesn't pin a specific count and defers to the per-image workflows in the sample posts.
These match the recommended Flux dev sampler stack, so you don't need to reconfigure anything if you're already running another Flux fine-tune.
ComfyUI or Forge
Both work. The model card explicitly addresses Forge: "If you had problems with GGUF-versions in Forge, it should now work. Please re-download the model to get an updated version". So Forge is a supported target alongside ComfyUI, and any older GGUF download issues have been fixed in recent versions.
You have three reasonable ways to actually run it:
- ComfyUI — most flexibility, native Flux + GGUF support via the ComfyUI-GGUF nodes. The creator links a beginner workflow at civitai.com/articles/17080.
- Forge — easier UI, fast, native Flux support, GGUF-compatible. Best fit if you don't want to touch nodes.
- LocalForge AI — Forge pre-configured with Flux working out of the box. Skip the VAE/CLIP/T5 download dance.
If you already have a Flux dev workflow running in either frontend, swap the checkpoint loader to point at the Fluxed Up file and you're done.
License Caveat
Fluxed Up inherits the FLUX.1 [dev] Non-Commercial License v2.0 from Black Forest Labs. That covers personal use, hobby projects, and R&D — but not revenue-generating use, paid end-user APIs, or training other models for commercial purposes. Full terms: bfl.ai/legal/non-commercial-license-terms.
If you need commercial rights, you have two options.
- BFL Self-Hosted Commercial License — a paid license from Black Forest Labs that lifts the NC restriction on Flux dev derivatives. See bfl.ai/legal/self-hosted-commercial-license-terms.
- Switch to CHROMA — a Flux Schnell fork licensed Apache 2.0, no NC restriction. Less polished for NSFW than Fluxed Up, but it's the only commercial-friendly Flux fork.
Alternatives in the Same Lane
Fluxed Up is the default pick, but it's not the only one. Three alternatives worth knowing about:
- Persephone 2.0 — handles NSFW and SFW from one checkpoint. ~1.1M downloads, 182 reviews. The pick if Fluxed Up's heavy female-subject + nude bias is too narrow.
- aidmaNSFWunlock LoRA — 1,380+ reviews. Adds NSFW capability to plain Flux dev without swapping checkpoints. Trigger word
aidmaNSFWunlockat strength 0.5–1.0. The "add-on" answer to Fluxed Up's "replace your checkpoint" answer. - Flux Unchained — 681 reviews, trained on ~5k explicit images. Last meaningful update Aug 2024. Solid frozen alternative if you want a stable, archived checkpoint.
For the full ranked comparison see Best Flux NSFW Models on CivitAI or the Flux NSFW Checkpoints rankings.
What to Do Next
- Need download links for VAE/CLIP/T5 in one place? Flux NSFW Models Download List — every file with size and direct link.
- Setting it up in ComfyUI? Flux NSFW in ComfyUI — node wiring and FP8/GGUF paths.
- Want a LoRA add-on instead of a checkpoint swap? aidmaNSFWunlock deep dive — strength settings, stack order, trigger word.
Verdict
Fluxed Up is the default Flux NSFW checkpoint — most downloads, most generations, most active maintenance. It's also the only one in the lane with five precision tiers, spanning 24 GB workstation cards down to 8 GB consumer GPUs. v7.1 still works and is still downloadable; v10.0 is the current free pick. Grab the format that matches your VRAM, install the standard Flux VAE + CLIP-L + T5-XXL files alongside, set sampler to dpmpp_2m with beta scheduler, and you're running. Only watch the license — non-commercial only unless you switch to CHROMA or buy BFL's commercial license.
What to Do Next
Need every download link in one place?
Fluxed Up + VAE + CLIP + T5 with file sizes and direct links.
Setting it up in ComfyUI?
Node wiring, GGUF custom nodes, and FP8 paths for Flux NSFW workflows.
Prefer a LoRA add-on?
Strength settings, stack order, and trigger word for the most-reviewed Flux NSFW LoRA.
