DIPPER XXL Paraphraser

11B parameter discourse paraphraser (T5-XXL fine-tune) with controllable lexical and order diversity. Running on ZeroGPU (H200) — first request may take a moment while the model loads.

0 100
0 100

Tips

  • Context prefix — paste the paragraph before the one you're paraphrasing. The model conditions on it to maintain coherence. Leave blank for standalone text.
  • For multi-chunk inputs, each chunk automatically uses the previous output as context.
  • Keep inputs to a few paragraphs — very long texts will hit the 3-minute GPU timeout.
  • Order diversity only has noticeable effects at 60+; start with Lexical 60 / Order 0 for clean rewording.
  • Built from the NeurIPS 2023 paper: Paraphrasing evades detectors of AI-generated text (arXiv)