Internal Approval Request (Ringi) Auto-Draft Generation: Structured Output Format Design + Prompt Version Comparison
Internal approval request (ringi) scenarios are very well suited for Dify, but the prerequisite is not “writing a paragraph that looks like a report” — it is having the system output a draft that closely matches the organization’s actual internal approval format.
There are not many public articles on this topic, so it is better positioned as a structured application case for “enterprise document auto-generation.” The focus is not on business background, but on output format design and prompt version iteration.
Two important clues can be found from public sources: first, Japanese enterprises do have the reality of “approval request first, system second,” where many AI adoptions themselves get stuck in the ringi and approval process; second, Dify already has sufficiently mature public practices in enterprise document generation, draft generation, and Human in the Loop, which can support a use case of “auto-draft generation + manual confirmation.” In other words, while this topic lacks a directly matching standard answer, it does not lack combinable public evidence.
1. Design Premises Confirmed from Public Sources
1. Approval Documents Are First and Foremost an “Approval Interface,” Not a “Writing Task”
Public articles about manufacturing local AI environments, while not Dify configuration tutorials, make one point very clear: what truly blocks organizations is not the model, but the ringi, budget, compliance, and accountability requirements. In other words, the core objective of approval documents is to be understandable, comparable, and approvable by the organization.
2. Enterprise Document Generation Without Structure Is Nearly Impossible to Connect to Approval Flows
Multiple Dify enterprise practice articles repeatedly emphasize one issue: if AI-generated content is just natural language long text, it is very difficult to connect to forms, work orders, approvals, or archiving systems. Therefore, this type of scenario must prioritize JSON / fixed fields / template output.
3. Approval Drafts Are Best Suited for “Auto-Generate First, Then Manual Confirmation”
Combined with public HITL articles, approval documents are not suitable for one-step fully automated submission. A more reliable approach is: AI handles organizing and completing the draft structure, while humans confirm key fields, budgets, benefits, and risk descriptions.
2. Why Approval Draft Generation Must Be Structured
Approval documents differ from regular articles — they typically have fixed fields:
- Purpose of application
- Background and current situation
- Budget amount
- Expected benefits
- Risks and countermeasures
- Items requiring approval
If the output is not structured, business departments will quickly feel that “it looks like a lot was written, but it cannot be directly submitted.”
Therefore, this scenario recommends prioritizing JSON or fixed Markdown template output, which can then be received by backend or form systems.
3. Recommended Output Fields
A common ringi draft structure can be designed as:
- Application title
- Requesting department
- Applicant
- Background issue
- Purpose
- Solution overview
- Cost estimate
- Expected benefits
- Risk points
- Decision items requiring approval
- Attachment list
If the enterprise approval process is highly standardized, additional fields can be added:
- Return on investment period
- Alternative solution comparison
- Whether personal information / compliance review is involved
4. Prompt Version Comparison Approach
Version 1: Pure Generation Type
Let the model write a ringi draft directly based on input.
Typical problems:
- Fluent style, but unstable fields
- Easily misses items
- Cannot be directly submitted to forms
Version 2: Template-Constrained Type
Explicitly require output by field, with definitions for each field.
Results improve noticeably — at least structural completeness is ensured.
Version 3: Structured + Missing Item Alert Type
In addition to field output, also require:
- Missing information must be flagged
- Insufficient content must list items to be supplemented
- Sensitive fields such as amounts, dates, and benefits must not be guessed
This is the version more suitable for enterprise deployment, because what approval documents fear most is “looking complete while the data is actually fabricated.”
5. Recommended Prompt Skeleton
The system prompt should emphasize:
- You are an approval draft organizing assistant, not an approval decision-maker
- You can only generate based on input information
- When data is missing, you must mark it as “to be supplemented”
- Output must follow a fixed field order
- Do not fabricate ROI, budgets, or timelines
6. How to Organize Input Information
To make drafts more stable, the input side should not just provide a single free-text block. It should at least be broken into:
- Requirement background
- Application item
- Budget ceiling
- Desired launch date
- Related departments
- Attachment summary
This is equivalent to doing a lightweight form structuring first, then letting Dify handle organizing the expression.
7. Review Nodes Worth Adding
Although the title is “auto-draft generation,” this type of scenario typically still recommends adding manual confirmation:
- Budget field is empty
- Involves personal information, legal, or procurement matters
- Expected benefits description is too vague
- Requires senior management approval
8. Conclusion
Internal approval request auto-draft generation is not about who writes more like a human, but about who more closely matches the organization’s actual approval format. What truly determines usability is often structural design rather than eloquence.
If you later have internal ringi templates, approval field screenshots, or prompt iteration records, this article is most worth expanding further.
Public Source References
note.com
- Can’t Get Approval Requests Through. So I Decided to Secretly Build an AI Environment on a Spare PC | https://note.com/kusanone_dx/n/nd2952fe10842
- [Cost Reduction] Achieving Business Efficiency with Dify: 10 Use Cases You Can Apply Tomorrow | https://note.com/sone_ai/n/n65c4c48417ac
zenn.dev / Official Documentation / Other Public Pages
- Human-in-the-Loop Use Cases: Specific Operational Patterns in Dify … | https://zenn.dev/nocodesolutions/articles/62a03c6770b824
- Applying Human-in-the-Loop Concepts in Dify to Prevent AI Runaway … | https://zenn.dev/nocodesolutions/articles/df0d883c7d1f79
Verified Information from Public Sources for This Article
- In the Japanese enterprise context, ringi and approval processes themselves are practical barriers to AI adoption, making draft generation use cases very reasonable
- Dify already has publicly verifiable HITL design patterns suitable for manual confirmation after approval draft generation
- Structured output is more suitable than natural language long text for entering approval systems, form systems, or archiving workflows