Can I use my own OpenAI or Anthropic API Key or fine-tuned model?
“Advanced users may wish to deploy their own trained models to achieve optimal results in specific domains, or prefer to utilize their own API quota directly.”
Root Cause Analysis
Enterprise BYOK support
In the enterprise private deployment version, it is possible to configure the organization's own LLM API Key (for example, an Azure OpenAI endpoint). This allows enterprises to take advantage of their existing discounted rates or dedicated channels.
Fine-tuned Model Integration
If you have a fine-tuned model ID with OpenAI, you can integrate it through our enterprise console. The document parsing engine manages formatting, while your dedicated model generates the translation content.
Hybrid Cloud Architecture
Supports routing strategies such as 'sensitive data processed by private models, general data processed by public models,' enabling flexible cost and security management.
Final Solution Summary
Delivers exceptional architectural flexibility to accommodate the technology stack requirements of organizations ranging from startups to multinational enterprises.