top of page

Plugins

Plugins is an upcoming feature of talentbot-enterprise-llmops-alops. You can incorporate plugins into your App orchestration and access AI applications with plugin capabilities through an API or WebApp. talentbot-enterprise-llmops-alops is compatible with the ChatGPT Plugins standard and provides some native plugins.

Based on WebApp Template

If developers are developing new products from scratch or in the product prototype design phase, you can quickly launch AI sites using talentbot-enterprise-llmops-alops. At the same time, Dify hopes that developers can fully freely create different forms of front-end applications. For this reason, we provide:

  • SDK for quick access to the talentbot-enterprise-llmops-alops API in various languages

  • WebApp Template for WebApp development scaffolding for each type of application

The WebApp Templates are open source under the MIT license. You are free to modify and deploy them to achieve all the capabilities of talentbot-enterprise-llmops-alops or as a reference code for implementing your own App.

You can find these Templates on GitHub:

The fastest way to use the WebApp Template is to click "Use this template" on GitHub, which is equivalent to forking a new repository. Then you need to configure the talentbot-enterprise-llmops-alops App ID and API Key, like this:

image.png

More config in config/index.ts:

image.png

Each WebApp Template provides a README file containing deployment instructions. Usually, WebApp Templates contain a lightweight backend service to ensure that developers' API keys are not directly exposed to users.

These WebApp Templates can help you quickly build prototypes of AI applications and use all the capabilities of talentbot-enterprise-llmops-alops. If you develop your own applications or new templates based on them, feel free to share with us.

Model Configuration

talentbot-enterprise-llmops-alops currently supports major model providers such as OpenAI's GPT series. Here are the model providers we currently support:

  • OpenAI

  • Azure OpenAI Service

  • Anthropic

  • Hugging Face Hub (Coming soon)

Based on technology developments and user needs, we will continue adding support for more LLM providers over time.

Trial Hosted Models

We provide trial quotas for different models for talentbot-enterprise-llmops-alops cloud service users. Please set up your own model provider before the trial quota runs out, otherwise it may impact normal use of your application.

  • OpenAI hosted model trial: We provide 500 free call credits for you to try out GPT3.5-turbo, GPT3.5-turbo-16k, text-davinci-003 models.

  • Anthropic Claude hosted model trial: We provide 1000 free call credits for you to try out Claude-instant-1, Claude2 models.

LLM Provider Configuration

Choosing an LLM provider and configuring API credentials are prerequisites for using talentbot-enterprise-llmops-alops. You can configure API keys under Account -> Settings -> Model Providers. The models available for selection in prompt orchestration may vary slightly depending on the provider configuration.

If you plan to use GPT models, you can find the API key in the OpenAI dashboard.

talentbot-enterprise-llmops-alops uses AES-256 and other encryption methods to securely store user-hosted API keys. Each tenant uses an independent key pair for encryption to ensure your API keys are not leaked.

If LLM provider keys are not configured properly, the following capabilities will be unavailable:

  • Prompt orchestration

  • Adding new documents or re-embedding in datasets

  • Dialogue in WebApp and API

Connect to OpenAI

You can get your API key from OpenAI. The models your key has access to depends on OpenAI's access policies.

talentbot-enterprise-llmops-alops currently supports:

  • GPT4

  • GPT3.5-turbo

  • GPT3.5-turbo-16k: Context length up to 16k tokens, approx. 20 pages of text.

  • text-davinci-003: For text generation applications only.

Connect to Anthropic

You can apply for your API key from Anthropic. Anthropic models support up to 100k tokens of context input and can generate thousands of tokens of long text output. The Anthropic models currently supported are:

  • Claude-instant: A lighter model capable of a range of tasks including casual conversation, text analysis, summarization and document understanding.

  • Claude2: With sophisticated reasoning and coding capabilities.

Note:

  • 1.Since Anthropic does not have embedding models, you must bind either an OpenAI or Azure OpenAI key in order to use Anthropic. Otherwise capabilities relying on LLM other than conversational will be unavailable in talentbot-enterprise-llmops-alops.

  • 2.Currently Dify only supports extra long context input in the conversational prompt field or message input box. For example, if you have a long text PDF, you would need to copy and paste the text into the pre-prompt field or message input box.

bottom of page