BroxiAI
  • Welcome
  • Account
    • Quickstart
    • Password
    • Team
      • Create team
      • Join team
      • Payment & Billing
      • Payment Policy
    • Global Variables
    • API Keys
  • Workflow
    • Overview
    • Quickstart
    • Components
    • Playground
    • Publish Workflow
    • File Manager
    • Webhook
  • Components
    • Input & Output
    • Agents
    • AI Models
    • Data
    • Embeddings
    • Helper
    • Loader
    • Logic
    • Memories
    • Processing
    • Prompt
    • Tools
    • Vector database
  • Advanced
    • Use Agent in flow
    • MCP Connect
    • MCP Astra DB
  • Integration
    • Apify
    • AssemblyAI
    • Composio
    • Google
      • Google Auth
      • Vertex AI
    • Notion
      • Setup
      • Notion Conversational Agent
      • Notion Meeting Notes Agent
Powered by GitBook
On this page
  • Use a prompt component in a flow​
  • Langchain Hub Prompt Template​
  1. Components

Prompt

PreviousProcessingNextTools

Last updated 13 days ago

A prompt is a structured input to a language model that instructs the model how to handle user inputs and variables.

Prompt components create prompt templates with custom fields and dynamic variables for providing your model structured, repeatable prompts.

Prompts are a combination of natural language and variables created with curly braces.

Use a prompt component in a flow

An example of modifying a prompt can be found in the , where a basic chatbot flow is extended to include a full vector RAG pipeline.

The default prompt in the Prompt component is Answer the user as if you were a GenAI expert, enthusiastic about helping them get started building something fresh.

This prompt creates a "personality" for your LLM's chat interactions, but it doesn't include variables that you may find useful when templating prompts.

To modify the prompt template, in the Prompt component, click the Template field. For example, the {context} variable gives the LLM model access to embedded vector data to return better answers.

Given the context
{context}
Answer the question
{user_question}

When variables are added to a prompt template, new fields are automatically created in the component. These fields can be connected to receive text input from other components to automate prompting, or to output instructions to other components. An example of prompts controlling agents behavior is available in the sequential tasks agent starter flow.

Name
Display Name
Info

template

Template

Create a prompt template with dynamic variables.

Name
Display Name
Info

prompt

Prompt Message

The built prompt message returned by the build_prompt method.

When a prompt is loaded, the component generates input fields for custom variables. For example, the default prompt "efriis/my-first-prompt" generates fields for profession and question.

Name
Display Name
Info

langchain_api_key

Your LangChain API Key

The LangChain API Key to use.

langchain_hub_prompt

LangChain Hub Prompt

The LangChain Hub prompt to use.

Name
Display Name
Info

prompt

Build Prompt

The built prompt message returned by the build_prompt method.

Inputs

Outputs

Langchain Hub Prompt Template

This component fetches prompts from the .

Inputs

Outputs

​
​
​
Langchain Hub
​
​
​
Quickstart