The LangSmith playground enables you to control various settings for your prompt. These include the model configuration, tool settings, and prompt formatting.
Model configurations are the set of parameters against which your prompt is run. For example, they include the provider, model, and temperature, among others. The LangSmith playground allows you to save and manage your model configurations, making it easy to reuse preferred settings across multiple prompts and sessions.
Enter a name and optional description for your configuration and confirm.
Your configuration is now saved and ready to be accessed by anyone in your organization’s workspace. All saved configurations are available in the Model configuration dropdown.
Once you have created a saved configuration, you can optionally set it as your default, so any new prompt you create will automatically use this configuration. To set a configuration as your default, click the Set as default button next to the dropdown.
Tools enable your LLM to perform tasks like searching the web, looking up information, and more. Here you can manage the ways your LLM can utilize and access the tools you have defined in your prompt. Learn more about tools here.
The Extra Parameters field allows you to pass additional model parameters that aren’t directly supported in the LangSmith interface. This is particularly useful in two scenarios:
When model providers release new parameters that haven’t yet been integrated into the LangSmith interface. You can specify these parameters in JSON format to use them right away.
When troubleshooting parameter-related errors in the playground. If you receive an error about unnecessary parameters (more common when using LangChainJS for run tracing), you can use this field to remove the extra parameters.