OpenAI launches ‘custom instructions’ for ChatGPT so users don’t have to repeat themselves in every prompt
OpenAI announced the beta launch of “custom instructions” for ChatGPT on July 20. The much-requested feature will allow users to create a preface for their prompts featuring instructions for the artificial intelligence (AI) chatbot to consider before responding to queries.
According to a company blog post, the feature works across prompts and sessions and includes support for plugins. As is typically the case, OpenAI’s launching the new feature in beta, citing the increased potential for unexpected outputs:
“Especially during the beta period, ChatGPT won’t always interpret custom instructions perfectly — at times it might overlook instructions, or apply them when not intended.”
This feature represents a significant step in the company’s efforts to develop ChatGPT in a method that maintains safety guardrails while still allowing it to “effectively reflect the diverse contexts and unique needs of each person.”
Custom instructions are currently available in beta for ChatGPT Plus subscribers outside of the United Kingdom and the European Union. The feature will expand to all users in “the coming weeks.”
Introducing Custom instructions! This feature lets you give ChatGPT any custom requests or context which you’d like applied to every conversation. Custom instructions are currently available to Plus users, and we plan to roll out to all users soon! https://t.co/fVIM9GeYk2
Here…
— OpenAI (@OpenAI) July 20, 2023
The custom instructions feature could be a game changer for users who execute complex prompts. In the crypto world, this could save innumerable work hours by allowing users to input their query parameters once over multiple prompts.
Traders could, for example, establish the market conditions via custom instructions at the start of the trading day and save themselves the time of having to repeatedly explain their portfolio position at the beginning of each prompt.
It could also be a useful tool for those who wish to limit the chatbot’s responses for legal and localization purposes — i.e., a crypto trader or AI developer who wants information in the context of General Data Protection Regulation compliance.
However, as The Verge recently reported, experts believe that increasing the complexity of queries seemingly increases the odds that ChatGPT will output incorrect information.
Related: Can AI tools like ChatGPT replace dedicated crypto trading bots?