How to prompt on GPT-o1

Align Meta Llama 3 to human preferences with DPO, Amazon SageMaker Studio, and Amazon SageMaker Ground Truth | Amazon Web Services

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More


OpenAI’s latest model family, GPT-o1, promises to be more powerful and better at reasoning than previous models. 

Using GPT-o1 will be slightly different than prompting GPT-4 or even GPT-4o. Since this model has more reasoning capabilities, some regular prompt engineering methods won’t work as well. Earlier models needed more guidance, and people took advantage of longer context windows to provide the models with more instructions.

According to OpenAI’s API documentation, the o1 models “perform best with straightforward prompts.” However, techniques like instructing the model and shot prompting “may not enhance performance and can sometimes hinder it.” 

OpenAI advised users of o1 to think of four things when prompting the new models:

  • Keep prompts simple and direct and do not guide the model too much because it understands instructions well
  • Avoid chain of thought prompts since o1 models already reasons internally
  • Use delimiters like triple quotation markets, XML tags and section titles so the model can get clarity on which sections it is interpreting
  • Limit additional context for retrieval augmented generation (RAG) because OpenAI said adding more context or documents when using the models for RAG tasks could overcomplicate its response

OpenAI’s advice for o1 vastly differs from the suggestions it gave to users of its previous models. Previously, the company suggested being incredibly specific, including details and giving models step-by-step instructions, GPTo1 will do better “thinking” on its own about how to solve queries. 

Ethan Mollick, a professor at the Wharton School of Business at the University of Pennsylvania, said in his One Useful Thing blog that his experience as an early user of o1 showed it works better on tasks that require planning, where the model concludes how to solve problems on its own. 

Prompt engineering and making it easier to guide models 

Prompt engineering, of course, became a method for people to drill down on specifics and get the responses they want from an AI model. It’s become not just an important skill but also a rising job category

Other AI developers released tools to make it easier to craft prompts when designing AI applications. Google launched Prompt Poet, built with the help of Character.ai, which integrates external data sources to make responses more relevant. 

GPT-o1 is still new, and people are still figuring out exactly how to use it (including me, who has yet to figure out my first prompt). However, some social media users predict that people will have to change how they approach prompting ChatGPT. 



Source link lol
By stp2y

Leave a Reply

Your email address will not be published. Required fields are marked *

No widgets found. Go to Widget page and add the widget in Offcanvas Sidebar Widget Area.