Yes, OpenAI has implemented rate limits for the ChatGPT API. As of March 1, 2023, free trial users get 20 requests per minute (RPM) and 40000 tokens per minute (TPM). Pay-as-you-go users (first 48 hours) get 60 RPM and 60000 TPM. After 48 hours, pay-as-you-go users get 3500 RPM and 90000 TPM. These numbers are subject to change, so it’s recommended to check the OpenAI API Pricing page for the most recent information. Also, bear in mind that codewise, each message passed to chat.models is counted as one request.