Supported models not working?

For this simple code:

    llm = OpenAI(base_url="https://api.perplexity.ai", api_key=os.getenv("PERPLEXITY_API_KEY"))
    resp = llm.chat.completions.create(model="llama3.1-sonar-large-128k-online", messages=[{"role": "user", "content": "say hello"}])
    ```
    
    I am getting the following error:
    
    ```
        raise self._make_status_error_from_response(err.response) from None
openai.BadRequestError: Error code: 400 - {'error': {'message': "Invalid model 'llama3.1-sonar-large-128k-online'. Permitted models can be found in the documentation at https://docs.perplexity.ai/docs/model-cards.", 'type': 'invalid_model', 'code': 400}}

Got the same for small and huge as well.

Hey @anjor, thanks for posting! I replied about it on our Discord, the correct model name is ‘llama-3.1-sonar-large-128k-online’.

Yes, thank you! For those without sharp vision, there’s a dash between llama and 3.1