I am working with a student on the Chatbot Tutorial, and the instructions talk about getting an OPEN AI API Key. Is this still necessary ? I saw the post about MIT AppIvnentor using Amazon's model by default, and I'm not sure if I should go through the steps to get an Open AI key. Can you give us guidance?
Sorry, this is a bit confusing.
By default, if you set the ChatBot to "chatgpt" and do not specify a model (aka "default"), we route the request to the Amazon/Anthropic Sonnet-4.5 model. We pay the bill and enforce a quota.
If you set up your own OpenAI API key, obtained directly from platform.openai.com, then we route you to OpenAI with gpt-4.0-mini as the model (the current default). We then do not enforce a quota because you are now paying the bill.
The confusing part is the switch of provider from OpenAI to Amazon when no API key is provided. This is because we (aka me!) failed to provide a "default" for provider in the provider selection field. So the default is "chatgpt" (which should really be OpenAI, which is the real provider).
I'm working on a fix so that in the future the default will be "default" and if you specify a provider explicitly, we will always use that provider.
My advice for now is to just use the default (aka Sonnet-4.5 pretending to be ChatGPT) unless you run into quota issues.
Btw., the reason for the default switch is that Amazon granted MIT a significant amount of free "credits" so for now, it doesn't cost us cash!
-Jeff