Use own OpenAI API key
S
Sam Kinred
Just adding on to this, I think the ability to choose between powerful models and standard models via BYOK would be a big selling point.
Jay Song
This is a bit trickier than you might imagine. We don't just send a transcript and ask "Summarize this" as a single API call. It's more of a layered approach to API calling, and there are/will be mixed structures utilizing multiple LLM models. So I'm not saying "No," but more like "Not yet," as we are still in the early stages. I try to avoid adding constraints to the development process, but this will probably be how it works!
SBehar
Jay Song Make sense! This could be a parallel process that's independent of your core structured outputs, which allows users to "chat" with their call's transcript using their OpenAI key.
Either way - security is a concern when full transcripts containing potentially sensitive info are jumping across multiple APIs/servers, some of which may not be secure and some of which may be used to train model with confidential or proprietary data