The Short Version
In several research projects, colleagues and I have relied on commercial tools for conversational AI agents. They're good, and in many ways impressive. But they don't really cater to research designs: probing logic, follow-up strategies, semi-structured and unstructured interview protocols, participant identifiers, or the kind of pricing model that works for an unfunded PhD project.
That's why I started OASIS. It's not trying to compete with commercial platforms. It's trying to be an open-source alternative for transparent research, a survey interview platform that runs entirely inside your Docker environment, talks to whichever AI providers you choose, and treats research methodology as a first-class concern rather than an afterthought.
To be clear: the ambition here is not to match enterprise platforms with dedicated engineering teams and substantial funding behind them. They will always move faster and offer more polish. But it is important that open, transparent alternatives exist for research. Tools you can self-host, inspect, and adapt to your study design without a procurement process. That is what OASIS is for.
Read the longer version
The gap isn't about capability. LLMs can already conduct nuanced, adaptive dialogues. It's about tooling. You can subscribe to an enterprise platform that handles everything, but then your data lives on someone else's servers, your prompts are proprietary, and switching providers means starting from scratch.
Alternatively, you can stitch together WebSockets, STT engines, LLM APIs, and TTS services yourself, which is fine if you have a software engineering team, but most research groups don't.
OASIS sits in between. It handles the engineering (audio pipelines, real-time transcription, session management, model routing) so researchers can focus on what they're actually good at: designing interview protocols, crafting questions, and interpreting data.
Importantly, OASIS can run in your own Docker environment. If your deployment calls external AI APIs (OpenAI, Google, etc.), that's your choice of provider. OASIS itself never phones home and never sees your data. For full isolation, you can point it at self-hosted models on GCP, Azure, or Scaleway.