SQAI Suite engages with third-party subprocessors and affiliates to help us provide services to our customers. Some of these subprocessors store Customer Content (content you submit to your SQAI Suite account) and assist SQAI Suite with processing it:
Core Infrastructure Subprocessors
Name | Nature of processing | Location |
Amazon Web Services, Inc. (AWS) | Cloud service provider | Regionally per your requirements, default EU |
Vector database to facilitate vector search features | Regionally per your requirements, default EU | |
Used for monitoring and testing of chatbot logic and prompts. (via LangSmith) | Regionally, default EU |
Other Subprocessors
Name | Nature of processing | Location |
Customer service platform used for technical support chat management | EU | |
Microsoft | Storage and email provider (O365) | EU |
AI Functionality Subprocessors
SQAI Suite uses Large Language Models (LLMs), such as those provided in the list below, to enhance certain features (e.g., conversational assistance, test suggestions).
By default, no personal data or identifiable information is sent to these services. However, if users manually input personal data into these tools, that data may be transmitted to the LLM provider.
In such cases, the user is responsible for ensuring that no sensitive or personal data is shared. SQAI Suite enforces safeguards to prevent such input, but cannot control manual actions by users. These providers are not considered subprocessors under our data processing framework unless personal data is shared contrary to our design.
Therefore, these Large Language Models (LLMs) are listed below with regards to transparency:
Provider Name | Nature of processing |
Anthropic | Artificial intelligence provider |
Mistral | Artificial intelligence provider |
OpenAI | Artificial intelligence provider |
Meta | Artificial intelligence provider |
Deepseek | Artificial intelligence provider |
Artificial intelligence provider | |
Cohere | Artificial intelligence provider |