Alteryx Artificial Intelligence
Frequently Asked Questions
Frequently Asked Questions
Alteryx products may include AI features. We may also give the user the ability to connect to AI provided by another vendor, such as OpenAI or Google Gemini. For example, our Alteryx Designer software has a variety of GenAI tools. These tools allow users to connect to your LLM provider with your own API. For our cloud products, such as Auto Insights, we may include AI features such as Playbooks and Magic Documents, which connect to Microsoft Azure Cognitive Services. For more information, please see the documentation (https://help.alteryx.com) for the specific product you’re using or are interested in using.
For Alteryx AI Tools Fact Sheets and FAQs – please visit this page.
When you contact our customer support, you will be talking to a human. In our cloud products, such as Alteryx Designer Cloud, we offer an AI chatbot named “Fin.” Fin can be used to get quick answers to common questions. If at any time a user would rather speak with a human, we let the users know that they can be connected to a customer service representative at any time.
As AI features are introduced into our products, we will provide relevant information directly to the users through the user interface within our products. We will also include information about these features in our product release notes and in updates our product documentation. To see our product release notes and documentation, please go to https://help.alteryx.com/.
To see detailed information about our AI features, visit the Alteryx AI Tools Fact Sheets and FAQs at this page.
When we modify an AI feature in one of our products, we will provide information about that change in our product release notes. You can see these release notes at https://help.alteryx.com/.
To see detailed information about our AI features, visit the Alteryx AI Tools Fact Sheets and FAQs at this page.
We do not train our AI models on customer data or personal data collected from users of our software.
As between you (the customer) and Alteryx, you own any original output generated from the AI features provided by Alteryx to you. Ownership of non-original output remains unchanged. For instance, if an output includes a quote from a published research article, any copyright associated with that quote remains with the author. You should also be aware that ownership rights may be affected by the content of inputs you use to produce the output. Similar inputs that you or someone else use may produce similar, non-original outputs, particularly so if the LLM temperature settings (i.e. the setting that determines how creative the output will be) are at lower settings. You should always check to verify the originality of your outputs if you plan to claim any type of ownership rights.
If the AI feature you are using connects you to a third party hosted LLM, such as OpenAI’s ChatGPT or a Google Gemini, your prompts will be processed by those LLMs. Please refer to the third party’s terms of use associated with those LLMs.
For products such as Playbooks and Magic Documents in Auto Insights, the AI model is not customized, but our products are designed to help you apply the power of the LLM to gain valuable insights and help you better understand your data. For Alteryx Copilot, we will customize that AI Feature with Alteryx data to provide you insights and assistance to build better workflows more efficiently.
Your prompt will be sent to the third party LLM in order to generate a response from the LLM. Please refer to the data handling information provided by that third party.
For some services we let you use your own API to connect to services like OpenAI or Google Gemini. Examples include the GenAI tools in Designer. For services like Magic Documents and Playbooks in Auto Insights, we connect to OpenAI through Microsoft Azure Cognitive Services. For Alteryx Copilot, we currently use Google Gemini. So, in those instances, OpenAI and Google are responsible for training their LLMs. Your data used by Magic Documents and Playbooks in Auto Insights is not used to improve Azure OpenAI Models. For more information, go to https://learn.microsoft.com/en-us/legal/cognitive-services/openai/data-privacy.
For Auto Insights, our AI features currently use Microsoft Azure Cognitive Services. Alteryx Copilot currently uses Google Gemini. For our Designer software, we make available tools that allow you to connect to various LLMs through API you provide. You may also design your own connectors using the functionality available in Designer to connect to other AI-based services. Please visit the product documentation (https://help.alteryx.com) to learn more about the LLMs in use or the other options available to you.
We use third party LLMs (Open AI through Microsoft Azure Cognitive Services or Google Gemini).
We use third party LLMs, such as Microsoft Cognitive Services (using OpenAI) and Google Gemini. As such, the extent and timing of any updates to these LLMs will be up to the third party providing the LLM.
We develop our AI features to be as accurate as possible. Accuracy depends on many factors, such as the quality of the input or prompt used to create the AI output, the LLM chosen, the temperature settings when using the AI, and other factors. As a result, accuracy can never be assured. Regardless of what AI you choose or where you get it from, you should always verify the accuracy of any output. We recommend building processes and procedures to consistently verify AI output.
Alteryx conducts comprehensive risk assessments and audits of AI technologies to mitigate potential biases or inaccuracies, incorporating feedback loops for ongoing improvement.
Whenever a third party LLM is used, such as OpenAI, you should investigate the third party’s practices to address potential bias. As for Alteryx, we are committed to making AI technologies that are free from unfair discrimination or bias. Techniques such as diverse dataset training, algorithmic fairness checks, and continuous model evaluation will be employed to address bias and discrimination. Please visit or Alteryx Responsible AI Principles website at https://www.alteryx.com/trust/ai-principles for more information about our commitment to responsible AI practices.
Prompt injection is when an attacker enters a text prompt or instructs the AI to retrieve certain external data intended to give the attacker the ability to perform unauthorized actions. In the case of AI, prompt injection might let the attacker make the AI say anything they want it to say, and can make the AI ignore previous instructions in favor of later ones. At Alteryx, we have several safeguard mechanisms in place to prevent such scenarios. These include stringent input validation procedures and the use of AI models that can detect and filter out potentially harmful instructions.
The AI Facts Sheets describe how Alteryx integrates responsible AI practices across its suite of products. Our detailed AI Fact Sheets provide a thorough overview of our AI features, and include information about our transparency, data handling, and accountability measures. We also include additional FAQs specific to each AI feature to address common AI feature-specific questions about data usage, encryption methods, and user controls.
It is always your choice whether to use an AI feature. In Designer Desktop, it is your choice whether to download and use our GenAI tools or Alteryx Copilot. In Designer Cloud, our AI chatbot assistant “Fin” will be available to assist you, but you can always ask to speak to a human. In Auto Insights, your administrator has the option to turn off or on the availability of Magic Documents or Playbooks.
For more information about a specific AI feature, please visit the AI Fact Sheets at this page.