Talk of economic uncertainty has flooded newsrooms and boardrooms throughout 2023, but in an unlikely twist, disruption may be the cure to businesses’ economic woes. Generative AI has captured the world’s attention seemingly overnight. Much of the recent hype has been driven by large language models (LLMs), such as OpenAI’s ChatGPT, GPT-3 and GPT-4, which have impressively generated human-like language and can perform a variety of tasks, from content summarization to code generation.
Businesses across industries have already begun to innovate with LLMs — and data analytics is no exception. As data sets become increasingly complex and unstructured, data analysts and other knowledge workers are constantly seeking more effective ways to extract insights and make confident decisions. In an interview, Peter Martinez, Sr. Product Marketing Manager at Alteryx, shared his insights on how generative artificial intelligence will forever change the way data workers find insights.
Q: What is the biggest reason organizations should consider using generative AI in their analytics today?
A: A key theme that continues to come up in the current economy and business environment and at every level of business is doing more with less. Analytics professionals are critical thinkers, creative problem solvers, and active listeners that help build solutions for end users. That being said, just like many other professions, there are many tasks in the analytics lifecycle that can be repetitive or mundane tasks that still require human oversight. With the dawn of generative AI, there’s a huge opportunity to automate these tasks so that human operators can spend more time on strategic, innately human add activities like picking up new projects, spending more time understanding the business, or generally providing more value to their organizations.
Q: How do you expect generative AI to alter the data analytics landscape?
A: There are concerns that generative AI can change analytics in a bad way because you’ll get unreliable outputs or hallucinations, but if you apply generative AI to the right use cases, using the right technology and vendors, you can alleviate that concern. There are valid concerns amongst professionals that without the proper guardrails in place, generative AI can negatively impact analytics by inadvertently generating incorrect outputs or fabricating responses — often referred to as hallucinations. But, there is a massive opportunity to apply generative AI while implementing appropriate guardrails to prevent those scenarios from occurring. More specifically, by applying generative AI to the right use cases and with the right vendors/technology, those concerns can be alleviated. For example, at Alteryx, one of our capabilities relies on generative AI to summarize trusted analytics and explain analytics workflows.
So, in terms of how this is going to change the analytics landscape, the best analogy I can give is the idea of a force multiplier. Generative AI can extend the power of one analyst into maybe three or four — not by being a carbon copy of that analyst, but by removing those mundane, repetitive tasks, like copying and pasting and governance documentation, from analysts’ day-to-day lives. In historical context, generative AI can complete these tasks at lightspeed. Organizations that don’t take advantage of gen AI are simply going to fall behind on operational efficiency measures. They’re not going to see as much return on their resources as the companies investing in generative AI.
Also, keep in mind that with the help of LLMs, end users can absorb information much faster. They no longer have to go in and click through different data stories, manage documents, read them, ingest them, and decide which ones are most important. They can get a synthesized summary of those key points in a three-paragraph email.
Q: Why is Alteryx incorporating generative AI into its platform?
A: Our core mission is Analytics for All, and there are two ways generative AI aligns with that mission. The first is democratizing analytics by enabling collaboration so that technical and non-technical users can perform analytics. For a long time, working with data required technical expertise in various programming languages, like SQL, Python, SPSS, or SAS. Alteryx came along and flipped the script. We provided drag-and-drop tools on a visual canvas and let technical and non-technical users work together and build analytics workflows. We built a bridge for collaboration.
One of the reasons we’ve been able to enable Analytics for All so effectively for years is because our products are so easy to use. Harnessing natural language to perform technical tasks is the next logical step. There’s nothing easier than speaking, reading, or writing in your native language.
The second part comes back to our conversation about eliminating mundane or repetitive tasks through automation. When you look at the sheer volume of work that analytics and non-analyst professionals have to do, there’s a huge opportunity for generative AI to automate those repetitive tasks. Freeing up workers’ time and helping decision-makers make intelligent, confident decisions at every level is absolutely in line with our mission of Analytics for All.
Q: Can customers trust insights from large language models, and should they be concerned about their analytics being a “black box?”
A: That’s a valid concern for customers to think through, but it starts with vendor selection. Who is your vendor of choice? Make sure to ask your vendor questions about their deployments, data architecture, and how their instances are set up.
Next is the use cases you’re going to use. Again, Alteryx isn’t relying on generative AI to develop trusted analytics. The analytics are still developed or guided by human operators. However, we are relying on generative AI to summarize those trusted analytics in an easily consumable and consistent manner. We’re simply expediting pieces of the analytics lifecycle using generative AI.
Finally, keep in mind the incredible amount of work involved in harnessing the power of gen AI to make its outputs reliable. Providing reliable, secure, trustworthy generative AI capabilities isn’t as simple as connecting to an LLM and creating a UI (user interface). There’s a ton of work that has to be done by the product and engineering teams to fine-tune each use case for accurate results. That’s part of the value Alteryx provides.
Our product team works behind the scenes to help eliminate and control for hallucinations and inaccuracies. Back to your question, the black box is a valid concern, but at the end of the day, this is why Alteryx is a trusted brand to do business with in this space — because we’re making those investments in the products and features we’re shipping. We stand behind the trustworthiness and accuracy of the outputs.
Q: How can users start using generative AI in their analytics and decision-making?
A: This is a great question and one that’s really timely. Microsoft just did a study and found that the average knowledge worker spends about 16 hours a week reporting on, summarizing, or answering questions about their work through emails, status reports, meetings, and so on. That’s two full workdays every week allocated to non-critical activity. These time-consuming tasks are hindering data workers and organizations everywhere.
My advice is to find a trusted vendor and see how they’re using generative analytics in their platform. Alteryx AiDIN offers several features that accelerate and augment the analytics lifecycle. One is Magic Documents, which leverages generative AI to summarize your insights and synthesize them into an email. The incredible value here is that you’re able to communicate insights to your end stakeholder faster because you’re using natural language — and they can consume those insights faster and more efficiently.
The Workflow Summary Tool, which is another Alteryx AiDIN feature, essentially looks at a workflow and summarizes what the workflow does. I used to be a consultant, and we would have to do documentation and data governance best practices for our clients so they knew what was happening to their data. Imagine the time required for governance and documentation when you have hundreds or thousands of workflows running. If you can automate that governance by simply asking an LLM to summarize a workflow, that opens up so much time for these workers to worry about bigger things in the data governance world, like metadata standards and data cataloging.
Remember, data workers spend so much time on time-consuming and mundane, repetitive yet necessary tasks. If you can start using generative AI to automate even a quarter of that and get analysts out of copying and pasting, out of writing emails, and back into kicking off a new project, finding new insights, and investigating new trends, they’re providing way more value. They’re scaling themselves out in a way that wasn’t possible before, and the ROI will keep compounding.