Intermountain Healthcare - Breaking Barriers - Inspire 2017

With approximately 250 analysts across their system, Intermountain Healthcare overcame many concerns and barriers as they worked to implement Alteryx. From security to workflow and data sharing, data governance, data architecture, and predictive modeling, Lynsie Daley, James Selfridge and team successfully broke down these barriers to move their Alteryx implementation forward. As a result, their team has delved into many core functionalities of Alteryx: data preparation, In-Database, macros, spatial mapping for population coverage and drive times, blending multiple data sources with their EDW, and time-series forecasting. Join to hear Intermountain's story from investigation to implementation and roll-out; incredible use cases they have tackled; and how the Alteryx platform has rapidly become one of their most powerful tools, as they continue to discover how to deepen analytical insights and improve process efficiency.

Video Transcription

Lynsie Daley:
Welcome. Thank you for joining us for our presentation. James and I are really excited to walk you through our implementation story. From our implementation to our roll out and then to be able to share with you guys some of the use cases that we've discovered as we've learned more about Alteryx.

With that, we'll just jump right in and quickly go over the agenda. We'll talk a little bit about who Intermountain Healthcare is and how our analytics department is structured. Then, we'll go through three different areas where we saw barriers as we were trying to implement Alteryx. We'll go through investigation of the tool, what we saw while we were investigating Alteryx, actually implementing it and coming up with a governance for the tool within Intermountain and then actually rolling it out to all of our analysts. Then, we will from there just show you some of the really awesome use cases that we've been working on.

Intermountain Healthcare is a not-for-profit health system and we are based in Utah and Idaho. We've got 21 hospitals, many clinics and also an insurance arm. You can read through some of our stats there. Our mission is helping people live the healthiest lives possible and our vision is to become a model healthcare system. You can imagine the analytics plays into that in a huge way. It's really exciting to be part of the group at Intermountain and to be able to make a difference and help make that vision come true.

Along with those stats, we have had 30,000 bursts last years and 150,000 surgery procedures. We do a lot. Then, specifically analytics. We have an enterprise analytics team that is made up of about 250 people. Those analysts are spread out through our organization. Some of them work more regionally. Some of them are at an enterprise level and some of them are just in a facility, but they all have a dotted line relationship up to this analytics team. Right now, we're working on trying to unify and become one analytics and try to get rid of some of the disparate data and the repetition of analysis that's already been done and that kind of thing. Also, big right now for us is that we're in the middle of switching over to a new electronic medical records system. We're moving over to Cerner.

We've had to retool all of our reports to pull the data from that new system and that's been a pretty daunting task. That is what's been going on with analytics at Intermountain and I will let James take it from here and start talking about our investigation phase.

James Selfridge:
To give you an idea of how long we've been looking at Alteryx and starting up with it, around the end of 2015, that's when we started our relationship with Alteryx and getting the pilot license and everything that we're going to describe to you occurs from that time-frame forward. That's an idea of where we're at. To begin with, we said let's start investigation Alteryx. We need a little bit more help in terms of blending our data, cleaning our data and spending less time doing those menial tasks that we all know about and basically utilizing the data analysts and letting them actually analyze their data, do what they want to do.

The first barrier is what I call our Grandma moment. We didn't really understand what we could necessarily do with Alteryx. We basically had this roadblock, this barrier, in saying what the heck can we do with Alteryx. Essentially, we wanted to see what we could do with it. What kind of use cases can we come out with this? What kind of questions will we get from leadership? Essentially, we saw that it made the cut of dozens of vendors that we examined in looking at data prep and cleansing capabilities and that's great. They were always able to answer our questions but what exactly can we do with this new tool? How are analysts going to use this? That's where Lynsie's going to tell us a little bit more about what we were able to do overcoming this Grandma moment essentially.

Lynsie Daley:
Yeah, taking advantage of training that Alteryx offered was a huge, huge part of how we were able to overcome this barrier. Before we could see the wins and the awesome use cases, we really had to dig in and train and learn the tool in detail. Once we did that, the use cases and the wins just kept rolling in. That was a really important piece for us.

The other really important thing is that we collected a lot of user feedback. We had a page up on Confluence, which is just a shared site where all of our analysts could basically give us their feedback. What they have seen that they liked about the tool, any of the concerns or any area that they were not happy with, any ROI or ideas that they were having about the tool. Basically, we tried to collect as much feedback from as many users as we could as quickly as possible because that feedback in it of itself helped us to justify continuing to look at Alteryx and to purchase it in the future.

James will now talk about some of the early wins that we had.

James Selfridge:
Of all the places to get an early win, HR was the last place we expected. I apologize if any of you are in HR. Essentially, what we were going through with HR at the time, the same time we were looking at Alteryx, is they were looking at an off the shelf product, a BI tool, to basically gather their metrics and display them on much like you would see with Tableau, QlikView, etc, because HR metrics are basically standard across the board, across many industries.

What one of the analysts, Shelly, in HR decided to do is she saw that we were looking at Alteryx and its data-blending capabilities. She said what if we can build this in house? We're going to investigate that product but what if we could build this in house? Get all the data together and build a Tableau dashboard? Here's her workflow that she built, produced a Tableau extract, built one of the most elegant yet complicated Tableau dashboards I've ever seen and it fulfilled all the requirements that we had with that off the shelf product. We saved tons of money right off the bat and got that early win from HR, again, of all places.

At that point, we said you know what? It's worth investigating even further. Let's actually try and implement this. That's where we come to our implementation phase. We feel like we've got enough ROI from our pilot. Now, let's actually roll this out. Let's implement it and put it into our existing stack. Here's where stress and indecision comes. If any of you has been involved with an implementation of any given software, you know the kind of questions that come up. You know the kind of stress, indecision, all the issues that arise. Basically, you start asking yourself what does the fully functioning product look like? How does it fit in with our existing BI stack? We use a lot of Tableau at Intermountain Healthcare. We have an Oracle based EDW. We used a lot of SQL Developer. We used Excel. A lot like other organizations, how does Alteryx fit in with that?

Essentially, we wanted to also understand the payment structure. Would individual departments pay for individual licenses or could we justify an enterprise license? There were a lot of questions going around and we didn't really have the answer to any of those yet while we were starting to implement. Then, we have this quote. This is one of the best, yet worst quotes, I've heard about Alteryx. I'll do my best to do some voice acting with this. There's a couple ways that you can read this. First is the "Holy crap, that's a lot of power." That's great. Alteryx empowers analysts and that's what we want to do. That's their vision as I interpret it and that's what we wanted to do at Intermountain Healthcare. We wanted to empower the analysts to do what they do best. That's great. We get a lot of power into their hands so they can do what they need to do.

For all of you nerds out there, with great power comes great responsibility. Let's read this in a different way. "Holy crap. That's a lot of power." Our chief data officer had a lot of hesitation of when we see what Alteryx can do, there's a lot concern surrounding that. We realized there's going to be some more resistance to this, not just implementation resistance, but a lot of other issues. That's something that Lynsie's going to explain a little bit further.

Lynsie Daley:
We are at our second barrier. As you can imagine, most of those concerns and questions revolved around data governance and security. Like James is saying, the tool is extremely powerful and that's great but it also lends itself to abuse of different tools or at our organization, writing ETL's, which is something that analysts don't do traditionally. We had to figure out all of those. Along with that, we really wanted to try and keep our one source of truth going. A lot of people had concerns about having disparate data sources on everybody's machine. Everybody's manipulating data with Alteryx and keeping all of these different data sources on their machine and reporting things in different ways. That can be really useful as well but it has to be done in the right way.

Then also, managing security and credentials. When we share workflows, how safe are our credentials? Are people seeing our passwords and user names? Are we able to keep those secure? What's the security like when we publish from Alteryx to Tableau? Are we staying secure in that way? All of those questions had to be answered, and that honestly took up the majority of the time that we were trying to work on our implementation. Just this piece alone took months and months to figure out.

In the end, what we settled on was basically a production workflow and what that is composed of is a publication guide, along with a publishing checklist. We broke this barrier with this. This is just a way to provide an extra gut check. We've got this publishing guide, which I'll talk about a little bit more in a minute, but our analysts review that before they publish to production. This is not limiting analysts with their experimentation or being creative. We don't have any boundaries with that. We're just saying when they're getting to the production stage and they're going to publish on our production server that at that point, they go through this gut check and they have somebody review their workflow. That's all the requirements are satisfied before they do publish.

To do that, we have this checklist and we have a committee. Somebody from that committee will review the workflow and we've got just a way that we track that with JIRA, which is a project management tool. Once the reviewer has reviewed and satisfied all the requirements, either the analyst has to go back and fix some of the quality issues or the requirements are satisfied and at that point, they can publish.

This is just a sample of what our publishing guide looks like. It's just about 10 pages and it's basically what to expect if you want to publish on the production server. We expect our analyst to review this as they're thinking about publishing on production and just to generally be aware of these guidelines that we've come up. Once they've reviewed that, and they're ready, they'll go through this checklist. This addresses things like data persistence and using the predictive tools, making sure that you understand those tools, or you've consulted with somebody who does, and data sources. Once that checklist is satisfied, then they're good to go and they can publish.

This also is just what we've imagined as our producing workflow. You start out developing your workflow on your machine and then from there, you go to our sandbox server, where you can test your workflow. You can basically do anything on that server. While your workflow is on that server, which we try to keep to a maximum of 60 days, then the publishing committee can look at the workflow and review everything and make sure that it satisfies the requirements. At that point, it moves to the production server, where it then would go out to a Tableau extract or hopefully, soon in the future, an EDW table or an analytic application.

James will take it from here and talk a little bit about how we rolled out the tool.

James Selfridge:
We feel like we've got all the governance put in place. We feel like we've addressed most of the questions that need to be addressed in terms of implementing. Let's start rolling this out. Let's grow the number of licenses that we have and put it in the hands of the analysts and see what we can do.

First, who here has heard the Gartner Hype Cycle? Okay. A few of you. I'm not sure if Gartner first came up with it, but essentially what this explains is what they say is a technology trigger. Think of artificial intelligence, self-driving cars, this new concept that gets introduced. Over time, it's going to get hype. There's going to be a lot of expectations about this new technology. Self-driving cars are going to reduce deaths to essentially zero. AI is going to solve world hunger, cure cancer, all of these things. We've got tons of hype about this new technology.

After a while, once we get more comfortable with that technology, we're going to go into this "trough of disillusionment". Essentially, we'll understand there's a lot more limitations to this technology and then finally, we'll get into the real use cases and understand how that technology can be applied and essentially go into the long term use of it.

We went through that. We're still going through it with Alteryx. For us, we had that technology trigger and at this point when we were trying to roll out, the point at which we were on this Gartner Hype Cycle, right there. That was fun. We were really excited. We said we're going to roll this out. We know we can do a ton with it. This is what we're going to do and then, we went into this disillusionment phase.

We realized our organization wasn't quite ready for this yet. I love this slide from Gartner. Our culture had to really embrace this change. A lot of people started asking questions. Well, why do we need to deal with this new technology? We already have existing technology that can do the exact same thing. How do we deal with this new development? Basically, we had culture shock and we're still dealing with it.

To get an idea, how many of you are in the healthcare industry? Insurance providers? Okay, so keeping that in mind, of all of you who are in the healthcare industry and knowing about the Affordable Care Act new legislation that's being introduced, how many of you think that your organization's could survive long term without needing to change? Hey, that's the kind of thing that we were talking about with people.

If you're not willing to embrace this kind of change, this very, very disruptive change, you won't survive. That's just a fact of life. It's a fact of business. If you aren't willing to change, you will not survive as an organization, especially in healthcare. That's one of the points that we tried to drive home of we need to be willing to embrace change. We need to be willing to accept certain forms of failure. You need to fail fast and figure it out and move forward. Some of the sources that we went through with this culture shock. Analysts writing to the EDW? A lot of organizations use Alteryx as their data architecture platform. That's great. It's a lot of return on investment for them. That's fantastic.

One of our sales associates that we had with our EDW said that's the closest thing I've seen to an actual EDW anywhere. When we talk about our EDW, it's extremely important to us. We've invested thousands and thousands of hour into millions and millions of dollars. It really is important to us so when we talk about analysts writing to our EDW, that's something very serious that we take into consideration because we have dozens of data architects that have built that up over time, over decades. That's how long we've had our EDW.

Some things that we've also talked about. What about acceptable production sources? Going back to the EDW, if you say the word Alteryx and ETL in the same sentence at Intermountain Healthcare, you're going to get slapped in the face. You can't do that. We had multiple occasions where people said don't you dare say ETL and Alteryx because it just doesn't happen. Over time, maybe, but that's a long ways away. That's a big culture shock for us.

What about master data, advanced analytic, security needs? There's still a ton of questions surrounding those and we're still trying to figure it out. We know we can do a lot with Alteryx and that's great. We're still in that process. Lynsie's actually going to go into it a little bit further with one of these topics.

Lynsie Daley:
This type of culture shock deserves its very own mention. We've got our analysts split out into two different groups. We have technical data analysts and statistical data analysts. Those statistical data analysts are just a little bit more savvy with the different methods and statistics. As you can imagine, there was some heartburn over using Alteryx. Our technical analysts really no problem. They're really excited. They finally have a tool where they can do predictive analytics and it's easy and there's no learning curve so they're great with it.

Our statistical data analysts are the people that really had heartburn because they didn't understand why they would drop bar R and pick up Alteryx. They already know how to code in R. What's so special about using Alteryx? It's almost like dragging and dropping just wasn't anything that they were interested. They wanted to code.

I think that we're still in the process of trying to get some of those analysts to look at this tool because with the R interaction and Alteryx, the possibilities are limitless. As a person with a statistical background myself, I see a lot of potential with the predictive tools. That's one of the pieces of culture shock that we've had.

The next win is our internal Alteryx User Group. This has been huge for us. We created a group. We call it All-Da-Tryx. We have a steering committee and we meet monthly. We basically have an analyst facing knowledge sharing group and we meet monthly. All of those 255 analysts are invited to this meeting. Every month, we're getting Alteryx up in front of these analysts and we're teaching about it.

Along with that, we have taken advantage of more Alteryx resources, and I think James wanted to especially call out Chad here.

James Selfridge:
Chad is in the back. Everyone look at Chad. Stare at him, and then if you could all clap at him. I mean, he's done a wonderful job for Intermountain. He has been instrumental in our process in going through this journey. We have absolutely needed him. He's been absolutely necessary for us.

Lynsie Daley:
Yes, and most recently we've started office hours with Chad. Chad has set up a WebEx and actually bimonthly now, we can call in. Any analyst that's working on a workflow has an issue or a question can call in and we have this discussion and Chad's there to help with any questions. If Chad doesn't know the answer, somebody from Intermountain might know the answer or Chad will help us to find the person who does know the answer. That's been great.

Of course, the email support is great, too. With this internal user group, we have seen our usage of Alteryx just go up exponentially since that started a few months ago. I think it's really, really important to have this community inside the organization where users of Alteryx know they can come when they have questions or when they need help with a workflow or even when they've just figured out something really cool and they want to share it. That's been instrumental in breaking down our barriers.

James Selfridge:
Now comes Oprah day. Everybody gets Alteryx. This was the day that we were looking for for a long, long time. Essentially, what we said was we've got all the governance in place. We felt like we dealt with the culture shock. We've got some training. We got a lot of things in place. Let's start handing out the licenses and get it into everybody's hands. Data analysts, BI developers, data architects, marketing analysts, financial analysts, actuarial analysts. Anyone who had a title where we thought maybe they could use Alteryx, we gave it to them and said go. Develop whatever you want. Learn however you want. Essentially, do what you can do and find that value.

We loved that day. It was fantastic. Here's where we currently are. Again, going back to when we started, it was about September 2015. Almost two years after, in terms of production, the single thing we can do with Alteryx is produce a TDE on our Tableau server. People will laugh and say are you serious? It's been two years. That's where we're at. That's our comfort level with Alteryx in terms of production. We've already seen the return on investment over it.

Slowly but surely, we're going to expand that scope and that's great, but that's all we can do right now. Now, we're starting to talk about Microsoft access. It's kind of one of those hush hush, don't talk about you really using Microsoft Access in your normal workflows. What about emailing? Can we take advantage of that feature? Sandbox tables in our EDW. Can we use those as production data sources? How does that integrate? What about analytic apps? There's a lot of power behind those in using self-service with our customers. That's great, but we need to answer some questions about those first before we put them into production.

Then, finally macros and security. How do we deal with all those questions? For those of you who are more visually inclined, like me, basically we think we can do a lot with Alteryx and here's where we're at currently. We know there's a lot of room to grow. There's a lot of production capabilities that we can put into place. We're just not there yet and it's going to take some more time. Slowly but surely, we're going to get there and be able to use all these awesome features that we keep hearing about at the conference.

With that, let's get to some of the use cases, the actual fun part.

Lynsie Daley:
Yes, so this is the exciting part. Who doesn't like to share the cool work that they've done? We've got a few really neat things that we've been able to do with Alteryx that we'll go through really quickly here.

The first use case is a cash flow time-series dashboard. The problem is that I had a statistical data analyst come to me with some R script and they had completed a time-series analysis for the treasury department that predicted how much cash would be on hand in the various accounts in the future. The problem was that the customer was looking for something more visual, something that they could share with lots of people, and something that they could refresh on demand. R by itself doesn't really lend itself to that.

We started talking and the solution ended up being using Alteryx in addition to Tableau and the R interface. If you've used the R interface in Tableau, you know that it's not just a matter of plugging in R code into a calculated field and getting the results. It's a mess. You have to conform to all of the nitpicky things with the R script. Probably 75% of the script that was handed to me was data preparation. My idea was why don't I build out a simple workflow in Alteryx that would complete that same data preparation and push that data into Tableau and from there, I can just do the few lines for the time-series analysis in Tableau with the R interface?

That's what I ended up doing and it worked out really smoothly. Just as an example, here's some of the R code that I was given. All of that that you see there is just various data cleansing processes. They're changing data types, they're doing some date math and it goes on and on from there. Here's just an example of the workflow that I created. Very simple, just some formulas to recreate what was happening that R script. Now, I've got this workflow that's pushing to a TDE file that I can open up in Tableau and I can refresh this workflow on a schedule and then it will update my Tableau. I can update my Tableau dashboard and it works really great.

There's just a snapshot of some of the R code in Tableau that's completing that time-series analysis. It's just an ARIMA model. I really just had to do three or four calculated fields like this once I brought that data in from Alteryx. This is the dashboard that we came out with. The top is just very visually showing that trend and then when you get to the blue color, we're looking at forecasted data. In there, there's a parameter that the user can control the periods that they want to forecast in the future. Then, we've got the confidence limits with it as well.

The bottom is really just showing accuracy. The actual versus the predicted values in that left-hand side. Then, looking at it more numerically in the middle, and then on the very end of that bottom, you can click on a week in that middle chart and then drill down to a certain account and see all of the balances.

That ended up working really well. The treasury department was really happy because now they've got this really quick visual interactive dashboard that they can use to look at their cash on hand in those accounts in the future. There's no more having to rerun the R code, export the images and all of that, so this was a really big win.

The next use case is one that I really enjoyed doing. This is predicting high opiate users with low back pain. Opiate use has become a pretty big deal and, at Intermountain we were wondering if we could try and predict those patients who would be at risk for using a high amount of opiates. We tried this with specifically our low back pain, chronic pain patients. I used the Alteryx predictive tools to do this.

I brought the data in and prepped it and then I was able to build four different models really, really quickly and compare those with the model comparison tool that you can find in the predictive district on the gallery. It's really great because then I can just look at the results and figure out which model is the best and go from there. We'll get into that just a little bit.

This is my workflow. I'm just reading the data in, filtering to the patient's that I want to look at and then creating a test and train set. I'm building four different models. I'm building a boosted model, a tree model, a random forest model and a logistic regression with step-wise selection. Then, I'm unioning all those results together into the model comparison tool and then we can look at the output from there.

Those are my four models. The first column is showing the overall accuracy. You can see that the models are fairly accurate but the most accurate model there is the logistic regression. If you look at the individual accuracies, if you look at patients who are predicted to not be high opiate users, it looks pretty accurate but then when you look at the ones that are predicted as high opiate users, which is what we care about, it doesn't look so good. The logistic regression is still the best model.

Also, output are these two charts, so the true positive rate against the rate of positive predictions and then the area under the curve. Both of those show the logistic model to be the best model out of the four of those. Those are the results from that logistic model. From there, I've picked out a few of the most significant variables and that is whether the patient has a medication management agreement signed with their physician, whether they're also taking a non-opiate analgesics, like Ibuprofen, and then also whether they're also taken benzodiazepines, like Valium.

Those correspond with what we've seen before. From this, this is just very preliminary work, but we were able to later start some work on risk stratification and putting these chronic pain patients into buckets, depending on how much of a risk they are with opiates and using high amounts of them.

That project showed a lot of value, as well. James has a couple of really cool use cases, as well.

James Selfridge:
We have an existing license with MapInfo, but it's actually pretty expensive. What we wanted to try and figure out with Alteryx, we saw that they could do some spatial analysis, and we said can we replicate what we do in MapInfo currently and bring it over to Alteryx? Let's just do something simple and produce a static map. Well, lo and behold, we can. Bring in some essentially patient encounters, do some patient origin mapping on that, map some of our clinics, do some drive time analysis, as well, which we couldn't necessarily do in MapInfo, or couldn't easily do in Map Info, and produce a static report, which was great to see.

Essentially, we saw some of the reporting capabilities currently and what you can do now. For those of you who have been in Alteryx for a while, how many of you actually use the reporting capabilities in there? Very few. Of those few, how many of you find it to be an absolute joy? You love it. Yeah, right? Okay.

We saw a new version. It's coming out. We're excited. We want to use more of the reporting capabilities and do more with it rather than that maybe 80's, 90's interface. It's going to be a lot better. Essentially, this is what we can produce right now, and we can do the drive time analysis. We can show patient origin. We can label. We can produce all these tables, and now from this, we have some ongoing discussions of okay, where is our optimal clinic placement? We found out there was a coverage gap that we didn't know existed before so there are talks starting of where should we place these service lines that these patients are basically be underserved within our system? How can we get our clinic closer to them?

In the future, now we're talking about site selection optimization. What about the Experian data? The demographic data and population projections? There's a lot of interesting ideas that are going around and saying how can we better utilize Alteryx to do some more of our spacial analysis and not just producing static reports? We want to take it a step further, something that we couldn't do in Map Info before.

That was one of our awesome use cases that we could find with Alteryx. The next one is AYASDI. How many of you have heard of AYASDI? They're relatively new. Okay, good. Essentially, what AYASDI believes in is that there's shape to your data. They do a lot of topological modeling. We actually started a pilot with AYASDI around the same time that we started our pilot with Alteryx. What they basically try to do for us, and the specific product that we go into is AYASDI care... It's specifically meant for healthcare providers. What it will do is produce these care process models. Say, for example, I'm diagnosed with bronchitis. This will say the best possible outcome for this particular patient's means that the provider should follow these steps in order with these supplies, etc, etc, and basically reduce outcome variation with these individual care process models.

What we were trying to do is basically pilot this tool and say okay, how can we produce these care process models? How can we validate the ones that we already have? We were stuck because it doesn't include any data preparation capabilities. It doesn't have any of those and it needs the data to be in a very specific format to ingest. Essentially, we were stuck. We were paying for this really expensive product but we couldn't use it, so of course, Alteryx comes in. We figure this out. Do you remember that data analyst from HR? The superstar? Shelly? Of course, she comes in again, she makes this awesome workflow, combines all the data that we have with our EDW, all the data sources that we need and produces the three files that AYASDI needs, parametrizes it so other data analysts can use it and validate their own care process models and even produce others.

Now, we get to see the benefits of AYASDI. It basically enabled us to use this tool that we were stuck on. It was awesome to see that we could actually put this into practice with another tool as well that we just weren't seeing before.

Now, we'll go into basically summarizing some of the benefits that we've seen so far.

Lynsie Daley:
We thought it was really important to specifically point out some of the benefits that Intermountain has seen from this tool as we've looked over it the last couple of years. The first one is innovation. Alteryx has freed up time for our analysts and allowed our analysts to be creative and innovative. That's a big deal because analysts are busy people and sometimes don't have a lot of time to do things like that. It's really allowed us to push the boundaries and find new ways to do things.

Predictive analytics. We've talked about that. Alteryx removes the learning curve. We've had a lot of analysts that don't have that extensive statistical background be able to start using these predictive tools and that's a big deal to us. We're trying to get more into predictive modeling, and Alteryx is a great tool for those who don't have that background.

Automation is probably the biggest benefit for us. We have taken so many manual processes that were taking hours and hours and put them into Alteryx and saved just countless hours.

The last one is speed. We write a lot SQL at Intermountain and we have a lot of complex queries that are really sub query heavy. A lot of them weren't returning data at all. We got the idea to break those queries apart in Alteryx and then try and run it. We're getting results within minutes instead of not getting results at all. That's been a big benefit to us as well.

Here we are at the summary. Just to summarize some of the barriers that we saw in our implementation. Just figuring out the tool in general, and dealing with all of the data governance and security issues and the cultural resistance. We also, as we were overcoming those barriers, saw a lot of wins. We were able to get that first big use case that really opened the eyes of our leadership and showed them that Alteryx was valuable. Our publishing guiding checklist. Coming up with that made a huge difference in the comfort level that our leaders had with us publishing into production the workflows that we were working on.

Then, our internal user group has been a big deal to us as well. Just quickly, some of the tips for implementation success. If you're looking at doing that within your organization, take advantage of all of the training and support that you can that's offered from Alteryx. They offer it for a reason and it's very, very, very helpful. Gather as much feedback as possible from early users and just get as many people as you can in front of the tool, banging on the tool, figuring out what they can do and collect that information. That's the important part. Address culture shock. Every organization is going to have it but it's going to be different everywhere you go. Look for that unique culture shock in your organization and address that.

Finally, have an internal user group for Alteryx. It really is a great way to form a community of people who are using something in common and share the knowledge between people and figure out problems.

With that, we can take questions for a few minutes, but just a reminder to complete the session survey on the app. That would be great. I think we can go ahead and we probably have a few minutes for questions.

James Selfridge:
Sorry, you have to wait for the microphones.

Crowd question:
Hi. I wanted to hear a little more about that review process that you guys set up. I'm curious to know who the reviewers are, what their titles are and if you feel like it causes a bottleneck in the process of trying to get analysts to that point of publishing? Also, what the turn around time is of the review process? Is it hours, days, weeks? Just general stuff like that.

Lynsie Daley:
That's a really great question. The people who make up that group who are reviewing the workflows are just other data analysts. We have what's called an Alteryx steering committee. I think there's about seven or eight data analysts on that group that represent different areas of analytics within Intermountain. Right now, we just have a process set up in JIRA where the analysts who created the workflow will go in and sign up for a review, and anybody from that committee who's available will pick up that ticket and then start looking at that workflow. Truthfully, we haven't had a huge opportunity to test it out yet so I can't say for sure if it's going to be a bottleneck in the process.

We're hoping that it's not. We're hoping that with the amount of people that we have to review that it will be a quick, maybe day turn around. Does that answer your question?

James Selfridge:
I'll add to that. I've gone through that process. It's not like tearing your nails off, anything like that. It's bureaucracy and bureaucracy is bureaucracy. I get it. It hasn't been bad.

Crowd question:
It seems like it's a peer process, it might not create the bottleneck if it went through security or IT or anything like that. [inaudible 00:41:44].

Lynsie Daley:
Yeah, definitely having it as a peer review process has its benefits.

James Selfridge:
Yeah, much better. Other questions?

Crowd question:
The first question is a simple one. What is the sample size for predicting opiate abuse? How big was the sample size of the learning part, the training part, training model or testing the model?

Lynsie Daley:
There were about probably 2,000 in the total data set and I think I split it off 70/30. 70 for training and 30 for validation.

Crowd question:
Okay. Thank you. Is that 2,000 represents a yearly? What is the size relative to your patient volume?

Lynsie Daley:
It is one year.

Crowd question:
One year.

Lynsie Daley:
We have a chronic pain registry in our data warehouse and it's a rolling 12 months. The number of opiate prescriptions that that patient had is the number that they had in the past 12 months to date.

Crowd question:

Lynsie Daley:
Does that make sense?

Crowd question:
Yes. The other question I had was more general in terms of overall. Are there things, or did you run into questions, about data validity or data accuracy and how did you deal with that if you have any experience as to share about that? Skeptics will question how good is it? How do you answer that question?

James Selfridge:
Can I take that one?

Lynsie Daley:
Yes, definitely.

James Selfridge:
When we ran into those questions of is your data being valid... Are you going through a process that you should be going? We always asked a different question. Well, what's the current process? Regardless of what Alteryx does, let's take out Alteryx out of the equation. Are they doing the same thing in SQL Developer? Are they doing the same thing in Toad? Are they doing the same thing in Excel?

Basically, what we're saying is it doesn't matter if they're doing it differently or not. Alteryx is just bringing that question painfully to light of how are we dealing with our data governance and the sad truth was is we didn't know. We had no clue. There's all these, I don't want to call them rogue processes, but essentially they were processes that we didn't know about and they were very manual. They involve a lot of emails, spreadsheets on network locations, access databases, things like that. Basically, what we're saying is let's use Alteryx with that publishing governance and bring those processes to light to help us learn where we are lacking as an organization.

Essentially, we just had to deal with that question in a very hard manner. People were trying to associate Alteryx with this "Oh No" moment but we said take a step back. It really doesn't matter which tool you do it in.

Another question here? Oh yeah, that guy.

Crowd question:
Thank you for the presentation. It's really good information. I saw a little bit of you mentioned you have a pilot server with a production server and you have that 60 day period where you want to try it out or something. First of all, how's the infrastructure set up if you know, and how you are doing that part? Does the review need to happen before even they go to pilot or can they go and try it out or how do you manage that?

Lynsie Daley:
As far as the infrastructure, I'm definitely not an expert on that. We've got a couple of administrators that handle that for us. With our test server, that's open to anybody and anything, so we don't require a review before anything's published there. We encourage our analysts to use that server to experiment and try out their workflows. It's only when they are going to go up to that production server that we require the publishing review.

James Selfridge:
We've partly got a couple of virtual machines set up for the servers. It's self managed. All that content is self managed by the analysts.

Question here? I know we're kind of out of time. I'm great to hang around if you guys are.

Crowd question:
Sure. I can come up afterwards. Quick question on your demonstrations. You showed an HR solution. You showed a treasury solution. You showed a clinical pathway's solution and then a narcotics. Did you guys develop those or did you have to work with those subject matter experts and assist them in developing those products? What if they had their own analysts? How did you deal with that?

Lynsie Daley:
Do you want to take that?

James Selfridge:
With the HR use case and the AYASDI use case, that was developed by the specific analyst who had expertise in those areas. The spatial use case and I'm trying to think of the other one. The spatial use case was developed by me. Having that subject matter expertise, and then the ones that Lynsie showed were developed by her. Basically, it's the analyst who had the expertise in the subject area and built it out. I guess built out the foundation and then after meeting with the business owners, refined them to what they needed to be.

Any other questions? Ah man, all right.

Crowd question:
[inaudible 00:47:11].

James Selfridge:

Lynsie Daley:
We're happy to.

Crowd question:
I just want to say a great, great thank you to James and Lynsie for coming in and presenting.



Los Gehts