ShopAtHome - Doing More with Less - Inspire 2017

With over 15,000,000 active visitors and 5,000 affiliated merchants, ShopAtHome.com generates huge volumes of data across 3 platforms. With just over 50 employees - 4 of whom oversee data and analytics - managing the business is no trivial task. For years, they have relied on operational data stores and Excel spreadsheets to make 'data-driven decisions'. This fairly reactive approach worked for years, but with market changes, a proactive approach was needed to ensure sustainability. EDW projects made self-service analytics a reality, but the process was slow, expensive, and made data quality issues and data governance unwieldy. With Alteryx, ShopAtHome is meeting their day-to-day needs and have accelerated their long-term roadmap. Join this session to learn how ShopAtHome has leveraged Alteryx to:

  • Improve source data quality with the predictive tools
  • Significantly reduce data product 'time-to-market' with the data blending and publishing tools
  • Expand its data ecosystem using 3rd party connectors
  • Make more proactive data decisions to remain competitive in the affiliate marketing space


Video Transcription


Matt Simcox:
hanks everybody for coming out. For sure later in the day, I know. Lots to absorb, so hopefully we'll be pretty quick, but I'll give you guys some good education and insight into what we've been doing with Alteryx. As Beth mentioned, my name is Matt Simcox, Director of BI for ShopAtHome. My friend here Zach is senior BI analyst, all around good guy and Alteryx guru at that.

What are we here for today? I want to talk about how we've leveraged Alteryx fundamentally to change the data culture at ShopAtHome.It was fairly kind of cancerous, for lack of a better word for a long time. BI was almost a silo, a bottleneck, if you will, and that made a lot of different things difficult. As a result of that, you'd have your marketers, your product owners, even your executives out there following up SQL Server Management Studio, writing queries poorly, and answering their own questions their own way.

Commonly, we would have two different types of flavors of questions. Number one, “Hey BI, I'd like this one answer to this one question that I'll probably forget about tomorrow and I like it today.” Or, ”Hey BI, I ran my own query and it's not the same as what you have on your dashboard and so clearly I'm right.” That's a tough environment to deal with. That's the way things was at ShopAtHome.

Now, a little bit more about the story. Quick agenda, I'll tell you a little bit about ShopAtHome. I'm going to gather maybe not everybody here has heard of us. No, we're not ShopAtHome on the QVC network, we're ShopAtHome.com. I will give you a little bit of the details around what the business problems were that we're facing, I kind of already alluded to them. But what are those business problems and how did we approach them? At which point Zach will deep dive into some of those use cases. I can't do justice to everything that we've done in the one year that we've Alteryx in 35 minutes, so hopefully the five use cases that we provide will give you some insight into what we've done and the challenges that we faced.

Zack will then deep dive into really our crown jewel. First project we had, of course we dive right into predictive, but it's around data quality and basically predicting data problems immediately upon them happening as opposed to waiting weeks if not months before not realizing revenue forecast etc. With what time we have remaining, we'll be happy to answer any questions.

A little bit about ShopAtHome. Way back when 1986, believe it or not, we were founded in the bedroom of our founder Marc and Claudia Braunstein as a catalog of catalogs. Who remembers those? Fast forward a little bit, we converted fully that catalog model to a dot com model in 2003. We then shifted that model to more of a cash back model. What do I mean by that? We're in what's commonly referred to as the affiliate marketing space, which basically means we coordinate traffic to the likes of Amazon eBay Best Buy etc in exchange for a commission.

Then what happens is basically we hand back part of that to the customer. We're paying... You all hopefully one day, to shop. We've converted to that model in 2006.  We began really a core focus on data centricity. In 2012, began development on our data warehouse. 14, purchased Tableau. Took it to... Or established it as our analytics system of record. In 16, we formally introduced analytics... Self-service analytics rather getting out of the SQL game more so dragging pills around.

We also bought a little software application called Alteryx. Then in 17, we fundamentally changed the way we do business. Again less of a focus on revenue, more of a focus on margin, more of a focus on the details, and in order to do that you have to have a really firm grasp on your data. A few other tidbits, about 15 million active visitors, about 5000 affiliated merchants of which we've got Amazon, eBay etc. Got about 50 employees, we've got a pretty small BI team to do all that so that's why we're doing more with less.

Let's talk a little bit about the business cases. There's really three key focuses here. Number one, source data quality. Number two, cycle time. Again I kind of alluded to being a silo and just waiting days upon days, weeks upon weeks for that one answer. Then just the breadth of our data, so we needed to do better by way of integrating other data sources to make the insights that we could glean much better and more focused on that margin bottom line.

The first one, source logging data quality. You're going to see a theme here. It's all about effort, metrics and the bottom line. Pre Alteryx, we were highly reactive. As I mentioned earlier it was commonplace for us to miss on revenue and then wonder why and have to go dig into those details and discover, oh gosh our website has been mis-logging for a month or whatever the case might be. As a byproduct of that obviously and our confidence and KPI's wasn't where it needed to be by any stretch of imagination. Natural byproduct of that is, lost revenue opportunity. Nobody wants that if they're running a business. Post all trades to appointment, we're very proactive. What Zach is going to dive into.

Leverages predictive tools, time series forecasting and it tells us immediately if we've got a problem whether it be something systemic or through interaction with our customers. Hackers are out there, I think we all know that at this point and time. As a byproduct of having these tools in place now today, we've got much more confidence and alignment to our KPI's. We've got optimized revenue conversion. We're losing out on very little revenue.

From a time to market standpoint, so it's talking about being a bottleneck. That's not a good place to be if you're BI, trust me. From an effort standpoint, and it talks about either BI was writing store procedures to handle one little problem. It would go serve a dashboard or whatever the case might be and move on. That made our backlog tremendously large, and people don't want to wait. What do they do?  They fire up SQL Server. They will write a query pretty poorly most of the time and answer their own questions. Not a good place. Then from a dollars and cents standpoint, simply put, we are under equipped from a decision making standpoint and the bottom line was directly impacted as a by product. Post all first employment from a cycle time standpoint. We're drag and dropping, we're not writing code and that's the best case scenario to be sure.

Alteryx and Tableau in conjunction have helped us get there, which is to second point. We're now leveraging... Because we've got Alteryx in place, we're leveraging Tableau truly from a self-service standpoint. Folks can go drag pills around, create their own reports and we know that the answers are right because we govern the platform. We're doing what we should be doing. One of our problems was, we'd have our product developers that are supposed to be directly impacting the bottom line features of our applications that will drive revenue. They're not supposed to be doing ETL. We've gotten out of that business for sure. We have common truth... Again as a byproduct of having Tableau in place of a self-service platform is emerging positive and that's a great place to be.

Third party data integration, the third co-theme here. From an effort standpoint as I mentioned we're doing all the ETL. We govern our data, we move our data around. Another co-subject matter was bringing data manually and from cloud applications. We use Salesforce Marketing Cloud for our CRM channel and it was common place. Day and a half a week I think is what Zach was estimating. Just bringing that data in manually, driving it up against our internal data and making decisions from it, you're two, three days behind at that point.

Kind of already touched on this, but we had very little real time visibility into our email channels and others. Different kind of subject, but from a data monetization standpoint, we just weren't effectively leveraging that capability by any stretch and we've now got a scalable mechanism in place that Zach will dive into a little bit.

Let's deep dive into a couple of use cases. You're going to see some common themes but what I want to do is reiterate the business requirements and Zach will go into some imagery that will show you the workflows. The first one is, as I kind of alluded to before, predictive aggregated logging monitor. This is the mechanism by which we identify logging problems and know statistically they're significant. Our business requirements, hey, let's identify data quality issues when they happen not two weeks later, not a month later, whatever it is. Let's then prioritize those high visibility anomalies. Maybe 80% confidence interval isn't what we want to look at. In this case we look only at 99%.

Now we want to alert the business and arm them with the right data to act on when they need to act on it. I think you can see how the proactivity is much more there than ever before. Zach, why don't you go ahead and dive into the workflow a little bit.

Zach Bloese:
Sweets. All right. Time for the fun stuff. No offense.

Matt Simcox:
None taken.

Zach Bloese:
He's only my boss. What I'm going to do here just for this one, we're going to do a bigger deep dive into this particular workflow and go into a lot more detail. I'm just going to hit the highlights; it's a bit of a teaser. This was our very first project and Matt alluded to this a little bit earlier. The very first project that we did, of course it was predictive and a batch macro because why start out with just simple data blending when you can do a batch macro and do a predictive analysis.

Matt Simcox:
Zach loves me.

Zach Bloese:
What we have here is we have data going into a batch macro. It's going into predictive tools, which is time series forecasting. Then we're moving on further and we're landing in the business owner. Again, I'll go into a little bit more detail on this specific flow here in a second, but just a high level over view of what we're doing, that's what we're going for.

Matt Simcox:
Again, ensuring the source data quality, critical, garbage in garbage out. I think we've all heard that one before, right? The second use case is very similar, source data issues, actually self-imposed source data issues unfortunately. I talked about those 5000 merchants that we're affiliated with. The problem was, primary keys do not exist for those merchants. Let's say Best Buy changed their network or we changed Best Buy sales representative, Best Buy is no longer Best Buy, it's Best Buy two. Let's say. That doesn't work because we can't do a year of a year analysis of that merchant or anything other similar types of dimensional analysis.

What we did was leverage fuzzy matching, which Zack will dive into a little bit. Our business requirements basically were to create that true primary merchant key or merchant primary keys Excuse me. And establish an automated process to keep that up to date. Administrative tools aren't what they are. Changing that wasn't going to be prioritized again because it didn't directly affect the bottom line. But in a short period of time, where Zach will dive into, we've accepted we've solved that problem.

Zach Bloese:
Data cleansing with fuzzy matching. I don't know how many of you have done fuzzy matching or if you've attended any sessions today that do fuzzy matching. Anyone that was in Alaska Airlines, this one looks a lot simpler? His was amazingly complex because it was a very complex problem. But this one is pretty straightforward because we're doing a pretty simple match.

But what we're doing is we're querying our merchant data and database just from the table doing a little bit of cleansing and then we're bringing that streaming it out so that we can start using Alteryx tools. What we're doing here is we're doing a fuzzy match based on soundex keys and when we do the fuzzy match on soundex keys, then that creates our match groups. Once we have our match groups, then we can tie that group ID to a merchant ID. Then we can tie the merchant keys that are created off of this to merchants. From here to here we're doing what Matt just described. We're giving each merchant ID their own parent ID so that we can roll everything up from there and we're able to do good analysis where we don't have to try to mash two different data sets together.

From there as we go through the flow we're just making sure that we're only looking at one's merchants that have more than one parent ID. If you don't have more than one, we don't... We're not going to shoot you through the rest of this flow. Once we have that we're really just joining back into the workflow to get the rest of the data and we're actually writing it to our database so that it does an automated process to clean up our data so that we don't have to manually do this all the time.

Matt Simcox:
Great. Use case number three. I kind of alluded to this before, the Salesforce Marketing Cloud and how that's basically our CRM system of record. I mentioned we had about a day and a half, a week and manual data blending. That job doesn't sound fun to me, I don't know about you guys. The simple business requirement was, hey let's automate that whole thing and let's make that data resident in our data warehouse eventually in our cube and so forth. Zach will touch on some of the pieces of data that we're bringing in and how they all make their way into our analytics platform.

Zach Bloese:
All right. Poorly zooming. There's three steps to this current data integration that we're doing. A lot of that work is done on our CRM vendor side so that's prepping all of the data on their site and getting it up to an FTP. What we do within Alteryx is we hit that FTP and download all the data that we want, then we write it all out as have them into CSBs. We have click data, opens, sends, your typical kind of email data that we would want. [Pursing 00:16:46] and doing some red jacks where need be because with emails you have query string and things in query strings that you want and they have just to be right in the middle where it's really annoying to get to them.

We do have red jacks in there and then maybe just some data type transformations in some of the other ones. But really that's about as simple as it gets and then we're writing it to our ODS so that we can then blend it with internal data. And as Matt alluded to the CRM team was doing this except they were going to the CRM sites and just copying it down into Excel spreadsheets and then running queries that may or may not have been right and [Jamie knows 00:17:26] into an Excel spreadsheet and it was a ridiculous process that we were able to just get rid of completely. Lets me close out that for you.

Matt Simcox:
Use case number four is really around a number of two things. Monetizing data, data is valuable guys and it's a low overhead game. Data monetization is a low overhead game if you do it right, which we think we did here. We needed a scalable and reusable process to on board new partners whether it be inbound or outbound. Blend all of that data with our internal data basically make it valuable and then sharing that back out with other data partners. We had it again, had to be very scalable and reusable. Zach will touch on how we did that.

Zach Bloese:
All right. This is a two step process. Some of the work that goes into this you can't see here so we're using the command line interface to do some batch files just so we can move files and import of them and things of that nature so you don't see that here, but I'll walk through the actual flow. We connect to AWS to pull in new data from our partners and so that's this input right here. The files tend to be additive so we always will join it up to our last import so that we can match up the data from this and this and make sure that we're only getting new records.  As you can see here, we're getting new data only. We're doing the timestamp and cleansing symbols out and so this is just purely pulling data from our data partners and then we're writing it to our database for step two.

What step two does is down here. This is in database query but we're taking this database that we created up here and we're joining it with other internal data, just different types of system logging into traffic on our site, things of that nature. Once we do that, we blend it all out, we [inaudible 01:19:45] timestamps, cleaned up the timestamps and then we write another file out and send it up to a different data partner via an FTP.

Once we do that we have another batch file in the post command and that will archive the file that we just sent up so that they're all just not sitting out there. The nice part about this is it's not a crazy process once you figured out the command line interface and it's very repeatable, so where our development team would have had to have done this, they always on boarded a different data partners. We're now able to do this. We just switch out FTP connections and things of that nature so we're talking a couple hours maybe just on board a new data partner, which is huge. Definitely a big time saver.

Matt Simcox:
Fifth use case is just getting stuff into our analytics platform. Both Zach and I touched on before how common practice for us was... When I say us, I mean BI.  We got a couple of questions we need to answer. We're going to go right as per procedure. That's going to take a long time, 1500,1800, 2000 lines of code going and grabbing things from all over the place. Fairly complex code at that. But that's not how we want it to operate anymore as I'm sure you can understand. We needed to create multi-dimensional datasets for marketing that included all of our core events, visits, out clicks, that's when a customer leaves the site goes to Best Buy, goes to Amazon, whatever the case might be.

We needed a mechanism of visualizing that multi-dimensional conversion of that journey, that funnel. We also needed a better way of enhancing that go forward. Zach is going to touch on how we're doing this and how it's also feeding downstream processes such as analysis services development, things of that nature.

Zach Bloese:
This one was pretty basic but it was a big deal at least for me. When Matt was talking earlier about how people would write bad code and we wouldn't have the right answers, he was talking about me. Well, it's true. This process as he kind of alluded to is hitting our main tables. We have visit data, we have out click data, we have purchased data, we have customer data and all the marketing teams always want that in one spot. It's like I want to see the entire flow. I want the funnel.

Before Alteryx, we had Tableau, which is a great tool. I love Tableau, but we had over a billion rows of data. Tableau doesn't like billions of rows of data, it gets really slow really fast and people have been complaining. It's a whole thing. No one likes to deal with it. This flow is just doing something very simple, it's taking our visits data, our out clicks data, our purchased data and our customer data and it's blending it into one data source. This is all NDB for the most part until we stream everything out. But we're doing our formulas, we're doing our calculations, time calculations, things of that nature and then we're joining it all together into one data source.

Once we have in one data source, we publish it up to Tableau server and then we connect to it and build dashboards off of the data source in Tableau, which is really fast and has saved a lot of time. Previous to this workflow, this was a store procedure that was about 1800 lines of code, which is a lot to go through, definitely. Any time that we'd have to make a change, you'd have to always read 1800 lines of code to figure out exactly what that store procedure was doing, where it's very easy if we need to make a change to figure out what this workflow is doing. It's very simplistic. It's easy to change things, to add things where a store procedure, not so much because you could touch one line of code and everything breaks very quickly.

Matt Simcox:
Cool. Hopefully that gave you some gauge into what I was describing at the beginning, which is data culture just quite simple as ShopAtHome wasn't great. We've shifted our paradigm to be sure. This demonstration, the crown jewel of our Alteryx platform is really what makes it all possible. It all flows from here making sure that your source data quality is critical. You're basically just flipping a coin. If the data coming in the door isn't accurate, isn't telling you what you need to know. Zach, we're going to do a pretty deep dive into this workflow and then we'll probably open up for some questions after that.

Zach Bloese:
All right.  As promised, we'll do a little bit of a deep dive into this predictive logging monitor just show what exactly we're doing, how we're using it, how it's working, things of that nature. We had tables that we knew that we wanted to monitor whether everyone has their key table whether to log in health or anything of that nature, every company has different ones. We had tables that we knew we wanted to monitor and we had metrics within those tables that we knew were important to the health of our business.

We were able to... There we go. This text input field is just... We're entering in the metrics that we want. I have a grouping field and an aggregator field. I now realize that aggregator isn't a real word but we can-

Matt Simcox:
It is now.

Zach Bloese:
It is now. We can make it happen. These are the... I want to... For this flow I want to count out clicks from our site and I want to group it by this and I want to further group it by this column or aggregate. That's fun. This feeds into our batch query macro. The batch macro... If any of you are... How many of you are familiar with batch macros or macros in general? A handful of you. Pretty good. All right.

What this is doing is we're writing our query and we're saying, “Okay, I want to count our out clicks aggregated by this, aggregated by that, by date.”  It's a really basic query, but we want to do it over and over again for each of our groups and our aggregations so that we get this long list of everything that we want to measure for the last 18 months so that we can then feed that into a predictive model, so that's what this is doing. We're running the basic query and then we're... all these up here or control parameters and our updates, they're updating the values that we want to change.

We're updating the name of the dimension or the name of the aggregator for each of the ones that we saw in that first text input and then we're updating the values of the aggregators and groupings that we saw from the text input. Then it loops through all the metrics that we want. Add as many as you want, time permitting and spits it out into the macro output, which goes back into our flow.

Once it goes back into our flow... This is just... I'm just limiting my data at this point so as Matt mentioned we have 5000 merchants. I only wanted to do the top 200 because once we get into forecast if your merchants... If the 5000 merchant shows up in this it's not going to have enough data to create a good predictive forecast, so it's just a giant waste of time and the model will take a very long time to run with no added benefit. I just limit it to the top 200 merchants so that we can just trim it down a little bit.

From there we get into the actual forecasting. We have all of our data that's coming from the previous... The macro, so that's 18 months worth of data and we're feeding into the time series forecast model. We mess around with this a little bit, you can do in an ARIMA, you can do in a ETS.  We've done both. We decided that ETS was a little bit better for us just because the nature of the industry that we're in obviously we have a seasonality issue. For shopping, Q four is huge. November's even bigger than October and December.

With an ETS that... It's an acronym, takes care of seasonality, so that's kind of the reason that we went with the ETS though the ARIMA was also producing some fairly decent results for us. But it's just more of a see which one's working better for you type of thing. We do do a 99% confidence interval. Again, we messed around with different confidence intervals but the end goal of this was to alert people, to alert business owners if there was something wrong with one of their metrics. I don't want to really be that boy who cried wolf when... If there is nothing wrong, I don't want to alert them. You can't get away from that entirely, but I want to try to get away from it as much as possible.

We run it at a 99% confidence interval. We joined our data back together so this produces our forecast and then this is our historical data. Pretty simple from here on out. We flag any record that falls outside of our ARIMA or ETS model. In this case 99% confidence interval on the low end or on the high end. Create the forecast to actual variance just so we can have a percentage to show people, not today. That's labeled fairly well. This runs at two o'clock in the morning so there's a lot of data that's not in there yet, so every little thing would fire, which would not be that much fun.

From there we just go into reporting. We use the reporting tools to create charts that will visually display how the trends have looked over the last 18 months and what the actual number was for yesterday and where that falls above or below for the ARIMA forecast so that people can really get a visual aspect of what the data looks like. That was really important because, like I said you can't one hundred percent predict something that's wrong. If you throw in a visualization, it really helps them to decipher it like, no, that's expected, that data point is fine or, oh my God what did we do? 

We build the chart in. We create a table for the email and then we email it out to the business owners. It's a fairly simple flow as far as it's not huge. But definitely a huge win for our company because it allows us to be very proactive. The teams can monitor their KPI's very closely to determine whether anything is broken in a previous release or if there's anything that they need to be alerted to.

I'm going to jump back in real quickly. This is the output that goes out to our business owners if anything is alerted. You can see we have our dimension, dimension value if it's aggregated at all. Then the low end forecast, the high end forecast, what the actual number was, the forecast actual variance and then the blue line is what the actual number was and then you have the trend for the last 18 months so then you can visually kind of see that blue line, maybe I'm okay with that. I can see maybe why it fell outside of the variance because if we are so high up here but it's come down to here. You can really quickly see if it's something that you really need to worry about. I think we're good there. Turn over to Matt.

Matt Simcox:
hank you Zach. Always love looking at that workflow. What is it all done? You don't just build workflows for your health. Well, maybe some of us do. Around those core themes source logging data quality, making sure it's not a garbage in garbage out type of situation. Ensuring our processes and our time to market are as lean as possible and we've got the greatest breadth of data possible and we're leveraging it from a monetization standpoint. What's our ROI look like in terms of effort, in terms of confidence in our metrics and our KPI's and ultimately the bottom line.

Source logging data quality, we are resolving problems in hours as opposed to weeks or months and that's a pretty hefty ROI. From a metric standpoint we have confidence in our data. That's critical. From a dollars and cents standpoint we reasonably estimate simply from those logging monitors that we have not lost out on at least $500,000 in revenue opportunities since they've been implemented and that's half a year. From the time the market standpoint, no more code. SQL is a three letter word. Self-service ensures one truth because we're not writing SQL, we have confidence in what's coming out of the platform and Alteryx supports that process and we've been emerging positive for about eight months in excess of 700,000.

From a third party data standpoint, we're not doing that and day and a half CRM anymore. Obviously, a pretty substantial effort on our ROI there. We've got a scalable data partnership process. We're effectively monetizing our data, which again is a big deal in today's digital marketing industry. We've got a wider universe of data so bringing all of that CRM data in allows for us to really have a better view into the funnel. Just from a data monetization standpoint, we reasonably estimate we've increased that capability per partner by about $500,000 a month. That's return on investment for us.

Back to the original thing, doing more with less. The original slide where we had the cool little guys hammering on the keyboard, I don't know If anybody saw that, but I think that was pretty nifty. But we've shifted the paradigm. No longer are we a bottleneck, we're a luxury. Folks love to come to us and we're not reactive or proactive. We don't have a garbage in garbage out scenario anymore. We're making the right decisions. We're answering questions we didn't know we had instead of one little question for one specific answer that someone says is going to forget about it tomorrow.

We're answering those types of questions in bunches. We're controlling the... And in order to do this, we've had to control our source data, we've had to find efficiencies in product delivery and we've had to expand our data universe. Make those the right decisions in an ever changing marketplace and no longer are folks coming to us and saying, “Hey BI, you're wrong.”  Or, “Hey BI, I want one answer to one analytics related question please.”  Now people are coming to us and say, "Hey BI, did you know that...?" And insert your insight there.

So that's all we have for today. I got a little time for questions. Thank you again for coming. Questions?

Crowd question:
Do you [inaudible 00:37:37] directly to... The data source goes directly to the Tableau source. Can we find a way more visual [inaudible 00:37:46] dashboard connecting directly to that data from the cloud rather than [inaudible 00:37:53] Tableau should be admitted to work with or the database?

Zach Bloese:
Yes. The question was, do we find it easier to publish directly to Tableau server from the workflow versus publishing-

Crowd question:
Is it faster?

Zach Bloese:
Or faster than just doing it. I've found that it is just from a... Even creating like a Tableau data extracts, that takes a long time. It really does speed things up. It goes up to Tableau server really fast and the performance I've had from taking it all to Alteryx flow and that data and then turning it into the Tableau server, they've been really speedy for me with tons of data, so I've been very happy with that. But I'd say it works. It's been better for me.

Matt Simcox:
It's [inaudible 00:38:54] to quantify, but I'll agree for sure.

Zach Bloese:
Can't speak to all.

Crowd question:
Your user is also to your dashboard performing faster than the [inaudible 00:39:02] dashboard?

Zach Bloese:
No one has complained to me.

Crowd question:
And that's good?

Zach Bloese:
Yeah. They're usually not shy about things like that.

Matt Simcox:
Any other questions? Yes.

Crowd question:
Hey, I have a question about [inaudible 00:39:15]. Do you still use it [inaudible 00:39:19]?

Matt Simcox:
he question is how we fully moved out of the store procedure environment. Little bit of a caviar. Really the intent there was to say store procedures to do ETL. We're not fully out of the woods, but we've vastly reduced that inventory, that's a maintenance nightmare, which we're alluded to so we've got a couple of things that are still out there that we rely on to operate but I'd say we've reduced it by 80%.

Zach Bloese:
I'll say that I haven't written a store procedure in over a year. Everything that I do is all through it's base. Again because of my bad code.

Matt Simcox:
hat's true. Any other questions?

Crowd question:
[inaudible 00:40:08] help in that situation?

Matt Simcox:
Phone numbers per customer? Does fuzzy matching help identify phone numbers? Is that-

Crowd question:
For example [inaudible 00:40:26] phone numbers and all I have is different phone numbers. Would it help you use different phone numbers and [inaudible 00:40:36]

Matt Simcox:
Sounds like an interesting use case, so the question basically is if you've got a customer phone number, can you use fuzzy matching to tie out the customer with a phone number? I'd say that would be pretty challenging but it's not a use case that we've encountered. We're a year in, but like Zach was alluding to some of the fuzzy matching that you all saw in the Alaska session was probably a little bit more complex than what we're doing but it works. We definitely want to expand our capability a little bit.

Zach Bloese:
That's a tough one. Anyone feel free to correct me if I'm mistaken, but I don't think fuzzy matching is your go to for that.

Crowd question:
inaudible 00:41:21] email campaigns. Does that [inaudible 00:41:26]

Zach Bloese:
Yes. The question is if we do a lot of outbound email campaigns and a hundred percent we do. We do have a lot of flows that are just speeding up different data sets to our email vendor site and then the email campaigns go out from there, but a lot of requests come through our team to be like, "Hey we have this idea and we want to send an email to this specific group of people." And before, that was a real pain, but now I just have the Alteryx flow and then I write it directly to the FTP and then from our CRM side it grabs it from the FTP in the way they go.

Crowd question:
Do you all do the [inaudible 00:42:07]

Matt Simcox:
I don't think we do anymore now.

Zach Bloese:
No.

Matt Simcox:
We've gotten out of the paper business.

Crowd question:
[inaudible 00:42:17] question. What makes it interesting is that you say that you've got four analysts network in your chain of the journey to on board [inaudible 00:42:24] across the line in terms of training them, having to build [inaudible 00:42:30]

Matt Simcox:
Can you touch on that one? [inaudible 00:42:35]

Zach Bloese:
I was one of the on boarded.

Matt Simcox:
he question was, what was the on boarding process for this fairly lean team? I'm assuming that you're speaking in particular just with Alteryx and Tableau and so forth. It was easy. It really was. Alteryx is pretty straightforward. That's one of the great benefits. I think Zack you might have-

Zach Bloese:
We came to Aspire last year ,of course that helped.

Matt Simcox:
Online resources, that's true not only for Alteryx but for Tableau are great. Those are really has been the main mechanisms I think Zach?

Zach Bloese:
Yeah. It hasn't been too hard. I think there's always going to be that one person in your company that is going to be with Excel till they die. And they are not going to leave it no matter how much you show them how cool the tools you are using and how much time it can save them, they're going to be in Excel and there's nothing you can do about it. But for most normal minded people, I think it's a pretty easy site.

Crowd question:
[inaudible 00:43:46]

Zach Bloese:
What was that? Sorry.

Crowd question:
Were you using predictive modeling before you used Alteryx?

Zach Bloese:
I wasn't doing predictive modeling before I used Alteryx, so that's definitely been a new learning experience for me and something that Alteryx gave me the capability of learning that new skill. Yeah.

Matt Simcox:
At the back.

Crowd question:
[inaudible 00:44:12]

Matt Simcox:
he question is, do we start off with the Alteryx Server? The answer is, no. We have scheduler to automate the workflows but we're not fully leveraging the gallery and all of that other functionality, hope to one day for sure.

Crowd question:
[inaudible 00:44:36] how did you pick that up?

Zach Bloese:
The question was how did I pick up predictive analytics since I didn't have any experience going into it? That's a great question. I did a lot of trainings on Alteryx's site through the webinars they have a lot of good trainings on predictive and throughout the year they have live training so you can find ones that match what you're looking for. I definitely abused Alteryx customer service. I would connect the data set and plug it in and see what the data sets look like and what the predictive models look like and then I'd be like, "Okay, I have no idea what that means." Then I would talk to customer service and they walk me through what exactly I was doing and how the data should look when it comes out. It's definitely been a learning experience, but there are so many resources out there within Alteryx and outside of Alteryx for predictive modeling.

Matt Simcox:
A funny story yesterday or two days ago at the airport. Ran into a couple of Alteryx folks and they happen to be on the customer service team and I introduced myself and they said, “Oh, that's Matt Simcox.” We definitely leveraged customer service and they do great.

Zach Bloese:
Great stuff.

Matt Simcox:
Any other questions? Thanks a lot again.

Zach Bloese:
Cool. Take a session survey if you get a chance.

^Top

Experience the
Power of Alteryx
For Yourself.

Get Started