Marketo - Driving Enterprise Engagement with Alteryx - Inspire 2017

For many, "data prep and analysis" is the phrase comes to mind when thinking of Alteryx, but at Marketo, its "governance and scalability." Marketo combines Alteryx and cloud computing to drive a fully automated solution that saves IT and lines of business both time and resources. In this session, you'll learn how Alteryx enables Marketo to deliver an enterprise-class data architecture that drives business engagement and dashboard adoption across the entire company. What's more, you'll hear about Marketo's "Alteryx in the Box" deployments that drive new ways of thinking across the company and accelerate great ideas.

Video Transcription

Tim Chandler:
Thanks very much. Before I jump in and start showing the presentation, I just want to get a show hands and kind of know who I'm talking to. Who here has done any large deployments with Informatica? Okay, one. OBIE developments? Okay, interesting. So when I started off in this world, I started off in that environment, and it's very structural. I mean you've got large teams, and it's slow. I can see the nods from a couple of people who use them.

It's not to say that they're not for some people, but just the way I was looking at what I wanted to get done in an environment, I was really looking around to try and find something that was going to happen a lot faster, and do the same thing. When I look at what has to be accomplished in this environment when we're start looking at data, I look at it from the lens of these five things. Or a different way of putting it, when it's 5:00 on a Friday, I don't want to be bothered. I want to be able to go home. I want to enjoy the weekend, so scalability and governance, I picked those as kind of the top two and that's what talked about in this session, but really all five of these things are super important, and really I think that the thing that you're going to learn, I'm going to talk about today, is that this is not about a group that's trying to go around IT and do some cool things with data, and get some great results in Tableau or whatever that end presentation is.

This is about the company and embracing Alteryx and using it as it's data pipeline and relying on it embedding the business. So these things are super important to us, and you'll see how we've really taken Alteryx and turned it into... Through the processes and the people that we have, how we've really tried to drive into this area.

These are the four basic areas that I'm really going to talk about today. How we delivered a corporate data pipeline. It's something that you don't hear about in the Alteryx area, but I think when you see what we're doing here I think that there's a compelling reason that most companies should be heading in this direction. Maybe not to get to it, but definitely in this direction. How we made data models drive dashboard adoption. That's supposed to be data, not date. Data you know the way I look at this whole approach of getting dashboard adoption is not to make everyone experts in Tableau or ClickView or whatever you want to use, it's to be able to really create fantastic data models that are agile, quick, and then when someone's in the business that's going to create a dashboard, whatever that application is, that they have this great data that's going to make it easy for them, and they can get something done quickly, and they trust it. So that's where that's coming from.

About six months ago we started to do something called Alteryx in a box, and there's a number reasons why we went in this way. Security wanted us to go in this direction originally, but then we started to look at what the opportunities and how we could manage what we're doing in the company. It made a lot of sense.  And then there's a few highlights I wanted to share with you of what I wish I knew a few years ago because I think they really helped drive a lot of what we're trying to do today.

So how we delivered a data pipeline. First of all let me tell you a data pipeline is really just a way of getting data in a heartbeat, you know that drum beat that's going to happen all the time. It's day and night, 24/7, it's just pumping data into where needs to be. You'll notice I don't talk about master data management here. I don't talk about single sign of truth too much because I really think that the models and what you can do with Alteryx you can really avoid lot of those huge costly projects. I'll get into that a bit, but basically the way that we look at moving data is you go find the data. You go find where it is, so that's a lot of API's. That's a lot of ODBC connections. That's looking to see where that data is and be able to get it hook or crook. Blending that data.

One of the first things that you normally run into if you're looking at environment is you realize that, user IDs are one more step that you have to go out and blend data to get what you really need to get pushed out into your business environment. Doing the calculations, naming and enriching data. This is where you want to try and create the calculations across, not just the department, but across the whole company. So people start to standardize and use common names, common fields to categorize and do calculations. It's very powerful. You think about, if you've got 25 people creating dashboards you can do one of two things. You can let them do the calculations and then fight to try and pull them back in, or you can listen to what they're doing, create those calculations, and put them in one repository, and let them come get it. There's a lot less work in doing the latter way.

The last area over here is where I hoped to really be focusing in the months ahead and that's predictive and prescriptive analytics, being able to come up with meaningful data, meaningful insights to the business of what needs to happen next. All too much though we're focused on these first three areas right now is we're just pulling and getting the company moving in this direction.

So connecting to data, I don't know if you've been in many of the other sites, or you have... Who here has used API's to pull data in? Oh, okay cool. So you probably know then from... For those of you who haven't done it, sometimes it can be fairly straightforward. You can usually use Postman to prototype and get some information going, and then the next step is taking that rock [inaudible 00:06:43], putting it into the download tool, and then start pulling out the data.

There's a lot of really cool things you can do in this environment. It's worth having a resource or someone on your team that is familiar with or knows how to do this because pulling this type of stuff out is really where a lot of your stuff is going to begin, and there's all too many conversations with the business that always start, "Can you get such-and-such data?" And if you don't have the ability to write API's or know someone who does, you're going to be really limited down this path.

So is another great resource, the workbench there. So if you don't know about it there has this free utility that comes with their package that basically allows you to create any queries and test them, and you can actually go in and look and see what tables are there. What fields are in there. What the field types are. It gives you a big jump ahead, so when you get into pulling data in with Alteryx you know exactly what to expect and how to move down this path.

I know in the past some of the tools inside Alteryx, they don't always pull up all the fields. Depending on the back end permissions and all that, so workbench is a way of really trying to make sure you know exactly what you should be getting. Security, both access and storage, so you know security makes the headlines here and there in big ways, and so as we've gone this down this path it's been super important to make sure that everything we do, that we're pulling in the security team, and showing them this is what we're going to do. This is the data we're looking at, and you know those really trying to make sure that every step we take using Alteryx is going to be a very secure environment.

We've got the blessing with our security team. They know and see how things are going, and you'll see a chart in a few minutes about actually how we're pulling this information through. When I think of connecting to data, these are the three things. So probably one of the more important workflows or databases that goes into our pipelines, so we have several workflows that are pushing data through, but one of the key ones is the opportunity. So if you haven't guessed by now, most of the stuff that I do, and the team does is around operational data. So bookings, financial information, partner information all the opportunity kind of information that's flowing through. So there's about 12 workflows that actually go through this up here, and the reason... We've broken them into components to make them more manageable.

So I could have written one huge workflow, but if and when an error does happen with either reading in data, publishing into Tableau, storing it in S3, doing a calculation that might be incorrect. It's a lot easier to maintain the code in Alteryx when you divide it up, and you actually have purpose-built modules or workflows that will make something happen. So there's 12 of them. We've broken these environments down to 12 environments. You can see one of the... This was a much earlier version of how we pulled stuff in through here, and as we talked to people in the business they'll say something like, "Oh, I need to categorize data in these ways, and I need to have these types of definitions." You create yourself a whole workflow that will then feed back into the major workflows of data up there.

So you're not having to pull up more widgets and more workflows than you need to. You're focusing on just what needs the change. It makes it a lot more manageable. Now this little routine up here, as it runs through it's using a runner tool, and then it's throwing it into a table, and then it's sending the final version of this actually had an email going out to me, so I can just see exactly how long each of these processes took, and what was happening.

Another thing that we do, it's not up here, is we integrated this was Slack. Amy actually came up with this idea about what, six months ago, and you'll know inside each one of these workflows you have the ability to send an alert, and one of those alerts is an email, and you can say if this is successfully completed send an email to the Slack channel. And you can go to the Slack channel, and you can actually see as your jobs are finishing in your scheduler, they're just popping up, and you can see success, success, success. You've got other channels where if there's failures you're not expecting, they'll pop up. It's an easy way to monitor, and it's also a way of showing what your business... Of what data's being refreshed and when it's being refreshed. They can just monitor a channel if they want as opposed to asking you when the date has been refreshed.

So this changed a little in the formatting, but scaling Alteryx workflows. I talked about compartmentalizing workflows. What I think of when I think of a large workflow, I really try to... Let me jump down here to number two. Is this four step kind of approach. The input blending enrichment for a five step cleaning and publishing. If you look at it from that point of view, and you start to compartmentalize these things either through Colors or through Workbooks, when you come back to these workflows months later, and you go, "What the heck is this?" This gives you an area to focus on a better understanding of where to make things happening.

So it may be very enticing sometimes for someone to say, "Oh, can you add that variable in?" And you'll say, "Oh, I'll just throw it in just before the publishing." No, you want to try and get back into where that makes sense and document who asked for it, when they asked for it, and why it's in there so you got this history and log right in your workflow of what's going on.

The very first one that I kind of skipped at the beginning. One thing that I never really focused much on, on the first part of working with Alteryx is trying to eliminate all the errors. I didn't want an error, but when it came to warnings, and the notices I generally didn't pay a lot of attention to that. It basically if I got what I wanted at the end, I was good, but as time goes on I think that especially if you're going to have many, many workflows in an environment, you really want to try and drive to getting to zero errors across your full workflow.

And here's why. Your workflow is going to most likely live for some time. Maybe two, three, four months, and something's going to change.'s going to change their fields. The performance is going to change maybe. There's all these parameters that you can't control that are out there, and when they change you need to know about it even if it doesn't break your system you need to be aware of that something's changed just, so you can hedge off your bets and make sure that in another two or three months that that might not have a larger impact in what you're looking at.

I think it's a best practice really, to really try and drive zero error, zero warnings into your workflows and to understand when those errors do start to pop up, what's really happening. So macros number three. A lot of the times, the only time we really create macros is when we have to do some repetitive task, and macro just makes sense. But there's also other ways you should be looking at macros, and one of the others is if you've created some piece of code, and you feel that, "Hey, this does something very purpose built." You may want to take that and put that as a macro, and then communicate with the other people in your company that, "Hey, this macro doesn't something really interesting. It'll save you some time, use this." So the crew macros and a lot of those I think are good examples of that. If you find a good macro out there, advertise it within your group. Make sure they understand it and leverage and try and use that macro as much as possible.

So I talked about the last one here about Slack channels and emailing those results. We've also done something, and I think I've got some pictures coming up here in a moment. We've implemented some dashboards. We've done dashboards of both Alteryx, and what's happening in Alteryx, and we've also done dashboards of Tableau. We use Tableau in the company, and we're monitoring the usage of people coming in and pounding the system. We're also managing the data that they touch, and we're also managing the status of those workflows pushing that data.

So we're see kind of an end to end view of how data is being created, pushed, published, and used, and viewed across the company, and it gives us a good idea of what's happening. Now there's a lot of things going on. We've got over 90 workflows updating data, and it's impossible for us to look at every one of those workflows every day, but that's where the visualization comes in to be able to look at some basic base lines overtime, and then soon as you start to see errors pop-up you know that there's something to look at there that's in more detail.

So this is our architecture and how we actually pull data across many different places, and it's grown over time. Is this... Yeah, okay. So when we started off this was just, and it was being pulled into Alteryx a server, and then it was being pushed to Tableau. That was the very simplest version of this, and that started off almost a year-and-a-half ago. And then over time we started to look at how we could pull other data sources through API or ODBC connectors, and start to pull that information in.

Let me talk about some of the details in here that I think are interesting. So one of the things that we started right off the bat is that we had knew that a lot of people have data on their own PC that doesn't exist anywhere else, and we needed access to that. Without access to that this whole thing kind of falls apart. So we originally looked at Google Sheets and there's some really nice macros out there that pull that data it, but we stopped doing that. We did not find that that was the way that we wanted to go for security reasons. What we started to use here is Box Sync, so it's a file sharing service that this server actually runs Alteryx server has Box Sync on it, and it has a lot of directories that are shared with the different businesses across the company, and when we meet with someone, and they say, "Oh, I've got this Excel spreadsheet, and I want it to update into the opportunity table, and I want to be able to view things in a different way." We can do that.

We pull their Excel spreadsheet into, and blend it into this environment. It works really well. A little education, don't change the format. You got probably have a little bit of error checking to make sure that they're not renaming the file or doing something like that, so there's a little bit of work. It's not just reading an Excel file at that stage. But that really opened things up quite a bit, so we had Salesforce, we had local data, and then we started to all of a sudden get into integration.

So with [Oanda 00:19:39], this is a financial software API that pools in the exchange rates. We are actually using this to not only enrich the dashboards through here, but it was also coming back in and updating tables in So now we're not just into the data presentation, we're also now into actually integrating tools, and processes with Alteryx. So this just started to continue down more and more Intacct, Clarizen, Coupa, Workday. Some of these are pretty easy to integrate. Others were not. And the one word of advice is I can point, is if you're starting to do this, always look to see if there's an ODBC connector. If there's an ODBC connector that should mean to you that there's a really easy way to get at that data.

If there's not, then they're going to probably present you a rest API, and that rest API environment that takes some time to figure out or go hire a contractor or someone to pull that data in. Just having a rest API doesn't mean that you've done the job. A rest API is really an environment with a lot of commands, and depending on what you want to pull out of that environment, you have to really customize it.

So as we bring information into this environment, a lot of the data blending happens at this stage. Over time, we started to now take copies of what was in here, and start to store them in S3, in Amazon's S3. Very easy to implement. As we started to pull information in through here, some of the things that we are capturing in here really add a lot of value and so it wasn't just a technical thing. For instance, we take a snapshot every night at 11:00 of about 35 key fields that change in our pipeline, and we store them over here.

So now we can create dashboards. They go back almost a year-and-a-half now. They can show you the velocity, the change, sales reps changing their target rates. You can see exactly all the kinds of things that typically get lost in So as we then push stuff into Tableau, into this environment, the one thing that we try to do as much as we possibly can is blend the data and come up with as few data bases to present over here.

When we look at opportunity there are two or three versions of opportunity, but that's typically done for security reasons. Is there a question? Yeah.

Crowd question:
[inaudible 00:22:34].

Tim Chandler:
You can ask that.

Crowd question:
[inaudible 00:22:39].

Tim Chandler:
It's all in the cloud, yeah. This is all in the cloud. These are all vendors that have cloud services, and then this is all on AWS. Yeah.

Crowd question:
[inaudible 00:23:02].

Tim Chandler:

Crowd question:
[inaudible 00:23:08].

Tim Chandler:
So the question is, what are the pros and cons of using Google Sheets versus Box. Okay, so in Marketo we use Box as a deployed standard throughout the whole company. Enough said? Okay. Yeah. One thing, it's a lot cleaner and easier to share and read in native Window files in Alteryx then it is to read things through an API in Google Sheets.

Crowd question:
One more question related to her question. I understand that Box [inaudible 00:23:55] Marketo. You also have Microsoft [inaudible 00:23:58]. I was just wondering if you guys have had a chance to test the [inaudible 00:24:15].

Tim Chandler:
So a question is, do we have Office 365? And the answer is yes. Have we looked into using the API's to pull in and to coordinate and share information in there. We've talked about, we haven't done anything though. Yeah. So there is the opportunity with API's to be able to go in and do interesting things in the office landscape or office world. So any other questions about this? I mean this is something that's kind of grown over a time. It's worked out to be very robust from the point of view that as things flow through here they get to provision they gets... The time of refreshes may be three hours, may be ten minutes, may be a day. It's whatever the business is really dictating, and if something does break in this environment, it just means that those refreshes are delayed as opposed to not being available.

Yeah, question back there.

Crowd question:
[inaudible 00:25:25] slowly changing dimensions [inaudible 00:25:31].

Tim Chandler:
Yeah. So the question is, are we doing full data refreshes or are we doing incremental loads. So it depends on the data. With things like it's just easier to do a full refresh and we've talked to the business and things change in our systems going back several months, so we've just made an easier... It's just easier to view and pull all that in. One of the things we do is we do pull in in twice a day actually, and we store it in S3, so if anyone else wants to hit and pounds and pull data out of, we courage them strongly to go to S3 and pull that data. It actually is faster and the company's all looking at the same data, so they're not asking, "Why is my number different than their number?"

When we get into other packages like Intacct, that's a huge amount of data, and Amy set that up to be actually incremental. Right? Yes. A lot of this other stuff it's pretty small. So we just go in and just pull it all back out. Okay, yup.

Crowd question:
[inaudible 00:26:53].

Tim Chandler:
So they're sitting right here. Amy take a bow. She's in finance. Chris is in also in finance, sitting right next to her, and then we got [Ambica 00:27:08] who's cut her teeth right away on doing API's. Her very first workflow was doing API's, I couldn't believe it. And then also [Pernecia 00:27:19], I don't know where Perencia is. Where's Pernecia? Oh, there she is. Way at the back, and she's just started, and she's doing a really cool project right now where we're using Alteryx to add and remove Tableau users as they get terminated and they join the company. So she's got a workflow that goes into HR Workday, see who's the status of everyone, and it uses the API's in Tableau, and it goes in and it associates them to the right security groups, associates them into Tableau, provisions it all. Saves her an hour a day, it's nice. Shout out for Pernecia there.

Crowd question:
[inaudible 00:28:07].

Tim Chandler:
No. So we don't have all the data. We have higher, fire, department, name we do not have salary. We do not have bonus information. We don't have any of that and we don't want it. Generally when I talk to people asking for this it's almost the same conversation goes on every time I have one of these requests. It's like I need all your data. HR says it a little different, but I need all your data, and they're like, "What are you going to do with it?" Its like, "I have no idea," but most of the time once they understand that you want all that data, and then once you get the data you can use Alteryx to filter it, to parse it out, and do whatever you want. Then you're kind of in control of where you want to go and what you want to see.

The last thing I want is to be given only part of the data, and then have to go back a month later, and ask for another part. All of this is 128 SSL encrypted, and AWS is lockdown through ports, and there's also a document from Alteryx, and there's a document from Tableau that actually go through and talk a lot about the settings of their applications, and the settings inside this environment. So we've followed those, we've worked with security. We feel very confident, and I hope after I say this I don't find Marketo in the newspaper tomorrow.

So some shiny objects here. Some cool stuff. We actually worked with Alteryx about seven, eight months ago. The name of the engineer slips me, but he provided us with a whole set of dashboards. These are public domain, I'm sure you guys can get them. And it gives you detail of how the gallery is being used. How the servers being used. What's being provisioned, what's new. How long jobs are running. This is just kind of a teaser sample of what's happening, but there is so much information you can pull out through these tabs with Alteryx. It's really cool, and it can give you insight into how your server can be tuned better to get more out of it.

So that was just about our data pipeline. Let me talk a little bit about how we made data models to drive dashboard adoption. Remember the whole idea behind this is that if there's 20 of you that are going to create dashboards, it's better that they come from one data source and that each of you, if you have different field requirements, that I try to make sure you get the fields you need quickly. That's the mindset that this is coming from. So over a year ago we just started working with different departments. It was almost like spinning plates. You talk to finance, talk to people there see what they want. Understand what's happening, and they tell you the needs, you get that. Then they might go... They might be quarter end, you can't talk to them for a week or so.

You're off talking to sales ops. You're talking to them, and so you're literally talking to all these people, and all the time you're trying to keep as few data repositories focused and available to them. They don't need to know that you're trying to just create one data repository. That everyone should know that they're going to a single place to get that information. That really cuts down on probably one of the biggest problems in dashboards, and that's reconciliation of data.

So now if I was at a big company, like companies I've worked at before, we'd be talking a lot about data stewards, but I never bring up this term inside Marketo. I really just look for people that are in Excel hell or have opportunities to automate things. People that are using the data in that business, and those are the people that ultimately care about making sure the data quality is right and that they're defining things in the right way. Those are the traits of the people I look for in the business to go talk to, and when I find these things, then things start to happen.

Training, I've got a real different view on training than most people. I believe that people can be really good at Tableau or Clickview or Doma or whatever package you want if the data model and the data presented to them is in really good shape. You give them really good data, they don't need to become expert, they can get a lot more done. You don't need to spend a lot of time on training. So I remember bringing in trainers and they'd say, "Oh, we're going to teach them about level of detail. We're going to teach them about how to do calculations, how to do data blending." I'd say, "No. Do not teach them any of that. I am going to provide the with the data that permits them to be able to create the dashboard they need without having to learn this stuff."

There have been some people that have gone off and learned this, and that's great, I'm not going to stop that. But the emphasis here is create great data. So company-wide ratification, this is probably more of a people exercise than a data exercise. So creating common calculated fields. There's a process that Chris and I, and about five other people in the company are going through right now to really ratify the categories of how we look at finance and sales ups, and other things in the company.

Common standard fields, when people ask for a field the name is important. It should be indicative of what it's going to actually be, and corporate data standards making sure the finance... Finance has that ultimate vote in many cases, so what that field is going to be defined as. So the adoption  process as I go through with each of these people it's pretty much the same. I look at business requirements. I keep looking to see how I can provision data, validate that data with them. This is a real tough one sometimes, and then as you continue you start to push dashboards. You may loop-back and provisioning more and more data, but eventually you get into production, but this is the kind of the cycle or the path of getting into production with every one of these business units out there.

The challenge, and this is the big difference between doing OBIE, and Informatica in my point of view, is that with Alteryx you can do this much faster. I can sit down with someone in the business, understand what they want, go in, pull a data field, rename it, publish in into Tableau. Some cases I can do that in a couple hours, some days one or two days, but in other corporate environments of doing this type of stuff this can take weeks or months, so monitoring this... This is actually a snapshot from a Tableau usage, so the first one I showed you was Alteryx, and it had whole bunch of Alteryx dashboards in there that show what's going on with Alteryx being created.

Got another set of dashboards and we've tuned it quite a bit that actually show what's going on here. So this one is just one of about probably 20 dashboards that we look at in Tableau. This one's showing the performance. This one... What we did in Tableau to make things as simple as possible, is we just created groups for all the different security areas that needed to happen, and then when people get added to the system we just add them to different groups inside Tableau based on their org ID. And this is a table that anyone can look at to see who is associated with which groups, and they can make requests through IT to update that if they so choose.

So that's the way we managed the security and the allocation of dashboards, and Alteryx is powering this in the back end. So one of the things I'm going to talk about here, and was talked about earlier or mentioned earlier, is Alteryx in a box. One of the things is that as we started to grow and to head down this path we're dealing with really sensitive data. This is something that you need to make sure that you got the right security's set up, the right version set up. You need to make sure that you control the environment at certain levels. And so what we did as we used AWS or Amazon Web Services or E2 their... A way of provisioning servers to actually create a series of instances or computers, and we used this to deploy Alteryx into.

So in other words it's a dedicated box that we've installed Alteryx and then we've handed that over to Chris and [Tambika 00:37:52], and Amy, and the team, and so everyone is using something in the cloud. And then we control the network security. We control the access, the ports. That's completely secured in that vein. So we know where the date is and how it's traveling, and it's not on someone's desktop.

So that made things really good from the point of view security, but also we found that the performance, except when you're maybe an event like this, is usually better than when it is on your own desktop. So this Alteryx in the box has made this thing more manageable. One of the things that I think is really interesting, Amy brought this up many months ago, is that when someone in the business was like, "Oh, I want Alteryx." They're like, "Okay, cool but first thing you got to tell us is what is your business objective? What are you going to do with the data? What is the data problem you're trying to solve?" If they could answer that question, we would proceed. If they can't, we don't want to go down this path because this is not about us just filling up licenses and having people tinker with it. We really want every person who gets Alteryx to actually before focused on a business problem and to try and solve it.

So that's all the licenses that we have and all the business boxes... The Alteryx's in a box we have out there are focused of people that are looking at it and making things happen. Every time... You know it's not just a couple people creating this data pipeline now. As we see more and more ideas come from more and more people, they're adding more and more ways of pulling data through Mavericks, and that's the ultimate thing, pouring more data.

So I talked a little bit about this. Talked ahead of myself I guess here. AWS, EC2 managed, and lot of the benefits. Macro sharing, governance all this kind of stuff. It just makes it easier to scale out. So as we add more uses to this environment, it's just replicating what we have. It's not worrying about setting up different environments. So some highlights, what I wish I knew a few years ago. Kind of listed them out here. When I wrote this I kind of felt like it was kind of a boring thing to say this, but it's so true that documentation is so important. You know you create a workflow, it works beautifully and you're just like, "Wow, this ist just awesome." You want to go talk to someone about it, but no you should stay there and you should go back and document every one of the stages of what's actually happening because you are going to come back to this workflow six months from now, and you're going to look at it and go, "I don't know what's going on with this thing." So documentation, I really kind of look at it as something kind of boring but necessary. Like brushing your teeth.

ODBC, I talked about this earlier. If ODBC is better than rest API's. If you can get your hands on an ODBC instead of a rest API, get the ODBC every time. The only time you're going to have to use rest API's if an ODBC exists, is if you actually want to do something like delete a user and an account or something of that stage.

AWS 3, originally when started getting into this whole architecture I thought we're going to be putting a lot of stuff in to redshift, I stopped doing that. The performance was just terrible for this type of work. This is a lot about bulk loads, and bulk reeds, and that's not what relational database is designed for, redshift. S3 is great for this.

Alteryx can be automated, the Tableau versioning. There's a lot of things that Alteryx can actually automate, and it's not just data, but I talked about the users. There's also other aspects that you can automate out there that kind of boggle the mind as you go down this path. Also Alteryx can modify its own workflows. So one of the things that I just kind of learned, and we're just starting to get into this, is that Alteryx workflows are XML software containers. They're just XML. So you can read a workflow into Alteryx, and you can start to parse it out, and search for things, and filter things, and look for things, and then you take that concept and you go, "Okay, if you've got a 100 workflows, then why wouldn't you just use Alteryx to read them in and categorize which macros you're using. Which versions of things you're using. How many tools you're using. How long are these things taking to run."

I mean there's a lot of interesting things you can kind of look into your code and analyze, and then pop out and then see how you can better manage your environments. So I think that this is something that we'll probably get more into. The logs that are generated by Alteryx, or again something else you can pull out and analyze and run and throw into Tableau or whatever tool you're using. But again, you can create a lot of self-monitoring of systems out there.

So what's happened to Marketo? What's kind of interesting... So this is from January to now, and this just shows the increased number of server hits and the increased users over time. We have an event here where we launched a page and turned it on for single sign on for the whole company, so that's why this popped up. The really cool thing is... Actually about to avert your attention over here is that you can see these little bumps are every time we've launched a new data set to the company, and each of these colors is a different department in the company.

So we're seeing adoption and usage across the company grow, not just by the number of departments, but by the number of pages and the dashboards that are viewed, and I attribute this completely to Alteryx. I do not give the benefit to a dashboard. I mean it can be multiple... Any dashboard out there we could've chosen and we'd get probably similar things, but the data is the important thing. You've got to have nice clean reliable data. You've got to tell the users about it, and as they start to adopt it like I think this was legal right here, all of a sudden they start to see easy ways to get stuff out, and that's... How do you make it easier for them? So, any questions?

I think we're actually out of time.

Tim Chandler:
Oh, okay.

So Tim thank you for that fantastic presentation. I know we had a number of questions throughout the presentation, but I strongly urge you to connect with Tim if you have any additional comments or feedback that you'd like to hear from him. Thank you.


Experience the
Power of Alteryx
For Yourself.

Get Started