2024年5月13日~16日まで、米国ラスベガスのVenetianで開催されるInspire 2024をお見逃しなく!皆様のお申し込みをお待ちしております。

 
Data Engineering Template:

Prepare Event Log Data to Publish into a Data Warehouse

Clean and structure transformations from data stored in JSON format

Event log analysis Flow The flow view of this template

Transformations:
unnest (json parsing), splitpatterns, rename, replace, filtering (delete, keep), aggregation (groupby, count)

This example flow shows you how to parse and clean application log files or events that might be stored in JSON format. The main types of transformations used in this flow are cleansing and structuring transformations.

This flow comes with detailed annotation of each step in the recipe as well as flow level descriptions for all the recipes. Once you understand the logic, you can customize this flow to jumpstart your own development or share it with others in your workspace.

There is a sample dataset that is loaded automatically for you when you use this template. It contains different event attributes all nested under a single JSON object. This is a very common format for many applications and tools, as well as data returned from API calls.

There are several different recipes that structure and cleanse the event data, and a final aggregation example to show how you can track events per user.

Fore more information, please check out this detailed guide in Trifacta’s Help Center.

New user?

If your data is mostly on Google Cloud Platform, please use Dataprep. Otherwise, choose Designer Cloud.

Use in Designer Cloud Use in Dataprep