How to get started with Daton setup

Step by step procedure to configure integrations on Daton

Sign-up

Step 1: Go to https://daton.sarasanalytics.com/login and sign-up with valid email address. All fields are mandatory, and Company Name must be unique

Step 2: One will receive a verification link to the registered email address. One will have to get verified successfully before logging in to Daton

Subscribe

Step 3: One would be redirected to ‘Subscriptions’ page after sign-in for the first time. Click on the ‘Free Trial’ button to try Daton for free. Free trial would be enabled for the registered email address for 14 days and users can subscribe to monthly plans anytime

Integrate Data Sources

Step 4: Go to Connectors menu tab to find all available sources. Users can filter the sources based on the selected category

Step 5: Click on any of the data sources to initiate integration setup. You will have to provide the following details -

  • Integration Name: Tables in the warehouse would be created using this name

  • Frequency: Parameter to specify how often the data must be replicated from source to destination. You can select a frequency from 1 hr to 24 hrs

  • History: Parameter to specify how much historical data to be replicated in the first load. All subsequent loads would be incremented from the previous load. You can select history from 0 years to 10 years

Step 6: Provide details for authentication and authorization. This screen will change depending on the authentication parameters and mechanism for each data source. Authentication mechanisms include OAuth (username & password for the SaaS application), and Basic (with static API keys and Secrets or Database details)

Step 7: Select accounts or schema for the integration. Database sources might have several DB schemas to select from. Users can select one or multiple schemas for an integration. Similarly for some of the SaaS applications like Facebook Ads, businesses can have multiple Ad accounts with in the same Facebook business account, and users can select multiple Ad account ids

Step 8: Select required tables. APIs from all third party applications are represented as tables in Daton. Users can select multiple tables for the selected integration

Step 9: Next step is to select the fields for each table. Select replication fields (only for database sources) to increment data based on this field. For example, If we select a primary key as a replication field, then data extracts will be processed incrementally based on the last extracted primary key.

Step 10: The configured integrations will be in pending state until data is extracted, then it will be moved to active state. Users can view the status in the My Integration page

Step 11: Users would be able to pause, re-activate or delete the integrations from the My Integrations page

Step 12: Additionally, users can view their usage in the Dashboards page and see process logs for all the integrations in the logs page

Integrate Data Warehouse

Step 13: Integrate data warehouse by selecting your choice from the Daton supported list. Refer warehouse setup tutorials for Google Big Query, Amazon Redshift, Oracle Autonomous data Warehouse respectively

E.g. Google Big Query

  • Follow this link to setup account and project on Google Cloud Platform

  • Go to the Connectors page and click on the Data Warehouses tab

Step 14: Provide a name for the warehouse integrations. Data sets will be created with this name on the warehouse

Step 15: Sign-in with Google account who owns the Big Query warehouse account. You can select the project within this account in the subsequent screens

Step 16: Select a project and submit the warehouse integration. Data processing would be initiated only after warehouse is configured successfully

Step 17: Allow sometime for data to flow and one can verify the integration status on My Integrations page. Initial data loads take several hours to a few days to completely replicate the data depending on data source . Data replication speed depends on the the number of historic years of data selected as well as the API quotas enforced by data source platforms

Step 18: Continue with other data source integrations and enjoy the data

Last updated