# How to get started with Daton setup

### Sign-up

**Step 1:**Go to <https://daton.sarasanalytics.com/login> and sign-up with valid email address. All fields are mandatory, and Company Name must be unique

![](https://15515196-files.gitbook.io/~/files/v0/b/gitbook-legacy-files/o/assets%2F-LbJ1eMq5M51cIpU163R%2F-LtiWHzKUv5Z-lO5Nvhz%2F-LtiY1BYKr6AljJlFfMX%2Fsign-up-2.jpg?alt=media\&token=0cb86a36-8715-459a-85a9-7bdc096e5dfb)

![Login Page](https://15515196-files.gitbook.io/~/files/v0/b/gitbook-legacy-files/o/assets%2F-LbJ1eMq5M51cIpU163R%2F-LtiWHzKUv5Z-lO5Nvhz%2F-LtiXn9MfOJnNzFZbfeX%2FSign-up.jpg?alt=media\&token=e7683dfc-3576-4d83-b76d-e2398594ccbf)

**Step 2:** One will receive a verification link to the registered email address. One will have to get verified successfully before logging in to Daton

### Subscribe

**Step 3:** One would be redirected to ‘Subscriptions’ page after sign-in for the first time. Click on the ‘Free Trial’ button to try Daton for free. Free trial would be enabled for the registered email address for 14 days and users can subscribe to monthly plans anytime

![Subscription Plans](https://15515196-files.gitbook.io/~/files/v0/b/gitbook-legacy-files/o/assets%2F-LbJ1eMq5M51cIpU163R%2F-LtiY7lVKnBcyllTo_mw%2F-Lti_GjrmLML3dHypimj%2FSubscriptions.jpg?alt=media\&token=81506381-dbcc-4a0b-8b02-50ad7f97b7a1)

### Integrate Data Sources

**Step 4:** Go to Connectors menu tab to find all available sources. Users can filter the sources based on the selected category &#x20;

![Connectors](https://15515196-files.gitbook.io/~/files/v0/b/gitbook-legacy-files/o/assets%2F-LbJ1eMq5M51cIpU163R%2F-Lti_Ncj1upNctwdS9Nt%2F-LtidRtBfkRY62rWWOws%2FSource-list.jpg?alt=media\&token=7236e80a-d4c7-454e-999a-ea4c543a489a)

**Step 5:**Click on any of the data sources to initiate integration setup. You will have to provide the following details -&#x20;

* Integration Name: Tables in the warehouse would be created using this name
* Frequency: Parameter to specify how often the data must be replicated from source to destination. You can select a frequency from 1 hr to 24 hrs
* History: Parameter to specify how much historical data to be replicated in the first load. All subsequent loads would be incremented from the previous load. You can select history from 0 years to 10 years &#x20;

![Integration Setup](https://15515196-files.gitbook.io/~/files/v0/b/gitbook-legacy-files/o/assets%2F-LbJ1eMq5M51cIpU163R%2F-Lti_Ncj1upNctwdS9Nt%2F-LtieBQM6Bm4aWw4MyQX%2FIntegration%20setup.jpg?alt=media\&token=cf54afae-6f91-4a91-8bc8-11821308094b)

**Step 6:** Provide details for authentication and authorization. This screen will change depending on the authentication parameters and mechanism for each data source. Authentication mechanisms include OAuth (username & password for the SaaS application), and Basic (with static API keys and Secrets or Database details) &#x20;

![Integration Setup](https://15515196-files.gitbook.io/~/files/v0/b/gitbook-legacy-files/o/assets%2F-LbJ1eMq5M51cIpU163R%2F-Lti_Ncj1upNctwdS9Nt%2F-LtiesG8msQzDm4TPM36%2FIntegration%20setup%202.jpg?alt=media\&token=2bae5cd5-4d04-4bd2-aed8-fcc3efdbb2fb)

**Step 7:** Select accounts or schema for the integration. Database sources might have several DB schemas to select from. Users can select one or multiple schemas for an integration. Similarly for some of the SaaS applications like Facebook Ads, businesses can have multiple Ad accounts with in the same Facebook business account, and users can select multiple Ad account ids

![Account Selection](https://15515196-files.gitbook.io/~/files/v0/b/gitbook-legacy-files/o/assets%2F-LbJ1eMq5M51cIpU163R%2F-Lti_Ncj1upNctwdS9Nt%2F-LtifPAey5S1hNBdAbDc%2FSelect%20Accounts.jpg?alt=media\&token=a7f603e0-423c-4761-8976-a096339338f9)

**Step 8:** Select required tables. APIs from all third party applications are represented as tables in Daton. Users can select multiple tables for the selected integration

![Table Selection](https://15515196-files.gitbook.io/~/files/v0/b/gitbook-legacy-files/o/assets%2F-LbJ1eMq5M51cIpU163R%2F-Lti_Ncj1upNctwdS9Nt%2F-LtifdNYxEuT1Nd3I3IZ%2FSelect%20Tables.jpg?alt=media\&token=f97741da-bd7e-486d-994b-b87020679254)

**Step 9:** Next step is to select the fields for each table.&#x20;Select replication fields (only for database sources) to increment data based on this field. For example, If we select a primary key as a replication field, then data extracts will be processed incrementally based on the last extracted primary key.

![Field Selection](https://15515196-files.gitbook.io/~/files/v0/b/gitbook-legacy-files/o/assets%2F-LbJ1eMq5M51cIpU163R%2F-Lti_Ncj1upNctwdS9Nt%2F-LtiixIKjg-7mkCVyciN%2FSelect%20fields.jpg?alt=media\&token=dd527174-58e2-43c3-bbda-104ff3dba035)

**Step 10:** The configured integrations will be in pending state until data is extracted, then it will be moved to active state. Users can view the status in the My Integration page &#x20;

![My Integrations Page](https://15515196-files.gitbook.io/~/files/v0/b/gitbook-legacy-files/o/assets%2F-LbJ1eMq5M51cIpU163R%2F-Lti_Ncj1upNctwdS9Nt%2F-Ltijan2RvOsw1G3gKMz%2Fintegration%20list.jpg?alt=media\&token=7107d575-2446-4470-b46a-a16b14f7386a)

**Step 11:** Users&#x20;would be able to pause, re-activate or delete the integrations from the My Integrations page

**Step 12:** Additionally, users can view their usage in the Dashboards page and see process logs for all the integrations in the logs page &#x20;

![Additional Monitoring Pages](https://15515196-files.gitbook.io/~/files/v0/b/gitbook-legacy-files/o/assets%2F-LbJ1eMq5M51cIpU163R%2F-Lti_Ncj1upNctwdS9Nt%2F-LtikTR9ZKNSu5UrDFxa%2Fdashboards.jpg?alt=media\&token=51067fab-1719-428a-ad69-3b403c870c18)

### Integrate Data Warehouse

**Step 13:** Integrate data warehouse by selecting your choice from the Daton supported list. Refer warehouse setup tutorials for [Google Big Query](https://docs.sarasanalytics.com/getting-started/essentials/how-to-setup-google-bigquery-project), Amazon Redshift, Oracle Autonomous data Warehouse respectively

E.g. Google Big Query

* Follow this [**link** ](https://docs.sarasanalytics.com/getting-started/essentials/how-to-setup-google-bigquery-project)to setup account and project on [Google Cloud Platform](https://console.cloud.google.com)
* Go to the Connectors page and click on the Data Warehouses tab &#x20;

![Select a Data Warehouse](https://15515196-files.gitbook.io/~/files/v0/b/gitbook-legacy-files/o/assets%2F-LbJ1eMq5M51cIpU163R%2F-Lti_Ncj1upNctwdS9Nt%2F-LtilKzFF2sAFKqmkHX8%2Fwarehouse%20tab.jpg?alt=media\&token=d3051252-54b0-4075-9eb7-048717f6ebb1)

**Step 14:** Provide a name for the warehouse integrations. Data sets will be created with this name on the warehouse &#x20;

![Data Warehouse Name](https://15515196-files.gitbook.io/~/files/v0/b/gitbook-legacy-files/o/assets%2F-LbJ1eMq5M51cIpU163R%2F-Lti_Ncj1upNctwdS9Nt%2F-LtiltRgvPPScUHQE0Oi%2Fwarehouse%20name.jpg?alt=media\&token=90f1719b-a928-45a9-8c61-b6c36c1cbf67)

**Step 15:** Sign-in with Google account who owns the Big Query warehouse account. You can select the project within this account in the subsequent screens &#x20;

![Warehouse Login](https://15515196-files.gitbook.io/~/files/v0/b/gitbook-legacy-files/o/assets%2F-LbJ1eMq5M51cIpU163R%2F-Lti_Ncj1upNctwdS9Nt%2F-LtimIRY6AFcLgynFDql%2Fwarehouse%20login.jpg?alt=media\&token=178322a3-060c-42a1-b305-d3c0b001ec74)

**Step 16:** Select a project and submit the warehouse integration. Data processing would be initiated only after warehouse is configured successfully &#x20;

![Select warehouse project](https://15515196-files.gitbook.io/~/files/v0/b/gitbook-legacy-files/o/assets%2F-LbJ1eMq5M51cIpU163R%2F-Lti_Ncj1upNctwdS9Nt%2F-Ltimce08orit2s8kMBj%2Fselect%20project.jpg?alt=media\&token=cfd33d06-60a5-4945-a3b2-0ab1f523b8a0)

**Step 17:** Allow sometime for data to flow and one can verify the integration status on My Integrations page. Initial data loads take several hours to a few days to completely replicate the data depending on data source . Data replication speed depends on the the number of historic years of data selected as well as the API quotas enforced by data source platforms

**Step 18:** Continue with other data source integrations and enjoy the data
