Are you looking for a simple way to copy data from Zoho to Azure Data Lake? You're in the right place! With Azure Data Factory (ADF), you can automate the process of copying data from Zoho’s API to Azure Data Lake Storage Gen2, making it easier to store and analyze your data.
How to Load Parquet Files from Azure Data Lake to Data Warehouse
By following these steps, you’ll be able to extract, transform, and load (ETL) your Parquet data into a structured data warehouse environment, enabling better analytics and reporting.
How to Copy Data from JSON to Parquet in Azure Data Lake
In this step-by-step guide, we’ll go through the exact process of creating Linked Services, defining datasets, and setting up a Copy Activity to seamlessly transfer your JSON data to Parquet format.
How to Copy Multiple Files from a SharePoint Folder to Datalake using Azure Data factory
With Azure Data Factory, copying files from SharePoint to Data Lake becomes a breeze. By leveraging Azure Data Factory's intuitive interface and robust features, you can easily set up connections to your SharePoint environment and Data Lake Storage Gen2. This allows you to define datasets representing the source (SharePoint) and the sink (Data Lake), specifying the exact locations from which data will be extracted and where it will be stored.
How to Refresh the Azure Analysis Service Model Using Azure Data Factory
This blog post will help you build an automated and effortless way to keep your Azure Analysis Services model up-to-date using Azure Data Factory, ensuring accurate insights with seamless refreshment.
How to send Emails to Users using SparkPost in Azure Data Factory
A step-by-step guide on how to send emails to multiple recipients using SparkPost in Azure Data Factory.
How to Pause and Resume Dedicated SQL pool (SQL DW) using Azure Data factory.
Azure Dedicated SQL pool (SQL DW) has many benefits like Massive parallel processing (MPP) architecture to scale compute and storage resources independently, allowing for high-performance analytics but one big issue with this Azure resource is the cost associated with it. Microsoft charges Dedicated SQL Pool on Hourly basis, so it means then you must pay when the SQL DW is on even if you are not using it for development or analysis.
How to Copy Files from SharePoint to Datalake using Azure Data factory
Copying files from SharePoint to Datalake or any other target location is one task you cannot ignore as a Data Engineer. Someday someone for sure is going to ask you to do that. So, how can you achieve that? Suppose we need to copy an excel file from SharePoint to Datalake Gen 2. You need... Continue Reading →