In a world full of dashboards, reports, and visualizations, simply showing data is not enough. If you want your audience to understand, remember, and act on the insights, you need to go beyond the graph. You need to tell a story. Welcome to the world of storytelling with data, where raw numbers are transformed into compelling, clear, and meaningful narratives that spark decisions.
The Role of Data Governance in Business Growth
Data governance is a growth enabler, not just a compliance formality. When data is governed effectively, organizations move faster, collaborate better, and scale with confidence.
How to Extract HubSpot Data to Azure Data Lake Using Azure Data Factory
In today’s data-driven world, integrating marketing and sales data from platforms like HubSpot into a centralized storage system such as Azure Data Lake is critical for advanced analytics and reporting. Azure Data Factory (ADF) makes it simple to automate this process without writing any code. In this guide, we'll walk through the exact steps needed to pull data from HubSpot and store it in Azure Data Lake Storage Gen2 in JSON format.
How to Load Parquet Files from Azure Data Lake to Data Warehouse
By following these steps, you’ll be able to extract, transform, and load (ETL) your Parquet data into a structured data warehouse environment, enabling better analytics and reporting.
How to Copy Multiple Files from a SharePoint Folder to Datalake using Azure Data factory
With Azure Data Factory, copying files from SharePoint to Data Lake becomes a breeze. By leveraging Azure Data Factory's intuitive interface and robust features, you can easily set up connections to your SharePoint environment and Data Lake Storage Gen2. This allows you to define datasets representing the source (SharePoint) and the sink (Data Lake), specifying the exact locations from which data will be extracted and where it will be stored.
How to Refresh the Azure Analysis Service Model Using Azure Data Factory
This blog post will help you build an automated and effortless way to keep your Azure Analysis Services model up-to-date using Azure Data Factory, ensuring accurate insights with seamless refreshment.
How to send Emails to Users using SparkPost in Azure Data Factory
A step-by-step guide on how to send emails to multiple recipients using SparkPost in Azure Data Factory.
How to start photography?
Photography is an art form that takes time and practice to master. Be patient with yourself and don't get discouraged by initial challenges. Keep pushing yourself to improve, and the results will follow.
How to Pause and Resume Dedicated SQL pool (SQL DW) using Azure Data factory.
Azure Dedicated SQL pool (SQL DW) has many benefits like Massive parallel processing (MPP) architecture to scale compute and storage resources independently, allowing for high-performance analytics but one big issue with this Azure resource is the cost associated with it. Microsoft charges Dedicated SQL Pool on Hourly basis, so it means then you must pay when the SQL DW is on even if you are not using it for development or analysis.
19 Best Practices of Power BI
Data models should be filtered and normalized. This is particularly important if datasets are sourced from high data volume repositories such as enterprise data warehouses.
How to handle duplicate records while inserting data in Databricks
Have you ever faced a challenge where records keep getting duplicated when you are inserting some new data into an existing table in Databricks? If yes, then this blog is for you. Let’s start with a simple use case: Inserting parquet data from one folder in Datalake to a Delta table using Databricks. Follow the... Continue Reading →
How to Copy Files from SharePoint to Datalake using Azure Data factory
Copying files from SharePoint to Datalake or any other target location is one task you cannot ignore as a Data Engineer. Someday someone for sure is going to ask you to do that. So, how can you achieve that? Suppose we need to copy an excel file from SharePoint to Datalake Gen 2. You need... Continue Reading →