DP-500 : How to successfully pass the exam?

January 27, 2023

Analytics Microsoft Azure

Read in minutes

Are you looking to earn the Microsoft certification DP-500: Designing and Implementing Enterprise-Scale Analytics Solutions Using Microsoft Azure and Microsoft Power BI? If so, you’re not alone! This certification is highly sought after by professionals looking to advance their careers in the field of data analytics. In this LinkedIn article, we’ll provide some expert tips to help you prepare for and pass this important certification exam.

First, let’s start by looking at what this certification covers. The DP-500 certification is geared towards professionals who are responsible for designing and implementing large-scale analytics solutions using Microsoft Azure Synapse Analytics and Microsoft Power BI. This includes tasks such as designing data pipelines, managing data storage, and creating dashboards and reports for business users.

To prepare for the DP-500 exam, it’s important to have a strong understanding of the following topics:

Microsoft Azure: This includes knowledge of Azure data storage options (such as Azure SQL Database and Azure Data Lake), as well as Azure data processing and analytics tools. You’ll also need to be familiar with Microsoft Purview.

Microsoft Power BI: This includes knowledge of Power BI desktop and online, as well as how to design and publish reports and dashboards using Power BI. You’ll also need to be familiar with Power BI data modeling and visualization techniques.

Data management and data governance: You’ll need to understand how to manage data at scale, including tasks such as data cleansing, data transformation, and data security.

Data visualization: You’ll need to be able to design effective data visualizations that effectively communicate insights to business users.

Some advice from one of our consultants

It is understandable that you may be feeling anxious or unsure about your chances of success on the DP-500 exam, especially if you have not had previous experience with Azure Synapse Analytics & Microsoft Purview. Prior to preparing for the exam, I had not had any experience using those two tools. These are important technologies that are covered on the exam, and it may have been necessary for you to spend additional time studying and gaining familiarity with them in order to fully prepare for the exam.

It is important to note that four weeks of study is a reasonable amount of time to prepare for the exam, as long as you use your study time effectively and focus on the most important exam objectives

So, what can you do to prepare for the DP-500 exam? Here are a few tips:

No alt text provided for this image

Use Microsoft’s official certification training materials: These materials are designed specifically to help you prepare for the DP-500 exam and are a great place to start.

Take online courses: There are many online courses available that can help you deepen your understanding of the topics covered on the DP-500 exam. One website that you might find helpful is Datamozart. This website offers a range of courses and resources for data professionals, including those preparing for the DP-500 exam.

Watch YouTube videos: There are many YouTube channels that offer helpful content for those preparing for the DP-500 exam. One channel that you might find particularly useful is Azure Synapse Analytics. This channel offers a range of videos on topics related to Azure Synapse Analytics, which is a key tool covered on the DP-500 exam.

Get insights from experts: Consider reaching out to experts in the field for advice on how to prepare for the DP-500 exam. Two Data Platform MVPs, Andy Cutler and Nikola Ilic, are known for their great explanations and insights on data platform topics. You might find it helpful to follow their blogs or watch their videos for additional guidance on preparing for the DP-500 exam.

Practice with sample questions: It is understandable that you may be looking for sample questions to help you prepare for the DP-500 exam. However, it is important to note that the quality and reliability of sample questions can vary greatly. Some sample questions may not accurately reflect the content or difficulty level of the actual exam, and using them as your sole source of preparation may not be sufficient to fully prepare you for the exam. Examtopic is a great website that provides information and resources for various IT certification exams. When I studied for the exam, the site did not contain any practice questions but now you can find sample questions here. It will probably help you a lot.

Gain hands-on experience: There’s no substitute for real-world experience when it comes to preparing for the DP-500 exam. Try working on projects using Azure and Power BI to get a feel for how these tools

I wish you the best of luck as you prepare for the DP-500 exam. Remember to stay focused, stay motivated, and keep up with your studies. With hard work and dedication, you can succeed on the exam and achieve your certification goals.

SHARE ON :



comments

Power BI and QlikView Comparison

September 28, 2022

Business Inteligence

Read in minutes

When we talk about Business Intelligence and Data Visualization, there are 3 leaders on the market today; Power BI, Qlik (Qlikview & Qliksense) and Tableau. 

Power BI was developed by Microsoft in 2010 as the Crescent Project. The first version was released in 2011. Later, Microsoft gave it the name of Power BI and it will be released in 2015. Today Power BI emerges as a leader of Business Intelligence tools in the market. 

Qlikview was first called Quikview in 1994 (long before Microsoft) but was renamed in 1996. When QlikView was created, its main function was to collect in-depth analysis of data from different systems. Finally, it evolved into Business Intelligence software where it now enjoys top marks from several software review sites.

In this article, we provide an overall comparison, through different criteria, between Power BI and QliKview, based on feedback (migrations from QlikView to Power BI for example) and documentation.

Data Sources

Regarding connectivity, two softwares allow extensive access to different types of data, whether located on-premise or in the cloud. However, QlikView has the particularity of requiring the prior downloading and installation of these connectors before being able to use them. Overall, in both cases, connecting to data sources is not a major problem.

Filter, Slicer and Selection

One of QlikView’s strengths is certainly its associative experience and its filter management. We are able to make a selection and a filter directly using a simple click on the values available through all the visuals. In this case we talk about Cross Filtering, it means all the visuals adapt to the interaction that we have with another visual.

This feature is also available in Power BI but unfortunately it only works on one tab. If we want to persist our selection through tabs, we have to use slicers and we need to synchronize them. For QlikView users, this intermediate step can be less intuitive and more costly in terms of navigation.

Unlike query-based BI tools, using in-memory associative technology, when QlikView users select a data point, no query is triggered. But all other fields are instantly filtered and grouped based on the user’s selection. Selection appear in green and datasets which are related to the selection appear in white. Data which is unrelated to the selection appear in gray. With this way, users have access to a tool which is both intuitive and user-friendly to browse data and to search information related to their activities.

On the other hand, Power BI does not directly allow to display data which is not linked to our selection.

Data Processing and Transformation 

Features to process and transform data in both software are numerous but those of Qlikview, thanks to a scripting language, are considered advanced and offer more possibilities during development. But the level of required knowledge is higher to master QlikView language and to provides our first models.

On the other hand, with its intuitive interface, Power BI is easy to use for less experienced people and especially those who do not have programming skills.

User Interface

Power BI is very intuitive with Drag & Drop everywhere. A novice user is able to create visuals and dashboards very easily.

For its part, QlikView allows a more advanced level of customization than Power BI but is much less intuitive and more complex, especially for a new user.
 
But if the investment is made to master and to use the full potential of QlikView, its highly customizable setup and his wide range of features can be a key advantage.

Price 

Power BI pricing is simple. The desktop version is free, while Power BI Pro costs less than $10 per user per month. The latest version – Power BI Premium – offers capacity pricing that helps to optimize costs.

For QlikView, fees are not so simple. QlikView website offers two editions, Enterprise and Personal. While the personal version, to use on a personal computer, is free, the price of the enterprise version is only accessible after contacting their sales team. According to anecdotal experience, no solution can beat the cost effectiveness of Power BI and QlikView is estimated to be 2-3 times more expensive.

Conclusion

In conclusion, Power BI and QlikView are two colossus of data processing and visualization. In most cases, these two softwares will fulfill all features in terms of data exploration and analysis.

However, Qlikview appears to be more complex and requires a higher level of learning but offers more customization advantages.
Indeed Power BI spotlight an easy-to-use and a familiar interface for users who know Microsoft environment. It is updated really often unlike QlikView, with the addition of many interesting features to its catalog. We need to take also its community into consideration since it is very active and it can be a very interesting support.

In addition, its acquisition turns out to be much less expensive than Qlikview, an important argument for companies today. It is certainly for these reasons that we find Microsoft as the leader of BI market and ahead of QlikView.
However Qlik does not give up, and has spotlighted last years his new flagship product Qliksense, which seems to adopt the many qualities of Power BI.

SHARE ON :


Related articles

September 14, 2022

Read in minutes

PowerBI CICD with Azure DevOps (Introduction & Implementation) (1/3)

Introduction The purpose of this document is to explain how to implement PowerBI CICD in Azure DevOps. This document is for those who are tired of publishing re...

April 18, 2022

Read in 3 minutes

Free BI Solution Architecture

Nowadays a wide number of companies understand the need to be data-oriented and to exploit the generated data to extend their successfulness. However, some comp...

May 29, 2020

Read in 3 minutes

Purge ODI Logs through Database

If you generate a lot of logs in ODI, purging through ODI built-in mechanism can be very slow. A lot faster to do it through Database, but you have to respect f...


comments

PowerBI CICD with Azure DevOps (Introduction & Implementation) (1/3)

September 14, 2022

Business Inteligence

Read in minutes

Introduction

The purpose of this document is to explain how to implement PowerBI CICD in Azure DevOps. This document is for those who are tired of publishing reports by hand on different environments.

To implement the solution, we used the “Power BI actions” extension which you can find here.

This document will walk you through the steps we took to implement the solution.

The extension

The PowerBI Actions extension is based on the PowerBI API created by Microsoft that you can find here. The extension allows you to automate several tasks such as Upload / Import a PowerBI dashboard, create a workspace, update a report connection…To perform these tasks, the extension must work with a connection to the PowerBI service connection.

Implementation

To perform the following steps, you must have sufficient authorization. If you do not have sufficient authorization, you may need to contact someone who does.

Creation of the PowerBI service connection

  1. Sign in to Azure Portal
  2. Select Azure Active Directory and then App Registration
  3. Click on New Registration
Figure 1 Creation of PowerBI service connection first step
Figure 2 Creation of PowerBI service connection second step

On the next page, copy the application IDs for further use.

Then we need to create a client secret:

  1. Go to certificates & secrets
  2. Click on New client secret
  3. Add a description
  4. Click on Add
Figure 3 Creation of client secrets

Now, we need to give some permission to the app

  1. Go to App permission
  2. Click to Add permission
  3. Go for PowerBI Service
  4. Select Application permissions
  5. Check Tenant.Read.All and Tenant.ReadWrite.All
  6. Click on Add Permission
Figure 4 Add permissions

Now the app as been created and it’s ready to be used in PowerBI

SHARE ON :


Related articles

September 28, 2022

Read in minutes

Power BI and QlikView Comparison

When we talk about Business Intelligence and Data Visualization, there are 3 leaders on the market today; Power BI, Qlik (Qlikview & Qliksense) and Tableau....

April 18, 2022

Read in 3 minutes

Free BI Solution Architecture

Nowadays a wide number of companies understand the need to be data-oriented and to exploit the generated data to extend their successfulness. However, some comp...

May 29, 2020

Read in 3 minutes

Purge ODI Logs through Database

If you generate a lot of logs in ODI, purging through ODI built-in mechanism can be very slow. A lot faster to do it through Database, but you have to respect f...


comments

Free BI Solution Architecture

April 18, 2022

Business Inteligence

Read in 3 minutes

Nowadays a wide number of companies understand the need to be data-oriented and to exploit the generated data to extend their successfulness.

However, some companies still have no clue of the power of data and might be frigid to invest considerable amounts of money on data management systems with no guarantee of benefits. For these companies, offering a full free open-source data management can be a nice way to demonstrate the added value of business intelligence systems. We built a complete open-source free data management system and we’re going to show you how easy it can be.

Data Integration

First-of-all, we use Talend Open Studio for Data Integration(TOS) as our ETL to gather data from different sources (databases, API, flat files, …) and to integrate these data into our Datawarehouse.

Figure 1: Example of sourcing job

TOS is used both to get data and to integrate it into Datamarts. The sourcing and storing flows are distinct and executed independently. To improve support, every part of the process can be executed alone.

Figure 2: example of storing metajob

Database

A database is necessary to store all the data produced by the data gathering and integration processes built in Talend jobs. We chose PostgreSQL database. This choice was driven by its high application-suitability because Pentaho’s repository database can be a PostgreSQL DB as well as for its extended SQL language.

Collaboration

As the BI solution put in place grows day after day, the BI team working on this solution is also growing day after day. To be able to work together on the same solution, we need to have a collaboration tool that allows us to save works on a shared repository. The tool we’re using is GIT. GIT allows us to save multiple types of files, from documentation to ETL jobs, including Reports definition file so that we’re always working on the latest versions of each file without having to ask questions every time to the team.

Orchestration

It’s important to have jobs able to gather/treat information and meta-jobs to combine them. It’s also important to have a way to schedule these meta-jobs, in the right order, on a certain frequency. This is called Orchestration and the tool we use is GoCD.

GoCD allows us to schedule our meta-jobs, built from Talend, at a certain time of the day.

Figure 3: defining the scheduling

Basically, GoCD is used thanks to Pipelines. One Pipeline is composed of several Stages that are executed one after the other.

Figure 4: list of stages

Our Pipeline is linked to a specific GIT repository and each stage take a specific job, inside this GIT repository, based on several variables beforehand defined and execute it in a specific environment.

Figure 5: link to GIT repository

Figure 6: tasks’s content

Figure 7: stages’s variables

Figure 8: pipeline’s variables

Exploitation

Finally, we exploit our data using some of Hitachi Vantara’s Pentaho solutions. Basic reports are built using Pentaho report designer (PRD), which is a so-called “What You See, What You Get” tool. These reports data is built using custom SQL queries as data sources for example.

Figure 9: PRD user interface

The reports can then be generated from Pentaho User Console (which manages users and reports) or scheduled on a fixed time basis and sent by email.

Figure 10: example of report

We also use Pentaho Community Dashboard Editor (CDE) to create dashboards. These dashboards can be accessed using Pentaho User Console or can be web integrated.

The last Pentaho solution we use is Mondrian. It helps us to create multidimensional cubes. Those cubes can thereafter act as data sources for CDE Dashboards or PRD Reports or Excel sheets for instance.

Conclusion

In conclusion, to build a free BI Solution Architecture, we used the following stack:

  • Open Studio for Data Integration
  • PostgreSQL Database
  • GIT Versioning
  • Go Continuous Delivery
  • Pentaho Report Designer
  • Pentaho Community Dashboard Editor

SHARE ON :


Related articles

September 28, 2022

Read in minutes

Power BI and QlikView Comparison

When we talk about Business Intelligence and Data Visualization, there are 3 leaders on the market today; Power BI, Qlik (Qlikview & Qliksense) and Tableau....

September 14, 2022

Read in minutes

PowerBI CICD with Azure DevOps (Introduction & Implementation) (1/3)

Introduction The purpose of this document is to explain how to implement PowerBI CICD in Azure DevOps. This document is for those who are tired of publishing re...

May 29, 2020

Read in 3 minutes

Purge ODI Logs through Database

If you generate a lot of logs in ODI, purging through ODI built-in mechanism can be very slow. A lot faster to do it through Database, but you have to respect f...


comments

Purge ODI Logs through Database

May 29, 2020

Business Inteligence Data Integration

Read in 3 minutes

If you generate a lot of logs in ODI, purging through ODI built-in mechanism can be very slow. A lot faster to do it through Database, but you have to respect foreign keys. Here is a sample plsql script to do so.

Here is a simple script with one parameter which is the number of days of log you want to keep, it will there retrieve session number and delete in the logs table following the dependencies.

SHARE ON :


Related articles

September 28, 2022

Read in minutes

Power BI and QlikView Comparison

When we talk about Business Intelligence and Data Visualization, there are 3 leaders on the market today; Power BI, Qlik (Qlikview & Qliksense) and Tableau....

September 14, 2022

Read in minutes

PowerBI CICD with Azure DevOps (Introduction & Implementation) (1/3)

Introduction The purpose of this document is to explain how to implement PowerBI CICD in Azure DevOps. This document is for those who are tired of publishing re...

April 18, 2022

Read in 3 minutes

Free BI Solution Architecture

Nowadays a wide number of companies understand the need to be data-oriented and to exploit the generated data to extend their successfulness. However, some comp...


comments

Clean ODI Scenario with Groovy

September 27, 2019

Business Inteligence Data Integration

Read in 1 minutes

You may generate a lot of scenarii when developping ODI projet. When promoting, commiting to git… …you are usually only interested in the last functionnal scenario.

All the past being stored in git or promoted, you may like to clear all all scenarii. If yes, this groovy script may help you. It will delete all scenarii but the last version (sorted by version name, take care…).

The code has two parameters, the project code and a pattern for the package. May help you target specific scenario.

//Imports core
import oracle.odi.core.persistence.transaction.ITransactionDefinition;
import oracle.odi.core.persistence.transaction.support.DefaultTransactionDefinition;
import oracle.odi.core.persistence.transaction.ITransactionManager;
import oracle.odi.core.persistence.transaction.ITransactionStatus;

//Imports odi Objects
import oracle.odi.domain.project.OdiPackage;
import oracle.odi.domain.runtime.scenario.OdiScenario;
import oracle.odi.domain.project.finder.IOdiPackageFinder;
import oracle.odi.domain.runtime.scenario.finder.IOdiScenarioFinder;


// Parameters -- TO FILL --
String sourceProjectCode = 'MY_PROJECT_CODE';
String sourcePackageRegexPattern = '*';


println "    Start Scenarios Deletion";
println "-------------------------------------";

//Setup Transaction
ITransactionDefinition txnDef = new DefaultTransactionDefinition();
ITransactionManager tm = odiInstance.getTransactionManager();
ITransactionStatus txnStatus = tm.getTransaction(txnDef);

int scenarioDeletedCounter = 0;

try {
  //Init Scenario Finder
  IOdiScenarioFinder odiScenarioFinder = (IOdiScenarioFinder)odiInstance.getTransactionalEntityManager().getFinder(OdiScenario.class);
  //Loops through all packages in target project/fodlers
  for (OdiPackage odiPackageItem : ((IOdiPackageFinder)odiInstance.getTransactionalEntityManager().getFinder(OdiPackage.class)).findByProject(sourceProjectCode)){
    // Only generate Scenario for package matching pattenr
    if (!odiPackageItem.getName().matches(sourcePackageRegexPattern)) {
      continue;    
    }
    println "Deleting Scenarii for Package " + odiPackageItem.getName();
    
    odiScenCollection = odiScenarioFinder.findBySourcePackage(odiPackageItem.getInternalId());
    maxOdiScen = odiScenCollection.max{it.getVersion()};
    if (maxOdiScen != null) {
      for (OdiScenario odiscen : odiScenCollection ) {
        if (odiscen != maxOdiScen){
          println "Deleting Scenari "+ odiscen.getName() + " " + odiscen.getVersion();
          odiInstance.getTransactionalEntityManager().remove(odiscen);
          scenarioDeletedCounter ++;
        }
      }
    }
 }   
// Commit transaction
tm.commit(txnStatus);


println "---------------------------------------------------";
println "     " + scenarioDeletedCounter + " Scenarios deleted Sccessfully";
println "---------------------------------------------------";

} 
catch (Exception e)
{
  // Print Execption
  println "---------------------ERROR-------------------------";
  println(e);
  println "---------------------------------------------------";
  println "     FAILURE : Scenarios Deletion failed";
  println "---------------------------------------------------";
}

SHARE ON :


Related articles

September 28, 2022

Read in minutes

Power BI and QlikView Comparison

When we talk about Business Intelligence and Data Visualization, there are 3 leaders on the market today; Power BI, Qlik (Qlikview & Qliksense) and Tableau....

September 14, 2022

Read in minutes

PowerBI CICD with Azure DevOps (Introduction & Implementation) (1/3)

Introduction The purpose of this document is to explain how to implement PowerBI CICD in Azure DevOps. This document is for those who are tired of publishing re...

April 18, 2022

Read in 3 minutes

Free BI Solution Architecture

Nowadays a wide number of companies understand the need to be data-oriented and to exploit the generated data to extend their successfulness. However, some comp...


comments

Human and Machine Learning

March 17, 2019

Business Inteligence Event

Read in 2 minutes

I had the opportunity to attend the 2019 Gartner Data & Analytics Summit at London. Here is a wrap up of some notes I took during the sessions.

Few years ago, AI was a subject of fear for the future. Now it’s a fact, Machine Learning is part of the present. We are not anymore in a challenge Humans vs Machines, goal is to free human resources for higher end tasks. Humans and Machines…

You still have a problem with terms like Artificial Intelligence, Machine Learning? No worries, just replace them with “Augmented“.
Augmented Analytics, Augmented Data Management, Augmented Data Integration…

2019 will be Augmented. Not Human versus Machine but Human and Machine Learning at the service of a better Data World.

The new tools will let you operate as you used too but, in the background, will run Machine Learning algorithm to suggest you new vizualisations, unexpected facts, correlations, to save you from repetitive task…

  • All your integration flows have a common pattern, your augmented tool will detect it and propose you to create a new template automatically.
  • You select a set of analytics, your augmented tool will propose a cool vizualisation.
  • You want to prepare a dataset, your augmented analytics will automatically suggest formatting corrections, data mapping and learn from your choices.

If you plan to buy a new tool this year, be sure this is part of the roadmap.

Any other trends for 2019?
Many other trends were presented by Gartner, here are a couple of recurring ones during the sessions :

  • NLP. Natural Language Processing, new tools should be able to accept natural language as input (which allow vocal input from Alexa, Cortona…).
  • DataOps. No-one will deny Data is a subject where requirements evolve quickly. This is thus a choice area to apply agile development methods. DataOps is a specialized version of DevOps practices. This fits perfectly in an augmented world where most repetitive tasks should be automated.

On a non-technical side :

  • Data Literacy. Being a good technician is not enough if you work in the data world. You need to understand data, how they are and can be presented. Your ability to communicate around the data is as important as your ability to manage them. This is what include the data literacy skills. Some training exists on the web, a must for any consultant.

And many more you can find on Gartner web site or at future events.

Enjoy 2019 with machines.13 Rue de la Libération, 5969 Itzig, Luxembourg

SHARE ON :


Related articles

September 28, 2022

Read in minutes

Power BI and QlikView Comparison

When we talk about Business Intelligence and Data Visualization, there are 3 leaders on the market today; Power BI, Qlik (Qlikview & Qliksense) and Tableau....

September 14, 2022

Read in minutes

PowerBI CICD with Azure DevOps (Introduction & Implementation) (1/3)

Introduction The purpose of this document is to explain how to implement PowerBI CICD in Azure DevOps. This document is for those who are tired of publishing re...

April 18, 2022

Read in 3 minutes

Free BI Solution Architecture

Nowadays a wide number of companies understand the need to be data-oriented and to exploit the generated data to extend their successfulness. However, some comp...


comments