Click to rate this post!
[Total: 0 Average: 0]

MSBI Interview Questions

1.What is MSBI?

Microsoft Business Intelligence is designed to empower business users through the BI Tools.

Users can create their own reports and analysis (examination and evaluation of relevant information)  and easily share and collaborate with their colleagues.

MSBI is designed with IT manageability and cost effectiveness.

2. Core Elements of SSIS Designer:

 1.Control flow

 2.Data Flow


 4.Event Handlers

 5.Package Explorer

3. What is difference between Flat files and Raw files?

 Flat Files:

 * A flat file that stores data in a plain text file.

 * Each line of the text file holds one record, with fields separated by delimiters, such as commas or tabs.

 * Flat files are data files that contain records with no structured relationships.

 Raw Files :

 * A raw file is a collection of unprocessed data(Binary format).

 * This means the file has not been altered, compressed, or manipulated in any way by      the computer.

 *  Raw files are often used as data files by software programs that load and process           the data.

4. Explain Derived Column Transformation  with an example
  • The Derived Column transformation creates new column values by applying expressions to transformation input columns.
  • An expression can contain any combination of variables, functions,operators, and  columns from the transformation input.
  • The result can be added as a new column or inserted into an existing column as a replacement value.
  • The Derived Column transformation can define multiple derived columns, and any variable or input columns can appear in multiple expressions
5.  Explain Fuzzy Lookup Transformation with an example.
  •  The Fuzzy Lookup transformation differs from the Lookup transformation in its use of fuzzy matching.
  •  The Lookup transformation uses an equi-join to locate matching records in the reference table.
  •  It returns either an exact match or nothing from the reference table.
  • The Fuzzy Lookup transformation uses fuzzy matching to return one or more close matches from the reference table.
  •  Normal Lookup perform exact match but Fuzzy lookup perform similarity matches or close matches.
  •   Hence This transformation adds additional columns
  •     _similarity            * _similarity_columname                  * _confidence

 _similarity : How many similar indicates or How much indicates the lookup rows.


  •  Indicates how much similar each columns in the row similarity.

 _confidence : How much confidence in the system is for identifying the similarities.

  •   Generally used for String operations.
6. Difference between Execute TSQL Task and Execute SQL Task

In SSIS there is one tasks Execute TSQL task which is similar to Execute SQL task. Will see what is the difference between two.

 Execute TSQL Task: 

Pros: Takes less memory, faster perfomance

Cons: Output into variable not supported, Only supports connection

 Execute SQL Task:

Pros: Support output into variables and multiple types of connection, parameterized query possible.

Cons: Takes more memory, slower performance compared to TSQL task.

7. Explain Precedence Constraints with an example

 A task will only execute if the condition that is set by the precedence constraint preceding the task is met. By using these constraints, it will choose different execution paths depending on the success or failure of other tasks.

 Success – Workflow will proceed when the preceding container executes successfully. Indicated in control flow by a solid green line.

Failure – Workflow will proceed when the preceding container’s execution results in a failure. Indicated in control flow by a solid red line.

Completion – Workflow will proceed when the preceding container’s execution completes, regardless of success or failure. Indicated in control flow by a solid blue line.

Expression/Constraint with logical AND – workflow will proceed when specified expression and constraints evaluate to true. Indicated in control flow by a solid color line along with a small ‘fx’ icon next to it.  Color of line depends on logical constraint chosen (e.g. success = green, completion = blue).

8. Different between Control Flow and Data Flow?

Control flow is for designing the flow of the package. Data flow is for ETL process.Data Flow is the subset of control flow

There will be only one control flow while multiple dataflow can exists.Data flow cannot work without a control flow

All process base tasks are part of control flow while ETL related tasks are the part of Dataflow which is again a subset of control flow.

9. What is Check point?

Checkpoint is the property in SSIS which enables the project to restart from the point of failure. When we set the property to true package create the checkpoint file which stores the information about package execution and use to restart package from the point of failure. So, If the package runs successfully, the checkpoint file is deleted, and then re-created the next time the package runs.

10. Explain Bulk insert task with an example

 The Bulk Insert task can transfer data only from a text file into a SQL Server table or view.

 If the destination table or view already contains data, the new data is appended to the existing data when the Bulk Insert task runs.

 we cannot perform any ETL operations during the data transfer.

 Destination table should be already present in sql server.

11. Explain Import Columns and Export Columns Transformation

 Import Columns:

 * The Import Column transformation is used to load binary data (photos, documents, media etc.) from the file system into a Data Flow.

 *  From the Data Flow it can then be loaded into a table, moved to a different location, modified – whatever you need to do.

 *  It is the reverse of the Export Column transformation, which pushes binary data out of a Data Flow and into the file system.

 Export columns:

 * The Export Column transformation reads data in a data flow and inserts the data into a file.

 * For example, if the data flow contains product information, such as a picture of each product, you could use the Export Column transformation to save the images to files.

 * Specify the data columns and the columns that contain the path of files to which to write the data.

 * Specify whether the data-insertion operation appends or truncates existing files.

 12. What is the difference between OLTP and OLAP?


 1.Database systems designed for Transactions purpose.

 2.Data is frequently  modified (Volatile data).

 3.More number of users.

 4.Normalized tables maintained.

 5.More no.of Indexes not recommended.

 6.Operational oriented.

 7.Database Systems maintain current data (< one year).

 8.Small amount of data.


 1.DWH Systems designed for Analysis and Reporting purpose.

 2.Data modifications are not frequently. (Non volatile data).

 3.Less number of users.(Data Analyst Team)

 4. Denormalized tables are maintained.

 5.More no.of Indexes recommended.

 6.Subject oriented.

 7.DWH maintain historical data ( 1 -- 10 years of data).

 8.Huge data (Very large database).

13. What is the size of the Cube in your last Project?

 Answer to this question varies from project to project and mainly depends on how BIG is your database and how COMPLEX the database design. So, Generally for the database with a TRANSACTION TABLE of 50 cores records, the cube size will be around 100GB. So, better go with 100GB as answer to this question.

 14. What is size of the database in your last Project?

 The database size will be 600 to 800GB for which the cube will come to 100 GB. So go with 800GB for this question.

 15. How frequently you process the cube?

 You have to be very careful here.

 Frequency of processing cube depends on HOW FREQUENTLY YOU ARE GETTING NEW     DATA?

Every day at 10pm IST.  we get new data either Daily or Weekly.

 16. What is the toughest challenge you face in your Project?

There are couple of this where we face difficulty.

 * While working on RELATIONSHIPS between Measure Groups and Dimensions.

 * Working on Complex calculations

 * Performance tuning

 * Working on KPI creations

 * Working on partitions.

 17. How many dimensions in your last cube?

5 tables, around 50 dimensions attributes

 18. How many measure groups in your last cube?

Total 10 to 20 .

19. What is the Schema of your last cube?

Snowflake Schema

20. Why not STAR Schema ?

My data base design doesn’t support STAR Schema.

21. What are the different relationships that you are used in your cube?

We have 6 types of Relationship:

a.No Relationship

b.Regular Relationship

c.Referenced Relationship

  1.  Fact Relationship

e.many to many Relationship

  1.  Data mining Relationships.
 22. How many Resources worked on same Cube in your Project?

 Only 2 and one in morning shift and another in Evening shift.

 23. How much time it take to Process the Cube?

This is Very very important question.This again depends on the Size of database,Complexity of the database and your server settings. For database with 50 cr transaction records, it generally takes 3 hrs.

 24. How many Calculation you done in Your Project?

 I answer more than 100 and if you tell the same then you are caught unless you are super good in MDX. Best answer for you is “Worked on 50 calculations”.

Get more questions and answers from onlineitguru trainers after completion of MSBI online training.

to our newsletter

Drop Us A Query

Trending Courses
  • oracle 12c rac | OnlineITGuru
    Oracle RAC Training
  • Oracle is the large vendor in providing the various storge services to the people across the globe. This vendor provides a different amount of storage services to the people across the globe.

  • salesforce lightning training | OnlineITGuru
    Salesforce Lightning Training
  • Developing an application is not a simple and easy task. There are various parameters that the web developer need to take care while developing an application. One of those parameters that the developer needs to take care of is the code reusability.

  • Selenium with python
    Selenium with Python Training
  • As we know, that Selenium with Python Web Browser Selenium Automation is Gaining Popularity Day by Day. So many Frameworks and Tools Have arisen to get Services to Developers.

  • machine learning with python
    Machine Learning with Python Training
  • Over last few years, Big Data and analysis have come up, with Exponential and modified Direction of Business. That operate Python, emerged with a fast and strong Contender for going with Predictive Analysis.

  • Data science with R
    Data Science With R Training
  • Understanding and using Linear, non-linear regression Models and Classifying techniques for stats analysis. Hypothesis testing sample methods, to get business decisions.

  • data science with python
    Data Science with Python Training
  • Everyone starts Somewhere, first you learn basics of Every Scripting concept. Here you need complete Introduction to Data Science python libraries Concepts.


100% Secure Payments. All major credit & debit cards accepted.

Call Now Button