Get Full Version of the Exam
http://www.EnsurePass.com/70-767.html

Question No.111

DRAG DROP

You are the administrator for a Data Quality Server. You are adding a user who must have permission to:

image

Edit and execute a project

image

View the activity monitoring data This user must not be able to:

image

Perform any kind of knowledge management

image

Create or change a knowledge base

image

Terminate an activity or perform administrative duties

You need to develop a Transact-SQL (T-SQL) script to meet these requirements. What should you do?

To answer, drag the appropriate code segment or segments to the correct location or locations in the answer area.

image

Correct Answer:

image

Question No.112

You administer a Microsoft SQL Server 2016 server that has SQL Server Integration Services (SSIS) installed.

You plan to deploy new SSIS packages to the server. The SSIS packages use the Project Deployment Model together with parameters and Integration Services environment variables.

You need to configure the SQL Server environment to support these packages. What should you do?

  1. Create SSIS configuration files for the packages.

  2. Create an Integration Services catalog.

  3. Install Data Quality Services.

  4. Install Master Data services.

Correct Answer: B

Explanation:

http://msdn.microsoft.com/en-us/library/hh479588.aspx http://msdn.microsoft.com/en-us/library/hh213290.aspx http://msdn.microsoft.com/en-us/library/hh213373.aspx

Question No.113

HOTSPOT

You are developing a SQL Server Integration Service (SSIS) package. The package loads a customer dimension table by using a data flow task.

Changes to the customer attributes must be tracked over time.

You need to produce a checksum value to identify the rows that have changed since the last Extract, Transform and Load (ETL) process execution. You need to use the least amount of development effort to achieve this goal.

Which transformation should you use?

To answer, select the appropriate transformation in the answer area.

image

Correct Answer:

image

Question No.114

DRAG DROP

You are editing a SQL Server Integration Services (SSIS) package that uses checkpoints. The package performs the following steps:

  1. Download a sales transaction file by using FTP.

  2. Truncate a staging table.

  3. Load the contents of the file to the staging table.

  4. Merge the data with another data source for loading to a data warehouse.

image

The checkpoints are currently working such that if any of the four steps fail, the package will restart from the failed step the next time it executes.

You need to modify the package to ensure that if either the Truncate Staging Table or the Load Sales to Staging task fails, the package will always restart from the Truncate Staging Table task the next time the package runs.

Which three steps should you perform in sequence?

To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.

image

Correct Answer:

image

Question No.115

You are designing an extract, transform, load (ETL) process for loading data from a SQL Server database into a large fact table in a data warehouse each day with the prior day#39;s sales data.

The ETL process for the fact table must meet the following requirements:

image

image

image

Load new data in the shortest possible time. Remove data that is more than 36 months old. Ensure that data loads correctly.

image

Minimize record locking.

image

Minimize impact on the transaction log.

You need to design an ETL process that meets the requirements. What should you do? (More than one answer choice may achieve the goal. Select the BEST answer.)

  1. Partition the destination fact table by date. Insert new data directly into the fact table and delete old data directly from the fact table.

  2. Partition the destination fact table by date. Use partition switching and staging tables both to remove old data and to load new data.

  3. Partition the destination fact table by customer. Use partition switching both to remove old data and to load new data into each partition.

  4. Partition the destination fact table by date. Use partition switching and a staging table to remove old data. Insert new data directly into the fact table.

Correct Answer: B

Question No.116

You are a database developer of a Microsoft SQL Server 2016 database. You are designing a table that will store Customer data from different sources. The table will include a column that contains the CustomerID from the source system and a column that contains the SourceID. A sample of this data is as shown in the following table. You need to ensure that the table has no duplicate CustomerID within a SourceID. You also need to ensure that the data in the table is in the order of SourceID and then CustomerID. Which Transact-SQL statement should you use?

  1. CREATE TABLE Customer

    (SourceID int NOT NULL IDENTITY,

    CustomerID int NOT NULL IDENTITY, CustomerName varchar(255) NOT NULL);

  2. CREATE TABLE Customer (SourceID int NOT NULL,

    CustomerID int NOT NULL PRIMARY KEY CLUSTERED,

    CustomerName varchar(255) NOT NULL);

  3. CREATE TABLE Customer

    (SourceID int NOT NULL PRIMARY KEY CLUSTERED,

    CustomerID int NOT NULL UNIQUE, CustomerName varchar(255) NOT NULL);

  4. CREATE TABLE Customer (SourceID int NOT NULL, CustomerID int NOT NULL,

CustomerName varchar(255) NOT NULL,

CONSTRAINT PK_Customer PRIMARY KEY CLUSTERED

(SourceID, CustomerID));

Correct Answer: D

Question No.117

You administer a SQL Server Integration Services (SSIS) solution in the SSIS catalog. A SQL Server Agent job is used to execute a package daily with the basic logging level.

Recently, the package execution failed because of a primary key violation when the package inserted data into the destination table.

You need to identify all previous times that the package execution failed because of a primary key violation.

What should you do?

  1. Use an event handler for OnError for the package.

  2. Use an event handler for OnError for each data flow task.

  3. Use an event handler for OnTaskFailed for the package.

  4. View the job history for the SQL Server Agent job.

  5. View the All Messages subsection of the All Executions report for the package.

  6. Store the System::SourceID variable in the custom log table.

  7. Store the System::ServerExecutionID variable in the custom log table.

  8. Store the System::ExecutionInstanceGUID variable in the custom log table.

  9. Enable the SSIS log provider for SQL Server for OnError in the package control flow.

  10. Enable the SSIS log provider for SQL Server for OnTaskFailed in the package control flow.

  11. Deploy the project by using dtutil.exe with the /COPY DTS option.

  12. Deploy the project by using dtutil.exe with the /COPY SQL option.

  13. Deploy the .ispac file by using the Integration Services Deployment Wizard.

  14. Create a SQL Server Agent job to execute the SSISDB.catalog.validate_project stored procedure.

  15. Create a SQL Server Agent job to execute the SSISDB.catalog.validate_package stored procedure.

  16. Create a SQL Server Agent job to execute the SSISDB.catalog.create_execution and SSISDB.catalog.start_execution stored procedures.

  17. Create a table to store error information. Create an error output on each data flow destination that writes OnError event text to the table.

  18. Create a table to store error information. Create an error output on each data flow destination that writes OnTaskFailed event text to the table.

Correct Answer: E

Question No.118

A SQL Server Integration Services (SSIS) package imports daily transactions from several files into a SQL Server table named Transaction. Each file corresponds to a different store and is imported in parallel with the other files. The data flow tasks use OLE DB destinations in fast load data access mode.

The number of daily transactions per store can be very large and is growing. The Transaction table does not have any indexes.

You need to minimize the package execution time. What should you do?

  1. Partition the table by day and store.

  2. Create a clustered index on the Transaction table.

  3. Run the package in Performance mode.

  4. Increase the value of the Row per Batch property.

Correct Answer: D

Explanation:

Data Access Mode ?This setting provides the #39;fast load#39; option which internally uses a BULK INSERT statement for uploading data into the destination table instead of a simple INSERT statement (for each single row) as in the case for other options.

BULK INSERT parameters include: ROWS_PER_BATCH =rows_per_batch

Indicates the approximate number of rows of data in the data file. By default, all the data in the data file is sent to the server as a single transaction, and the number of rows in the batch is unknown to the query optimizer. If you specify ROWS_PER_BATCH (with a value gt; 0) the server uses this value to optimize the bulk- import operation. The value specified for ROWS_PER_BATCH should approximately the same as the actual number of rows.

Question No.119

You are designing a data warehouse with two fact tables. The first table contains sales per month and the second table contains orders per day.

Referential integrity must be enforced declaratively.

You need to design a solution that can join a single time dimension to both fact tables. What should you do?

  1. Join the two fact tables.

  2. Merge the fact tables.

  3. Create a time dimension that can join to both fact tables at their respective granularity.

  4. Create a surrogate key for the time dimension.

Correct Answer: C

Explanation:

http://msdn.microsoft.com/en-us/library/ms174537.aspx http://technet.microsoft.com/en-us/library/ms174832.aspx http://msdn.microsoft.com/en-us/library/ms174884.aspx

http://decipherinfosys.wordpress.com/2007/02/01/surrogate-keys-vs-natural-keys-for-primary-key/

http://www.agiledata.org/essays/keys.html http://www.databasejournal.com/features/mssql/article.php/3922066/SQL-Server-Natural-Key-

Verses-Surrogate-Key.htm http://www.jamesserra.com/archive/2016/01/surrogate-keys/

Question No.120

You are using a SQL Server Integration Services (SSIS) project that is stored in the SSIS catalog. An Environment has been defined in the SSIS catalog.

You need to add the Environment to the project. Which stored procedure should you use?

  1. catalog.create_environment_variable

  2. catalog.create_environment_reference

  3. catalog.set_execution_parameter_value

  4. catalog.set_environment_variable_value

Correct Answer: B

Explanation:

Environments (Test, Production etc) are associated with projects by creating references to the environments in the projects.

Get Full Version of the Exam
70-767 Dumps
70-767 VCE and PDF

Comments are closed.