Quick and Effective Microsoft 70-450 Exam Preparation Options – Braindump2go new released 70-450 Exam Dumps Questions! Microsoft Official 70-450 relevant practice tests are available for Instant downloading at Braindump2go! PDF and VCE Formates, easy to use and install! 100% Success Achievement Guaranteed!
Vendor: Microsoft
Exam Code: 70-450
Exam Name: PRO: Designing, Optimizing and Maintaining a Database Administrative Solution Using Microsoft SQL Server 2008
Keywords: 70-450 Exam Dumps,70-450 Practice Tests,70-450 Practice Exams,70-450 Exam Questions,70-450 PDF,70-450 VCE Free,70-450 Book,70-450 E-Book,70-450 Study Guide,70-450 Braindump,70-450 Prep Guide
Microsoft 70-450 Dumps VCE and 70-450 Dumps PDF Download: http://www.braindump2go.com/70-450.html
QUESTION 61
You are a professional level SQL Sever 2008 Database Administrator.
It is reported by the customers that the server performance degraded due to a newly implemented process. Dynamic Management Views is utilized to confirm that no long running queries exist.
The operating system performance data should be correlated with the actual query execution trace, and the least administrative effort should be utilized.
Which action will you perform to finish the task?
A. To finish the task, Data Collector should be utilized.
B. To finish the task, the SQLdiag.exe utility should be utilized.
C. To finish the task, SQL Server Profiler and the tracerpt.exe utility should be utilized.
D. To finish the task, SQL Server Profiler and System Monitor should be utilized.
Answer: D
Explanation:
SOL Server profiler displays data about a large number of SOL Server events.
Whereas Windows System Monitor graphically displays information about the server internals. You can merge the two sets of information and walk through a scenario viewing both perspectives using SQL Server Profiler.
To set up the dual-perspective experience, you need to simultaneously capture server performance using both Performance Monitor’s Counter Logs and SOL Server Profiler.
The steps to do this are listed below:
1. Configure System Monitor with the exact counters you want to view later. Be sure to get the scale and everything just right. Set up the Counter Log to the exact same configuration.
2. Configure Profiler with the right set of trace events. They must include the start and end time data columns so that Profiler can integrate the two logs later. Save the profiler and close profiler.
3. Manually start the Counter Log. Open SOL Profiler trace code to start the server–side trace.
4. When the test is complete, stop both the counter Log and the server side trace.
You need to stop the SQL Profiler which is a negative point in this.
5. Open profiler and open the saved trace file.
6. Use the Files> Import Performance Data menu command to import the Counter Log.
You have the option of selecting only the important counters from the performance monitor.
There will be performance issues if you select too many counters.
http://www.sqI-server-performance.com/faq/How_to_Integrate_Performance_Monitor_and_SOL_Profiler_p1.aspx
QUESTION 62
You are a professional level SQL Sever 2008 Database Administrator.
There is a database in the instance, and the day-to-day business of your company requires the database. When reports are executed, slow response time is experienced by Users.
A performance monitoring strategy will be implemented by you so as to have three aspects of data captured and stored:
– Blocking and deadlock information
– Executed Transact-SQL statements
– Query activity and Counters for disk, CPU,and memory.
You are required to utilize the least amount of administrative effort to implement the monitoring process.
Which action will you perform to finish the task?
A. To finish the task, the client-side profiler trace should be utilized.
B. To finish the task, the dynamic management views should be utilized.
C. To finish the task, the data collector should be utilized.
D. To finish the task, the System Monitor counter log trace should be utilized.
Answer: C
Explanation:
SQL Sewer 2008 provides a data collector that you can use to obtain and save data that is gathered from several sources. The data collector enables you to use data collection containers, which enable you to determine the scope and frequency of data collection on a SOL Server server system.
The data collector provides predefined collector types that you can use for data collection.
The collector types provide the actual mechanism for collecting data and uploading it to the management data warehouse.
For this release of the data collector, the following collector type is provided.
QUESTION 63
You are a professional level SQL Sever 2008 Database Administrator.
The computer on which the instance run has the following three features:
64-GB RAM, four quad-core processors, and several independent physical
raid volumes A transactional database will be implemented on the instance.
In addition, the database should have a high volume of INSERT, UPDATE, and DELETE activities, creation of new tables is contained by the activities.
You need to maximize disk bandwidth and decrease the contention in the storage allocation structures so as to have the performance of the database optimized.
Which action will you perform to finish the task?
A. To finish the task, database and log files should be placed on the same volume.
B. To finish the task, the affinity mask option should be configured properly.
C. To finish the task, multiple data files should be create for the database.
D. To finish the task, the affinity I/O mask option should be configured properly.
Answer: C
Explanation:
If your database is very large and very busy, multiple tiles can be used to increase performance. Here is one example of how you might use multiple files. Let’s say you have a single table with 10 million rows that is heavily queried. It the table is in a single file, such as a single database file, then SQL Server would only use one thread to perform a read of the rows in the table. But if the table were divided into three physical files, then SOL Server would use three threads (one per physical file} to read the table, which potentially could be faster. In addition, if each tile were on its own separate physical disk or disk array, the performance gain would even be greater.
Essentially, the more files that a large table is divided into, the greater the potential performance.
Of course there is a point where the additional threads aren’t of much use when you max out the server’s UO. But up until you do max out the IXO, additional threads (and files) should increase performance.
http://www.sqI-server-perlormance.comr’tipsffiIegroups_p1.aspx
QUESTION 64
You are a professional level SQL Sever 2008 Database Administrator.
Log shipping should be implemented for several databases on three SQL Server instances.
The logs are migrated to a fourth SQL Server instance. A manual failover will be implemented.
You need to ensure that the database applications utilize the secondary server after failover.
Since you are the technical support, you are required to confirm that the latest data should be available to users.
Which actions should you perform to achieve the goal? (Choose more than one)
A. To achieve the goal, you should utilize the WITH RECOVERY option on the last log to apply
any unapplied transaction log backups in sequence to each secondary database.
B. To achieve the goal, you should redirect client computers to the secondary instance.
C. To achieve the goal, you should replicate all log shipping network shares to the secondary
instance.
D. To achieve the goal, you should utilize the WITH NORECOVERY option to back up the tail
of the transaction log of primary databases.
E. To achieve the goal, you should back up all databases on the secondary instance.
Answer: ABD
Explanation:
Log shipping consists of three operations:
1. Back up the transaction log at the primary server instance.
2. Copy the transaction log file to the secondary server instance.
3. Restore the log backup on the secondary server instance.
The log can be shipped to multiple secondary server instances.
In such cases, operations 2 and 3 are duplicated for each secondary server instance.
A log shipping configuration does not automatically fail over from the primary server to the secondary sewer. It the primary database becomes unavailable, any of the secondary databases can be brought online manually.
To make the target as the new source database you have to
1. Backup the transaction log tail in order to have the latest transactions and to put the database in recovery state
BACKUP LOG lzldventuxewoxks]
TO DISK = N ‘ C : \PTOgTam lileS\Mic:T »;s0fl; SQL Server\MSSQL10 .MSSQLSERVER\MSSQL\Backup\Adv.trn`
WITH NO_TRUNCATE ,
NOFORMAT,
NOINIT,
NAME = N’ Adva.ntu:l:s,.W0Tks–Tra11Saction Log Backup ‘ ,
SKIP,
NOREMJTND,
NOUNLOAD,
NOREc0vERY,
STATS = 10
2. After having copied all the transaction log backup, restore them in order and, tor the latest, use the WITH RECOVERY option
3. Redirect all the clients to the new source database.
QUESTION 65
You are a professional level SQL Sever 2008 Database Administrator.
After a regular test, you find that performance degradation is experienced by an instance for the three reasons:
Excessive CPU usage, Server processes paging and Deadlocks
A monitoring solution should be implemented to provide data, monitor and troubleshoot performance issues and detailed deadlock information should be contained in the provided data.
You should utilize the least amount of administrative effort to finish the task.
Which tool will you utilize to finish the task?
A. To finish the task, you should utilize Resource Governor.
B. To finish the task, you should utilize Database Engine Tuning Advisor.
C. To finish the task, you should utilize Extended Events.
D. To finish the task, you should utilize Performance Monitor (SYSMON).
Answer: C
Explanation:
Introducing SQL Server Extended Events
SOL Server Extended Events (Extended Events} is a general event-handling system for server systems. The Extended Events infrastructure supports the correlation of data from SOL Server, and under certain conditions, the correlation of data from the operating system and database applications. In the latter case, Extended Events output must be directed to Event Tracing for Windows (ETW) in order to correlate the event data with operating system or application event data. All applications have execution points that are useful both inside and outside an application.
Inside the application, asynchronous processing may be enqueued using information that is gathered during the initial execution of a task. Outside the application, execution points provide monitoring utilities with information about the behavioral and performance characteristics of the monitored application.
Extended Events supports using event data outside a process, This Etta is typically used
by:
– Tracing tools, such as SOL Trace and System Monitor.
– Logging tools, such as the Windows event log or the SOL Server error log.
– Users administering a product or developing applications on a product.
QUESTION 66
You are a professional level SQL Sever 2008 Database Administrator.
There are 30 branch offices in DoubleSecurity Insurance, and in the branch offices, customer data are stored in SQL Server 2008 databases.
Customer data should be security compliant if it is stored through multiple database instances.
You intend to utilize the Policy-Based Management feature to design a strategy for custom policies. And the format of custom policies is XML format.
The requirements listed below should be satisfied.
The company distributes custom policies to all instances.
In addition, the company enforces the policies on all instances. A strategy should be thought out and the minimum amount of administrative effort should be utilized.
Which action should you perform to finish the task?
A. To finish the task, the Distributed File System Replication service should be utilized.
B. To finish the task, a configuration server should be utilized.
C. To finish the task, the policies should be distributed by utilizing Group Policy Objects.
D. To finish the task, the policies should be distributed by utilizing the Active Directory directory
service.
Answer: B
Explanation:
Configuration Server or Central Management Server
In SOL Server 2008, you can designate an instance ot SQL Server as a Central Management Server. Central Management Servers store a list of instances of SOL Server that is organized into one or more Central Management Server groups. Actions that are taken by using a Central Management Server group will act on all servers in the server group. This includes connecting to servers by using Object Explorer and executing Transact-SOL statements and Policy-Based Management policies on multiple servers at the same time. All Central Management Servers and subordinate servers must be registered by using Windows Authentication. Versions ot SOL Server that are earlier than SOL Server 2008 cannot be designated as a Central Management Server.
QUESTION 67
You are a professional level SQL Sever 2008 Database Administrator.
All data changes are implemented through stored procedures, and only the INSERT, UPDATE,
or DELETE statements are utilized by the procedures.
A backup strategy should be implemented.
The business requirements listed below should be satisfied:
– Point-in-time recovery for failure is supported by the backup strategy at any time of day.
– The least amount of disk space should be utilized by the transaction log.
Which action should you perform to finish the task?
A. To finish the task, hourly database snapshots should be utilized.
B. To finish the task, the full-recovery model along with transaction log backups should be
utilized.
C. To finish the task, the full-recovery model along with differential backups should be utilized.
D. To finish the task, the simple-recovery model along with differential backups should be
utilized.
Answer: B
Explanation:
Description
– Requires log backups.
– No work is lost due to a lost or damaged data file.
– Can recover to an arbitrary point in time (tor example, prior to application or user error).
Work loss exposure
– Normally none.
– It the tail of the log is damaged, changes since the most recent log backup must be redone.
Recover to point in time
– Can recover to a specific point in time, assuming that your backups are complete up to that point in time.
QUESTION 68
You are a professional level SQL Sever 2008 Database Administrator.
You are experienced in managing databases in an enterprise-level organization,optimizing and sustaining the database life cycle.
In the company, your job is to implement solutions on security, troubleshooting, deployment and optimization. A SQL Server 2008 infrastructure is managed by you.
A database is utilized by the instance, and the database is utilized by a Web-based application. 15,000 transactions are processed by the application every minute.
A column is contained by a table in the database, and the column is utilized only by the application. Sensitive data is stored in this column.
The sensitive data should be stored with the highest security level.
In addition, the least amount of memory space and processor time should be utilized.
From the following four encryption types, which one should you utilize?
A. Asymmetric key encryption should be utilized.
B. Certificate-based encryption should be utilized.
C. Symmetric key encryption should be utilized.
D. Transparent data encryption should be utilized.
Answer: C
Explanation:
At the root of encryption tree is the Windows Data Protection API (DPAPI), which secures the key hierarchy at the machine level and is used to protect the service master key (SMK) for the database server instance. The DIVIK protects the database master key (DMK), which is stored at the user database level and which in turn protects certificates and asymmetric keys.
These in turn protect symmetric keys, which protect the data. TDE uses a similar hierarchy down to the certificate. The primary difference is that when you use TDE, the DIVIK and certificate must bestored in the master database rather than in the user database. A new key, used only for TDE and referred to as the database encryption key {DEK), is created and stored in the user database. This hierarchy enables the server to automatically open keys and decrypt data in both cell-level and database-level encryption. The important distinction is that when cell-level encryption is used, all keys from the DIVIK down can be protected by a password instead of by another key.
This breaks the decryption chain and forces the user to input a password to access data.
In TDE, the entire chain from DPAPI down to the DEK must be maintained so that the server
can automatically provide access to files protected by TDE. In both cell-level encryption and
TDE, encryption and decryption through these keys is provided by the Windows Cryptographic API (CAPI). Symmetric keys use the same password to encrypt and decrypt data, so it is the
less space consuming, because one asymmetric key will use one private and one public key.
QUESTION 69
You are a professional level SQL Sever 2008 Database.
A new database application is hosted by the instance.
The security requirements should be designed for the application.
A unique login to the SQL Server 2008 server is assigned to each application user.
Stored procedures are included by the application database to execute stored procedures in the MSDB database. SQLAgent jobs are scheduled by the stored procedures in the MSDB database.
Since you are the technical support, you are required to confirm that the stored procedures in the MSDB database should be executed by utilizing the security context of the application user.
Which action should you perform?
A. Each user should be added to the public role in the MSDB database.
B. Each user should be added to the db_dtsltduser database role in the MSDB database.
C. The MSDB database should be set to utilize the TRUSTWORTHY option, and then each
user should be added to the MSDB database.
D. The new database should be set to utilize the TRUSTWORTHY option, and then each
user should be added to the MSDB database.
Answer: D
Explanation:
The TRUSTWORTHY database property is used to indicate whether the instance of SOL
Server trusts the database and the contents within it. By default, this setting is OFF, but can
be set to ON by using the ALTER DATABASE statement. For example, ALTER DATABASE
AdventureWorks2008R2 SET TRUSTWORTHY ON;
By default msdb has the option TRUSTWORTHY set to True.
QUESTION 70
You are a professional level SQL Sever 2008 Database Administrator.
A maintenance strategy should be designed for a mission-critical database, and a large table named Orders is contained by the database.
Index maintenance operations are contained in the design plan.
When you design the strategy, the facts listed below should be taken into consideration.
– First, the users continuously access to the Orders table in the database.
– Secondly, a column of the xml data type is contained by Orders table.
– Thirdly, the new rows are regularly added to the Orders table.
– Fourthly, the average fragmentation for the clustered index of the Orders table is no more than 2 percent.
A strategy should be designed to have the performance of the queries on the table optimized.
Which action will you perform?
A. The clustered index of the Orders table should be dropped.
B. The clustered index of the Orders table offline should be rebuilt once a month.
C. The clustered index of the Orders table should be excluded from scheduled reorganizing or rebuilding operations.
D. The clustered index of the Orders table should be reorganized by reducing the fill factor.
Answer: C
Explanation:
As the users will continuously access the database and there is one cluster index, the cluster index could not be unavailable because the leaf pages of the clustered index contains the table data.
Furthermore, the cluster index has never one fragmentation of more than 2%, this means it doesn’t need to be reordered.
With this, you can be sure that answer A,B and D are wrong.
Guaranteed 100% Microsoft 70-450 Exam Pass OR Full Money Back! Braindump2go Provides you the latest 70-450 Dumps PDF & VCE for Instant Download!