Search This Blog

SQL Server Log File Viewer


Method 1:

With SQL Server 2005 the rules have changed significantly.  Management Studio ships with the Log File Viewer which gives DBAs and Developers insight into the SQL Server 2005 and Windows Event Logs (Application, Security and System) all in one interface.  In this interface each of the needed logs can be selected with the ability to filter the data and export as needed.  In this tip we will answer common questions for those of you that are new to the tool which correspond to the tool's primary functionality.
How do I access the Log File Viewer?
  1. Launch SQL Server Management Studio.
  2. Navigate to your SQL Server 2005 instance.
  3. Expand the instance, then expand the Management folder followed by the SQL Server Logs folder.
  4. Double click on any of the log files.
  5. Below is the default Log File Viewer interface that will be loaded.
What are the available data sources for the Log File Viewer?
  • Database Mail
  • SQL Server Agent Logs
  • SQL Server Error Logs
  • Windows Event Logs (Application, Security and System)
How can I filter the data?
Just click on the 'Filter' button which will load the interface below.  In this interface it is possible to fine tune the data that is being reviewed.  At the highest level the filtering could be based on the start and end date\time stamp. At the lowest level you could look for the exact error message.
How do I search for an error message?
Just click on the 'Search' button which will load the interface below.  In this interface you can search the results for the specific error message.
How can I export data?
Click the 'Export' button which will save the current contents of the Log File Viewer to a *.log, *.csv or *.txt file.   This file can be reviewed later by clicking the 'Load Log' icon and reviewing the file contents in the Log File Viewer.
Does the Log File Viewer work with a SQL Server 2000 instance that I have registered with Management Studio (SQL Server 2005)?
Yes, the compatible SQL Server 2000 logs, such as SQL Server Agent Logs, SQL Server Error Logs and Windows Event Logs, can be viewed. You can consider the Log File Viewer one of those valuable features that Management Studio (SQL Server 2005) offers over Enterprise Manager (SQL Server 2000), so start using it today!

Method 2:
SQL Server transaction logs contain records describing changes made to a database. They store enough information to recover the database to a specific point in time, to replay or undo a change. But, how to see what’s in them, find a specific transaction, see what has happened and revert the changes such as recovering accidentally deleted records
To see what is stored in an online transaction log, or a transaction log backup is not so simple
Opening LDF and TRN files in a binary editor shows unintelligible records so these clearly cannot be read directly. For instance, this in an excerpt from an LDF file:
Opening LDF and TRN files in a binary editor

Use fn_dblog

fn_dblog is an undocumented SQL Server function that reads the active portion of an online transaction log
Let’s look at the steps you have to take and the way the results are presented
  1. Run the fn_dblog function
  2. Select * FROM sys.fn_dblog(NULL,NULL)
    Results set returned by fn_dblog function

    As the function itself returns 129 columns, returning only the specific ones is recommended as well as narrowing down the results to a specific transaction type, if applicable
  3. From the results set returned by fn_dblog, find the transactions you want to see
  4. To see transactions for inserted rows, run:
    SELECT [Current LSN], 
           Operation, 
           Context, 
           [Transaction ID], 
           [Begin time]
           FROM sys.fn_dblog
       (NULL, NULL)
      WHERE operation IN
       ('LOP_INSERT_ROWS');
    Transactions for inserted rows
    To see transactions for deleted records, run:
    SELECT [begin time], 
           [rowlog contents 1], 
           [Transaction Name], 
           Operation
      FROM sys.fn_dblog
       (NULL, NULL)
      WHERE operation IN
       ('LOP_DELETE_ROWS');
    Transactions for deleted rows

  5. Find the column that stores the value inserted or deleted – check out the RowLog Contents 0 , RowLog Contents 1 , RowLog Contents 2 , RowLog Contents 3 , RowLog Contents 4, Description and Log Record
  6. Row data is stored in different columns for different operation types. To be able to see exactly what you need using the fn_dblog function, you have to know the column content for each transaction type. As there’s no official documentation for this function, this is not so easy
    The inserted and deleted rows are displayed in hexadecimal values. To be able to break them into fields you have to know the format that is used, understand the status bits, know the total number of columns and so on
  7. Convert binary data into table data taking into account the table column data type. Note that mechanisms for conversion are different for different data types
fn_dbLog is a great, powerful, and free function but it does have a few limitations – reading log records for object structure changes is complex as it usually involves reconstructing the state of several system tables, only the active portion of an online transaction log is read, and there’s no UPDATE/BLOB reconstruction
As the UPDATE operation is minimally logged in transaction logs, with no old or new values, just what was changed for the record (e.g. SQL Server may log that “G” was changed to “F”, when actually the value “GLOAT” was changed into ”FLOAT”), you have to manually reconstruct the state prior to the update which involves reconstructing all the intermediary states between row’s original insertion into page and the update you are trying to reconstruct
When deleting BLOBs, the deleted BLOB is not inserted into a transaction log, so just reading the log record for the DELETE BLOB cannot bring the BLOB back. Only if there is an INSERT log record for the deleted BLOB, and you manage to pair these two, you will be able to recover a deleted BLOB from a transaction log using fn_dblog

Use fn_dump_dblog

To read transaction log native or natively compressed backups, even without the online database, use the fn_dump_dblog function. Again, this function is undocumented
  1. Run the fn_dump_dblog function on a specific transaction log backup. Note that you have to specify all 63 parameters
  2. SELECT *
    FROM fn_dump_dblog
    (NULL,NULL,N'DISK',1,N'E:\ApexSQL\backups\AdventureWorks2012_05222013.trn', 
    DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT, 
    DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,
    DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,
    DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,
    DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,
    DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,
    DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT, 
    DEFAULT);
    fn_dump_dblog function output
    The same as with fn_dbLog, 129 columns are returned, so returning only the specific ones is recommended
    SELECT [Current LSN], 
           Operation, 
           Context, 
           [Transaction ID], 
         [transaction name],
           Description
    FROM fn_dump_dblog
    (NULL,NULL,N'DISK',1,N'E:\ApexSQL\backups\AdventureWorks2012_05222013.trn', 
    DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT, 
    DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,
    DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,
    DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,
    DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,
    DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,
    DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT,DEFAULT, 
    DEFAULT);
    Again, you have to decipher hex values to get the information you’re looking for
    Returning specific columns using fn_dump_dblog function
    And you are back to square one as with using fn_dblog – you need to reconstruct all row values manually, you need to reconstruct entire state chains for UPDATE operations and BLOB values and so on
    If you don’t want to actually extract transactions from the transaction log backup, but to restore the database to a point in time before a specific operation occurred, you can:
  3. Determine the LSN for this transaction
  4. Convert the LSN into the format used in the WITH STOPBEFOREMARK = ‘<mark_name>’ clause, e.g 00000070:00000011:0001 should be transformed into 112000000001700001
  5. Restore the full log backup chain until you reach the time when the transactions occurred. Use the WITH STOPBEFOREMARK = ‘<mark_name>’ clause to specify the referencing transaction LSN
    RESTORE LOG AdventureWorks2012
    FROM
        DISK = N'E:\ApexSQL\backups\AW2012_05232013.trn'
    WITH
        STOPBEFOREMARK = 'lsn:112000000001700001',
        NORECOVERY;

USE DBCC PAGE

Another useful, but again undocumented command is DBCC PAGE. Use it to read the content of database online files – MDF and LDF. The syntax is:
DBCC PAGE ( {'dbname' | dbid}, filenum, pagenum [, printopt={0|1|2|3} ])
To dump the first page in the AdventureWorks2012 database online transaction log file, use:
SELECT FILE_ID ('AdventureWorks2012_Log') AS 'File ID' 
-- to determine Log file ID = 2
DBCC PAGE (AdventureWorks2012, 2, 0, 2)
You’ll get
DBCC execution completed. If DBCC printed error messages, 
contact your system administrator.
By default, the output is not displayed. If you want an output in SQL Server Management Studio, turn on the trace flag 3604 first
DBCC TRACEON (3604, -1)
And then re-run
DBCC PAGE (AdventureWorks2012, 2, 0, 2)
You’ll get a bunch of errors and bad header and you can ignore all that. At the end you’ll get a glorious hexadecimal output from the online LDF file:
Hexadecimal output from the online LDF file
Which is not the friendliest presentation of your database data and is basically no different than viewing it in a hex editor (just more uncomfortable) though at least you get access to the online data