It’s been a little while since I’ve released any new commands so I decided to throw one together real quick. Fellow MVP, Matthew McDermott, was working on a scripted install and had asked if I had anything to set the path for the ULS logs. Unfortunately I didn’t but a quick disassemble of the central admin page showed that the code to do this was really simple so I decided to throw a command together: gl-tracelog.
The code necessary to set the trace log properties is real simple – you just get a SPDiagnosticsService object via the static "Local" property and then set the necessary properties and call update:
The syntax for the command is shown below:
c:\>stsadm -help gl-tracelog stsadm -o gl-tracelog Sets the log file location (note that the location must exist on each server) and the maximum number of log files to maintain and how long to capture events to a single file. Parameters: [-logdirectory <log file location>] [-logfilecount <number of log files to create (0-1024)>] [-logfileminutes <number of minutes to use a log file (0-1440)>]
Here’s an example of how to use the command to set each property:
stsadm -o gl-tracelog -logdirectory c:\moss\logs -logfilecount 100 -logfileminutes 30
One question someone might have is why I would want to change the location of the log files? There are two reasons to do this, performance and disk capacity. It’s often a best practice to keep your C drive dedicated to the operating system and application installs and then to create another another drive for your log files (ULS logs and IIS logs for instance). Doing this will allow each drive to perform with less contention and will allow log files to grow without the risk of bringing down the operating system (let your C drive fill up and see how long your SharePoint server continues to run – eventually you won’t even be able to log in to the machine). Joel Oleson has a good post on this that I’d highly recommend everyone read: SharePoint Disk Allocation and Disk I/O.