Server 2003: Extremely slow logoff (Script) times?

Majestic

Active Member
Reaction score
28
Location
Montreal, Canada
Hi all..

I'm managing a company with about 35 workstations and 1 server with Microsoft Server 2003.

I have about 6 workstations that seems to be taking an incredibly long time (10+ minutes) to finish running the logoff script.

The network is fully gigabit (Workstations and Server), and I have verified that they are running at the proper speed.

I'm still going through the profiles but from what I can see some of the profiles that are taking a VERY long time to log off, running the log off script have anywhere from 20 to 35 gigs of data on the server.

The log off script is basically an Xcopy batch file with the /S /E /Y /D flags. So it backs up all the directories and subdirectories, checks the date on all files and if the date is newer on the client side then it will copy it to the server /profiles/ directory.

I'm backing up favourites, desktop, my documents, and all email (outlook, outlook express, etc.)

I have had profiles in the past that were fairly large but the workstations logged off quite quickly.

I'd like to know if there is a faster way to do this, more efficient and/or What could be causing such a long log off time?

I'm going to try installing UPHCleaner on the workstations with the issue ( http://support.microsoft.com/kb/837115 )

If anybody has any suggestions I would really appreciate it as I'm not sure where to go here and whether I should just be erasing the profiles on the Server and then re-syncing the workstations to the server?

It seems to be that on such as fast connection it should not be taking this kind of time.

Yes I did take into consideration whether a lot of other workstations were logging off at the same time and I have tested it individually (after hours) and it still takes a very long time for the workstation(s) in question to log off.

Any help would be appreciated!

Thanks

Majestic
 
If anybody has any suggestions I would really appreciate it as I'm not sure where to go here and whether I should just be erasing the profiles on the Server and then re-syncing the workstations to the server?

If you have everything backed up, why not do that? I mean, it could be any number of things, but your solutions sounds like a good solution to me.

P.S. BTW, just one server for that many workstations? Might wanna get a second, not that that is what the issue seems to be.
 
wow.. I have a bit of an update and I'm feeling rather "sheepish" here. One of the workstations that has been taking a very long time to log off had 320 Gigs on his desktop (!!). I have selected those folders and turned off folder sync. This brings me to a new question then,

What is an appropriate amount of allowed storage space in A) a Profile? and B) On the desktop itself to be allowed?

I'm going to implement quotas. They are running Windows Server 2003 R2 which does have quota management in it.

I'd like to hear what people are typically putting?

Appreciated in advance!

Majestic
 
Why are you allowing any data to be stored locally at all? Is there a reason?

Are you running roaming profiles or local ones?
 
Why not just use folder redirection and redirect their docs etc to the server (or another server) and back that up nightly instead of using scripts to backup data. As far as profiles go, really depends on how much storage you have available. I limit my users to 1GB but I have over 1,000 users. One server should be plenty for only 35 clients. Also remember that even though you have gigabit switches that doesn't mean you are getting gigabit speeds on every port. All switches have a max throughput. Typically the higher end the switch the better the throughput.
 
Last edited:
Why are you allowing any data to be stored locally at all? Is there a reason?

Are you running roaming profiles or local ones?

Well this is a company that uses a lot of cad and does quite a bit of rendering. If we had everything go to the server it would fill up VERY quickly.

There are no roaming profiles, only local.

Majestic
 
Why not just use folder redirection and redirect their docs etc to the server (or another server) and back that up nightly instead of using scripts to backup data. As far as profiles go, really depends on how much storage you have available. I limit my users to 1GB but I have over 1,000 users. One server should be plenty for only 35 clients. Also remember that even though you have gigabit switches that doesn't mean you are getting gigabit speeds on every port. All switches have a max throughput. Typically the higher end the switch the better the throughput.

The documents are already redirected. The problem is the constant growing graphic files on the local machines. The server gets backed up to both a nas and changed 1 tb drives every other day but everything requires constant cleanup.

You're right about the switches and I'm going to do a speed test / network analysis to find out the true speed- to that end what is a good tool for that?

Thanks

Majestic
 
You may want to google "throughput" for you brand/model of switch. This will give you a baseline of what throughput you should expect to see. If its a managed switch there should be a way to look at the interface itself and see the input/output rates and what kind of queuing mechanism is in place (example: FIFO) or anykind of QOS policies.

Here is a link you may find usefull.

http://testlab.pcausa.com/ttcp/classic/pcattcp.htm

I agree with a central location to store user data. My users know that if they store something locally, it doesnt get backed up. That why they have file shares and home directory's. You may want to look at SAN storage. Not to bad to setup and you can expand volumes on the fly.
 
Last edited:
Back
Top