Showing posts with label fails. Show all posts
Showing posts with label fails. Show all posts

Friday, March 30, 2012

JOB With SSIS Step Fails If Package Contains A Script Task

I have a Job Step defined to execute a SSIS Package. This SSIS package contains a Script Task. The Job fails with the message "Package execution failed. The step failed."

I am logging events in the package and when the packages gets to the Script Task the log reports "The script files failed to load". If I disable the Script Task from the package it executes fine.

Curiously, the package runs successfully with the Scritp task enabled using dtexecui and dtexec from the command line.

Only if I include the Package in a job step with the Script Task enabled does it fail.

Any help would be appreciated.I've no real idea about this Steve except to say have you got the script code pre-compiled? If so, try it without (and vica versa).

-Jamie|||

Sorry for not posting this sooner, Jamie...Yes, setting the Script Task "PrecompileScriptIntoBinaryCode" Property to True resolved the issue.

-Steve

|||hey - did you ever get this to work? Im having the same issue - its a SSIS pkg that ran on one box and I moved to another box (64 bit if that makes a diff) - any info is greatly apprecaited - thanks.|||

Juantana,

Yes, set your Script Task "PrecompileScriptIntoBinaryCode" Property=True and then open and close your script. Save, redeploy and it should work. Let me know if you have any questions.

-Steve

|||

Hi,

Inspite of putting the PrecompileScriptintoBinarycode Property = True, it is not working. Do you have any suggestions?

Thanks.

|||After settings Precompile to True, you need to open and close script editor(s) to actually precompile the script in the task. Then re-deploy the pockage to target system.|||Also remove all the break points in the code, having break points prevents from recompilation. I guess this issue would be fixed by SP2|||Just an FYI I had all my scripts set to PrecompileScriptIntoBinaryCode=True however when developing in 32bit and deploying to 64bit seemingly randomly I will get the "script failed to load" error. I just open the script and compile it again and save and that usually does the trick.|||

Also if the problem still persists and you keep getting the same error I am sure you are not taking the dtsx file from bin folder to execute. The file created by designer (Business Intelligence Development Studio) is only useful for debug and development mode. After building the solution one should use dtsx created in bin folder. This solution worked for me.

Thanks

Mohit

|||NOTE: I got this same error message with Precompile option set to True.

Problem was that a variable name script used was not passed in. Go figure|||

I have about 30 dtxs that I would like to recompile its scripts to solve this issue. I would like to avoid to open each package, then open each script in the package and do the save to recomplie.

Is there a way to recompile from the command line?

Thanks!

|||

Previously I said "randomly" but I think I have it figured out when this error occurs:

If I am working on a package, open a different package and copy a script object and paste into the one I am working on it will not work in 64-bit without a recompile (but will work in 32 bit).

|||

Chris Honcoop wrote:

Previously I said "randomly" but I think I have it figured out when this error occurs:

If I am working on a package, open a different package and copy a script object and paste into the one I am working on it will not work in 64-bit without a recompile (but will work in 32 bit).

Sounds like it might be a bug. Could you submit it at http://connect.microsoft.com?

-Jamie

|||

I too have similar doubt as that of Paulino. I've 50+ packages with 60+ script tasks. Is there any way, I can compile using a command in command prompt? Any help would be highly appreciated.

My Regards

JOB With SSIS Step Fails If Package Contains A Script Task

I have a Job Step defined to execute a SSIS Package. This SSIS package contains a Script Task. The Job fails with the message "Package execution failed. The step failed."

I am logging events in the package and when the packages gets to the Script Task the log reports "The script files failed to load". If I disable the Script Task from the package it executes fine.

Curiously, the package runs successfully with the Scritp task enabled using dtexecui and dtexec from the command line.

Only if I include the Package in a job step with the Script Task enabled does it fail.

Any help would be appreciated.I've no real idea about this Steve except to say have you got the script code pre-compiled? If so, try it without (and vica versa).

-Jamie|||

Sorry for not posting this sooner, Jamie...Yes, setting the Script Task "PrecompileScriptIntoBinaryCode" Property to True resolved the issue.

-Steve

|||hey - did you ever get this to work? Im having the same issue - its a SSIS pkg that ran on one box and I moved to another box (64 bit if that makes a diff) - any info is greatly apprecaited - thanks.|||

Juantana,

Yes, set your Script Task "PrecompileScriptIntoBinaryCode" Property=True and then open and close your script. Save, redeploy and it should work. Let me know if you have any questions.

-Steve

|||

Hi,

Inspite of putting the PrecompileScriptintoBinarycode Property = True, it is not working. Do you have any suggestions?

Thanks.

|||After settings Precompile to True, you need to open and close script editor(s) to actually precompile the script in the task. Then re-deploy the pockage to target system.|||Also remove all the break points in the code, having break points prevents from recompilation. I guess this issue would be fixed by SP2|||Just an FYI I had all my scripts set to PrecompileScriptIntoBinaryCode=True however when developing in 32bit and deploying to 64bit seemingly randomly I will get the "script failed to load" error. I just open the script and compile it again and save and that usually does the trick.|||

Also if the problem still persists and you keep getting the same error I am sure you are not taking the dtsx file from bin folder to execute. The file created by designer (Business Intelligence Development Studio) is only useful for debug and development mode. After building the solution one should use dtsx created in bin folder. This solution worked for me.

Thanks

Mohit

|||NOTE: I got this same error message with Precompile option set to True.

Problem was that a variable name script used was not passed in. Go figure|||

I have about 30 dtxs that I would like to recompile its scripts to solve this issue. I would like to avoid to open each package, then open each script in the package and do the save to recomplie.

Is there a way to recompile from the command line?

Thanks!

|||

Previously I said "randomly" but I think I have it figured out when this error occurs:

If I am working on a package, open a different package and copy a script object and paste into the one I am working on it will not work in 64-bit without a recompile (but will work in 32 bit).

|||

Chris Honcoop wrote:

Previously I said "randomly" but I think I have it figured out when this error occurs:

If I am working on a package, open a different package and copy a script object and paste into the one I am working on it will not work in 64-bit without a recompile (but will work in 32 bit).

Sounds like it might be a bug. Could you submit it at http://connect.microsoft.com?

-Jamie

sql

JOB With SSIS Step Fails If Package Contains A Script Task

I have a Job Step defined to execute a SSIS Package. This SSIS package contains a Script Task. The Job fails with the message "Package execution failed. The step failed."

I am logging events in the package and when the packages gets to the Script Task the log reports "The script files failed to load". If I disable the Script Task from the package it executes fine.

Curiously, the package runs successfully with the Scritp task enabled using dtexecui and dtexec from the command line.

Only if I include the Package in a job step with the Script Task enabled does it fail.

Any help would be appreciated.I've no real idea about this Steve except to say have you got the script code pre-compiled? If so, try it without (and vica versa).

-Jamie|||

Sorry for not posting this sooner, Jamie...Yes, setting the Script Task "PrecompileScriptIntoBinaryCode" Property to True resolved the issue.

-Steve

|||hey - did you ever get this to work? Im having the same issue - its a SSIS pkg that ran on one box and I moved to another box (64 bit if that makes a diff) - any info is greatly apprecaited - thanks.|||

Juantana,

Yes, set your Script Task "PrecompileScriptIntoBinaryCode" Property=True and then open and close your script. Save, redeploy and it should work. Let me know if you have any questions.

-Steve

|||

Hi,

Inspite of putting the PrecompileScriptintoBinarycode Property = True, it is not working. Do you have any suggestions?

Thanks.

|||After settings Precompile to True, you need to open and close script editor(s) to actually precompile the script in the task. Then re-deploy the pockage to target system.|||Also remove all the break points in the code, having break points prevents from recompilation. I guess this issue would be fixed by SP2|||Just an FYI I had all my scripts set to PrecompileScriptIntoBinaryCode=True however when developing in 32bit and deploying to 64bit seemingly randomly I will get the "script failed to load" error. I just open the script and compile it again and save and that usually does the trick.|||

Also if the problem still persists and you keep getting the same error I am sure you are not taking the dtsx file from bin folder to execute. The file created by designer (Business Intelligence Development Studio) is only useful for debug and development mode. After building the solution one should use dtsx created in bin folder. This solution worked for me.

Thanks

Mohit

|||NOTE: I got this same error message with Precompile option set to True.

Problem was that a variable name script used was not passed in. Go figure|||

I have about 30 dtxs that I would like to recompile its scripts to solve this issue. I would like to avoid to open each package, then open each script in the package and do the save to recomplie.

Is there a way to recompile from the command line?

Thanks!

|||

Previously I said "randomly" but I think I have it figured out when this error occurs:

If I am working on a package, open a different package and copy a script object and paste into the one I am working on it will not work in 64-bit without a recompile (but will work in 32 bit).

|||

Chris Honcoop wrote:

Previously I said "randomly" but I think I have it figured out when this error occurs:

If I am working on a package, open a different package and copy a script object and paste into the one I am working on it will not work in 64-bit without a recompile (but will work in 32 bit).

Sounds like it might be a bug. Could you submit it at http://connect.microsoft.com?

-Jamie

JOB With SSIS Step Fails If Package Contains A Script Task

I have a Job Step defined to execute a SSIS Package. This SSIS package contains a Script Task. The Job fails with the message "Package execution failed. The step failed."

I am logging events in the package and when the packages gets to the Script Task the log reports "The script files failed to load". If I disable the Script Task from the package it executes fine.

Curiously, the package runs successfully with the Scritp task enabled using dtexecui and dtexec from the command line.

Only if I include the Package in a job step with the Script Task enabled does it fail.

Any help would be appreciated.I've no real idea about this Steve except to say have you got the script code pre-compiled? If so, try it without (and vica versa).

-Jamie|||

Sorry for not posting this sooner, Jamie...Yes, setting the Script Task "PrecompileScriptIntoBinaryCode" Property to True resolved the issue.

-Steve

|||hey - did you ever get this to work? Im having the same issue - its a SSIS pkg that ran on one box and I moved to another box (64 bit if that makes a diff) - any info is greatly apprecaited - thanks.|||

Juantana,

Yes, set your Script Task "PrecompileScriptIntoBinaryCode" Property=True and then open and close your script. Save, redeploy and it should work. Let me know if you have any questions.

-Steve

|||

Hi,

Inspite of putting the PrecompileScriptintoBinarycode Property = True, it is not working. Do you have any suggestions?

Thanks.

|||After settings Precompile to True, you need to open and close script editor(s) to actually precompile the script in the task. Then re-deploy the pockage to target system.|||Also remove all the break points in the code, having break points prevents from recompilation. I guess this issue would be fixed by SP2|||Just an FYI I had all my scripts set to PrecompileScriptIntoBinaryCode=True however when developing in 32bit and deploying to 64bit seemingly randomly I will get the "script failed to load" error. I just open the script and compile it again and save and that usually does the trick.|||

Also if the problem still persists and you keep getting the same error I am sure you are not taking the dtsx file from bin folder to execute. The file created by designer (Business Intelligence Development Studio) is only useful for debug and development mode. After building the solution one should use dtsx created in bin folder. This solution worked for me.

Thanks

Mohit

|||NOTE: I got this same error message with Precompile option set to True.

Problem was that a variable name script used was not passed in. Go figure|||

I have about 30 dtxs that I would like to recompile its scripts to solve this issue. I would like to avoid to open each package, then open each script in the package and do the save to recomplie.

Is there a way to recompile from the command line?

Thanks!

|||

Previously I said "randomly" but I think I have it figured out when this error occurs:

If I am working on a package, open a different package and copy a script object and paste into the one I am working on it will not work in 64-bit without a recompile (but will work in 32 bit).

|||

Chris Honcoop wrote:

Previously I said "randomly" but I think I have it figured out when this error occurs:

If I am working on a package, open a different package and copy a script object and paste into the one I am working on it will not work in 64-bit without a recompile (but will work in 32 bit).

Sounds like it might be a bug. Could you submit it at http://connect.microsoft.com?

-Jamie

JOB With SSIS Step Fails If Package Contains A Script Task

I have a Job Step defined to execute a SSIS Package. This SSIS package contains a Script Task. The Job fails with the message "Package execution failed. The step failed."

I am logging events in the package and when the packages gets to the Script Task the log reports "The script files failed to load". If I disable the Script Task from the package it executes fine.

Curiously, the package runs successfully with the Scritp task enabled using dtexecui and dtexec from the command line.

Only if I include the Package in a job step with the Script Task enabled does it fail.

Any help would be appreciated.I've no real idea about this Steve except to say have you got the script code pre-compiled? If so, try it without (and vica versa).

-Jamie|||

Sorry for not posting this sooner, Jamie...Yes, setting the Script Task "PrecompileScriptIntoBinaryCode" Property to True resolved the issue.

-Steve

|||hey - did you ever get this to work? Im having the same issue - its a SSIS pkg that ran on one box and I moved to another box (64 bit if that makes a diff) - any info is greatly apprecaited - thanks.|||

Juantana,

Yes, set your Script Task "PrecompileScriptIntoBinaryCode" Property=True and then open and close your script. Save, redeploy and it should work. Let me know if you have any questions.

-Steve

|||

Hi,

Inspite of putting the PrecompileScriptintoBinarycode Property = True, it is not working. Do you have any suggestions?

Thanks.

|||After settings Precompile to True, you need to open and close script editor(s) to actually precompile the script in the task. Then re-deploy the pockage to target system.|||Also remove all the break points in the code, having break points prevents from recompilation. I guess this issue would be fixed by SP2|||Just an FYI I had all my scripts set to PrecompileScriptIntoBinaryCode=True however when developing in 32bit and deploying to 64bit seemingly randomly I will get the "script failed to load" error. I just open the script and compile it again and save and that usually does the trick.|||

Also if the problem still persists and you keep getting the same error I am sure you are not taking the dtsx file from bin folder to execute. The file created by designer (Business Intelligence Development Studio) is only useful for debug and development mode. After building the solution one should use dtsx created in bin folder. This solution worked for me.

Thanks

Mohit

|||NOTE: I got this same error message with Precompile option set to True.

Problem was that a variable name script used was not passed in. Go figure|||

I have about 30 dtxs that I would like to recompile its scripts to solve this issue. I would like to avoid to open each package, then open each script in the package and do the save to recomplie.

Is there a way to recompile from the command line?

Thanks!

|||

Previously I said "randomly" but I think I have it figured out when this error occurs:

If I am working on a package, open a different package and copy a script object and paste into the one I am working on it will not work in 64-bit without a recompile (but will work in 32 bit).

|||

Chris Honcoop wrote:

Previously I said "randomly" but I think I have it figured out when this error occurs:

If I am working on a package, open a different package and copy a script object and paste into the one I am working on it will not work in 64-bit without a recompile (but will work in 32 bit).

Sounds like it might be a bug. Could you submit it at http://connect.microsoft.com?

-Jamie

JOB With SSIS Step Fails If Package Contains A Script Task

I have a Job Step defined to execute a SSIS Package. This SSIS package contains a Script Task. The Job fails with the message "Package execution failed. The step failed."

I am logging events in the package and when the packages gets to the Script Task the log reports "The script files failed to load". If I disable the Script Task from the package it executes fine.

Curiously, the package runs successfully with the Scritp task enabled using dtexecui and dtexec from the command line.

Only if I include the Package in a job step with the Script Task enabled does it fail.

Any help would be appreciated.I've no real idea about this Steve except to say have you got the script code pre-compiled? If so, try it without (and vica versa).

-Jamie|||

Sorry for not posting this sooner, Jamie...Yes, setting the Script Task "PrecompileScriptIntoBinaryCode" Property to True resolved the issue.

-Steve

|||hey - did you ever get this to work? Im having the same issue - its a SSIS pkg that ran on one box and I moved to another box (64 bit if that makes a diff) - any info is greatly apprecaited - thanks.|||

Juantana,

Yes, set your Script Task "PrecompileScriptIntoBinaryCode" Property=True and then open and close your script. Save, redeploy and it should work. Let me know if you have any questions.

-Steve

|||

Hi,

Inspite of putting the PrecompileScriptintoBinarycode Property = True, it is not working. Do you have any suggestions?

Thanks.

|||After settings Precompile to True, you need to open and close script editor(s) to actually precompile the script in the task. Then re-deploy the pockage to target system.|||Also remove all the break points in the code, having break points prevents from recompilation. I guess this issue would be fixed by SP2|||Just an FYI I had all my scripts set to PrecompileScriptIntoBinaryCode=True however when developing in 32bit and deploying to 64bit seemingly randomly I will get the "script failed to load" error. I just open the script and compile it again and save and that usually does the trick.|||

Also if the problem still persists and you keep getting the same error I am sure you are not taking the dtsx file from bin folder to execute. The file created by designer (Business Intelligence Development Studio) is only useful for debug and development mode. After building the solution one should use dtsx created in bin folder. This solution worked for me.

Thanks

Mohit

|||NOTE: I got this same error message with Precompile option set to True.

Problem was that a variable name script used was not passed in. Go figure|||

I have about 30 dtxs that I would like to recompile its scripts to solve this issue. I would like to avoid to open each package, then open each script in the package and do the save to recomplie.

Is there a way to recompile from the command line?

Thanks!

|||

Previously I said "randomly" but I think I have it figured out when this error occurs:

If I am working on a package, open a different package and copy a script object and paste into the one I am working on it will not work in 64-bit without a recompile (but will work in 32 bit).

|||

Chris Honcoop wrote:

Previously I said "randomly" but I think I have it figured out when this error occurs:

If I am working on a package, open a different package and copy a script object and paste into the one I am working on it will not work in 64-bit without a recompile (but will work in 32 bit).

Sounds like it might be a bug. Could you submit it at http://connect.microsoft.com?

-Jamie

JOB With SSIS Step Fails If Package Contains A Script Task

I have a Job Step defined to execute a SSIS Package. This SSIS package contains a Script Task. The Job fails with the message "Package execution failed. The step failed."

I am logging events in the package and when the packages gets to the Script Task the log reports "The script files failed to load". If I disable the Script Task from the package it executes fine.

Curiously, the package runs successfully with the Scritp task enabled using dtexecui and dtexec from the command line.

Only if I include the Package in a job step with the Script Task enabled does it fail.

Any help would be appreciated.I've no real idea about this Steve except to say have you got the script code pre-compiled? If so, try it without (and vica versa).

-Jamie|||

Sorry for not posting this sooner, Jamie...Yes, setting the Script Task "PrecompileScriptIntoBinaryCode" Property to True resolved the issue.

-Steve

|||hey - did you ever get this to work? Im having the same issue - its a SSIS pkg that ran on one box and I moved to another box (64 bit if that makes a diff) - any info is greatly apprecaited - thanks.|||

Juantana,

Yes, set your Script Task "PrecompileScriptIntoBinaryCode" Property=True and then open and close your script. Save, redeploy and it should work. Let me know if you have any questions.

-Steve

|||

Hi,

Inspite of putting the PrecompileScriptintoBinarycode Property = True, it is not working. Do you have any suggestions?

Thanks.

|||After settings Precompile to True, you need to open and close script editor(s) to actually precompile the script in the task. Then re-deploy the pockage to target system.|||Also remove all the break points in the code, having break points prevents from recompilation. I guess this issue would be fixed by SP2|||Just an FYI I had all my scripts set to PrecompileScriptIntoBinaryCode=True however when developing in 32bit and deploying to 64bit seemingly randomly I will get the "script failed to load" error. I just open the script and compile it again and save and that usually does the trick.|||

Also if the problem still persists and you keep getting the same error I am sure you are not taking the dtsx file from bin folder to execute. The file created by designer (Business Intelligence Development Studio) is only useful for debug and development mode. After building the solution one should use dtsx created in bin folder. This solution worked for me.

Thanks

Mohit

|||NOTE: I got this same error message with Precompile option set to True.

Problem was that a variable name script used was not passed in. Go figure|||

I have about 30 dtxs that I would like to recompile its scripts to solve this issue. I would like to avoid to open each package, then open each script in the package and do the save to recomplie.

Is there a way to recompile from the command line?

Thanks!

|||

Previously I said "randomly" but I think I have it figured out when this error occurs:

If I am working on a package, open a different package and copy a script object and paste into the one I am working on it will not work in 64-bit without a recompile (but will work in 32 bit).

|||

Chris Honcoop wrote:

Previously I said "randomly" but I think I have it figured out when this error occurs:

If I am working on a package, open a different package and copy a script object and paste into the one I am working on it will not work in 64-bit without a recompile (but will work in 32 bit).

Sounds like it might be a bug. Could you submit it at http://connect.microsoft.com?

-Jamie

sql

JOB With SSIS Step Fails If Package Contains A Script Task

I have a Job Step defined to execute a SSIS Package. This SSIS package contains a Script Task. The Job fails with the message "Package execution failed. The step failed."

I am logging events in the package and when the packages gets to the Script Task the log reports "The script files failed to load". If I disable the Script Task from the package it executes fine.

Curiously, the package runs successfully with the Scritp task enabled using dtexecui and dtexec from the command line.

Only if I include the Package in a job step with the Script Task enabled does it fail.

Any help would be appreciated.I've no real idea about this Steve except to say have you got the script code pre-compiled? If so, try it without (and vica versa).

-Jamie|||

Sorry for not posting this sooner, Jamie...Yes, setting the Script Task "PrecompileScriptIntoBinaryCode" Property to True resolved the issue.

-Steve

|||hey - did you ever get this to work? Im having the same issue - its a SSIS pkg that ran on one box and I moved to another box (64 bit if that makes a diff) - any info is greatly apprecaited - thanks.|||

Juantana,

Yes, set your Script Task "PrecompileScriptIntoBinaryCode" Property=True and then open and close your script. Save, redeploy and it should work. Let me know if you have any questions.

-Steve

|||

Hi,

Inspite of putting the PrecompileScriptintoBinarycode Property = True, it is not working. Do you have any suggestions?

Thanks.

|||After settings Precompile to True, you need to open and close script editor(s) to actually precompile the script in the task. Then re-deploy the pockage to target system.|||Also remove all the break points in the code, having break points prevents from recompilation. I guess this issue would be fixed by SP2|||Just an FYI I had all my scripts set to PrecompileScriptIntoBinaryCode=True however when developing in 32bit and deploying to 64bit seemingly randomly I will get the "script failed to load" error. I just open the script and compile it again and save and that usually does the trick.|||

Also if the problem still persists and you keep getting the same error I am sure you are not taking the dtsx file from bin folder to execute. The file created by designer (Business Intelligence Development Studio) is only useful for debug and development mode. After building the solution one should use dtsx created in bin folder. This solution worked for me.

Thanks

Mohit

|||NOTE: I got this same error message with Precompile option set to True.

Problem was that a variable name script used was not passed in. Go figure|||

I have about 30 dtxs that I would like to recompile its scripts to solve this issue. I would like to avoid to open each package, then open each script in the package and do the save to recomplie.

Is there a way to recompile from the command line?

Thanks!

|||

Previously I said "randomly" but I think I have it figured out when this error occurs:

If I am working on a package, open a different package and copy a script object and paste into the one I am working on it will not work in 64-bit without a recompile (but will work in 32 bit).

|||

Chris Honcoop wrote:

Previously I said "randomly" but I think I have it figured out when this error occurs:

If I am working on a package, open a different package and copy a script object and paste into the one I am working on it will not work in 64-bit without a recompile (but will work in 32 bit).

Sounds like it might be a bug. Could you submit it at http://connect.microsoft.com?

-Jamie

Job to Execute SSIS package fails

This question has been asked earlier in this forum. But, I still didn't get the correct resolution of my problem.

I am trying to run the DTS package from my filesystem.

Command Line:
/FILE "C:\SSIS\IS\bin\Package1.dtsx" /MAXCONCURRENT " -1 " /CHECKPOINTING OFF

Executed as user: FILESERVER\SQLServiceQA. The command line parameters are invalid. The step failed.

Can anyone please let me know whats the problem with it.

I am not using any Script Task in the package.

Thanks.Does it run if you execute the same command line as the same user using DTExec? If so then you should probably post to the agent forum. If not then what errors do you get when running it from DTExec?

Thanks,
Matt

Wednesday, March 28, 2012

Job succeeds manually but fails if scheduled

My client has a number of jobs that are run overnight. We've set them
up to email me when they're completed. Every morning I get in to a
bunch of emails like this:

<quote
JOB RUN:'Tech Pubs Email Notification' was run on 18/03/2006 at
00:00:00
DURATION:0 hours, 0 minutes, 0 seconds
STATUS: Succeeded
MESSAGES:The job succeeded. The Job was invoked by Schedule 10 (Send
Mail). The last step to run was step 1 (Send Mail).
</quote
However, the most important job - the database backup - fails every
time.

<quote>
JOB RUN:'DB Backup Job for DB Maintenance Plan 'DB Maintenance Plan1''
was run on 20/03/2006 at 18:00:00
DURATION:0 hours, 0 minutes, 2 seconds
STATUS: Failed
MESSAGES:The job failed. The Job was invoked by Schedule 7 (Schedule
1). The last step to run was step 1 (Step 1).
</quote
What's strange is that the job runs successfully if you kick it off
manually (in EM: right-click and "Start Job")!!! Does anyone have any
idea of why that might be? Where to look for diagnostic information?

TIA

Edwardteddysnips@.hotmail.com wrote:
> My client has a number of jobs that are run overnight. We've set them
> up to email me when they're completed. Every morning I get in to a
> bunch of emails like this:
> <quote>
> JOB RUN:'Tech Pubs Email Notification' was run on 18/03/2006 at
> 00:00:00
> DURATION:0 hours, 0 minutes, 0 seconds
> STATUS: Succeeded
> MESSAGES:The job succeeded. The Job was invoked by Schedule 10 (Send
> Mail). The last step to run was step 1 (Send Mail).
> </quote>
> However, the most important job - the database backup - fails every
> time.
> <quote>
> JOB RUN:'DB Backup Job for DB Maintenance Plan 'DB Maintenance Plan1''
> was run on 20/03/2006 at 18:00:00
> DURATION:0 hours, 0 minutes, 2 seconds
> STATUS: Failed
> MESSAGES:The job failed. The Job was invoked by Schedule 7 (Schedule
> 1). The last step to run was step 1 (Step 1).
> </quote>
> What's strange is that the job runs successfully if you kick it off
> manually (in EM: right-click and "Start Job")!!! Does anyone have any
> idea of why that might be? Where to look for diagnostic information?
> TIA
> Edward

Edward,

Have you got the right permissions?

This might sound a bit obvious but its usually the case. You might not
be the owner of the package and the system will only schedule the job
to run if the permissions are correct.

Bryan|||Bryan wrote:
> teddysnips@.hotmail.com wrote:
[...]
>
> Edward,
> Have you got the right permissions?
> This might sound a bit obvious but its usually the case. You might not
> be the owner of the package and the system will only schedule the job
> to run if the permissions are correct.
> Bryan

I would have assumed so, but I'll check. Thanks for the suggestion.

Edward|||Bryan wrote:
[...]
> Edward,
> Have you got the right permissions?
> This might sound a bit obvious but its usually the case. You might not
> be the owner of the package and the system will only schedule the job
> to run if the permissions are correct.

I checked on this today. All the jobs that run correctly have exactly
the same owners/permissions as the job that fails. I can't find any
attribute of the job (apart from what it actually does, obviously) that
distinguishes it from any of the other, successful jobs.

One other thing of which I was not aware. Apparently this job had been
scheduled successfully up until about a week ago, when it began
failing.

Any further thoughts/ideas?

TIA

Edward|||Hi Edward,

To be honest I don't ( I hate admitting defeat though....)

I have encountered this problem in the past especially when picking up
from any previous owners of packages.
One solution I have done is to copy the code from one package and dump
it into a package that is being allowed to run but with a different
owner.
This usually confirmed to myself that the problem must a a Owner /
Permission issue.

Apart from that I'm at a loss to recommend any other course of action.|||(teddysnips@.hotmail.com) writes:
> I checked on this today. All the jobs that run correctly have exactly
> the same owners/permissions as the job that fails. I can't find any
> attribute of the job (apart from what it actually does, obviously) that
> distinguishes it from any of the other, successful jobs.
> One other thing of which I was not aware. Apparently this job had been
> scheduled successfully up until about a week ago, when it began
> failing.
> Any further thoughts/ideas?

Check View History for the job. Don't miss to check View Step History.

Unfortunately, jobs that set up from a maintenance plan does not seem to
write very much useful information, so I'm not really expecting this to
give you anything. (But check nevertheless.) I the maintenance plan has
its own log somewhere, but I don't remember where - or if there was anything
that useful in it.

May you should scrap the plan, and set up the jobs without it.

--
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se

Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/pr...oads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodin...ions/books.mspx|||<teddysnips@.hotmail.com> wrote in message
news:1142932717.825816.151430@.z34g2000cwc.googlegr oups.com...
> My client has a number of jobs that are run overnight. We've set them
> up to email me when they're completed. Every morning I get in to a
> bunch of emails like this:
> <quote>
> JOB RUN: 'Tech Pubs Email Notification' was run on 18/03/2006 at
> 00:00:00
> DURATION: 0 hours, 0 minutes, 0 seconds
> STATUS: Succeeded
> MESSAGES: The job succeeded. The Job was invoked by Schedule 10 (Send
> Mail). The last step to run was step 1 (Send Mail).
> </quote>
> However, the most important job - the database backup - fails every
> time.
> <quote>
> JOB RUN: 'DB Backup Job for DB Maintenance Plan 'DB Maintenance Plan1''
> was run on 20/03/2006 at 18:00:00
> DURATION: 0 hours, 0 minutes, 2 seconds
> STATUS: Failed
> MESSAGES: The job failed. The Job was invoked by Schedule 7 (Schedule
> 1). The last step to run was step 1 (Step 1).
> </quote>
> What's strange is that the job runs successfully if you kick it off
> manually (in EM: right-click and "Start Job")!!! Does anyone have any
> idea of why that might be? Where to look for diagnostic information?
> TIA
> Edward

If it isn't the owner of the job step (sa would be good if sa owns the
database), then check for proper disk
space in the backup destination.

Job step fails with Error 7399 and 7312

We have a job that intermittently fails on the same step with the same error.
Identical jobs that run before this one always succeed. The step uses the
following T-SQL to update a table on a linked server:
DBCC TRACEON (7300, 3604)
INSERT INTO <linkedservername>.ods.dbo.[!ODSReplicationTimes ]
([Date], Imports, Div)
VALUES (dbo.DateOnly(DATEADD(hh, 7, GETDATE())), GETDATE(), 'wbd')
if the job fails, it is exactly the default timeout of 10 minutes after the
start time, and results in this error:
Executed as user: <username>. OLE DB provider 'SQLOLEDB' reported an error.
[SQLSTATE 42000] (Error 7399) [SQLSTATE 01000] (Error 7312) OLE DB error
trace [OLE/DB Provider 'SQLOLEDB' IRowsetChange::InsertRow returned
0x80004005: ]. [SQLSTATE 01000] (Error 7300). The step failed.
on failure, this step proceeds to a similar step which calls a sproc:
DBCC TRACEON (7300, 3604)
CheckFutureStatus 'JobName Finish','!Check Status WBD',240,240,3,'wbd'
yielding a similar result:
Executed as user: <username>. OLE DB provider 'SQLOLEDB' reported an error.
[SQLSTATE 42000] (Error 7399) [SQLSTATE 01000] (Error 7312) OLE DB error
trace [OLE/DB Provider 'SQLOLEDB' IRowsetChange::SetData returned 0x80004005:
]. [SQLSTATE 01000] (Error 7300). The step failed.
I've set the step to retry 3 times/5 minutes to no avail. the traceon
doesn't give any additional information.
Any ideas?
Hello
Did you try to perform a SQL Profiler trace on the server that you have
configured as the linked server to see what happens on that server. I'm
assuming that the query should finish within the 10 minute timeout period.
Thank you for using Microsoft newsgroups.
Sincerely
Pankaj Agarwal
Microsoft Corporation
This posting is provided AS IS with no warranties, and confers no rights.
|||What Events/Data would you want to see. anything in particular for this
situation?
"Pankaj Agarwal [MSFT]" wrote:

> Hello
> Did you try to perform a SQL Profiler trace on the server that you have
> configured as the linked server to see what happens on that server. I'm
> assuming that the query should finish within the 10 minute timeout period.
> Thank you for using Microsoft newsgroups.
> Sincerely
> Pankaj Agarwal
> Microsoft Corporation
> This posting is provided AS IS with no warranties, and confers no rights.
>
|||I think it would be beneficial to capture the Execution Statistics, Stored
Procedure and TSQL related events. Also capture SP Recompile, Database File
Autogrow. This would give you at least a starting point to see what event
is the query stuck on. I would also download the blocker script from the
following KB article and run that to see if there is any blocking when the
query does not complete during the 10 min timeout interval.
271509 INF: How to Monitor SQL Server 2000 Blocking
http://support.microsoft.com/?id=271509
Thank you for using Microsoft newsgroups.
Sincerely
Pankaj Agarwal
Microsoft Corporation
This posting is provided AS IS with no warranties, and confers no rights.

Job step fails with Error 7399 and 7312

We have a job that intermittently fails on the same step with the same error
.
Identical jobs that run before this one always succeed. The step uses the
following T-SQL to update a table on a linked server:
DBCC TRACEON (7300, 3604)
INSERT INTO <linkedservername>.ods.dbo.[!ODSReplicationTimes]
([Date], Imports, Div)
VALUES (dbo.DateOnly(DATEADD(hh, 7, GETDATE())), GETDATE(), 'wbd')
if the job fails, it is exactly the default timeout of 10 minutes after the
start time, and results in this error:
Executed as user: <username>. OLE DB provider 'SQLOLEDB' reported an error.
[SQLSTATE 42000] (Error 7399) [SQLSTATE 01000] (Error 7312) OLE D
B error
trace [OLE/DB Provider 'SQLOLEDB' IRowsetChange::InsertRow returned
0x80004005: ]. [SQLSTATE 01000] (Error 7300). The step failed.
on failure, this step proceeds to a similar step which calls a sproc:
DBCC TRACEON (7300, 3604)
CheckFutureStatus 'JobName Finish','!Check Status WBD',240,240,3,'wbd'
yielding a similar result:
Executed as user: <username>. OLE DB provider 'SQLOLEDB' reported an error.
[SQLSTATE 42000] (Error 7399) [SQLSTATE 01000] (Error 7312) OLE D
B error
trace [OLE/DB Provider 'SQLOLEDB' IRowsetChange::SetData returned 0x8000
4005:
]. [SQLSTATE 01000] (Error 7300). The step failed.
I've set the step to retry 3 times/5 minutes to no avail. the traceon
doesn't give any additional information.
Any ideas?Hello
Did you try to perform a SQL Profiler trace on the server that you have
configured as the linked server to see what happens on that server. I'm
assuming that the query should finish within the 10 minute timeout period.
Thank you for using Microsoft newsgroups.
Sincerely
Pankaj Agarwal
Microsoft Corporation
This posting is provided AS IS with no warranties, and confers no rights.|||What Events/Data would you want to see. anything in particular for this
situation?
"Pankaj Agarwal [MSFT]" wrote:

> Hello
> Did you try to perform a SQL Profiler trace on the server that you have
> configured as the linked server to see what happens on that server. I'm
> assuming that the query should finish within the 10 minute timeout period.
> Thank you for using Microsoft newsgroups.
> Sincerely
> Pankaj Agarwal
> Microsoft Corporation
> This posting is provided AS IS with no warranties, and confers no rights.
>|||I think it would be beneficial to capture the Execution Statistics, Stored
Procedure and TSQL related events. Also capture SP Recompile, Database File
Autogrow. This would give you at least a starting point to see what event
is the query stuck on. I would also download the blocker script from the
following KB article and run that to see if there is any blocking when the
query does not complete during the 10 min timeout interval.
271509 INF: How to Monitor SQL Server 2000 Blocking
http://support.microsoft.com/?id=271509
Thank you for using Microsoft newsgroups.
Sincerely
Pankaj Agarwal
Microsoft Corporation
This posting is provided AS IS with no warranties, and confers no rights.sql

Job step failing with "ConnectionRead (WrapperRead())." error

We have a job step which executes a stored proc that fails intermittently
with this error:
Executed as user: domain\user. ConnectionRead (WrapperRead()). [SQLSTATE
01000] (Message 258) General network error. Check your network
documentation. [SQLSTATE 08S01] (Error 11). The step failed.
This stored proc executes just fine through QA.
Could it be that the SP does execute but that the step is being incorrectly
reported as failed?
Thank you
-- alan cranfield, DBAuse same ANSI settings in job like in QA
"Cranfield" wrote:

> We have a job step which executes a stored proc that fails intermittently
> with this error:
> Executed as user: domain\user. ConnectionRead (WrapperRead()). [SQLSTA
TE
> 01000] (Message 258) General network error. Check your network
> documentation. [SQLSTATE 08S01] (Error 11). The step failed.
> This stored proc executes just fine through QA.
> Could it be that the SP does execute but that the step is being incorrectl
y
> reported as failed?
> Thank you
> --
> -- alan cranfield, DBA|||I tried your suggestion. Job step still fails.
This must be a bug. I will raise a case with PSS.
thanks for the reply.
-- cranfield, DBA
"Aleksandar Grbic" wrote:
[vbcol=seagreen]
> use same ANSI settings in job like in QA
>
> "Cranfield" wrote:
>

Job step failing with "ConnectionRead (WrapperRead())." error

We have a job step which executes a stored proc that fails intermittently
with this error:
Executed as user: domain\user. ConnectionRead (WrapperRead()). [SQLSTATE
01000] (Message 258) General network error. Check your network
documentation. [SQLSTATE 08S01] (Error 11). The step failed.
This stored proc executes just fine through QA.
Could it be that the SP does execute but that the step is being incorrectly
reported as failed?
Thank you
--
-- alan cranfield, DBAuse same ANSI settings in job like in QA
"Cranfield" wrote:
> We have a job step which executes a stored proc that fails intermittently
> with this error:
> Executed as user: domain\user. ConnectionRead (WrapperRead()). [SQLSTATE
> 01000] (Message 258) General network error. Check your network
> documentation. [SQLSTATE 08S01] (Error 11). The step failed.
> This stored proc executes just fine through QA.
> Could it be that the SP does execute but that the step is being incorrectly
> reported as failed?
> Thank you
> --
> -- alan cranfield, DBA|||I tried your suggestion. Job step still fails.
This must be a bug. I will raise a case with PSS.
thanks for the reply.
--
-- cranfield, DBA
"Aleksandar Grbic" wrote:
> use same ANSI settings in job like in QA
>
> "Cranfield" wrote:
> > We have a job step which executes a stored proc that fails intermittently
> > with this error:
> >
> > Executed as user: domain\user. ConnectionRead (WrapperRead()). [SQLSTATE
> > 01000] (Message 258) General network error. Check your network
> > documentation. [SQLSTATE 08S01] (Error 11). The step failed.
> >
> > This stored proc executes just fine through QA.
> >
> > Could it be that the SP does execute but that the step is being incorrectly
> > reported as failed?
> >
> > Thank you
> >
> > --
> > -- alan cranfield, DBA

Job step failing with "ConnectionRead (WrapperRead())." error

We have a job step which executes a stored proc that fails intermittently
with this error:
Executed as user: domain\user. ConnectionRead (WrapperRead()). [SQLSTATE
01000] (Message 258) General network error. Check your network
documentation. [SQLSTATE 08S01] (Error 11). The step failed.
This stored proc executes just fine through QA.
Could it be that the SP does execute but that the step is being incorrectly
reported as failed?
Thank you
-- alan cranfield, DBA
use same ANSI settings in job like in QA
"Cranfield" wrote:

> We have a job step which executes a stored proc that fails intermittently
> with this error:
> Executed as user: domain\user. ConnectionRead (WrapperRead()). [SQLSTATE
> 01000] (Message 258) General network error. Check your network
> documentation. [SQLSTATE 08S01] (Error 11). The step failed.
> This stored proc executes just fine through QA.
> Could it be that the SP does execute but that the step is being incorrectly
> reported as failed?
> Thank you
> --
> -- alan cranfield, DBA
|||I tried your suggestion. Job step still fails.
This must be a bug. I will raise a case with PSS.
thanks for the reply.
-- cranfield, DBA
"Aleksandar Grbic" wrote:
[vbcol=seagreen]
> use same ANSI settings in job like in QA
>
> "Cranfield" wrote:

Monday, March 26, 2012

Job Scheduler fails and Managing Tempdb

Hello,
I have two issues, hoping someone can help.
Issue 1. I have various DTS packages that copy data from Progress
Database to Sql Data Warehouse. The ODBC Connection is stable and
packages have been auto scheduled by creating a job that is managed by
the SQL Agent service to run daily at night. The packages work fine
only in two cases, when running manually by using the DTSRun.exe
command line utility and when the job is manually run in the SQL Agent
service. Auto run is alwayz showing a status failer, I'm now using
windows scheduler and my packs are now always a success.
My question is why is SQL Agent not reliable? I'm 100% sure that there
is absolutely nothing wrong with the packets as they run error free
when manually ran.
Issue 2. My data warehouse is 28 G in size this makes the tempdb to
grow bigger everyday. To downsize it I stop and start the sql service
manually, can you please assist me with a batch command to auto start
and stop the service?
Thanking you in advance
Babalwa
1) This most likely is a permissions issue. Check that the account that SQL
Agent is running under has the appropriate permissions to execute everything
in your DTS package. It can be running under the LocalSystem account, which
doesn't have any permissions outside the local computer.
2) It's not much use shrinking tempdb, as it will grow again when you use
the server. And when tempdb auto grows that will only slow down your server.
You can check however if there are any transaction that stay open in tempdb
for a long time, cause extra space to be used, and prevent reuse of the
transaction log. Use DBCC OPENTRAN ('tempdb') to see if you have any old
transactions.
Otherwise the size of tempdb is something you have to live with, with your
current setup. Check if you can either change your applications and
databases, so tempdb is needed less, or you might have to get more harddisk
space.
Jacco Schalkwijk
SQL Server MVP
"Babalwa Magwentshu" <babalwa@.hotmail.com> wrote in message
news:5999368f.0407280128.75bd9322@.posting.google.c om...
> Hello,
> I have two issues, hoping someone can help.
> Issue 1. I have various DTS packages that copy data from Progress
> Database to Sql Data Warehouse. The ODBC Connection is stable and
> packages have been auto scheduled by creating a job that is managed by
> the SQL Agent service to run daily at night. The packages work fine
> only in two cases, when running manually by using the DTSRun.exe
> command line utility and when the job is manually run in the SQL Agent
> service. Auto run is alwayz showing a status failer, I'm now using
> windows scheduler and my packs are now always a success.
> My question is why is SQL Agent not reliable? I'm 100% sure that there
> is absolutely nothing wrong with the packets as they run error free
> when manually ran.
> Issue 2. My data warehouse is 28 G in size this makes the tempdb to
> grow bigger everyday. To downsize it I stop and start the sql service
> manually, can you please assist me with a batch command to auto start
> and stop the service?
> Thanking you in advance
> Babalwa
|||Thank you for your response Jacco,
Let me briefly expand on the issue, I am the domain administrator and
have designed the data warehouse packets and it's schedules under my
administrative password. I have checked the permissions, security
won't be the issue. Correct me if I'm wrong, this is how I understand
the security concept - for packages created under a Microsoft Windows
NT 4.0 or Microsoft Windows 2000 account, the job runs under the
security context of the account that started SQL Server Agent.
Thank you again,
babalwa@.hotmail.com (Babalwa Magwentshu) wrote in message news:<5999368f.0407280128.75bd9322@.posting.google. com>...
> Hello,
> I have two issues, hoping someone can help.
> Issue 1. I have various DTS packages that copy data from Progress
> Database to Sql Data Warehouse. The ODBC Connection is stable and
> packages have been auto scheduled by creating a job that is managed by
> the SQL Agent service to run daily at night. The packages work fine
> only in two cases, when running manually by using the DTSRun.exe
> command line utility and when the job is manually run in the SQL Agent
> service. Auto run is alwayz showing a status failer, I'm now using
> windows scheduler and my packs are now always a success.
> My question is why is SQL Agent not reliable? I'm 100% sure that there
> is absolutely nothing wrong with the packets as they run error free
> when manually ran.
> Issue 2. My data warehouse is 28 G in size this makes the tempdb to
> grow bigger everyday. To downsize it I stop and start the sql service
> manually, can you please assist me with a batch command to auto start
> and stop the service?
> Thanking you in advance
> Babalwa

Job Scheduler fails and Managing Tempdb

Hello,
I have two issues, hoping someone can help.
Issue 1. I have various DTS packages that copy data from Progress
Database to Sql Data Warehouse. The ODBC Connection is stable and
packages have been auto scheduled by creating a job that is managed by
the SQL Agent service to run daily at night. The packages work fine
only in two cases, when running manually by using the DTSRun.exe
command line utility and when the job is manually run in the SQL Agent
service. Auto run is alwayz showing a status failer, I'm now using
windows scheduler and my packs are now always a success.
My question is why is SQL Agent not reliable? I'm 100% sure that there
is absolutely nothing wrong with the packets as they run error free
when manually ran.
Issue 2. My data warehouse is 28 G in size this makes the tempdb to
grow bigger everyday. To downsize it I stop and start the sql service
manually, can you please assist me with a batch command to auto start
and stop the service?
Thanking you in advance
Babalwa1) This most likely is a permissions issue. Check that the account that SQL
Agent is running under has the appropriate permissions to execute everything
in your DTS package. It can be running under the LocalSystem account, which
doesn't have any permissions outside the local computer.
2) It's not much use shrinking tempdb, as it will grow again when you use
the server. And when tempdb auto grows that will only slow down your server.
You can check however if there are any transaction that stay open in tempdb
for a long time, cause extra space to be used, and prevent reuse of the
transaction log. Use DBCC OPENTRAN ('tempdb') to see if you have any old
transactions.
Otherwise the size of tempdb is something you have to live with, with your
current setup. Check if you can either change your applications and
databases, so tempdb is needed less, or you might have to get more harddisk
space.
Jacco Schalkwijk
SQL Server MVP
"Babalwa Magwentshu" <babalwa@.hotmail.com> wrote in message
news:5999368f.0407280128.75bd9322@.posting.google.com...
> Hello,
> I have two issues, hoping someone can help.
> Issue 1. I have various DTS packages that copy data from Progress
> Database to Sql Data Warehouse. The ODBC Connection is stable and
> packages have been auto scheduled by creating a job that is managed by
> the SQL Agent service to run daily at night. The packages work fine
> only in two cases, when running manually by using the DTSRun.exe
> command line utility and when the job is manually run in the SQL Agent
> service. Auto run is alwayz showing a status failer, I'm now using
> windows scheduler and my packs are now always a success.
> My question is why is SQL Agent not reliable? I'm 100% sure that there
> is absolutely nothing wrong with the packets as they run error free
> when manually ran.
> Issue 2. My data warehouse is 28 G in size this makes the tempdb to
> grow bigger everyday. To downsize it I stop and start the sql service
> manually, can you please assist me with a batch command to auto start
> and stop the service?
> Thanking you in advance
> Babalwa|||Thank you for your response Jacco,
Let me briefly expand on the issue, I am the domain administrator and
have designed the data warehouse packets and it's schedules under my
administrative password. I have checked the permissions, security
won't be the issue. Correct me if I'm wrong, this is how I understand
the security concept - for packages created under a Microsoft Windows
NT 4.0 or Microsoft Windows 2000 account, the job runs under the
security context of the account that started SQL Server Agent.
Thank you again,
babalwa@.hotmail.com (Babalwa Magwentshu) wrote in message news:<5999368f.0407280128.75bd9322
@.posting.google.com>...
> Hello,
> I have two issues, hoping someone can help.
> Issue 1. I have various DTS packages that copy data from Progress
> Database to Sql Data Warehouse. The ODBC Connection is stable and
> packages have been auto scheduled by creating a job that is managed by
> the SQL Agent service to run daily at night. The packages work fine
> only in two cases, when running manually by using the DTSRun.exe
> command line utility and when the job is manually run in the SQL Agent
> service. Auto run is alwayz showing a status failer, I'm now using
> windows scheduler and my packs are now always a success.
> My question is why is SQL Agent not reliable? I'm 100% sure that there
> is absolutely nothing wrong with the packets as they run error free
> when manually ran.
> Issue 2. My data warehouse is 28 G in size this makes the tempdb to
> grow bigger everyday. To downsize it I stop and start the sql service
> manually, can you please assist me with a batch command to auto start
> and stop the service?
> Thanking you in advance
> Babalwasql

Friday, March 23, 2012

Job schedule success/failure

I have a job that has 4 steps, all of which have to run. They are all T-SQL
scripts. How can I get the job to report failure if step 2 fails but step 4
doesn't?Stephanie wrote:
> I have a job that has 4 steps, all of which have to run. They are
> all T-SQL scripts. How can I get the job to report failure if step 2
> fails but step 4 doesn't?
Are you saying you want the job to continue should one of the steps fail
or are you saying you want all the steps to run regardless of success
and have the job report failure if any of the steps fail. If the latter,
then how will you resolve running the failed step?
David Gugick
Quest Software
www.imceda.com
www.quest.com|||I want all steps to run regardless of the status of a particular step but I
want failure of the job as a whole to be based on failure of any one of the
steps. If a step fails, I will worry about that later. But I'd like all
steps to run no matter what and I'd like to get an alert if any of the steps
fail as opposed to just the last step.
"David Gugick" wrote:

> Stephanie wrote:
> Are you saying you want the job to continue should one of the steps fail
> or are you saying you want all the steps to run regardless of success
> and have the job report failure if any of the steps fail. If the latter,
> then how will you resolve running the failed step?
> --
> David Gugick
> Quest Software
> www.imceda.com
> www.quest.com
>|||Stephanie wrote:
> I want all steps to run regardless of the status of a particular step
> but I want failure of the job as a whole to be based on failure of
> any one of the steps. If a step fails, I will worry about that
> later. But I'd like all steps to run no matter what and I'd like to
> get an alert if any of the steps fail as opposed to just the last
> step.
You can check the "Append output to step history" for each step. In the
morning, examine the job history and show step detail. If there were any
failed steps, you'll see them there. Make sure each step is created to
proceed to the next step on failure or success.
select
j.name,
h.*
From
msdb..sysjobs j inner join
msdb..sysjobhistory h
On
j.job_id = h.job_id
and
j.name = N'<job_name>'
and
h.sql_message_id > 0
Order By
j.name,
h.run_date,
h.run_time
-- OR
Exec msdb..sp_help_jobhistory @.job_name = N'<job_name>', @.run_status = 0
David Gugick
Quest Software
www.imceda.com
www.quest.comsql

Wednesday, March 21, 2012

Job owned by a non-sysadmin fails to run

Hi,
The preblem has been already discussed but none of answers help me.
I have a SQL Server 2000 SP4. Users that are not sysadmins create jobs for
SQL Server Agent. These jobs consist of a single CmdExec step.
As advised in many posts I created a Proxy SQL Server Agent account
(sqlproxy) but this did not help me, the jobs still fail to run. This
account is a windows account. I made this account belong to the sysadmins
role of SQL Server.
Both SQL Server and SQL Server Agent run under a special account
(sqlservice).
I added the account sqlservice to Administrators as advised in the article
http://support.microsoft.com/kb/833559 and even added to Administrators the
account sqlproxy, although the article states I did not have to.
The message I get in EventLog is like below:
-- message start
SQL Server Scheduled Job '<job name>' (0xEFC686299E5B9249957CC5FCF5C782C4) -
Status: Failed - Invoked on: 2006-12-18 12:06:06 - Message: The job failed.
The Job was invoked by User <domain>\<user>. The last step to run was step
1 (<step name> ).
-- message end
The command of the CmdExec step runs fine if I login as user as well as
sqlproxy.
I set the output file in advanced properties of the CmdExec step but did not
see in that file anything. Looks like the job does not start at all.
When I add user acocunt to the group Administrators the job runs
successfully, but this is definitely not an option.
I would appreciate any help as I run out of ideas already.Try changing the owner of the job to SA or another sysadmin.
Ivan Gerken wrote:
> Hi,
> The preblem has been already discussed but none of answers help me.
> I have a SQL Server 2000 SP4. Users that are not sysadmins create jobs for
> SQL Server Agent. These jobs consist of a single CmdExec step.
> As advised in many posts I created a Proxy SQL Server Agent account
> (sqlproxy) but this did not help me, the jobs still fail to run. This
> account is a windows account. I made this account belong to the sysadmins
> role of SQL Server.
> Both SQL Server and SQL Server Agent run under a special account
> (sqlservice).
> I added the account sqlservice to Administrators as advised in the article
> http://support.microsoft.com/kb/833559 and even added to Administrators th
e
> account sqlproxy, although the article states I did not have to.
> The message I get in EventLog is like below:
> -- message start
> SQL Server Scheduled Job '<job name>' (0xEFC686299E5B9249957CC5FCF5C782C4)
-
> Status: Failed - Invoked on: 2006-12-18 12:06:06 - Message: The job failed
.
> The Job was invoked by User <domain>\<user>. The last step to run was ste
p
> 1 (<step name> ).
> -- message end
> The command of the CmdExec step runs fine if I login as user as well as
> sqlproxy.
> I set the output file in advanced properties of the CmdExec step but did n
ot
> see in that file anything. Looks like the job does not start at all.
> When I add user acocunt to the group Administrators the job runs
> successfully, but this is definitely not an option.
> I would appreciate any help as I run out of ideas already.|||This quick solution is not an option.
Jobs are created from an external application (Business Desk of Commerce
Server 2002). A created job is owned by the user logged in to the
application (using win integrated authentication).
I definitely don't want to give users administrative privileges.
"PSPDBA" <DissendiumDBA@.gmail.com> wrote in message
news:1166704870.838450.230270@.f1g2000cwa.googlegroups.com...
> Try changing the owner of the job to SA or another sysadmin.
>|||For debugging, try temporarily changing the job owner to 'sa' as PSPDBA
suggested. If the job still fails, then the problem is related to running
as a job rather than a security issue.
Exactly what does the CmdExec step do? Are mapped drives accessed?
Hope this helps.
Dan Guzman
SQL Server MVP
"Ivan Gerken" <testivan@.waterproof.nl> wrote in message
news:uE$3GeRJHHA.420@.TK2MSFTNGP06.phx.gbl...
> This quick solution is not an option.
> Jobs are created from an external application (Business Desk of Commerce
> Server 2002). A created job is owned by the user logged in to the
> application (using win integrated authentication).
> I definitely don't want to give users administrative privileges.
> "PSPDBA" <DissendiumDBA@.gmail.com> wrote in message
> news:1166704870.838450.230270@.f1g2000cwa.googlegroups.com...
>|||Looks like I have a problem with CmdExec jobs in general.
I changed the step command to "dir c:\temp" and it ran fine when owned by an
admin but failed when owned by a user. In case of being owned by a user even
the output file was not created. The folder c:\temp has "full control"
permission granted to everyone.
"Dan Guzman" <guzmanda@.nospam-online.sbcglobal.net> wrote in message
news:42FF0C9C-4AFC-42AF-9464-0931A1DDE13B@.microsoft.com...
> I'm starting to run out of ideas. Do you have any CmdExec job steps that
> successfully run as non-sysadmin users or is it just dmlrun.exe that has
> the problem?
> --
> Hope this helps.
> Dan Guzman
> SQL Server MVP|||Lets make sure I have the relevant details right since so much has been
discussed in this thread:
- SQL Server service and SQL Server Agent service run under the same
account
- The account is a member of the local administrators group
- xp_cmdshell runs fine when involed by non-sysadmins
- CmdExec jobs fail for jobs owned by non-sysadmins
What I find strange is that xp_cmdshell works but CmdExec doesn't. I can
see how this might be the case if you used different service accounts and
the SQL Agent service account lacked the advanced user rights (e.g. 'act as
part of the operating system' and 'replace a process-level token') that are
needed to switch security context to the proxy account.
Can you double-check to ensure the same service account is used for SQL
Server and SQL Server Agent services? If you have made changes to service
account security, have you since restarted the service? In some cases, a
server restart in needed in order for security changes to fully take affect.
Happy Holidays
Dan Guzman
SQL Server MVP
"Ivan Gerken" <testivan@.waterproof.nl> wrote in message
news:%239rDNPAKHHA.2236@.TK2MSFTNGP02.phx.gbl...
> Looks like I have a problem with CmdExec jobs in general.
> I changed the step command to "dir c:\temp" and it ran fine when owned by
> an admin but failed when owned by a user. In case of being owned by a user
> even the output file was not created. The folder c:\temp has "full
> control" permission granted to everyone.
> "Dan Guzman" <guzmanda@.nospam-online.sbcglobal.net> wrote in message
> news:42FF0C9C-4AFC-42AF-9464-0931A1DDE13B@.microsoft.com...
>|||- SQL Server service and SQL Server Agent service run under the same
account
Yes, referred to earlier as sqlservice. However, the services MSSEARCH,
MSSQLServerADHelper, MSSQLServerOLAPService run under Local System (I think
it hardly matters but just in case).
- The account is a member of the local administrators group
Yes, plus OLAP Administrators and Users.
- xp_cmdshell runs fine when involed by non-sysadmins
Yes. User account is a member of Users and Remote Desktop Users.
- CmdExec jobs fail for jobs owned by non-sysadmins
Yes, even after restarting both MSSQLSERVER and SQLSERVERAGENT.
"Dan Guzman" <guzmanda@.nospam-online.sbcglobal.net> wrote in message
news:A7AC10BD-AE8F-4C96-ADE3-1F1603A38D9C@.microsoft.com...
> Lets make sure I have the relevant details right since so much has been
> discussed in this thread:
> - SQL Server service and SQL Server Agent service run under the same
> account
> - The account is a member of the local administrators group
> - xp_cmdshell runs fine when involed by non-sysadmins
> - CmdExec jobs fail for jobs owned by non-sysadmins
> What I find strange is that xp_cmdshell works but CmdExec doesn't. I can
> see how this might be the case if you used different service accounts and
> the SQL Agent service account lacked the advanced user rights (e.g. 'act
> as part of the operating system' and 'replace a process-level token') that
> are needed to switch security context to the proxy account.
> Can you double-check to ensure the same service account is used for SQL
> Server and SQL Server Agent services? If you have made changes to service
> account security, have you since restarted the service? In some cases, a
> server restart in needed in order for security changes to fully take
> affect.
> --
> Happy Holidays
> Dan Guzman
> SQL Server MVP|||> Yes, even after restarting both MSSQLSERVER and SQLSERVERAGENT.
Have you restarted the server since you added the sqlservice account to the
local Administrator's group? Although not normally required, I've seen
occasions where a restart was needed to pickup the group membership change.
BTW, are there any related messages in the SQL Agent log files?
Hope this helps.
Dan Guzman
SQL Server MVP
"Ivan Gerken" <testivan@.waterproof.nl> wrote in message
news:uVs$ZSQKHHA.2232@.TK2MSFTNGP02.phx.gbl...
>- SQL Server service and SQL Server Agent service run under the same
>account
> Yes, referred to earlier as sqlservice. However, the services MSSEARCH,
> MSSQLServerADHelper, MSSQLServerOLAPService run under Local System (I
> think it hardly matters but just in case).
> - The account is a member of the local administrators group
> Yes, plus OLAP Administrators and Users.
> - xp_cmdshell runs fine when involed by non-sysadmins
> Yes. User account is a member of Users and Remote Desktop Users.
> - CmdExec jobs fail for jobs owned by non-sysadmins
> Yes, even after restarting both MSSQLSERVER and SQLSERVERAGENT.
>
> "Dan Guzman" <guzmanda@.nospam-online.sbcglobal.net> wrote in message
> news:A7AC10BD-AE8F-4C96-ADE3-1F1603A38D9C@.microsoft.com...
>|||Dan,
Many thanks for a great hint! I checked SQL Agent log and found there the
following message:
[136] Job <job name> reported: Warning: cannot write logfile
c:\temp\dmout.txt. Error 5 : Access is denied
Then, after I have cleared the "Output file" box the job executed
successfully. So the problem seems to be solved.
However, I find this error very odd because full control is granted to
everyone on c:\temp
Many thanks for your help!
"Dan Guzman" <guzmanda@.nospam-online.sbcglobal.net> wrote in message
news:C7F5CB7B-970A-4EFF-885C-772ED21ECF0A@.microsoft.com...
> Have you restarted the server since you added the sqlservice account to
> the local Administrator's group? Although not normally required, I've
> seen occasions where a restart was needed to pickup the group membership
> change.
> BTW, are there any related messages in the SQL Agent log files?
> --
> Hope this helps.
> Dan Guzman
> SQL Server MVP
> "Ivan Gerken" <testivan@.waterproof.nl> wrote in message
> news:uVs$ZSQKHHA.2232@.TK2MSFTNGP02.phx.gbl...
>|||I'm glad you were able to get it sorted out. I'm sorry I didn't suggest
checking the log earlier.

> However, I find this error very odd because full control is granted to
> everyone on c:\temp
The error 5 can be caused by the file being used by another process or that
the read-only attribute is set.
Hope this helps.
Dan Guzman
SQL Server MVP
"Ivan Gerken" <testivan@.waterproof.nl> wrote in message
news:eLhoAacKHHA.3424@.TK2MSFTNGP02.phx.gbl...
> Dan,
> Many thanks for a great hint! I checked SQL Agent log and found there the
> following message:
> [136] Job <job name> reported: Warning: cannot write logfile
> c:\temp\dmout.txt. Error 5 : Access is denied
> Then, after I have cleared the "Output file" box the job executed
> successfully. So the problem seems to be solved.
> However, I find this error very odd because full control is granted to
> everyone on c:\temp
> Many thanks for your help!
>
> "Dan Guzman" <guzmanda@.nospam-online.sbcglobal.net> wrote in message
> news:C7F5CB7B-970A-4EFF-885C-772ED21ECF0A@.microsoft.com...
>

Monday, March 12, 2012

Job Failure

Dear All

A quick question:

I run a set of scheduled jobs. The jobs run Stored Procedures.
However, if the sproc fails, the job quits and moves on to the next
one. However, the sproc should carry on. E.g, IF find bad records,
EXPORT to file, GO TO NEXT RECORD (BUT DON'T QUIT THE SPROC). The job
scheduler does not allow this, therefore, the sproc does not get a
chance to finish.

Is there a way to ensure a sproc can finish before moving on to the
next step?

Thanks

Simonaaronss@.the-mdu.com (Simon) wrote in message news:<f526ea06.0402050300.499b8b81@.posting.google.com>...
> Dear All
> A quick question:
> I run a set of scheduled jobs. The jobs run Stored Procedures.
> However, if the sproc fails, the job quits and moves on to the next
> one. However, the sproc should carry on. E.g, IF find bad records,
> EXPORT to file, GO TO NEXT RECORD (BUT DON'T QUIT THE SPROC). The job
> scheduler does not allow this, therefore, the sproc does not get a
> chance to finish.
> Is there a way to ensure a sproc can finish before moving on to the
> next step?
> Thanks
> Simon

I'm not sure I understand you exactly - I guess the issue is not
really the scheduling, but rather how to handle an error in your
stored procedure? If the procedure exists because of an error, then
control goes back to the scheduled job. You could set a number of
retry attempts for that job step, but it would probably be better to
handle or prevent the error in your procedure. There are some useful
articles here:

http://www.sommarskog.se/index.html

If this isn't helpful, perhaps you could post (some of) your procedure
code to show where it fails, along with the error message, and someone
may be able to suggest how to handle the error condition.

Simon