Friday, March 30, 2012

Jobs

I have created several jobs which contains several steps inside it. Now with the schedule it runs very frequently and more than 100 jobs are running almost simulteneously. so I want to limit the number of jobs running at a time say at a time I want to run 10 jobs after that other 10 set and s
How can I do it help me ou
Regards
SunilYou can adapt schedules, or you can create a wrapper job. This wrapper job
can start other jobs on each own step using the sp_start_job system SP.
--
Dejan Sarka, SQL Server MVP
Please reply only to the newsgroups.
"Sunil" <anonymous@.discussions.microsoft.com> wrote in message
news:9250860D-15DE-408D-A5A8-3CA7FAD56CC4@.microsoft.com...
> I have created several jobs which contains several steps inside it. Now
with the schedule it runs very frequently and more than 100 jobs are running
almost simulteneously. so I want to limit the number of jobs running at a
time say at a time I want to run 10 jobs after that other 10 set and so
>
> How can I do it help me out
> Regards,
> Sunil

JOBS

Hi All,
I am currently running a job that backs up a database using
the following code
BACKUP DATABASE [Leads] TO DISK = N'\\mymachine\d$\backup\leads.bak' WITH INIT , NOUNLOAD ,
NAME = N'Leads backup2', SKIP , STATS = 10, noFORMAT
this most of the time works fine but every now and again it
complete's fine but keeps the database locked open, is
there any way to fix this'
Thanks PhilWhat exactly do you mean by "Locked Open"?
--
Andrew J. Kelly SQL MVP
"Phil" <harlequintp@.blazemail.com> wrote in message
news:051801c3fabf$38fedc30$a101280a@.phx.gbl...
> Hi All,
> I am currently running a job that backs up a database using
> the following code
> BACKUP DATABASE [Leads] TO DISK => N'\\mymachine\d$\backup\leads.bak' WITH INIT , NOUNLOAD ,
> NAME = N'Leads backup2', SKIP , STATS = 10, noFORMAT
> this most of the time works fine but every now and again it
> complete's fine but keeps the database locked open, is
> there any way to fix this'
> Thanks Phil|||Sorry I should of been a little clearer, what I mean is
although the job says that it has completed, if you look
at the locks section in the SQL Enterprise Window, you
can still see a lock on the database showing the code
featured below.
Thanks again for your help!
Phil
>--Original Message--
>What exactly do you mean by "Locked Open"?
>--
>Andrew J. Kelly SQL MVP
>
>"Phil" <harlequintp@.blazemail.com> wrote in message
>news:051801c3fabf$38fedc30$a101280a@.phx.gbl...
>> Hi All,
>> I am currently running a job that backs up a database
using
>> the following code
>> BACKUP DATABASE [Leads] TO DISK =>> N'\\mymachine\d$\backup\leads.bak' WITH INIT ,
NOUNLOAD ,
>> NAME = N'Leads backup2', SKIP , STATS = 10,
noFORMAT
>> this most of the time works fine but every now and
again it
>> complete's fine but keeps the database locked open, is
>> there any way to fix this'
>> Thanks Phil
>
>.
>|||You have to be even clear than that<g>. What type of lock does it have?
Are these locks causing problems? It is normal for any connection to a db
to have at least a shared lock on the db. You also have to refresh EM in
order for most of the windows to show up to date information.
--
Andrew J. Kelly SQL MVP
<anonymous@.discussions.microsoft.com> wrote in message
news:0bd801c3fb1f$b4c51a40$a401280a@.phx.gbl...
> Sorry I should of been a little clearer, what I mean is
> although the job says that it has completed, if you look
> at the locks section in the SQL Enterprise Window, you
> can still see a lock on the database showing the code
> featured below.
> Thanks again for your help!
> Phil
> >--Original Message--
> >What exactly do you mean by "Locked Open"?
> >
> >--
> >Andrew J. Kelly SQL MVP
> >
> >
> >"Phil" <harlequintp@.blazemail.com> wrote in message
> >news:051801c3fabf$38fedc30$a101280a@.phx.gbl...
> >> Hi All,
> >>
> >> I am currently running a job that backs up a database
> using
> >> the following code
> >>
> >> BACKUP DATABASE [Leads] TO DISK => >> N'\\mymachine\d$\backup\leads.bak' WITH INIT ,
> NOUNLOAD ,
> >> NAME = N'Leads backup2', SKIP , STATS = 10,
> noFORMAT
> >>
> >> this most of the time works fine but every now and
> again it
> >> complete's fine but keeps the database locked open, is
> >> there any way to fix this'
> >>
> >> Thanks Phil
> >
> >
> >.
> >sql

Jobs

I have created several jobs which contains several steps inside it. Now with
the schedule it runs very frequently and more than 100 jobs are running alm
ost simulteneously. so I want to limit the number of jobs running at a time
say at a time I want to run
10 jobs after that other 10 set and so
How can I do it help me out
Regards,
SunilYou can adapt schedules, or you can create a wrapper job. This wrapper job
can start other jobs on each own step using the sp_start_job system SP.
Dejan Sarka, SQL Server MVP
Please reply only to the newsgroups.
"Sunil" <anonymous@.discussions.microsoft.com> wrote in message
news:9250860D-15DE-408D-A5A8-3CA7FAD56CC4@.microsoft.com...
quote:

> I have created several jobs which contains several steps inside it. Now

with the schedule it runs very frequently and more than 100 jobs are running
almost simulteneously. so I want to limit the number of jobs running at a
time say at a time I want to run 10 jobs after that other 10 set and so
quote:

>
> How can I do it help me out
> Regards,
> Sunil

Jobs

Hi,
Using Enterprise Manager, when i create a job, The job properties dialog box
has Schedules tab. In Schedules tab I have option to create New Altert.
I am trying to undrestand why would i need to create an alert in Job
Schadules.
Any help would be apprecited,
AliHi,
By setting an alert on a job, you can send a Email / Pager message to get
the status of your job.
Thanks
Hari
MCDBA
"A.M" <IHateSpam@.sapm123.com> wrote in message
news:eZuJTct3DHA.4060@.TK2MSFTNGP11.phx.gbl...
quote:

> Hi,
> Using Enterprise Manager, when i create a job, The job properties dialog

box
quote:

> has Schedules tab. In Schedules tab I have option to create New Altert.
> I am trying to undrestand why would i need to create an alert in Job
> Schadules.
> Any help would be apprecited,
> Ali
>
|||Hi Ali,
Thank you for using MSDN Newsgroup!
I would like to follow up on this issue and see if you still have questions
about this issue. Should you have any questions, please feel free to post
here. Looking forward to your reply!
Best regards
Baisong Wei
Microsoft Online Support
----
Get Secure! - www.microsoft.com/security
This posting is provided "as is" with no warranties and confers no rights.
Please reply to newsgroups only. Thanks.

Jobs

Good morning,
What I can do to allow developers have access to view the jobs I created
with out giving them administrator permission?
Thanks.Hi
Doesn't member of db_owner fixed database role have an access to the jobs?
"Matthew Z" <MatthewZ@.discussions.microsoft.com> wrote in message
news:4315658A-3079-41F7-B8E9-45225924CDAF@.microsoft.com...
> Good morning,
> What I can do to allow developers have access to view the jobs I created
> with out giving them administrator permission?
> Thanks.|||Thans. but I don't want them to be a member of db_owner.
"Uri Dimant" wrote:

> Hi
> Doesn't member of db_owner fixed database role have an access to the job
s?
>
>
> "Matthew Z" <MatthewZ@.discussions.microsoft.com> wrote in message
> news:4315658A-3079-41F7-B8E9-45225924CDAF@.microsoft.com...
>
>|||There are no special roles for job management in 2000. There are in 2005.
Tibor Karaszi, SQL Server MVP
http://www.karaszi.com/sqlserver/default.asp
http://www.solidqualitylearning.com/
"Matthew Z" <MatthewZ@.discussions.microsoft.com> wrote in message
news:4315658A-3079-41F7-B8E9-45225924CDAF@.microsoft.com...
> Good morning,
> What I can do to allow developers have access to view the jobs I created
> with out giving them administrator permission?
> Thanks.|||There is a role in msdb called TargetserverRole, which allows members to vie
w
the jobs in EM.
Just keep in mind this is undocumented and subject to change. For instance,
permissions this role had were changed with SP3. So if you make use of it,
test in dev before slapping SP4 on in production (if/when it comes out).
I got above info from following discussion at sqlservercentral:
[url]http://www.sqlservercentral.com/forums/shwmessage.aspx?forumid=5&messageid=108953[
/url]
"Matthew Z" wrote:

> Good morning,
> What I can do to allow developers have access to view the jobs I created
> with out giving them administrator permission?
> Thanks.

JOBS

Thanks for the help on this, when you go to the Locks/Object
section of Enterprise Manager, and click the drop down you
see the (said database in there, when you look at the
process's on the left hand screen, you have a ProcessID of
15, locktype = DB, Mode = U, Status = Grant, Owner = Sess
when you click for the properties of the process id, you
get the backup statement previously discussed, even though
the job that run's the script has finished and completed
sucessesfully, I'm afriad I cant go into any more detail
than that.
Sorry, PhilDid you try refreshing the screen? You can right click on the node above
the locks and choose "Refresh". See if that helps. Also run cp_locks in QA
and see if thoe locks still show up.
Andrew J. Kelly SQL MVP
"Phil" <anonymous@.discussions.microsoft.com> wrote in message
news:168201c3fbb8$02a5a430$a101280a@.phx.gbl...
> Thanks for the help on this, when you go to the Locks/Object
> section of Enterprise Manager, and click the drop down you
> see the (said database in there, when you look at the
> process's on the left hand screen, you have a ProcessID of
> 15, locktype = DB, Mode = U, Status = Grant, Owner = Sess
> when you click for the properties of the process id, you
> get the backup statement previously discussed, even though
> the job that run's the script has finished and completed
> sucessesfully, I'm afriad I cant go into any more detail
> than that.
> Sorry, Phil

JOBS

Hi All,
I am currently running a job that backs up a database using
the following code
BACKUP DATABASE [Leads] TO DISK =
N'\\mymachine\d$\backup\leads.bak' WITH INIT , NOUNLOAD ,
NAME = N'Leads backup2', SKIP , STATS = 10, noFORMAT
this most of the time works fine but every now and again it
complete's fine but keeps the database locked open, is
there any way to fix this'
Thanks PhilWhat exactly do you mean by "Locked Open"?
Andrew J. Kelly SQL MVP
"Phil" <harlequintp@.blazemail.com> wrote in message
news:051801c3fabf$38fedc30$a101280a@.phx.gbl...
> Hi All,
> I am currently running a job that backs up a database using
> the following code
> BACKUP DATABASE [Leads] TO DISK =
> N'\\mymachine\d$\backup\leads.bak' WITH INIT , NOUNLOAD ,
> NAME = N'Leads backup2', SKIP , STATS = 10, noFORMAT
> this most of the time works fine but every now and again it
> complete's fine but keeps the database locked open, is
> there any way to fix this'
> Thanks Phil|||Sorry I should of been a little clearer, what I mean is
although the job says that it has completed, if you look
at the locks section in the SQL Enterprise Window, you
can still see a lock on the database showing the code
featured below.
Thanks again for your help!
Phil
>--Original Message--
>What exactly do you mean by "Locked Open"?
>--
>Andrew J. Kelly SQL MVP
>
>"Phil" <harlequintp@.blazemail.com> wrote in message
>news:051801c3fabf$38fedc30$a101280a@.phx.gbl...
using
NOUNLOAD ,
noFORMAT
again it
>
>.
>|||You have to be even clear than that<g>. What type of lock does it have?
Are these locks causing problems? It is normal for any connection to a db
to have at least a shared lock on the db. You also have to refresh EM in
order for most of the windows to show up to date information.
Andrew J. Kelly SQL MVP
<anonymous@.discussions.microsoft.com> wrote in message
news:0bd801c3fb1f$b4c51a40$a401280a@.phx.gbl...
> Sorry I should of been a little clearer, what I mean is
> although the job says that it has completed, if you look
> at the locks section in the SQL Enterprise Window, you
> can still see a lock on the database showing the code
> featured below.
> Thanks again for your help!
> Phil
> using
> NOUNLOAD ,
> noFORMAT
> again itsql

Jobbing!

I want to retrieve all those records from a column (that stores date
values) where in the date is 1 + today's date. For e.g. today is
20/09/2005. I want all those records where the date is 21/09/2005.
I want to send a reminder mail to all those records retrieved that they
have to make the payment latest by tomorrow which is the due date.
That's the reason why I am fetching all those records where the date is
1 + today's date. I am implementing this by creating a job & scheduling
it to run everyday at one particular time. This is the code:
---
DECLARE
@.getduedate varchar(20),
@.msg varchar(3000),
@.email varchar(100),
@.person varchar(50)
SET @.getduedate=(SELECT DDate FROM MyTable WHERE
DDate=CONVERT(char(20),GETDATE()+1,1)))
IF (@.getduedate<>"")
BEGIN
SET @.email=(SELECT EMail FROM MyTable WHERE
DDate=CONVERT(char(20),GETDATE()+1,1))
SET @.person=(SELECT Person FROM MyTable WHERE
DDate=CONVERT(char(20),GETDATE()+1,1))
SET @.msg='To ' + @.person + ','
SET @.msg=@.msg + 'Your payment is due for tomorrow.'
EXEC master.dbo.xp_sendmail
@.recipients=@.email,
@.subject='Payment Due Date Reminder!',
@.message=@.msg
END
---
But the above generates the "Subquery returned more than 1 value" error
when more than 1 record matches the criteria. How do I resolve this?
Thanks,
Arpan
You have to do that in a loop if it contains more than one rows
(untested)
DECLARE
@.getduedate varchar(20),
@.msg varchar(3000),
@.email varchar(100),
@.person varchar(50),
@.RowCount int,
@.I INT
SET @.I = 0
CREATE TABLE #Mails
(
Counter INT identity(1,1),
DDate varchar(200),
EMail varchar(200),
Person varchar(200)
)
INSERT INTO #Mails(DDate,EMail,Person)
SELECT DDate,EMail,Person FROM MyTable WHERE
DDate=CONVERT(char(20),GETDATE()+1,1)
SET @.Rowcount = @.@.Rowcount
WHILE @.I < @.RowCount
BEGIN
Select @.getduedate= DDate,
@.email = EMail,
@.person = Person,
@.msg = 'To ' + @.person + ',' + 'Your payment is
due for tomorrow.'
FROM #Mails
Where Counter = @.I
EXEC master.dbo.xp_sendmail
@.recipients=@.email,
@.subject='Payment Due Date Reminder!',
@.message=@.msg
END
HTH, jens Suessmeyer.|||Sorry should be :
WHILE @.I <= @.RowCount|||Thanks, Jens, for your help although a couple of minor issues gave me a
big headache :-)
There was no code to increment the variable @.l at the end of the WHILE
loop which was why it created an infinite loop!
Secondly the temp table #Mails is being created with an Identity column
initialized to 1 & incrementing by 1 for subsequent records but you
have initialized @.l to 0 which is why the job wasn't succeeding since
xp_sendmail wasn't getting any value for the mandatory @.email parameter
when @.l=0!
Anyways, thanks a lot once again for your help. I really appreciate the
efforts & time you have put in to help me out.
BTW, isn't there any other approach other than what you have shown (no
cursors......please)?
Regards,
Arpan

job_id not found

Hi
I am getting a wierd message when I click on the SnapShot Agent.
Error 14262, The Specified @.job_id (###) does not exist
any idea what could have caused this and what I can do to resolve this
thanks
P
Has someone deleted the job using the management folder in EM? Have a look
in the jobs subfolder and see if the snapshot job has been removed or not.
If it has, then the easiest solution is to set up the publication once
again.
Cheers,
Paul Ibison SQL Server MVP, www.replicationanswers.com
(recommended sql server 2000 replication book:
http://www.nwsu.com/0974973602p.html)

Job: Export to File: no headers/trailers

Hi,
I tryed to Export a query to a Fiel with a Job.
What I found in my file was this:
"
Job 'ExportToSiclid' : Step 1, 'ExportFile' : Began Executing 2004-06-22
13:16:07
tdprNFoy tdprNCCD CallDate TEL Choice
tdprNSocFin Agent
-- -- -- -- -- --
-- --
4240001034 9100 2004-06-22-13.13.48.71700 0478385115 Informatie 110
Van den Hel, Heidi
4276454891 9100 2004-06-22-13.13.57.23300 0 FIDI 104
Tanghe, Wendy
(2 rows(s) affected)
"
What I actually expected and wanted was this:
"
42400010349100 2004-06-22-13.13.48.717000478385115 Informatie110
Van den Hel, Heidi
42764548919100 2004-06-22-13.13.57.233000 FIDI 104
Tanghe, Wendy
"
So without the header, without the trailer, without the names of the
columns, and without a space between the columns.
Is this possible or not, and if so: how?
Thanks a lot in advance,
PieterHi
This looks like you are starting a command prompt and running osql/isql? You
may want to look at adding SET NOCOUNT ON to the SQL and using the -h flag
to osql/isql. Alternatively try the
DTS Export wizard or BCP utility. Mor information on all of these can be
found in Books Online.
John
"DraguVaso" <pietercoucke@.hotmail.com> wrote in message
news:OqhPeoEWEHA.208@.TK2MSFTNGP10.phx.gbl...
> Hi,
> I tryed to Export a query to a Fiel with a Job.
> What I found in my file was this:
> "
> Job 'ExportToSiclid' : Step 1, 'ExportFile' : Began Executing 2004-06-22
> 13:16:07
> tdprNFoy tdprNCCD CallDate TEL Choice
> tdprNSocFin Agent
> -- -- -- -- -- -
--
> -- --
> 4240001034 9100 2004-06-22-13.13.48.71700 0478385115 Informatie
110
> Van den Hel, Heidi
> 4276454891 9100 2004-06-22-13.13.57.23300 0 FIDI
104
> Tanghe, Wendy
> (2 rows(s) affected)
> "
> What I actually expected and wanted was this:
> "
> 42400010349100 2004-06-22-13.13.48.717000478385115 Informatie110
> Van den Hel, Heidi
> 42764548919100 2004-06-22-13.13.57.233000 FIDI 104
> Tanghe, Wendy
> "
> So without the header, without the trailer, without the names of the
> columns, and without a space between the columns.
> Is this possible or not, and if so: how?
> Thanks a lot in advance,
> Pieter
>|||Thanks,
And how do you put a DTS or BCP in a job so it is executed every day
automaticly?
"John Bell" <jbellnewsposts@.hotmail.com> wrote in message
news:uvJvr2EWEHA.556@.tk2msftngp13.phx.gbl...
> Hi
> This looks like you are starting a command prompt and running osql/isql?
You
> may want to look at adding SET NOCOUNT ON to the SQL and using the -h flag
> to osql/isql. Alternatively try the
> DTS Export wizard or BCP utility. Mor information on all of these can be
> found in Books Online.
> John
> "DraguVaso" <pietercoucke@.hotmail.com> wrote in message
> news:OqhPeoEWEHA.208@.TK2MSFTNGP10.phx.gbl...
> > Hi,
> >
> > I tryed to Export a query to a Fiel with a Job.
> >
> > What I found in my file was this:
> >
> > "
> > Job 'ExportToSiclid' : Step 1, 'ExportFile' : Began Executing 2004-06-22
> > 13:16:07
> >
> > tdprNFoy tdprNCCD CallDate TEL Choice
> > tdprNSocFin Agent
> -- -- -- -- -- -
> --
> > -- --
> > 4240001034 9100 2004-06-22-13.13.48.71700 0478385115 Informatie
> 110
> > Van den Hel, Heidi
> > 4276454891 9100 2004-06-22-13.13.57.23300 0 FIDI
> 104
> > Tanghe, Wendy
> >
> > (2 rows(s) affected)
> > "
> >
> > What I actually expected and wanted was this:
> > "
> > 42400010349100 2004-06-22-13.13.48.717000478385115 Informatie110
> > Van den Hel, Heidi
> > 42764548919100 2004-06-22-13.13.57.233000 FIDI 104
> > Tanghe, Wendy
> > "
> >
> > So without the header, without the trailer, without the names of the
> > columns, and without a space between the columns.
> >
> > Is this possible or not, and if so: how?
> >
> > Thanks a lot in advance,
> >
> > Pieter
> >
> >
>|||Hi
If you run the DTS export wizard that will give you the option to schedule
the DTS job. This can also be changed in the created job. If you are already
running osql from your batch job, you can change the command to run BCP
instead.
John
"DraguVaso" <pietercoucke@.hotmail.com> wrote in message
news:eygYb4FWEHA.1048@.tk2msftngp13.phx.gbl...
> Thanks,
> And how do you put a DTS or BCP in a job so it is executed every day
> automaticly?
> "John Bell" <jbellnewsposts@.hotmail.com> wrote in message
> news:uvJvr2EWEHA.556@.tk2msftngp13.phx.gbl...
> > Hi
> >
> > This looks like you are starting a command prompt and running osql/isql?
> You
> > may want to look at adding SET NOCOUNT ON to the SQL and using the -h
flag
> > to osql/isql. Alternatively try the
> > DTS Export wizard or BCP utility. Mor information on all of these can be
> > found in Books Online.
> >
> > John
> >
> > "DraguVaso" <pietercoucke@.hotmail.com> wrote in message
> > news:OqhPeoEWEHA.208@.TK2MSFTNGP10.phx.gbl...
> > > Hi,
> > >
> > > I tryed to Export a query to a Fiel with a Job.
> > >
> > > What I found in my file was this:
> > >
> > > "
> > > Job 'ExportToSiclid' : Step 1, 'ExportFile' : Began Executing
2004-06-22
> > > 13:16:07
> > >
> > > tdprNFoy tdprNCCD CallDate TEL Choice
> > > tdprNSocFin Agent
> >
> -- -- -- -- -- -
> > --
> > > -- --
> > > 4240001034 9100 2004-06-22-13.13.48.71700 0478385115
Informatie
> > 110
> > > Van den Hel, Heidi
> > > 4276454891 9100 2004-06-22-13.13.57.23300 0 FIDI
> > 104
> > > Tanghe, Wendy
> > >
> > > (2 rows(s) affected)
> > > "
> > >
> > > What I actually expected and wanted was this:
> > > "
> > > 42400010349100 2004-06-22-13.13.48.717000478385115
Informatie110
> > > Van den Hel, Heidi
> > > 42764548919100 2004-06-22-13.13.57.233000 FIDI
104
> > > Tanghe, Wendy
> > > "
> > >
> > > So without the header, without the trailer, without the names of the
> > > columns, and without a space between the columns.
> > >
> > > Is this possible or not, and if so: how?
> > >
> > > Thanks a lot in advance,
> > >
> > > Pieter
> > >
> > >
> >
> >
>|||Thanks!!
"John Bell" <jbellnewsposts@.hotmail.com> wrote in message
news:OCSgETGWEHA.2908@.TK2MSFTNGP10.phx.gbl...
> Hi
> If you run the DTS export wizard that will give you the option to schedule
> the DTS job. This can also be changed in the created job. If you are
already
> running osql from your batch job, you can change the command to run BCP
> instead.
> John
> "DraguVaso" <pietercoucke@.hotmail.com> wrote in message
> news:eygYb4FWEHA.1048@.tk2msftngp13.phx.gbl...
> > Thanks,
> >
> > And how do you put a DTS or BCP in a job so it is executed every day
> > automaticly?
> >
> > "John Bell" <jbellnewsposts@.hotmail.com> wrote in message
> > news:uvJvr2EWEHA.556@.tk2msftngp13.phx.gbl...
> > > Hi
> > >
> > > This looks like you are starting a command prompt and running
osql/isql?
> > You
> > > may want to look at adding SET NOCOUNT ON to the SQL and using the -h
> flag
> > > to osql/isql. Alternatively try the
> > > DTS Export wizard or BCP utility. Mor information on all of these can
be
> > > found in Books Online.
> > >
> > > John
> > >
> > > "DraguVaso" <pietercoucke@.hotmail.com> wrote in message
> > > news:OqhPeoEWEHA.208@.TK2MSFTNGP10.phx.gbl...
> > > > Hi,
> > > >
> > > > I tryed to Export a query to a Fiel with a Job.
> > > >
> > > > What I found in my file was this:
> > > >
> > > > "
> > > > Job 'ExportToSiclid' : Step 1, 'ExportFile' : Began Executing
> 2004-06-22
> > > > 13:16:07
> > > >
> > > > tdprNFoy tdprNCCD CallDate TEL Choice
> > > > tdprNSocFin Agent
> > >
> >
> -- -- -- -- -- -
> > > --
> > > > -- --
> > > > 4240001034 9100 2004-06-22-13.13.48.71700 0478385115
> Informatie
> > > 110
> > > > Van den Hel, Heidi
> > > > 4276454891 9100 2004-06-22-13.13.57.23300 0 FIDI
> > > 104
> > > > Tanghe, Wendy
> > > >
> > > > (2 rows(s) affected)
> > > > "
> > > >
> > > > What I actually expected and wanted was this:
> > > > "
> > > > 42400010349100 2004-06-22-13.13.48.717000478385115
> Informatie110
> > > > Van den Hel, Heidi
> > > > 42764548919100 2004-06-22-13.13.57.233000 FIDI
> 104
> > > > Tanghe, Wendy
> > > > "
> > > >
> > > > So without the header, without the trailer, without the names of the
> > > > columns, and without a space between the columns.
> > > >
> > > > Is this possible or not, and if so: how?
> > > >
> > > > Thanks a lot in advance,
> > > >
> > > > Pieter
> > > >
> > > >
> > >
> > >
> >
> >
>

Job/Sheduled DTS/...: Rename a File

Hi,
I need to import a file with a Sheduled DTS, and after I have imported it I
shoudl rename the File. Is there any way to do this with a DTS or Job?
Thanks a lot,
PieterDragu,
have a look at this link. It has examples of different uses of the
filesystemobject which you'll need to call in VBScript:
http://www.sqldts.com/default.aspx?292
HTH,
Paul Ibison|||Pieter,
Execute a batch file either using DTS or sql job.
--
Dinesh
SQL Server MVP
--
--
SQL Server FAQ at
http://www.tkdinesh.com
"DraguVaso" <pietercoucke@.hotmail.com> wrote in message
news:#ntqwtPWEHA.1760@.TK2MSFTNGP10.phx.gbl...
> Hi,
> I need to import a file with a Sheduled DTS, and after I have imported it
I
> shoudl rename the File. Is there any way to do this with a DTS or Job?
> Thanks a lot,
> Pieter
>|||Great! Thanks! How do I execute such a script in my Job?
"Paul Ibison" <Paul.Ibison@.Pygmalion.Com> wrote in message
news:eJj$21PWEHA.4056@.TK2MSFTNGP11.phx.gbl...
> Dragu,
> have a look at this link. It has examples of different uses of the
> filesystemobject which you'll need to call in VBScript:
> http://www.sqldts.com/default.aspx?292
> HTH,
> Paul Ibison
>|||Dragu,
these scripts can be copied and pasted into an activeX script task.
Regards,
Paul Ibison|||Thanks! :-)
"Paul Ibison" <Paul.Ibison@.Pygmalion.Com> wrote in message
news:O88QAeQWEHA.1356@.TK2MSFTNGP09.phx.gbl...
> Dragu,
> these scripts can be copied and pasted into an activeX script task.
> Regards,
> Paul Ibison
>

Job/Sheduled DTS/...: Rename a File

Hi,
I need to import a file with a Sheduled DTS, and after I have imported it I
shoudl rename the File. Is there any way to do this with a DTS or Job?
Thanks a lot,
PieterDragu,
have a look at this link. It has examples of different uses of the
filesystemobject which you'll need to call in vbscript :
http://www.sqldts.com/default.aspx?292
HTH,
Paul Ibison|||Pieter,
Execute a batch file either using DTS or sql job.
Dinesh
SQL Server MVP
--
--
SQL Server FAQ at
http://www.tkdinesh.com
"DraguVaso" <pietercoucke@.hotmail.com> wrote in message
news:#ntqwtPWEHA.1760@.TK2MSFTNGP10.phx.gbl...
> Hi,
> I need to import a file with a Sheduled DTS, and after I have imported it
I
> shoudl rename the File. Is there any way to do this with a DTS or Job?
> Thanks a lot,
> Pieter
>|||Great! Thanks! How do I execute such a script in my Job?
"Paul Ibison" <Paul.Ibison@.Pygmalion.Com> wrote in message
news:eJj$21PWEHA.4056@.TK2MSFTNGP11.phx.gbl...
> Dragu,
> have a look at this link. It has examples of different uses of the
> filesystemobject which you'll need to call in vbscript :
> http://www.sqldts.com/default.aspx?292
> HTH,
> Paul Ibison
>|||Dragu,
these scripts can be copied and pasted into an activeX script task.
Regards,
Paul Ibison|||Thanks! :-)
"Paul Ibison" <Paul.Ibison@.Pygmalion.Com> wrote in message
news:O88QAeQWEHA.1356@.TK2MSFTNGP09.phx.gbl...
> Dragu,
> these scripts can be copied and pasted into an activeX script task.
> Regards,
> Paul Ibison
>sql

Job/Sheduled DTS/...: Rename a File

Hi,
I need to import a file with a Sheduled DTS, and after I have imported it I
shoudl rename the File. Is there any way to do this with a DTS or Job?
Thanks a lot,
Pieter
Dragu,
have a look at this link. It has examples of different uses of the
filesystemobject which you'll need to call in vbscript:
http://www.sqldts.com/default.aspx?292
HTH,
Paul Ibison
|||Pieter,
Execute a batch file either using DTS or sql job.
Dinesh
SQL Server MVP
--
SQL Server FAQ at
http://www.tkdinesh.com
"DraguVaso" <pietercoucke@.hotmail.com> wrote in message
news:#ntqwtPWEHA.1760@.TK2MSFTNGP10.phx.gbl...
> Hi,
> I need to import a file with a Sheduled DTS, and after I have imported it
I
> shoudl rename the File. Is there any way to do this with a DTS or Job?
> Thanks a lot,
> Pieter
>
|||Great! Thanks! How do I execute such a script in my Job?
"Paul Ibison" <Paul.Ibison@.Pygmalion.Com> wrote in message
news:eJj$21PWEHA.4056@.TK2MSFTNGP11.phx.gbl...
> Dragu,
> have a look at this link. It has examples of different uses of the
> filesystemobject which you'll need to call in vbscript:
> http://www.sqldts.com/default.aspx?292
> HTH,
> Paul Ibison
>
|||Dragu,
these scripts can be copied and pasted into an activeX script task.
Regards,
Paul Ibison
|||Thanks! :-)
"Paul Ibison" <Paul.Ibison@.Pygmalion.Com> wrote in message
news:O88QAeQWEHA.1356@.TK2MSFTNGP09.phx.gbl...
> Dragu,
> these scripts can be copied and pasted into an activeX script task.
> Regards,
> Paul Ibison
>

Job/DTS Scheduled Execution

Hi all,
Just a quick question...
I have 2 DTS's - one which imports a small amount of data (about 60 rows
max), and another which imports about 20,000 rows (and will increase over
time).
I had a job running the smaller import every two minutes (as it reports real
time data) - I've added the other import to this but it would seem that the
20,000 rows will take more than 2 minutes.
I was wondering if anyone could advise as to what is likely to happen when
the job takes more than 2 minutes to run - I'm "guessing" that 2 minutes
after it started, it'll start again, and start over writing data thats only
just been written with the last import (and round and round it goes) - but
because of the nature of this, wouldn't this cause the server to overload a
bit, ie, it would be constantly starting new executions of the job whilst
the previous execution was running - or does SQL have anything inbuilt to
prevent this.
I'm thinking that my best bet would be to split these apart - have the
smaller import in one job running every two minutes, and the other in a
separate job that runs, perhaps every 5 (will have to time it I guess) ...
Any info on this would be appreciated.
Regards
RobAgent will not start a job if that job is currently executing. I.e., you wil
l not have several
instances of the same job executing at the same time. Does that answer your
question?
Tibor Karaszi, SQL Server MVP
http://www.karaszi.com/sqlserver/default.asp
http://www.solidqualitylearning.com/
Blog: http://solidqualitylearning.com/blogs/tibor/
"Rob Meade" <ku.shn.tsews.thbu@.edaem.bor> wrote in message
news:uDRBiAOHGHA.3700@.TK2MSFTNGP15.phx.gbl...
> Hi all,
> Just a quick question...
> I have 2 DTS's - one which imports a small amount of data (about 60 rows m
ax), and another which
> imports about 20,000 rows (and will increase over time).
> I had a job running the smaller import every two minutes (as it reports re
al time data) - I've
> added the other import to this but it would seem that the 20,000 rows will
take more than 2
> minutes.
> I was wondering if anyone could advise as to what is likely to happen when
the job takes more than
> 2 minutes to run - I'm "guessing" that 2 minutes after it started, it'll s
tart again, and start
> over writing data thats only just been written with the last import (and r
ound and round it
> goes) - but because of the nature of this, wouldn't this cause the server
to overload a bit, ie,
> it would be constantly starting new executions of the job whilst the previ
ous execution was
> running - or does SQL have anything inbuilt to prevent this.
> I'm thinking that my best bet would be to split these apart - have the sma
ller import in one job
> running every two minutes, and the other in a separate job that runs, perh
aps every 5 (will have
> to time it I guess) ...
> Any info on this would be appreciated.
> Regards
> Rob
>|||hi Rob,
What about to call the second DTS from the first one? (As last task, of
course)? So this way I would keep one job no more.
"Rob Meade" wrote:

> Hi all,
> Just a quick question...
> I have 2 DTS's - one which imports a small amount of data (about 60 rows
> max), and another which imports about 20,000 rows (and will increase over
> time).
> I had a job running the smaller import every two minutes (as it reports re
al
> time data) - I've added the other import to this but it would seem that th
e
> 20,000 rows will take more than 2 minutes.
> I was wondering if anyone could advise as to what is likely to happen when
> the job takes more than 2 minutes to run - I'm "guessing" that 2 minutes
> after it started, it'll start again, and start over writing data thats onl
y
> just been written with the last import (and round and round it goes) - but
> because of the nature of this, wouldn't this cause the server to overload
a
> bit, ie, it would be constantly starting new executions of the job whilst
> the previous execution was running - or does SQL have anything inbuilt to
> prevent this.
> I'm thinking that my best bet would be to split these apart - have the
> smaller import in one job running every two minutes, and the other in a
> separate job that runs, perhaps every 5 (will have to time it I guess) ...
> Any info on this would be appreciated.
> Regards
> Rob
>
>|||"Tibor Karaszi" wrote ...

> Agent will not start a job if that job is currently executing. I.e., you
> will not have several instances of the same job executing at the same
> time. Does that answer your question?
Hi Tibor,
Yes it does - thank you :o)
Rob|||"Enric" wrote ...

> What about to call the second DTS from the first one? (As last task, of
> course)? So this way I would keep one job no more.
I could do - but of course then I'd have an even longer wait for the first
lot of info...
Cheers though.
Rob

Job with stored proc succeeds with sqlstate 01000

I have a stored procedure that I am executing through the sql server
job scheduler, it executes properly but after each line in the log file
there is a sqlstate message.
procedure name p_document_purge [SQLSTATE 01000]
Archiving records [SQLSTATE 01000]
51 rows archived [SQLSTATE 01000]
Purging duplicate records [SQLSTATE 01000]
51 rows purged [SQLSTATE 01000]
Purge completed successfully [SQLSTATE 01000]
Does anyone know what might cause this? The @.@.ERROR during the
procedure is always 0 or it would rollback the entire transaction. Also
If I run the proc from query analyzer I do not get any negative
feedback regarding it. I tried looking in the documentation, but this
is listed as a general error which doesn't really help.
Thanks
BillPrint statements in the stored procedure can result in the
sqlstate message.
-Sue
.
On 23 Feb 2006 06:30:49 -0800, william_dudek@.yahoo.com
wrote:

>I have a stored procedure that I am executing through the sql server
>job scheduler, it executes properly but after each line in the log file
>there is a sqlstate message.
>procedure name p_document_purge [SQLSTATE 01000]
>Archiving records [SQLSTATE 01000]
>51 rows archived [SQLSTATE 01000]
>Purging duplicate records [SQLSTATE 01000]
>51 rows purged [SQLSTATE 01000]
>Purge completed successfully [SQLSTATE 01000]
>Does anyone know what might cause this? The @.@.ERROR during the
>procedure is always 0 or it would rollback the entire transaction. Also
>If I run the proc from query analyzer I do not get any negative
>feedback regarding it. I tried looking in the documentation, but this
>is listed as a general error which doesn't really help.
>Thanks
>Bill

Job with stored proc succeeds with sqlstate 01000

I have a stored procedure that I am executing through the sql server
job scheduler, it executes properly but after each line in the log file
there is a sqlstate message.
procedure name p_document_purge [SQLSTATE 01000]
Archiving records [SQLSTATE 01000]
51 rows archived [SQLSTATE 01000]
Purging duplicate records [SQLSTATE 01000]
51 rows purged [SQLSTATE 01000]
Purge completed successfully [SQLSTATE 01000]
Does anyone know what might cause this? The @.@.ERROR during the
procedure is always 0 or it would rollback the entire transaction. Also
If I run the proc from query analyzer I do not get any negative
feedback regarding it. I tried looking in the documentation, but this
is listed as a general error which doesn't really help.
Thanks
BillPrint statements in the stored procedure can result in the
sqlstate message.
-Sue
.
On 23 Feb 2006 06:30:49 -0800, william_dudek@.yahoo.com
wrote:
>I have a stored procedure that I am executing through the sql server
>job scheduler, it executes properly but after each line in the log file
>there is a sqlstate message.
>procedure name p_document_purge [SQLSTATE 01000]
>Archiving records [SQLSTATE 01000]
>51 rows archived [SQLSTATE 01000]
>Purging duplicate records [SQLSTATE 01000]
>51 rows purged [SQLSTATE 01000]
>Purge completed successfully [SQLSTATE 01000]
>Does anyone know what might cause this? The @.@.ERROR during the
>procedure is always 0 or it would rollback the entire transaction. Also
>If I run the proc from query analyzer I do not get any negative
>feedback regarding it. I tried looking in the documentation, but this
>is listed as a general error which doesn't really help.
>Thanks
>Bill

Job with stored proc succeeds with sqlstate 01000

I have a stored procedure that I am executing through the sql server
job scheduler, it executes properly but after each line in the log file
there is a sqlstate message.
procedure name p_document_purge [SQLSTATE 01000]
Archiving records [SQLSTATE 01000]
51 rows archived [SQLSTATE 01000]
Purging duplicate records [SQLSTATE 01000]
51 rows purged [SQLSTATE 01000]
Purge completed successfully [SQLSTATE 01000]
Does anyone know what might cause this? The @.@.ERROR during the
procedure is always 0 or it would rollback the entire transaction. Also
If I run the proc from query analyzer I do not get any negative
feedback regarding it. I tried looking in the documentation, but this
is listed as a general error which doesn't really help.
Thanks
Bill
Print statements in the stored procedure can result in the
sqlstate message.
-Sue
..
On 23 Feb 2006 06:30:49 -0800, william_dudek@.yahoo.com
wrote:

>I have a stored procedure that I am executing through the sql server
>job scheduler, it executes properly but after each line in the log file
>there is a sqlstate message.
>procedure name p_document_purge [SQLSTATE 01000]
>Archiving records [SQLSTATE 01000]
>51 rows archived [SQLSTATE 01000]
>Purging duplicate records [SQLSTATE 01000]
>51 rows purged [SQLSTATE 01000]
>Purge completed successfully [SQLSTATE 01000]
>Does anyone know what might cause this? The @.@.ERROR during the
>procedure is always 0 or it would rollback the entire transaction. Also
>If I run the proc from query analyzer I do not get any negative
>feedback regarding it. I tried looking in the documentation, but this
>is listed as a general error which doesn't really help.
>Thanks
>Bill
sql

Job with steps problem

Hi all,
I have Windows 2003 r2 Server x64, sql 2005 sp1 x64 with sql 2000 latest sp
on one machine.
In SQL2005 I have created 2 maintenance plans - one for 2005 and second for
2000
Also created 3rd job (first 2 were generated by maintenance plans) with the
following steps:
1. cmd to move filese from A to B
2. SQL SISP package \Maintenance Plans\2005
3. SQL SISP package \Maintenance Plans\2000
When I execute that job it goes to step one and move those file, it's ok.
But step 2 and 3 are not executed correctly, I mean they are executed
(Successful) but that maintenance plan is not executing. After all I'm
getting the successful result but no backup are mage
Tried to execute those maintenance plans one by one. 2000 - successful but
nothing happens, the same is for 2005
When I execute the job associated - all is ok.
How could I add defined JOB to the Job Steps, to make it all run?
Thank you in advance
Alex
I don't think it is a good idea to call job from another job's step.
I mean you have to be certain that it is really business requirements
Take a look at sp_start_job system stored procedure in the BOL
"Guzun, Alex" <a> wrote in message
news:OT%23pG$gvHHA.536@.TK2MSFTNGP06.phx.gbl...
> Hi all,
> I have Windows 2003 r2 Server x64, sql 2005 sp1 x64 with sql 2000 latest
> sp on one machine.
> In SQL2005 I have created 2 maintenance plans - one for 2005 and second
> for 2000
> Also created 3rd job (first 2 were generated by maintenance plans) with
> the following steps:
> 1. cmd to move filese from A to B
> 2. SQL SISP package \Maintenance Plans\2005
> 3. SQL SISP package \Maintenance Plans\2000
> When I execute that job it goes to step one and move those file, it's ok.
> But step 2 and 3 are not executed correctly, I mean they are executed
> (Successful) but that maintenance plan is not executing. After all I'm
> getting the successful result but no backup are mage
> Tried to execute those maintenance plans one by one. 2000 - successful but
> nothing happens, the same is for 2005
> When I execute the job associated - all is ok.
> How could I add defined JOB to the Job Steps, to make it all run?
> Thank you in advance
|||Hm... don't know what is BOL, and how to use sp_start_job system stored
procedure
I'm just a guy that using GUI for SQL
I need to achieve those job to run as 1, 2, 3.
2nd will wait for 1, to end and so on
Please help me if you can
Thanks in advance
"Uri Dimant" <urid@.iscar.co.il> wrote in message
news:#jzRXGhvHHA.2288@.TK2MSFTNGP05.phx.gbl...
> Alex
> I don't think it is a good idea to call job from another job's step.
> I mean you have to be certain that it is really business requirements
> Take a look at sp_start_job system stored procedure in the BOL
>
> "Guzun, Alex" <a> wrote in message
> news:OT%23pG$gvHHA.536@.TK2MSFTNGP06.phx.gbl...
>
>
|||Alex
BOL -Books On Line that is a part of SQL Server Client Tools installation.
Can you create three jobs and and checking the status by using sp_help_job?
"Guzun, Alex" <a> wrote in message
news:%23tGvRWjvHHA.1168@.TK2MSFTNGP02.phx.gbl...[vbcol=seagreen]
> Hm... don't know what is BOL, and how to use sp_start_job system stored
> procedure
> I'm just a guy that using GUI for SQL
> I need to achieve those job to run as 1, 2, 3.
> 2nd will wait for 1, to end and so on
> Please help me if you can
> Thanks in advance
> "Uri Dimant" <urid@.iscar.co.il> wrote in message
> news:#jzRXGhvHHA.2288@.TK2MSFTNGP05.phx.gbl...
|||Sorry
I have 2 jobs for 2 maintenance plans
Got also 3 job with cmd script to move files from DAY folder to WEEK folder
And everyday I'll tape DAY folder
Now I want them to run one by one, don't want to calculate time intervals
for each one
Thanks
"Uri Dimant" <urid@.iscar.co.il> wrote in message
news:#DnGyajvHHA.1164@.TK2MSFTNGP02.phx.gbl...
> Alex
> BOL -Books On Line that is a part of SQL Server Client Tools installation.
> Can you create three jobs and and checking the status by using
> sp_help_job?
>
> "Guzun, Alex" <a> wrote in message
> news:%23tGvRWjvHHA.1168@.TK2MSFTNGP02.phx.gbl...
>

Job with steps problem

Hi all,
I have Windows 2003 r2 Server x64, sql 2005 sp1 x64 with sql 2000 latest sp
on one machine.
In SQL2005 I have created 2 maintenance plans - one for 2005 and second for
2000
Also created 3rd job (first 2 were generated by maintenance plans) with the
following steps:
1. cmd to move filese from A to B
2. SQL SISP package \Maintenance Plans\2005
3. SQL SISP package \Maintenance Plans\2000
When I execute that job it goes to step one and move those file, it's ok.
But step 2 and 3 are not executed correctly, I mean they are executed
(Successful) but that maintenance plan is not executing. After all I'm
getting the successful result but no backup are mage :(
Tried to execute those maintenance plans one by one. 2000 - successful but
nothing happens, the same is for 2005
When I execute the job associated - all is ok.
How could I add defined JOB to the Job Steps, to make it all run?
Thank you in advanceAlex
I don't think it is a good idea to call job from another job's step.
I mean you have to be certain that it is really business requirements
Take a look at sp_start_job system stored procedure in the BOL
"Guzun, Alex" <a> wrote in message
news:OT%23pG$gvHHA.536@.TK2MSFTNGP06.phx.gbl...
> Hi all,
> I have Windows 2003 r2 Server x64, sql 2005 sp1 x64 with sql 2000 latest
> sp on one machine.
> In SQL2005 I have created 2 maintenance plans - one for 2005 and second
> for 2000
> Also created 3rd job (first 2 were generated by maintenance plans) with
> the following steps:
> 1. cmd to move filese from A to B
> 2. SQL SISP package \Maintenance Plans\2005
> 3. SQL SISP package \Maintenance Plans\2000
> When I execute that job it goes to step one and move those file, it's ok.
> But step 2 and 3 are not executed correctly, I mean they are executed
> (Successful) but that maintenance plan is not executing. After all I'm
> getting the successful result but no backup are mage :(
> Tried to execute those maintenance plans one by one. 2000 - successful but
> nothing happens, the same is for 2005
> When I execute the job associated - all is ok.
> How could I add defined JOB to the Job Steps, to make it all run?
> Thank you in advance|||Hm... don't know what is BOL, and how to use sp_start_job system stored
procedure
I'm just a guy that using GUI for SQL
I need to achieve those job to run as 1, 2, 3.
2nd will wait for 1, to end and so on :(
Please help me if you can
Thanks in advance
"Uri Dimant" <urid@.iscar.co.il> wrote in message
news:#jzRXGhvHHA.2288@.TK2MSFTNGP05.phx.gbl...
> Alex
> I don't think it is a good idea to call job from another job's step.
> I mean you have to be certain that it is really business requirements
> Take a look at sp_start_job system stored procedure in the BOL
>
> "Guzun, Alex" <a> wrote in message
> news:OT%23pG$gvHHA.536@.TK2MSFTNGP06.phx.gbl...
>> Hi all,
>> I have Windows 2003 r2 Server x64, sql 2005 sp1 x64 with sql 2000 latest
>> sp on one machine.
>> In SQL2005 I have created 2 maintenance plans - one for 2005 and second
>> for 2000
>> Also created 3rd job (first 2 were generated by maintenance plans) with
>> the following steps:
>> 1. cmd to move filese from A to B
>> 2. SQL SISP package \Maintenance Plans\2005
>> 3. SQL SISP package \Maintenance Plans\2000
>> When I execute that job it goes to step one and move those file, it's ok.
>> But step 2 and 3 are not executed correctly, I mean they are executed
>> (Successful) but that maintenance plan is not executing. After all I'm
>> getting the successful result but no backup are mage :(
>> Tried to execute those maintenance plans one by one. 2000 - successful
>> but nothing happens, the same is for 2005
>> When I execute the job associated - all is ok.
>> How could I add defined JOB to the Job Steps, to make it all run?
>> Thank you in advance
>
>|||Alex
BOL -Books On Line that is a part of SQL Server Client Tools installation.
Can you create three jobs and and checking the status by using sp_help_job?
"Guzun, Alex" <a> wrote in message
news:%23tGvRWjvHHA.1168@.TK2MSFTNGP02.phx.gbl...
> Hm... don't know what is BOL, and how to use sp_start_job system stored
> procedure
> I'm just a guy that using GUI for SQL
> I need to achieve those job to run as 1, 2, 3.
> 2nd will wait for 1, to end and so on :(
> Please help me if you can
> Thanks in advance
> "Uri Dimant" <urid@.iscar.co.il> wrote in message
> news:#jzRXGhvHHA.2288@.TK2MSFTNGP05.phx.gbl...
>> Alex
>> I don't think it is a good idea to call job from another job's step.
>> I mean you have to be certain that it is really business requirements
>> Take a look at sp_start_job system stored procedure in the BOL
>>
>> "Guzun, Alex" <a> wrote in message
>> news:OT%23pG$gvHHA.536@.TK2MSFTNGP06.phx.gbl...
>> Hi all,
>> I have Windows 2003 r2 Server x64, sql 2005 sp1 x64 with sql 2000 latest
>> sp on one machine.
>> In SQL2005 I have created 2 maintenance plans - one for 2005 and second
>> for 2000
>> Also created 3rd job (first 2 were generated by maintenance plans) with
>> the following steps:
>> 1. cmd to move filese from A to B
>> 2. SQL SISP package \Maintenance Plans\2005
>> 3. SQL SISP package \Maintenance Plans\2000
>> When I execute that job it goes to step one and move those file, it's
>> ok. But step 2 and 3 are not executed correctly, I mean they are
>> executed (Successful) but that maintenance plan is not executing. After
>> all I'm getting the successful result but no backup are mage :(
>> Tried to execute those maintenance plans one by one. 2000 - successful
>> but nothing happens, the same is for 2005
>> When I execute the job associated - all is ok.
>> How could I add defined JOB to the Job Steps, to make it all run?
>> Thank you in advance
>>|||Sorry :)
I have 2 jobs for 2 maintenance plans
Got also 3 job with cmd script to move files from DAY folder to WEEK folder
And everyday I'll tape DAY folder
Now I want them to run one by one, don't want to calculate time intervals
for each one
Thanks
"Uri Dimant" <urid@.iscar.co.il> wrote in message
news:#DnGyajvHHA.1164@.TK2MSFTNGP02.phx.gbl...
> Alex
> BOL -Books On Line that is a part of SQL Server Client Tools installation.
> Can you create three jobs and and checking the status by using
> sp_help_job?
>
> "Guzun, Alex" <a> wrote in message
> news:%23tGvRWjvHHA.1168@.TK2MSFTNGP02.phx.gbl...
>> Hm... don't know what is BOL, and how to use sp_start_job system stored
>> procedure
>> I'm just a guy that using GUI for SQL
>> I need to achieve those job to run as 1, 2, 3.
>> 2nd will wait for 1, to end and so on :(
>> Please help me if you can
>> Thanks in advance
>> "Uri Dimant" <urid@.iscar.co.il> wrote in message
>> news:#jzRXGhvHHA.2288@.TK2MSFTNGP05.phx.gbl...
>> Alex
>> I don't think it is a good idea to call job from another job's step.
>> I mean you have to be certain that it is really business requirements
>> Take a look at sp_start_job system stored procedure in the BOL
>>
>> "Guzun, Alex" <a> wrote in message
>> news:OT%23pG$gvHHA.536@.TK2MSFTNGP06.phx.gbl...
>> Hi all,
>> I have Windows 2003 r2 Server x64, sql 2005 sp1 x64 with sql 2000
>> latest sp on one machine.
>> In SQL2005 I have created 2 maintenance plans - one for 2005 and second
>> for 2000
>> Also created 3rd job (first 2 were generated by maintenance plans) with
>> the following steps:
>> 1. cmd to move filese from A to B
>> 2. SQL SISP package \Maintenance Plans\2005
>> 3. SQL SISP package \Maintenance Plans\2000
>> When I execute that job it goes to step one and move those file, it's
>> ok. But step 2 and 3 are not executed correctly, I mean they are
>> executed (Successful) but that maintenance plan is not executing. After
>> all I'm getting the successful result but no backup are mage :(
>> Tried to execute those maintenance plans one by one. 2000 - successful
>> but nothing happens, the same is for 2005
>> When I execute the job associated - all is ok.
>> How could I add defined JOB to the Job Steps, to make it all run?
>> Thank you in advance
>>
>

Job with steps problem

Hi all,
I have Windows 2003 r2 Server x64, sql 2005 sp1 x64 with sql 2000 latest sp
on one machine.
In SQL2005 I have created 2 maintenance plans - one for 2005 and second for
2000
Also created 3rd job (first 2 were generated by maintenance plans) with the
following steps:
1. cmd to move filese from A to B
2. SQL SISP package \Maintenance Plans\2005
3. SQL SISP package \Maintenance Plans\2000
When I execute that job it goes to step one and move those file, it's ok.
But step 2 and 3 are not executed correctly, I mean they are executed
(Successful) but that maintenance plan is not executing. After all I'm
getting the successful result but no backup are mage
Tried to execute those maintenance plans one by one. 2000 - successful but
nothing happens, the same is for 2005
When I execute the job associated - all is ok.
How could I add defined JOB to the Job Steps, to make it all run?
Thank you in advanceAlex
I don't think it is a good idea to call job from another job's step.
I mean you have to be certain that it is really business requirements
Take a look at sp_start_job system stored procedure in the BOL
"Guzun, Alex" <a> wrote in message
news:OT%23pG$gvHHA.536@.TK2MSFTNGP06.phx.gbl...
> Hi all,
> I have Windows 2003 r2 Server x64, sql 2005 sp1 x64 with sql 2000 latest
> sp on one machine.
> In SQL2005 I have created 2 maintenance plans - one for 2005 and second
> for 2000
> Also created 3rd job (first 2 were generated by maintenance plans) with
> the following steps:
> 1. cmd to move filese from A to B
> 2. SQL SISP package \Maintenance Plans\2005
> 3. SQL SISP package \Maintenance Plans\2000
> When I execute that job it goes to step one and move those file, it's ok.
> But step 2 and 3 are not executed correctly, I mean they are executed
> (Successful) but that maintenance plan is not executing. After all I'm
> getting the successful result but no backup are mage
> Tried to execute those maintenance plans one by one. 2000 - successful but
> nothing happens, the same is for 2005
> When I execute the job associated - all is ok.
> How could I add defined JOB to the Job Steps, to make it all run?
> Thank you in advance|||Hm... don't know what is BOL, and how to use sp_start_job system stored
procedure
I'm just a guy that using GUI for SQL
I need to achieve those job to run as 1, 2, 3.
2nd will wait for 1, to end and so on
Please help me if you can
Thanks in advance
"Uri Dimant" <urid@.iscar.co.il> wrote in message
news:#jzRXGhvHHA.2288@.TK2MSFTNGP05.phx.gbl...
> Alex
> I don't think it is a good idea to call job from another job's step.
> I mean you have to be certain that it is really business requirements
> Take a look at sp_start_job system stored procedure in the BOL
>
> "Guzun, Alex" <a> wrote in message
> news:OT%23pG$gvHHA.536@.TK2MSFTNGP06.phx.gbl...
>
>|||Alex
BOL -Books On Line that is a part of SQL Server Client Tools installation.
Can you create three jobs and and checking the status by using sp_help_job?
"Guzun, Alex" <a> wrote in message
news:%23tGvRWjvHHA.1168@.TK2MSFTNGP02.phx.gbl...[vbcol=seagreen]
> Hm... don't know what is BOL, and how to use sp_start_job system stored
> procedure
> I'm just a guy that using GUI for SQL
> I need to achieve those job to run as 1, 2, 3.
> 2nd will wait for 1, to end and so on
> Please help me if you can
> Thanks in advance
> "Uri Dimant" <urid@.iscar.co.il> wrote in message
> news:#jzRXGhvHHA.2288@.TK2MSFTNGP05.phx.gbl...|||Sorry
I have 2 jobs for 2 maintenance plans
Got also 3 job with cmd script to move files from DAY folder to WEEK folder
And everyday I'll tape DAY folder
Now I want them to run one by one, don't want to calculate time intervals
for each one
Thanks
"Uri Dimant" <urid@.iscar.co.il> wrote in message
news:#DnGyajvHHA.1164@.TK2MSFTNGP02.phx.gbl...
> Alex
> BOL -Books On Line that is a part of SQL Server Client Tools installation.
> Can you create three jobs and and checking the status by using
> sp_help_job?
>
> "Guzun, Alex" <a> wrote in message
> news:%23tGvRWjvHHA.1168@.TK2MSFTNGP02.phx.gbl...
>

JOB With SSIS Step Fails If Package Contains A Script Task

I have a Job Step defined to execute a SSIS Package. This SSIS package contains a Script Task. The Job fails with the message "Package execution failed. The step failed."

I am logging events in the package and when the packages gets to the Script Task the log reports "The script files failed to load". If I disable the Script Task from the package it executes fine.

Curiously, the package runs successfully with the Scritp task enabled using dtexecui and dtexec from the command line.

Only if I include the Package in a job step with the Script Task enabled does it fail.

Any help would be appreciated.I've no real idea about this Steve except to say have you got the script code pre-compiled? If so, try it without (and vica versa).

-Jamie|||

Sorry for not posting this sooner, Jamie...Yes, setting the Script Task "PrecompileScriptIntoBinaryCode" Property to True resolved the issue.

-Steve

|||hey - did you ever get this to work? Im having the same issue - its a SSIS pkg that ran on one box and I moved to another box (64 bit if that makes a diff) - any info is greatly apprecaited - thanks.|||

Juantana,

Yes, set your Script Task "PrecompileScriptIntoBinaryCode" Property=True and then open and close your script. Save, redeploy and it should work. Let me know if you have any questions.

-Steve

|||

Hi,

Inspite of putting the PrecompileScriptintoBinarycode Property = True, it is not working. Do you have any suggestions?

Thanks.

|||After settings Precompile to True, you need to open and close script editor(s) to actually precompile the script in the task. Then re-deploy the pockage to target system.|||Also remove all the break points in the code, having break points prevents from recompilation. I guess this issue would be fixed by SP2|||Just an FYI I had all my scripts set to PrecompileScriptIntoBinaryCode=True however when developing in 32bit and deploying to 64bit seemingly randomly I will get the "script failed to load" error. I just open the script and compile it again and save and that usually does the trick.|||

Also if the problem still persists and you keep getting the same error I am sure you are not taking the dtsx file from bin folder to execute. The file created by designer (Business Intelligence Development Studio) is only useful for debug and development mode. After building the solution one should use dtsx created in bin folder. This solution worked for me.

Thanks

Mohit

|||NOTE: I got this same error message with Precompile option set to True.

Problem was that a variable name script used was not passed in. Go figure|||

I have about 30 dtxs that I would like to recompile its scripts to solve this issue. I would like to avoid to open each package, then open each script in the package and do the save to recomplie.

Is there a way to recompile from the command line?

Thanks!

|||

Previously I said "randomly" but I think I have it figured out when this error occurs:

If I am working on a package, open a different package and copy a script object and paste into the one I am working on it will not work in 64-bit without a recompile (but will work in 32 bit).

|||

Chris Honcoop wrote:

Previously I said "randomly" but I think I have it figured out when this error occurs:

If I am working on a package, open a different package and copy a script object and paste into the one I am working on it will not work in 64-bit without a recompile (but will work in 32 bit).

Sounds like it might be a bug. Could you submit it at http://connect.microsoft.com?

-Jamie

|||

I too have similar doubt as that of Paulino. I've 50+ packages with 60+ script tasks. Is there any way, I can compile using a command in command prompt? Any help would be highly appreciated.

My Regards

JOB With SSIS Step Fails If Package Contains A Script Task

I have a Job Step defined to execute a SSIS Package. This SSIS package contains a Script Task. The Job fails with the message "Package execution failed. The step failed."

I am logging events in the package and when the packages gets to the Script Task the log reports "The script files failed to load". If I disable the Script Task from the package it executes fine.

Curiously, the package runs successfully with the Scritp task enabled using dtexecui and dtexec from the command line.

Only if I include the Package in a job step with the Script Task enabled does it fail.

Any help would be appreciated.I've no real idea about this Steve except to say have you got the script code pre-compiled? If so, try it without (and vica versa).

-Jamie|||

Sorry for not posting this sooner, Jamie...Yes, setting the Script Task "PrecompileScriptIntoBinaryCode" Property to True resolved the issue.

-Steve

|||hey - did you ever get this to work? Im having the same issue - its a SSIS pkg that ran on one box and I moved to another box (64 bit if that makes a diff) - any info is greatly apprecaited - thanks.|||

Juantana,

Yes, set your Script Task "PrecompileScriptIntoBinaryCode" Property=True and then open and close your script. Save, redeploy and it should work. Let me know if you have any questions.

-Steve

|||

Hi,

Inspite of putting the PrecompileScriptintoBinarycode Property = True, it is not working. Do you have any suggestions?

Thanks.

|||After settings Precompile to True, you need to open and close script editor(s) to actually precompile the script in the task. Then re-deploy the pockage to target system.|||Also remove all the break points in the code, having break points prevents from recompilation. I guess this issue would be fixed by SP2|||Just an FYI I had all my scripts set to PrecompileScriptIntoBinaryCode=True however when developing in 32bit and deploying to 64bit seemingly randomly I will get the "script failed to load" error. I just open the script and compile it again and save and that usually does the trick.|||

Also if the problem still persists and you keep getting the same error I am sure you are not taking the dtsx file from bin folder to execute. The file created by designer (Business Intelligence Development Studio) is only useful for debug and development mode. After building the solution one should use dtsx created in bin folder. This solution worked for me.

Thanks

Mohit

|||NOTE: I got this same error message with Precompile option set to True.

Problem was that a variable name script used was not passed in. Go figure|||

I have about 30 dtxs that I would like to recompile its scripts to solve this issue. I would like to avoid to open each package, then open each script in the package and do the save to recomplie.

Is there a way to recompile from the command line?

Thanks!

|||

Previously I said "randomly" but I think I have it figured out when this error occurs:

If I am working on a package, open a different package and copy a script object and paste into the one I am working on it will not work in 64-bit without a recompile (but will work in 32 bit).

|||

Chris Honcoop wrote:

Previously I said "randomly" but I think I have it figured out when this error occurs:

If I am working on a package, open a different package and copy a script object and paste into the one I am working on it will not work in 64-bit without a recompile (but will work in 32 bit).

Sounds like it might be a bug. Could you submit it at http://connect.microsoft.com?

-Jamie

sql

JOB With SSIS Step Fails If Package Contains A Script Task

I have a Job Step defined to execute a SSIS Package. This SSIS package contains a Script Task. The Job fails with the message "Package execution failed. The step failed."

I am logging events in the package and when the packages gets to the Script Task the log reports "The script files failed to load". If I disable the Script Task from the package it executes fine.

Curiously, the package runs successfully with the Scritp task enabled using dtexecui and dtexec from the command line.

Only if I include the Package in a job step with the Script Task enabled does it fail.

Any help would be appreciated.I've no real idea about this Steve except to say have you got the script code pre-compiled? If so, try it without (and vica versa).

-Jamie|||

Sorry for not posting this sooner, Jamie...Yes, setting the Script Task "PrecompileScriptIntoBinaryCode" Property to True resolved the issue.

-Steve

|||hey - did you ever get this to work? Im having the same issue - its a SSIS pkg that ran on one box and I moved to another box (64 bit if that makes a diff) - any info is greatly apprecaited - thanks.|||

Juantana,

Yes, set your Script Task "PrecompileScriptIntoBinaryCode" Property=True and then open and close your script. Save, redeploy and it should work. Let me know if you have any questions.

-Steve

|||

Hi,

Inspite of putting the PrecompileScriptintoBinarycode Property = True, it is not working. Do you have any suggestions?

Thanks.

|||After settings Precompile to True, you need to open and close script editor(s) to actually precompile the script in the task. Then re-deploy the pockage to target system.|||Also remove all the break points in the code, having break points prevents from recompilation. I guess this issue would be fixed by SP2|||Just an FYI I had all my scripts set to PrecompileScriptIntoBinaryCode=True however when developing in 32bit and deploying to 64bit seemingly randomly I will get the "script failed to load" error. I just open the script and compile it again and save and that usually does the trick.|||

Also if the problem still persists and you keep getting the same error I am sure you are not taking the dtsx file from bin folder to execute. The file created by designer (Business Intelligence Development Studio) is only useful for debug and development mode. After building the solution one should use dtsx created in bin folder. This solution worked for me.

Thanks

Mohit

|||NOTE: I got this same error message with Precompile option set to True.

Problem was that a variable name script used was not passed in. Go figure|||

I have about 30 dtxs that I would like to recompile its scripts to solve this issue. I would like to avoid to open each package, then open each script in the package and do the save to recomplie.

Is there a way to recompile from the command line?

Thanks!

|||

Previously I said "randomly" but I think I have it figured out when this error occurs:

If I am working on a package, open a different package and copy a script object and paste into the one I am working on it will not work in 64-bit without a recompile (but will work in 32 bit).

|||

Chris Honcoop wrote:

Previously I said "randomly" but I think I have it figured out when this error occurs:

If I am working on a package, open a different package and copy a script object and paste into the one I am working on it will not work in 64-bit without a recompile (but will work in 32 bit).

Sounds like it might be a bug. Could you submit it at http://connect.microsoft.com?

-Jamie

JOB With SSIS Step Fails If Package Contains A Script Task

I have a Job Step defined to execute a SSIS Package. This SSIS package contains a Script Task. The Job fails with the message "Package execution failed. The step failed."

I am logging events in the package and when the packages gets to the Script Task the log reports "The script files failed to load". If I disable the Script Task from the package it executes fine.

Curiously, the package runs successfully with the Scritp task enabled using dtexecui and dtexec from the command line.

Only if I include the Package in a job step with the Script Task enabled does it fail.

Any help would be appreciated.I've no real idea about this Steve except to say have you got the script code pre-compiled? If so, try it without (and vica versa).

-Jamie|||

Sorry for not posting this sooner, Jamie...Yes, setting the Script Task "PrecompileScriptIntoBinaryCode" Property to True resolved the issue.

-Steve

|||hey - did you ever get this to work? Im having the same issue - its a SSIS pkg that ran on one box and I moved to another box (64 bit if that makes a diff) - any info is greatly apprecaited - thanks.|||

Juantana,

Yes, set your Script Task "PrecompileScriptIntoBinaryCode" Property=True and then open and close your script. Save, redeploy and it should work. Let me know if you have any questions.

-Steve

|||

Hi,

Inspite of putting the PrecompileScriptintoBinarycode Property = True, it is not working. Do you have any suggestions?

Thanks.

|||After settings Precompile to True, you need to open and close script editor(s) to actually precompile the script in the task. Then re-deploy the pockage to target system.|||Also remove all the break points in the code, having break points prevents from recompilation. I guess this issue would be fixed by SP2|||Just an FYI I had all my scripts set to PrecompileScriptIntoBinaryCode=True however when developing in 32bit and deploying to 64bit seemingly randomly I will get the "script failed to load" error. I just open the script and compile it again and save and that usually does the trick.|||

Also if the problem still persists and you keep getting the same error I am sure you are not taking the dtsx file from bin folder to execute. The file created by designer (Business Intelligence Development Studio) is only useful for debug and development mode. After building the solution one should use dtsx created in bin folder. This solution worked for me.

Thanks

Mohit

|||NOTE: I got this same error message with Precompile option set to True.

Problem was that a variable name script used was not passed in. Go figure|||

I have about 30 dtxs that I would like to recompile its scripts to solve this issue. I would like to avoid to open each package, then open each script in the package and do the save to recomplie.

Is there a way to recompile from the command line?

Thanks!

|||

Previously I said "randomly" but I think I have it figured out when this error occurs:

If I am working on a package, open a different package and copy a script object and paste into the one I am working on it will not work in 64-bit without a recompile (but will work in 32 bit).

|||

Chris Honcoop wrote:

Previously I said "randomly" but I think I have it figured out when this error occurs:

If I am working on a package, open a different package and copy a script object and paste into the one I am working on it will not work in 64-bit without a recompile (but will work in 32 bit).

Sounds like it might be a bug. Could you submit it at http://connect.microsoft.com?

-Jamie

JOB With SSIS Step Fails If Package Contains A Script Task

I have a Job Step defined to execute a SSIS Package. This SSIS package contains a Script Task. The Job fails with the message "Package execution failed. The step failed."

I am logging events in the package and when the packages gets to the Script Task the log reports "The script files failed to load". If I disable the Script Task from the package it executes fine.

Curiously, the package runs successfully with the Scritp task enabled using dtexecui and dtexec from the command line.

Only if I include the Package in a job step with the Script Task enabled does it fail.

Any help would be appreciated.I've no real idea about this Steve except to say have you got the script code pre-compiled? If so, try it without (and vica versa).

-Jamie|||

Sorry for not posting this sooner, Jamie...Yes, setting the Script Task "PrecompileScriptIntoBinaryCode" Property to True resolved the issue.

-Steve

|||hey - did you ever get this to work? Im having the same issue - its a SSIS pkg that ran on one box and I moved to another box (64 bit if that makes a diff) - any info is greatly apprecaited - thanks.|||

Juantana,

Yes, set your Script Task "PrecompileScriptIntoBinaryCode" Property=True and then open and close your script. Save, redeploy and it should work. Let me know if you have any questions.

-Steve

|||

Hi,

Inspite of putting the PrecompileScriptintoBinarycode Property = True, it is not working. Do you have any suggestions?

Thanks.

|||After settings Precompile to True, you need to open and close script editor(s) to actually precompile the script in the task. Then re-deploy the pockage to target system.|||Also remove all the break points in the code, having break points prevents from recompilation. I guess this issue would be fixed by SP2|||Just an FYI I had all my scripts set to PrecompileScriptIntoBinaryCode=True however when developing in 32bit and deploying to 64bit seemingly randomly I will get the "script failed to load" error. I just open the script and compile it again and save and that usually does the trick.|||

Also if the problem still persists and you keep getting the same error I am sure you are not taking the dtsx file from bin folder to execute. The file created by designer (Business Intelligence Development Studio) is only useful for debug and development mode. After building the solution one should use dtsx created in bin folder. This solution worked for me.

Thanks

Mohit

|||NOTE: I got this same error message with Precompile option set to True.

Problem was that a variable name script used was not passed in. Go figure|||

I have about 30 dtxs that I would like to recompile its scripts to solve this issue. I would like to avoid to open each package, then open each script in the package and do the save to recomplie.

Is there a way to recompile from the command line?

Thanks!

|||

Previously I said "randomly" but I think I have it figured out when this error occurs:

If I am working on a package, open a different package and copy a script object and paste into the one I am working on it will not work in 64-bit without a recompile (but will work in 32 bit).

|||

Chris Honcoop wrote:

Previously I said "randomly" but I think I have it figured out when this error occurs:

If I am working on a package, open a different package and copy a script object and paste into the one I am working on it will not work in 64-bit without a recompile (but will work in 32 bit).

Sounds like it might be a bug. Could you submit it at http://connect.microsoft.com?

-Jamie

JOB With SSIS Step Fails If Package Contains A Script Task

I have a Job Step defined to execute a SSIS Package. This SSIS package contains a Script Task. The Job fails with the message "Package execution failed. The step failed."

I am logging events in the package and when the packages gets to the Script Task the log reports "The script files failed to load". If I disable the Script Task from the package it executes fine.

Curiously, the package runs successfully with the Scritp task enabled using dtexecui and dtexec from the command line.

Only if I include the Package in a job step with the Script Task enabled does it fail.

Any help would be appreciated.I've no real idea about this Steve except to say have you got the script code pre-compiled? If so, try it without (and vica versa).

-Jamie|||

Sorry for not posting this sooner, Jamie...Yes, setting the Script Task "PrecompileScriptIntoBinaryCode" Property to True resolved the issue.

-Steve

|||hey - did you ever get this to work? Im having the same issue - its a SSIS pkg that ran on one box and I moved to another box (64 bit if that makes a diff) - any info is greatly apprecaited - thanks.|||

Juantana,

Yes, set your Script Task "PrecompileScriptIntoBinaryCode" Property=True and then open and close your script. Save, redeploy and it should work. Let me know if you have any questions.

-Steve

|||

Hi,

Inspite of putting the PrecompileScriptintoBinarycode Property = True, it is not working. Do you have any suggestions?

Thanks.

|||After settings Precompile to True, you need to open and close script editor(s) to actually precompile the script in the task. Then re-deploy the pockage to target system.|||Also remove all the break points in the code, having break points prevents from recompilation. I guess this issue would be fixed by SP2|||Just an FYI I had all my scripts set to PrecompileScriptIntoBinaryCode=True however when developing in 32bit and deploying to 64bit seemingly randomly I will get the "script failed to load" error. I just open the script and compile it again and save and that usually does the trick.|||

Also if the problem still persists and you keep getting the same error I am sure you are not taking the dtsx file from bin folder to execute. The file created by designer (Business Intelligence Development Studio) is only useful for debug and development mode. After building the solution one should use dtsx created in bin folder. This solution worked for me.

Thanks

Mohit

|||NOTE: I got this same error message with Precompile option set to True.

Problem was that a variable name script used was not passed in. Go figure|||

I have about 30 dtxs that I would like to recompile its scripts to solve this issue. I would like to avoid to open each package, then open each script in the package and do the save to recomplie.

Is there a way to recompile from the command line?

Thanks!

|||

Previously I said "randomly" but I think I have it figured out when this error occurs:

If I am working on a package, open a different package and copy a script object and paste into the one I am working on it will not work in 64-bit without a recompile (but will work in 32 bit).

|||

Chris Honcoop wrote:

Previously I said "randomly" but I think I have it figured out when this error occurs:

If I am working on a package, open a different package and copy a script object and paste into the one I am working on it will not work in 64-bit without a recompile (but will work in 32 bit).

Sounds like it might be a bug. Could you submit it at http://connect.microsoft.com?

-Jamie

JOB With SSIS Step Fails If Package Contains A Script Task

I have a Job Step defined to execute a SSIS Package. This SSIS package contains a Script Task. The Job fails with the message "Package execution failed. The step failed."

I am logging events in the package and when the packages gets to the Script Task the log reports "The script files failed to load". If I disable the Script Task from the package it executes fine.

Curiously, the package runs successfully with the Scritp task enabled using dtexecui and dtexec from the command line.

Only if I include the Package in a job step with the Script Task enabled does it fail.

Any help would be appreciated.I've no real idea about this Steve except to say have you got the script code pre-compiled? If so, try it without (and vica versa).

-Jamie|||

Sorry for not posting this sooner, Jamie...Yes, setting the Script Task "PrecompileScriptIntoBinaryCode" Property to True resolved the issue.

-Steve

|||hey - did you ever get this to work? Im having the same issue - its a SSIS pkg that ran on one box and I moved to another box (64 bit if that makes a diff) - any info is greatly apprecaited - thanks.|||

Juantana,

Yes, set your Script Task "PrecompileScriptIntoBinaryCode" Property=True and then open and close your script. Save, redeploy and it should work. Let me know if you have any questions.

-Steve

|||

Hi,

Inspite of putting the PrecompileScriptintoBinarycode Property = True, it is not working. Do you have any suggestions?

Thanks.

|||After settings Precompile to True, you need to open and close script editor(s) to actually precompile the script in the task. Then re-deploy the pockage to target system.|||Also remove all the break points in the code, having break points prevents from recompilation. I guess this issue would be fixed by SP2|||Just an FYI I had all my scripts set to PrecompileScriptIntoBinaryCode=True however when developing in 32bit and deploying to 64bit seemingly randomly I will get the "script failed to load" error. I just open the script and compile it again and save and that usually does the trick.|||

Also if the problem still persists and you keep getting the same error I am sure you are not taking the dtsx file from bin folder to execute. The file created by designer (Business Intelligence Development Studio) is only useful for debug and development mode. After building the solution one should use dtsx created in bin folder. This solution worked for me.

Thanks

Mohit

|||NOTE: I got this same error message with Precompile option set to True.

Problem was that a variable name script used was not passed in. Go figure|||

I have about 30 dtxs that I would like to recompile its scripts to solve this issue. I would like to avoid to open each package, then open each script in the package and do the save to recomplie.

Is there a way to recompile from the command line?

Thanks!

|||

Previously I said "randomly" but I think I have it figured out when this error occurs:

If I am working on a package, open a different package and copy a script object and paste into the one I am working on it will not work in 64-bit without a recompile (but will work in 32 bit).

|||

Chris Honcoop wrote:

Previously I said "randomly" but I think I have it figured out when this error occurs:

If I am working on a package, open a different package and copy a script object and paste into the one I am working on it will not work in 64-bit without a recompile (but will work in 32 bit).

Sounds like it might be a bug. Could you submit it at http://connect.microsoft.com?

-Jamie

sql

JOB With SSIS Step Fails If Package Contains A Script Task

I have a Job Step defined to execute a SSIS Package. This SSIS package contains a Script Task. The Job fails with the message "Package execution failed. The step failed."

I am logging events in the package and when the packages gets to the Script Task the log reports "The script files failed to load". If I disable the Script Task from the package it executes fine.

Curiously, the package runs successfully with the Scritp task enabled using dtexecui and dtexec from the command line.

Only if I include the Package in a job step with the Script Task enabled does it fail.

Any help would be appreciated.I've no real idea about this Steve except to say have you got the script code pre-compiled? If so, try it without (and vica versa).

-Jamie|||

Sorry for not posting this sooner, Jamie...Yes, setting the Script Task "PrecompileScriptIntoBinaryCode" Property to True resolved the issue.

-Steve

|||hey - did you ever get this to work? Im having the same issue - its a SSIS pkg that ran on one box and I moved to another box (64 bit if that makes a diff) - any info is greatly apprecaited - thanks.|||

Juantana,

Yes, set your Script Task "PrecompileScriptIntoBinaryCode" Property=True and then open and close your script. Save, redeploy and it should work. Let me know if you have any questions.

-Steve

|||

Hi,

Inspite of putting the PrecompileScriptintoBinarycode Property = True, it is not working. Do you have any suggestions?

Thanks.

|||After settings Precompile to True, you need to open and close script editor(s) to actually precompile the script in the task. Then re-deploy the pockage to target system.|||Also remove all the break points in the code, having break points prevents from recompilation. I guess this issue would be fixed by SP2|||Just an FYI I had all my scripts set to PrecompileScriptIntoBinaryCode=True however when developing in 32bit and deploying to 64bit seemingly randomly I will get the "script failed to load" error. I just open the script and compile it again and save and that usually does the trick.|||

Also if the problem still persists and you keep getting the same error I am sure you are not taking the dtsx file from bin folder to execute. The file created by designer (Business Intelligence Development Studio) is only useful for debug and development mode. After building the solution one should use dtsx created in bin folder. This solution worked for me.

Thanks

Mohit

|||NOTE: I got this same error message with Precompile option set to True.

Problem was that a variable name script used was not passed in. Go figure|||

I have about 30 dtxs that I would like to recompile its scripts to solve this issue. I would like to avoid to open each package, then open each script in the package and do the save to recomplie.

Is there a way to recompile from the command line?

Thanks!

|||

Previously I said "randomly" but I think I have it figured out when this error occurs:

If I am working on a package, open a different package and copy a script object and paste into the one I am working on it will not work in 64-bit without a recompile (but will work in 32 bit).

|||

Chris Honcoop wrote:

Previously I said "randomly" but I think I have it figured out when this error occurs:

If I am working on a package, open a different package and copy a script object and paste into the one I am working on it will not work in 64-bit without a recompile (but will work in 32 bit).

Sounds like it might be a bug. Could you submit it at http://connect.microsoft.com?

-Jamie

Job View History is disappearing

I've created new maintenance plans and while I can view the maintenance plan
history, the job view history is deleted somehow. Where is this controlled?
The maintenance plan is not deleting job history and I don't see any other
jobs or maintenance plans that are deleting history.
This is SQL2005 SE SP2. I remember in SQL2000 you could limit the size of
the job history, but I don't see that in 2005. Is there a default somewhere
I'm missing?
Thanks
RonRon,
In SSMS, right-click on SQL Agent, select Properties, go to the History
page. This is where you can set the size of history.
RLF
"Ron" <Ron@.discussions.microsoft.com> wrote in message
news:0EB1F324-BF2B-4F24-87AC-585FAB4071E3@.microsoft.com...
> I've created new maintenance plans and while I can view the maintenance
> plan
> history, the job view history is deleted somehow. Where is this
> controlled?
> The maintenance plan is not deleting job history and I don't see any
> other
> jobs or maintenance plans that are deleting history.
> This is SQL2005 SE SP2. I remember in SQL2000 you could limit the size of
> the job history, but I don't see that in 2005. Is there a default
> somewhere
> I'm missing?
> Thanks
> Ron|||Thanks - I'll bump it up and see if that's the issue.
"Russell Fields" wrote:
> Ron,
> In SSMS, right-click on SQL Agent, select Properties, go to the History
> page. This is where you can set the size of history.
> RLF
> "Ron" <Ron@.discussions.microsoft.com> wrote in message
> news:0EB1F324-BF2B-4F24-87AC-585FAB4071E3@.microsoft.com...
> > I've created new maintenance plans and while I can view the maintenance
> > plan
> > history, the job view history is deleted somehow. Where is this
> > controlled?
> > The maintenance plan is not deleting job history and I don't see any
> > other
> > jobs or maintenance plans that are deleting history.
> >
> > This is SQL2005 SE SP2. I remember in SQL2000 you could limit the size of
> > the job history, but I don't see that in 2005. Is there a default
> > somewhere
> > I'm missing?
> >
> > Thanks
> >
> > Ron
>
>

Job truncates text data type

I have a table with a text data type and when I run my sql manually,
everything works fine - the text column is completely filled with what
I need (it's html from a http post request via a stored procedure).
When I put the exact same SQL inside a job, and kick the job off (or
let the agent pick it up on the schedule), it truncates the text column
to a width of 498, rendering my later job steps useless b/c the
expected data isn't there.
Why would the job cutoff the text in this column?What is the syntax used in the proc? Are you issuing an update/insert
or a writetext/updatetext statement?