Showing posts with label imports. Show all posts
Showing posts with label imports. Show all posts

Friday, March 30, 2012

Job/DTS Scheduled Execution

Hi all,
Just a quick question...
I have 2 DTS's - one which imports a small amount of data (about 60 rows
max), and another which imports about 20,000 rows (and will increase over
time).
I had a job running the smaller import every two minutes (as it reports real
time data) - I've added the other import to this but it would seem that the
20,000 rows will take more than 2 minutes.
I was wondering if anyone could advise as to what is likely to happen when
the job takes more than 2 minutes to run - I'm "guessing" that 2 minutes
after it started, it'll start again, and start over writing data thats only
just been written with the last import (and round and round it goes) - but
because of the nature of this, wouldn't this cause the server to overload a
bit, ie, it would be constantly starting new executions of the job whilst
the previous execution was running - or does SQL have anything inbuilt to
prevent this.
I'm thinking that my best bet would be to split these apart - have the
smaller import in one job running every two minutes, and the other in a
separate job that runs, perhaps every 5 (will have to time it I guess) ...
Any info on this would be appreciated.
Regards
RobAgent will not start a job if that job is currently executing. I.e., you wil
l not have several
instances of the same job executing at the same time. Does that answer your
question?
Tibor Karaszi, SQL Server MVP
http://www.karaszi.com/sqlserver/default.asp
http://www.solidqualitylearning.com/
Blog: http://solidqualitylearning.com/blogs/tibor/
"Rob Meade" <ku.shn.tsews.thbu@.edaem.bor> wrote in message
news:uDRBiAOHGHA.3700@.TK2MSFTNGP15.phx.gbl...
> Hi all,
> Just a quick question...
> I have 2 DTS's - one which imports a small amount of data (about 60 rows m
ax), and another which
> imports about 20,000 rows (and will increase over time).
> I had a job running the smaller import every two minutes (as it reports re
al time data) - I've
> added the other import to this but it would seem that the 20,000 rows will
take more than 2
> minutes.
> I was wondering if anyone could advise as to what is likely to happen when
the job takes more than
> 2 minutes to run - I'm "guessing" that 2 minutes after it started, it'll s
tart again, and start
> over writing data thats only just been written with the last import (and r
ound and round it
> goes) - but because of the nature of this, wouldn't this cause the server
to overload a bit, ie,
> it would be constantly starting new executions of the job whilst the previ
ous execution was
> running - or does SQL have anything inbuilt to prevent this.
> I'm thinking that my best bet would be to split these apart - have the sma
ller import in one job
> running every two minutes, and the other in a separate job that runs, perh
aps every 5 (will have
> to time it I guess) ...
> Any info on this would be appreciated.
> Regards
> Rob
>|||hi Rob,
What about to call the second DTS from the first one? (As last task, of
course)? So this way I would keep one job no more.
"Rob Meade" wrote:

> Hi all,
> Just a quick question...
> I have 2 DTS's - one which imports a small amount of data (about 60 rows
> max), and another which imports about 20,000 rows (and will increase over
> time).
> I had a job running the smaller import every two minutes (as it reports re
al
> time data) - I've added the other import to this but it would seem that th
e
> 20,000 rows will take more than 2 minutes.
> I was wondering if anyone could advise as to what is likely to happen when
> the job takes more than 2 minutes to run - I'm "guessing" that 2 minutes
> after it started, it'll start again, and start over writing data thats onl
y
> just been written with the last import (and round and round it goes) - but
> because of the nature of this, wouldn't this cause the server to overload
a
> bit, ie, it would be constantly starting new executions of the job whilst
> the previous execution was running - or does SQL have anything inbuilt to
> prevent this.
> I'm thinking that my best bet would be to split these apart - have the
> smaller import in one job running every two minutes, and the other in a
> separate job that runs, perhaps every 5 (will have to time it I guess) ...
> Any info on this would be appreciated.
> Regards
> Rob
>
>|||"Tibor Karaszi" wrote ...

> Agent will not start a job if that job is currently executing. I.e., you
> will not have several instances of the same job executing at the same
> time. Does that answer your question?
Hi Tibor,
Yes it does - thank you :o)
Rob|||"Enric" wrote ...

> What about to call the second DTS from the first one? (As last task, of
> course)? So this way I would keep one job no more.
I could do - but of course then I'd have an even longer wait for the first
lot of info...
Cheers though.
Rob

Monday, March 26, 2012

job schedule, dts

I have made an dts package which takes an *.txt file and imports it into an
table in my database, it get′s the *.txt file from another server on a
mapping o\\server\folder and it works fine but when i schedule the packag to
run every night it fails? i can browse to the file in explorer and i have
used the same account on the sqlserver service and sqlagent service!!
This is the error i get?
Executed as user: oden\admincrm. ...art: Delete from Table
[Salesmaker].[dbo].[salesmakerkund] Step DTSRun OnFinish: Delete from
Table [Salesmaker].[dbo].[salesmakerkund] Step DTSRun OnStart: Copy Data
from ESMKND to [Salesmaker].[dbo].[salesmakerkund] Step DTSRun OnError:
Copy Data from ESMKND to [Salesmaker].[dbo].[salesmakerkund] Step, Error =
-2147467259 (80004005) Error string: Error opening datafile: The syste
m
cannot find the path specified. Error source: Microsoft Data
Transformation Services Flat File Rowset Provider Help file:
DTSFFile.hlp Help context: 0 Error Detail Records: Error: 3
(3); Provider Error: 3 (3) Error string: Error opening datafile: The
system cannot find the path specified. Error source: Microsoft Data
Transformation Services Flat File Rowset Provider Help file:
DTSFFile.hlp Help context: 0 DTSRun OnFinish: Copy Data from
ESMKND to [Salesmaker].[dbo].[s... Process Exit Code 1. The step failed.Hi
Did you map the drive when you were logged in with the SQL Server service
account? Looks like the mapping is for your user and not SQL Service Account
Why don't you jut use an UNC path? It is more portable and works
(\servername\sharename\filename)
Regards
--
Mike Epprecht, Microsoft SQL Server MVP
Zurich, Switzerland
IM: mike@.epprecht.net
MVP Program: http://www.microsoft.com/mvp
Blog: http://www.msmvps.com/epprecht/
"LeSurfer" <LeSurfer@.discussions.microsoft.com> wrote in message
news:41483A22-45E5-4E81-8367-ABEC1AF17CE3@.microsoft.com...
>I have made an dts package which takes an *.txt file and imports it into an
> table in my database, it gets the *.txt file from another server on a
> mapping o\\server\folder and it works fine but when i schedule the packag
> to
> run every night it fails? i can browse to the file in explorer and i
> have
> used the same account on the sqlserver service and sqlagent service!!
> This is the error i get?
> Executed as user: oden\admincrm. ...art: Delete from Table
> [Salesmaker].[dbo].[salesmakerkund] Step DTSRun OnFinish: Delete from
> Table [Salesmaker].[dbo].[salesmakerkund] Step DTSRun OnStart: Copy
> Data
> from ESMKND to [Salesmaker].[dbo].[salesmakerkund] Step DTSRun OnError:
> Copy Data from ESMKND to [Salesmaker].[dbo].[salesmakerkund] Step, Error =
> -2147467259 (80004005) Error string: Error opening datafile: The
> system
> cannot find the path specified. Error source: Microsoft Data
> Transformation Services Flat File Rowset Provider Help file:
> DTSFFile.hlp Help context: 0 Error Detail Records: Error:
> 3
> (3); Provider Error: 3 (3) Error string: Error opening datafile:
> The
> system cannot find the path specified. Error source: Microsoft
> Data
> Transformation Services Flat File Rowset Provider Help file:
> DTSFFile.hlp Help context: 0 DTSRun OnFinish: Copy Data from
> ESMKND to [Salesmaker].[dbo].[s... Process Exit Code 1. The step failed.
>|||LeSurfer,
Check out:
http://support.microsoft.com/defaul...kb;en-us;269074
HTH
Jerry
"LeSurfer" <LeSurfer@.discussions.microsoft.com> wrote in message
news:41483A22-45E5-4E81-8367-ABEC1AF17CE3@.microsoft.com...
>I have made an dts package which takes an *.txt file and imports it into an
> table in my database, it gets the *.txt file from another server on a
> mapping o\\server\folder and it works fine but when i schedule the packag
> to
> run every night it fails? i can browse to the file in explorer and i
> have
> used the same account on the sqlserver service and sqlagent service!!
> This is the error i get?
> Executed as user: oden\admincrm. ...art: Delete from Table
> [Salesmaker].[dbo].[salesmakerkund] Step DTSRun OnFinish: Delete from
> Table [Salesmaker].[dbo].[salesmakerkund] Step DTSRun OnStart: Copy
> Data
> from ESMKND to [Salesmaker].[dbo].[salesmakerkund] Step DTSRun OnError:
> Copy Data from ESMKND to [Salesmaker].[dbo].[salesmakerkund] Step, Error =
> -2147467259 (80004005) Error string: Error opening datafile: The
> system
> cannot find the path specified. Error source: Microsoft Data
> Transformation Services Flat File Rowset Provider Help file:
> DTSFFile.hlp Help context: 0 Error Detail Records: Error:
> 3
> (3); Provider Error: 3 (3) Error string: Error opening datafile:
> The
> system cannot find the path specified. Error source: Microsoft
> Data
> Transformation Services Flat File Rowset Provider Help file:
> DTSFFile.hlp Help context: 0 DTSRun OnFinish: Copy Data from
> ESMKND to [Salesmaker].[dbo].[s... Process Exit Code 1. The step failed.
>