importing to a linked back end

G

Guest

Hi,
We have a split database. A daily process we need to undertake is to import
data into the back end, using a function (ie. macro and import spec) in the
front end.
However, we seem to get an error, I'd like to keep the front end free of all
data, but cant see how to import the data (a text file) into the back end.

Can anyone help? Appreciate your time..
Cheers
Mike
 
R

Rick Brandt

mikeyb said:
Hi,
We have a split database. A daily process we need to undertake is to
import data into the back end, using a function (ie. macro and import
spec) in the front end.
However, we seem to get an error, I'd like to keep the front end free
of all data, but cant see how to import the data (a text file) into
the back end.

Can anyone help? Appreciate your time..
Cheers
Mike

If you run the import function in the front end and you specify a linked table
as the target then you are importing the data into the back end.

What is the error?
 
G

Guest

Yes, I have specified the linked table (tbl_FICash), its;
aTransferText function
transfer type: Import delimited
spec: a specification is setup to format the data
table name :tbl_FICash
file name :path included
field names : no

and yields the error "Numeric field overflow"
it works fine if I import it into an identical named table in the front end,
and then copy the contents to the back end, but thats not the ideal for
obvious reasons.

does that make sense?
thanks for coming back so quickly, any help appreciated...do i need to put
anything different in the table name field?
 
R

Rick Brandt

mikeyb said:
Yes, I have specified the linked table (tbl_FICash), its;
aTransferText function
transfer type: Import delimited
spec: a specification is setup to format the data
table name :tbl_FICash
file name :path included
field names : no

and yields the error "Numeric field overflow"
it works fine if I import it into an identical named table in the
front end, and then copy the contents to the back end, but thats not
the ideal for obvious reasons.

does that make sense?
thanks for coming back so quickly, any help appreciated...do i need
to put anything different in the table name field?

I can't think of any reason for that to happen. I would try deleting and then
re-creating the link in the front end in case the link definition is outdated or
corrupted in any way.
 
G

Guest

Hi Rick,

Tried recreating the link - didnt quite do it, but spotted and remembered we
usually get import errors where the column headers are non-numeric, as the
imported file is from a mainframe system, opened the text file, removed the
column headers manually and it imported it into the back end no problem, i
guess you just need to be a little more careful about the data integrity
being imported than i was at first.

Thanks for your help with this, I guess my next task is being able to
automatically remove the top three lines of the mainframe file every day, as
audit wouldnt like us manually editing them as routine.... any help on that
obviously appreciated if you can think of it..!
Cheers
Mike
 
R

Rick Brandt

mikeyb said:
Hi Rick,

Tried recreating the link - didnt quite do it, but spotted and remembered we
usually get import errors where the column headers are non-numeric, as the
imported file is from a mainframe system, opened the text file, removed the
column headers manually and it imported it into the back end no problem, i
guess you just need to be a little more careful about the data integrity
being imported than i was at first.

Thanks for your help with this, I guess my next task is being able to
automatically remove the top three lines of the mainframe file every day, as
audit wouldnt like us manually editing them as routine.... any help on that
obviously appreciated if you can think of it..!
Cheers
Mike

When importing from non-Jet sources it is actually quite common to first import
into a very generic table that will take just about anything without protest and
then use queries and code to move the data into the final table. That allows
you to massage and perform checks on the data so that errors are avoided in the
final move.

I would not consider that a practice to be avoided at all, but rather just the
smartest way to do it.
 
G

Guest

thanks Rick, hadnt actually thought of that, i was editing the raw *.txt file
before the import, but will take your suggestion of importing into a mule
table and then processing the data.
thanks very much, saved a great deal of time..
cheers
MIke
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top