From: | "Zeugswetter Andreas SB SD" <ZeugswetterA(at)spardat(dot)at> |
---|---|
To: | "Thomas Swan" <tswan(at)olemiss(dot)edu>, "Lee Kindness" <lkindness(at)csl(dot)co(dot)uk> |
Cc: | "Tom Lane" <tgl(at)sss(dot)pgh(dot)pa(dot)us>, <pgsql-hackers(at)postgresql(dot)org> |
Subject: | Re: Bulkloading using COPY - ignore duplicates? |
Date: | 2001-10-01 14:39:36 |
Message-ID: | 46C15C39FEB2C44BA555E356FBCD6FA41EB3A0@m0114.s-mxs.net |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-hackers |
> IMHO, you should copy into a temporary table and the do a select
> distinct from it into the table that you want.
Which would be way too slow for normal operation :-(
We are talking about a "fast as possible" data load from a flat file
that may have duplicates (or even data errors, but that
is another issue).
Andreas
From | Date | Subject | |
---|---|---|---|
Next Message | Marc G. Fournier | 2001-10-01 14:40:46 | Re: Preparation for Beta |
Previous Message | Oleg Bartunov | 2001-10-01 14:37:41 | cvs problem |