Lists: | Postg토토 핫SQL : Postg토토 |
---|
From: | postgres(at)countup(dot)de |
---|---|
To: | pgsql-performance(at)postgresql(dot)org |
Subject: | Re: Select-Insert-Query |
Date: | 2004-03-02 00:49:03 |
Message-ID: | 20040302004903.DA8D55A816E@kserver17.erfurt12.de |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Lists: | Postg토토 핫SQL : Postg토토 |
<html><head></head><body bgcolor="#ffffff" text="#000000" link="#000000" alink="#000000" vlink="#000000" marginwidth=10 marginheight=10 topmargin=10 leftmargin=10>Hi,
<br>
<br>nobody has an idea? :-(
<br>
<br>-----Ursprüngliche Nachricht-----
<br>Von: pgsql-performance-owner(at)postgresql(dot)org [mailto:pgsql-performance-owner(at)postgresql(dot)org] Im Auftrag von postgres(at)countup(dot)de
<br>Gesendet: Freitag, 27. Februar 2004 17:53
<br>An: pgsql-performance(at)postgresql(dot)org
<br>Betreff: [PERFORM] Select-Insert-Query
<br>
<br>Hi,
<br>
<br>what is the most performant way to select for example the first 99 rows of a table and insert them into another table...
<br>
<br>at the moment i do this:
<br>
<br>for userrecord in select * from table where account_id = a_account_id and counter_id = userrecord.counter_id and visitortable_id between a_minid and a_maxid limit 99 loop
<br>insert into lastusers (account_id, counter_id, date, ip, hostname) values(a_account_id,userrecord.counter_id,userrecord.date
,userrecord.ip,userrecord.hostname);
<br>end loop;
<br>
<br>i think "limit" is a performance killer, is that right? but what to do instead
<br>
<br>thanks
<br>bye
<br></body></html>
Attachment | Content-Type | Size |
---|---|---|
unknown_filename | text/html | 1.1 KB |
From: | Oliver Elphick <olly(at)lfix(dot)co(dot)uk> |
---|---|
To: | postgres(at)countup(dot)de |
Cc: | pgsql-performance(at)postgresql(dot)org |
Subject: | Re: Select-Insert-Query |
Date: | 2004-03-02 11:48:06 |
Message-ID: | 1078228086.13842.60.camel@cerberus.lfix.co.uk |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Lists: | pgsql-performance |
On Tue, 2004-03-02 at 00:49, postgres(at)countup(dot)de wrote:
> what is the most performant way to select for example the first 99
> rows of a table and insert them into another table...
>
> at the moment i do this:
>
> for userrecord in select * from table where account_id = a_account_id
> and counter_id = userrecord.counter_id and visitortable_id between
> a_minid and a_maxid limit 99 loop
Using LIMIT without ORDER BY will give a selection that is dependent on
the physical location of rows in the table; this will change whenever
one of them is UPDATEd.
> insert into lastusers (account_id, counter_id, date, ip, hostname)
> values(a_account_id,userrecord.counter_id,userrecord.date
> ,userrecord.ip,userrecord.hostname);
> end loop;
>
> i think "limit" is a performance killer, is that right? but what to do
> instead
I'm sure it is the loop that is the killer. Use a query in the INSERT
statement:
INSERT INTO lastusers (account_id, counter_id, date, ip, hostname)
SELECT * FROM table
WHERE account_id = a_account_id AND
counter_id = userrecord.counter_id AND
visitortable_id between a_minid and a_maxid
ORDER BY date DESC
LIMIT 99;
--
Oliver Elphick <olly(at)lfix(dot)co(dot)uk>
LFIX Ltd