From: | Jasen Betts <jasen(at)xnet(dot)co(dot)nz> |
---|---|
To: | pgsql-php(at)postgresql(dot)org |
Subject: | Re: large resultset |
Date: | 2010-06-15 12:21:20 |
Message-ID: | hv7r80$ogq@reversiblemaps.ath.cx |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-php |
On 2010-06-15, Andrew McMillan <andrew(at)morphoss(dot)com> wrote:
> Fundamentally sending 2million of anything can get problematic pretty
> darn quickly, unless the 'thing' is less than 100 bytes.
>
> My personal favourite would be to write a record somewhere saying 'so
> and so wants these 2 million records', and give the user a URL where
> they can fetch them from. Or e-mail them to the user, or... just about
> anything, except try and generate them in-line with the page, in a
> reasonable time for their browser to not give up, or their proxy to not
> give up, or their ISP's transparent proxy to not give up.
email often fails for sizes over 10Mb
> Why do they want 2 million record anyway? 2 million of what? Will
> another user drop by 10 seconds later and also want 2 million records?
> The same 2 million? Why does the user want 2 million records? Is there
> something that can be done to the 2 million records to make them a
> smaller but more useful set of information?
/Nobody/ wants a web page with 2 million lines on it,
(scrolling gets tricky when each pixel is 2000 lines of data, plus
most browsers aren't designed to handle it well)
still if it's served with "Content-Disposition: Attachment"
they'll get offered it for download instead.
(unless they use IE and you use cookies and SSL in which case it
doesn't work)
From | Date | Subject | |
---|---|---|---|
Next Message | vinny | 2010-06-15 12:25:02 | Re: large resultset |
Previous Message | Jasen Betts | 2010-06-15 12:09:25 | Re: large resultset |