From: | John DeSoi <jdesoi(at)gmail(dot)com> |
---|---|
To: | tigran2-postgres(at)riatest(dot)com |
Cc: | <pgsql-php(at)postgresql(dot)org> |
Subject: | Re: Storing large files in multiple schemas: BLOB or BYTEA |
Date: | 2012-10-11 12:59:13 |
Message-ID: | EFC71856-826E-4F87-834B-F624BFD8C5B9@gmail.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-php |
On Oct 10, 2012, at 6:12 AM, tigran2-postgres(at)riatest(dot)com wrote:
> 2. BYTEA. These are correctly stored per schema so pg_dump –n works correctly however I cannot seem to find a way to stream the data. This means that there is no way to access the data from PHP if it is larger than memory limit.
You can get the octet length and then use the substring function to grab large columns in chunks. See
http://www.postgresql.org/docs/current/interactive/functions-binarystring.html
John DeSoi, Ph.D.
From | Date | Subject | |
---|---|---|---|
Next Message | John DeSoi | 2012-10-11 13:05:25 | Re: Storing large files in multiple schemas: BLOB or BYTEA |
Previous Message | tigran2-postgres | 2012-10-10 10:12:05 | Storing large files in multiple schemas: BLOB or BYTEA |