Answer the question
In order to leave comments, you need to log in
Postgresql: is it possible to write directly to CSV?
There is a DB in which the log table (logging of actions of users). It writes data from a large number of places (server software, triggers and stored procedures). The table is noticeably large.
Because reading from this table is extremely rare (only during "debriefing"), and speed is not important, the idea arose to write simply to text files (CSV). I remembered CSV Storage Engine in MySql DBMS. But, unfortunately, there is no STORAGE ENGINE in PG.
Tried options: 1. FOREIGN
DATA
WRAPPER
in the case of CSV files, it works only for reading
2. Using FOREIGN DATA WRAPPER, make a "bridge" to MySql, and save it to CSV there
. A MySql server is required, which is not very convenient.
3. Using the scheduler and the COPY TO command, regularly export data to CSV files (possible with subsequent archiving)
It looks somehow very "crutch", there is another "point of failure"
Maybe there are some other options?
Answer the question
In order to leave comments, you need to log in
For example, in PHP there is file_put_contents () for this, write directly to the file, bypassing the database
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question