Another quick one from the IRC channel. The problem: to import a json data file (array of objects) on the client system into PostgreSQL without needing to mess with external scripting for the job (and without relying on Unix tools). This approach works for files under 1GB only, of course.
begin;
-- import the json to a lo object and note the OID
\lo_import :filename
\set obj :LASTOID
-- change this to an insert or whatever you need.
-- the 'UTF8' is the encoding to assume that the json data is
-- already in, it will be converted to the database encoding if that
-- differs.
-- the "as r(a integer, ...)" part should be edited to match the json
-- object key names (remember to "quote" names that aren't lowercase)
-- and the desired column types.
create table testtab as
select *
from json_to_recordset(convert_from(lo_get(:'obj'),'UTF8')::json)
as r(a integer, b boolean, c text);
-- clean up the object
\lo_unlink :obj
commit;
Put the above in a file, e.g. import.sql
and run as:
psql -v filename="myfile.json" -d mydb -f import.sql
For this purpose I wrote pgimportdoc https://github.com/okbob/pgimportdoc