My suggestion is to use an intermediate table. I have been using this approach for some time and - although somewhat verbose - it works well. So here it is.
- Create an intermediate table with all fields as
text
. Here is an example with three fields.
create table intermediate_t (field_a text, field_b text, field_c text)
- Populate the intermediate table using copy. You may need to use extra parameters like
encoding
, null
etc. A foreign table as intermediate is another viable option.
copy intermediate_t from 'c:\temp\test_data.txt'
with (
format 'csv', delimiter ',', quote '"'
);
The sample data file c:\temp\test_data.txt
is:
Onion soup, 12.30, false
Creme caramel, "10,50", true
Fish and chips, 15.00, false
Creme brulee, "6,20", true
Mercimek corbasi, 12.00, false
- Select from the intermediate table and format/cast the column values as per your needs. You have the full power of SQL to do this, no matter how complex it might be, at your disposal.
select field_a,
case when field_b ~ ',' then replace(field_b,',','.') else field_b end::numeric field_b,
field_c::boolean
from intermediate_t;
You may also create your target table or a view using the above query.
create table target_t as
select field_a,
case when field_b ~ ',' then replace(field_b,',','.') else field_b end::numeric field_b,
field_c::boolean
from intermediate_t;
field_a |
field_b |
field_c |
Onion soup |
12.30 |
false |
Creme caramel |
10.50 |
true |
Fish and chips |
15.00 |
false |
Creme brulee |
6.20 |
true |
Mercimek corbasi |
12.00 |
false |
lc_numeric
to a locale that recognizes,
as decimal placeholder and then usingto_number()