Importing CSV with timestamps in milliseconds

I am trying to import a CSV file to an existing table. The timestamps defined in the CSV file are in milliseconds. If I try to importing using the REST API or web interface I get an error on all timestamp columns.

This is how my files look like:

id,exchange,symbol,date,price,amount,sell
829895171,bw,suiusdt_perp,1741219201694,2.6313,0.9,true

And this is the schema of my table

CREATE TABLE crypto_trades_binance_btcusd_perp (
    id LONG,
    exchange SYMBOL,
    symbol SYMBOL,
    date TIMESTAMP,
    price DOUBLE,
    amount DOUBLE,
    sell BOOLEAN
) TIMESTAMP(date) PARTITION BY MONTH

I know that when importing using SQL statements, I can cast values, e.g.

INSERT INTO crypto_trades_binance_btcusd_perp
VALUES ('829895171', 'bw', 'suiusdt_perp', cast(1741219201694 * 1000 as TIMESTAMP), 2.6313,0.9,true)

Is it possible to cast the timestamp values when importing from a CSV with the REST API or web interface as well?

Hi Soren,

I don’t think there is a flag that can handle this right now, but I can raise an issue for it.

Here are some options:

  1. Pre-process the data set to convert the timestamps from millis to micros.
  2. Import as DATE and then copy it into a TIMESTAMP field (or from another table into the final table)
  3. Read the CSV line-by-line and convert then send via the ILP client.

Got it, pre-processing makes the most sense for my use case. Thanks!

Sounds good!

Fwiw, if you are intending to import lots of data, one option would be to create Parquet files. Then copy them to the QuestDB import folder, and read them with read_parquet('file.parquet'). You can read from this file and use an INSERT INTO SELECT to copy it to another table.

Might be handy in future, plus Parquet supports timestamps in millis, micros, and nanos.