Skip to content
🚀 Agentic Postgres: The first database built for agents 🤖
TimescaleDB - Timeseries database for PostgreSQL Docs
  • TigerData.com
  • Try for free
Get started
Try the key Tiger Data features
Start coding with Tiger Data
Create a Tiger Cloud service
About Tiger Data products
Tiger Data architecture for real-time analytics
Pricing plans and account management
Changelog
Use Tiger Data products
Hypertables
Hypercore
Continuous aggregates
Tutorials
Integrations
API Reference
Import and sync
Sync from Postgres
Sync from S3
Stream from Kafka
Upload a file using Console
Upload a file using the terminal
Migrate with downtime
Live migration
Dual-write and backfill
FAQ and troubleshooting
Integrate AI with Tiger Data
Other deployment options
Find a docs page
Sync, import, and migrate your data to Tiger

Upload a file into your service using Tiger Cloud Console

Tiger Cloud: Performance, Scale, Enterprise, Free

Self-hosted products

MST

You can upload files into your service using Tiger Cloud Console. This page explains how to upload CSV, Parquet, and text files, from your local machine and from an S3 bucket.

And that is it, you have imported your data to your Tiger Cloud service.

Keywords

import

Found an issue on this page?Report an issue or Edit this page in GitHub.

PreviousStream from KafkaNextUpload a file using the terminal

Related Content

Sync, import, and migrate your data to Tiger
In Tiger, you can easily import individual files, migrate from other databases, or sync directly so that your data from another source is continuously updated
Upload a file into your service using the terminal
You can upload CSV, MySQL, and Parquet files into your service using the terminal.
Live migration
Migrate your entire database to Tiger Cloud with low downtime
Sync data from S3 to your service
Synchronize data from S3 to Tiger Cloud service in real time
FAQ and troubleshooting
Troubleshooting known issues in database migrations
Stream data from Kafka into your service
Stream data from Kafka into a Tiger Cloud service in order to store, query, and analyze your Kafka events efficiently