βΉοΈ Skipped - page is already crawled
| Filter | Status | Condition | Details |
|---|---|---|---|
| HTTP status | PASS | download_http_code = 200 | HTTP 200 |
| Age cutoff | PASS | download_stamp > now() - 6 MONTH | 0.7 months ago |
| History drop | PASS | isNull(history_drop_reason) | No drop reason |
| Spam/ban | PASS | fh_dont_index != 1 AND ml_spam_score = 0 | ml_spam_score=0 |
| Canonical | PASS | meta_canonical IS NULL OR = '' OR = src_unparsed | Not set |
| Property | Value |
|---|---|
| URL | https://clickhouse.com/docs/use-cases/time-series/basic-operations |
| Last Crawled | 2026-04-04 03:37:42 (19 days ago) |
| First Indexed | 2025-04-26 20:09:09 (12 months ago) |
| HTTP Status Code | 200 |
| Content | |
| Meta Title | Basic operations - time-series | ClickHouse Docs |
| Meta Description | Basic time-series operations in ClickHouse. |
| Meta Canonical | null |
| Boilerpipe Text | ClickHouse provides several methods for working with time series data, allowing you to aggregate, group, and analyze data points across different time periods.
This section covers the fundamental operations commonly used when working with time-based data.
Common operations include grouping data by time intervals, handling gaps in time series data, and calculating changes between time periods.
These operations can be performed using standard SQL syntax combined with ClickHouse's built-in time functions.
We're going to explore ClickHouse time-series querying capabilities with the Wikistat (Wikipedia pageviews data) dataset:
CREATE TABLE wikistat
(
`time` DateTime,
`project` String,
`subproject` String,
`path` String,
`hits` UInt64
)
ENGINE = MergeTree
ORDER BY (time);
Let's populate this table with 1 billion records:
INSERT INTO wikistat
SELECT *
FROM s3('https://ClickHouse-public-datasets.s3.amazonaws.com/wikistat/partitioned/wikistat*.native.zst')
LIMIT 1e9;
Aggregating by time bucket
β
The most popular requirement is to aggregate data based on periods, e.g. get the total amount of hits for each day:
SELECT
toDate(time) AS date,
sum(hits) AS hits
FROM wikistat
GROUP BY ALL
ORDER BY date ASC
LIMIT 5;
ββββββββdateββ¬βββββhitsββ
β 2015-05-01 β 25524369 β
β 2015-05-02 β 25608105 β
β 2015-05-03 β 28567101 β
β 2015-05-04 β 29229944 β
β 2015-05-05 β 29383573 β
ββββββββββββββ΄βββββββββββ
We've used the
toDate()
function here, which converts the specified time to a date type. Alternatively, we can batch by an hour and filter on the specific date:
SELECT
toStartOfHour(time) AS hour,
sum(hits) AS hits
FROM wikistat
WHERE date(time) = '2015-07-01'
GROUP BY ALL
ORDER BY hour ASC
LIMIT 5;
βββββββββββββββββhourββ¬βββhitsββ
β 2015-07-01 00:00:00 β 656676 β
β 2015-07-01 01:00:00 β 768837 β
β 2015-07-01 02:00:00 β 862311 β
β 2015-07-01 03:00:00 β 829261 β
β 2015-07-01 04:00:00 β 749365 β
βββββββββββββββββββββββ΄βββββββββ
The
toStartOfHour()
function used here converts the given time to the start of the hour.
You can also group by year, quarter, month, or day.
Custom grouping intervals
β
We can even group by arbitrary intervals, e.g., 5 minutes using the
toStartOfInterval()
function.
Let's say we want to group by 4-hour intervals.
We can specify the grouping interval using the
INTERVAL
clause:
SELECT
toStartOfInterval(time, INTERVAL 4 HOUR) AS interval,
sum(hits) AS hits
FROM wikistat
WHERE date(time) = '2015-07-01'
GROUP BY ALL
ORDER BY interval ASC
LIMIT 6;
Or we can use the
toIntervalHour()
function
SELECT
toStartOfInterval(time, toIntervalHour(4)) AS interval,
sum(hits) AS hits
FROM wikistat
WHERE date(time) = '2015-07-01'
GROUP BY ALL
ORDER BY interval ASC
LIMIT 6;
Either way, we get the following results:
βββββββββββββintervalββ¬ββββhitsββ
β 2015-07-01 00:00:00 β 3117085 β
β 2015-07-01 04:00:00 β 2928396 β
β 2015-07-01 08:00:00 β 2679775 β
β 2015-07-01 12:00:00 β 2461324 β
β 2015-07-01 16:00:00 β 2823199 β
β 2015-07-01 20:00:00 β 2984758 β
βββββββββββββββββββββββ΄ββββββββββ
Filling empty groups
β
In a lot of cases we deal with sparse data with some absent intervals. This results in empty buckets. Let's take the following example where we group data by 1-hour intervals. This will output the following stats with some hours missing values:
SELECT
toStartOfHour(time) AS hour,
sum(hits)
FROM wikistat
WHERE (project = 'ast') AND (subproject = 'm') AND (date(time) = '2015-07-01')
GROUP BY ALL
ORDER BY hour ASC;
βββββββββββββββββhourββ¬βsum(hits)ββ
β 2015-07-01 00:00:00 β 3 β <- missing values
β 2015-07-01 02:00:00 β 1 β <- missing values
β 2015-07-01 04:00:00 β 1 β
β 2015-07-01 05:00:00 β 2 β
β 2015-07-01 06:00:00 β 1 β
β 2015-07-01 07:00:00 β 1 β
β 2015-07-01 08:00:00 β 3 β
β 2015-07-01 09:00:00 β 2 β <- missing values
β 2015-07-01 12:00:00 β 2 β
β 2015-07-01 13:00:00 β 4 β
β 2015-07-01 14:00:00 β 2 β
β 2015-07-01 15:00:00 β 2 β
β 2015-07-01 16:00:00 β 2 β
β 2015-07-01 17:00:00 β 1 β
β 2015-07-01 18:00:00 β 5 β
β 2015-07-01 19:00:00 β 5 β
β 2015-07-01 20:00:00 β 4 β
β 2015-07-01 21:00:00 β 4 β
β 2015-07-01 22:00:00 β 2 β
β 2015-07-01 23:00:00 β 2 β
βββββββββββββββββββββββ΄ββββββββββββ
ClickHouse provides the
WITH FILL
modifier to address this. This will fill out all the empty hours with zeros, so we can better understand the distribution over time:
SELECT
toStartOfHour(time) AS hour,
sum(hits)
FROM wikistat
WHERE (project = 'ast') AND (subproject = 'm') AND (date(time) = '2015-07-01')
GROUP BY ALL
ORDER BY hour ASC WITH FILL STEP toIntervalHour(1);
βββββββββββββββββhourββ¬βsum(hits)ββ
β 2015-07-01 00:00:00 β 3 β
β 2015-07-01 01:00:00 β 0 β <- new value
β 2015-07-01 02:00:00 β 1 β
β 2015-07-01 03:00:00 β 0 β <- new value
β 2015-07-01 04:00:00 β 1 β
β 2015-07-01 05:00:00 β 2 β
β 2015-07-01 06:00:00 β 1 β
β 2015-07-01 07:00:00 β 1 β
β 2015-07-01 08:00:00 β 3 β
β 2015-07-01 09:00:00 β 2 β
β 2015-07-01 10:00:00 β 0 β <- new value
β 2015-07-01 11:00:00 β 0 β <- new value
β 2015-07-01 12:00:00 β 2 β
β 2015-07-01 13:00:00 β 4 β
β 2015-07-01 14:00:00 β 2 β
β 2015-07-01 15:00:00 β 2 β
β 2015-07-01 16:00:00 β 2 β
β 2015-07-01 17:00:00 β 1 β
β 2015-07-01 18:00:00 β 5 β
β 2015-07-01 19:00:00 β 5 β
β 2015-07-01 20:00:00 β 4 β
β 2015-07-01 21:00:00 β 4 β
β 2015-07-01 22:00:00 β 2 β
β 2015-07-01 23:00:00 β 2 β
βββββββββββββββββββββββ΄ββββββββββββ
Rolling time windows
β
Sometimes, we don't want to deal with the start of intervals (like the start of a day or an hour) but window intervals.
Let's say we want to understand the total hits for a window, not based on days but on a 24-hour period offset from 6 pm.
We can use the
date_diff()
function to calculate the difference between a reference time and each record's time.
In this case, the
day
column will represent the difference in days (e.g., 1 day ago, 2 days ago, etc.):
SELECT
dateDiff('day', toDateTime('2015-05-01 18:00:00'), time) AS day,
sum(hits),
FROM wikistat
GROUP BY ALL
ORDER BY day ASC
LIMIT 5;
ββdayββ¬βsum(hits)ββ
β 0 β 25524369 β
β 1 β 25608105 β
β 2 β 28567101 β
β 3 β 29229944 β
β 4 β 29383573 β
βββββββ΄ββββββββββββ |
| Markdown | [Skip to main content](https://clickhouse.com/docs/use-cases/time-series/basic-operations#__docusaurus_skipToContent_fallback)
[](https://clickhouse.com/)
- [Products](https://clickhouse.com/docs/use-cases/time-series/basic-operations)
- [ClickHouse Cloud Best way to use ClickHouse. Available on AWS, GCP, and Azure.](https://clickhouse.com/cloud)
- [BYOC (Bring Your Own Cloud) The fully managed ClickHouse Cloud service, Can be deployed in your AWS account.](https://clickhouse.com/cloud/bring-your-own-cloud)
- [ClickHouse Set up a database with open-source ClickHouse. ClickHouse](https://clickhouse.com/clickhouse)
- [Discover more than 100 integrations.](https://clickhouse.com/integrations)
[Discover more than 100 integrations.](https://clickhouse.com/integrations)
- [Use cases](https://clickhouse.com/docs/use-cases/time-series/basic-operations)
- [Real-time analytics](https://clickhouse.com/use-cases/real-time-analytics)
- [Machine Learning & Generative AI](https://clickhouse.com/use-cases/machine-learning-and-data-science)
- [Business Intelligence](https://clickhouse.com/use-cases/data-warehousing)
- [Logs, Events, Traces](https://clickhouse.com/use-cases/observability)
- [All use cases](https://clickhouse.com/use-cases)
[All use cases](https://clickhouse.com/use-cases)
- [Documentation](https://clickhouse.com/docs)
- [Resources](https://clickhouse.com/docs/use-cases/time-series/basic-operations)
- [User stories](https://clickhouse.com/user-stories)
- [Blog](https://clickhouse.com/blog)
- [Events](https://clickhouse.com/company/events)
- [Learning and certification](https://clickhouse.com/learn)
- [Comparison](https://clickhouse.com/docs/use-cases/time-series/basic-operations)
- [BigQuery](https://clickhouse.com/comparison/bigquery)
- [PostgreSQL](https://clickhouse.com/comparison/postgresql)
- [Redshift](https://clickhouse.com/comparison/redshift)
- [Rockset](https://clickhouse.com/comparison/rockset)
- [Snowflake](https://clickhouse.com/comparison/snowflake)
- [Video](https://clickhouse.com/videos)
- [Demo](https://clickhouse.com/demos)
- [Pricing](https://clickhouse.com/pricing)
- [Contact](https://clickhouse.com/company/contact?loc=nav)
[46\.7k](https://github.com/ClickHouse/ClickHouse?utm_source=clickhouse&utm_medium=website&utm_campaign=website-nav)
[Search`Ctrl``K`](https://clickhouse.com/docs/search)
[Sign in](https://console.clickhouse.cloud/signIn?loc=docs-nav-signIn-cta)
[Get started](https://console.clickhouse.cloud/signUp?loc=docs-nav-signUp-cta)
[Get started](https://clickhouse.com/docs/introduction-clickhouse)
[Cloud](https://clickhouse.com/docs/cloud/overview)
[Manage data](https://clickhouse.com/docs/updating-data)
[Server admin](https://clickhouse.com/docs/guides/manage-and-deploy-index)
[Reference](https://clickhouse.com/docs/sql-reference)
[Integrations](https://clickhouse.com/docs/integrations)
[ClickStack](https://clickhouse.com/docs/use-cases/observability/clickstack/overview)
[chDB](https://clickhouse.com/docs/chdb)
[About](https://clickhouse.com/docs/about)
[Knowledge Base](https://clickhouse.com/docs/knowledgebase)
[English](https://clickhouse.com/docs/use-cases/time-series/basic-operations)
- [English](https://clickhouse.com/docs/use-cases/time-series/basic-operations)
- [ζ₯ζ¬θͺ](https://clickhouse.com/docs/jp/use-cases/time-series/basic-operations)
- [δΈζ](https://clickhouse.com/docs/zh/use-cases/time-series/basic-operations)
- [Π ΡΡΡΠΊΠΈΠΉ](https://clickhouse.com/docs/ru/use-cases/time-series/basic-operations)
- [νκ΅μ΄](https://clickhouse.com/docs/ko/use-cases/time-series/basic-operations)
[Skip to main content](https://clickhouse.com/docs/use-cases/time-series/basic-operations#__docusaurus_skipToContent_fallback)
[](https://clickhouse.com/)
- [Products](https://clickhouse.com/docs/use-cases/time-series/basic-operations)
- [ClickHouse Cloud Best way to use ClickHouse. Available on AWS, GCP, and Azure.](https://clickhouse.com/cloud)
- [BYOC (Bring Your Own Cloud) The fully managed ClickHouse Cloud service, Can be deployed in your AWS account.](https://clickhouse.com/cloud/bring-your-own-cloud)
- [ClickHouse Set up a database with open-source ClickHouse. ClickHouse](https://clickhouse.com/clickhouse)
- [Discover more than 100 integrations.](https://clickhouse.com/integrations)
[Discover more than 100 integrations.](https://clickhouse.com/integrations)
- [Use cases](https://clickhouse.com/docs/use-cases/time-series/basic-operations)
- [Real-time analytics](https://clickhouse.com/use-cases/real-time-analytics)
- [Machine Learning & Generative AI](https://clickhouse.com/use-cases/machine-learning-and-data-science)
- [Business Intelligence](https://clickhouse.com/use-cases/data-warehousing)
- [Logs, Events, Traces](https://clickhouse.com/use-cases/observability)
- [All use cases](https://clickhouse.com/use-cases)
[All use cases](https://clickhouse.com/use-cases)
- [Documentation](https://clickhouse.com/docs)
- [Resources](https://clickhouse.com/docs/use-cases/time-series/basic-operations)
- [User stories](https://clickhouse.com/user-stories)
- [Blog](https://clickhouse.com/blog)
- [Events](https://clickhouse.com/company/events)
- [Learning and certification](https://clickhouse.com/learn)
- [Comparison](https://clickhouse.com/docs/use-cases/time-series/basic-operations)
- [BigQuery](https://clickhouse.com/comparison/bigquery)
- [PostgreSQL](https://clickhouse.com/comparison/postgresql)
- [Redshift](https://clickhouse.com/comparison/redshift)
- [Rockset](https://clickhouse.com/comparison/rockset)
- [Snowflake](https://clickhouse.com/comparison/snowflake)
- [Video](https://clickhouse.com/videos)
- [Demo](https://clickhouse.com/demos)
- [Pricing](https://clickhouse.com/pricing)
- [Contact](https://clickhouse.com/company/contact?loc=nav)
[46\.7k](https://github.com/ClickHouse/ClickHouse?utm_source=clickhouse&utm_medium=website&utm_campaign=website-nav)
[Search`Ctrl``K`](https://clickhouse.com/docs/search)
[Sign in](https://console.clickhouse.cloud/signIn?loc=docs-nav-signIn-cta)
[Get started](https://console.clickhouse.cloud/signUp?loc=docs-nav-signUp-cta)
[Get started](https://clickhouse.com/docs/introduction-clickhouse)
[Cloud](https://clickhouse.com/docs/cloud/overview)
[Manage data](https://clickhouse.com/docs/updating-data)
[Server admin](https://clickhouse.com/docs/guides/manage-and-deploy-index)
[Reference](https://clickhouse.com/docs/sql-reference)
[Integrations](https://clickhouse.com/docs/integrations)
[ClickStack](https://clickhouse.com/docs/use-cases/observability/clickstack/overview)
[chDB](https://clickhouse.com/docs/chdb)
[About](https://clickhouse.com/docs/about)
[Knowledge Base](https://clickhouse.com/docs/knowledgebase)
[English](https://clickhouse.com/docs/use-cases/time-series/basic-operations)
- [English](https://clickhouse.com/docs/use-cases/time-series/basic-operations)
- [ζ₯ζ¬θͺ](https://clickhouse.com/docs/jp/use-cases/time-series/basic-operations)
- [δΈζ](https://clickhouse.com/docs/zh/use-cases/time-series/basic-operations)
- [Π ΡΡΡΠΊΠΈΠΉ](https://clickhouse.com/docs/ru/use-cases/time-series/basic-operations)
- [νκ΅μ΄](https://clickhouse.com/docs/ko/use-cases/time-series/basic-operations)
[Search`Ctrl``K`](https://clickhouse.com/docs/search)
- [Introduction](https://clickhouse.com/docs/introduction-clickhouse)
- [Concepts](https://clickhouse.com/docs/concepts)
- [What is OLAP?](https://clickhouse.com/docs/concepts/olap)
- [Why is ClickHouse so fast?](https://clickhouse.com/docs/concepts/why-clickhouse-is-so-fast)
- [Why is ClickHouse unique?](https://clickhouse.com/docs/about-us/distinctive-features)
- [Glossary](https://clickhouse.com/docs/concepts/glossary)
- [FAQ](https://clickhouse.com/docs/concepts/faq)
- [Starter guides](https://clickhouse.com/docs/starter-guides)
- [Creating tables](https://clickhouse.com/docs/guides/creating-tables)
- [Updating and deleting data](https://clickhouse.com/docs/guides/developer/mutations)
- [Selecting data](https://clickhouse.com/docs/guides/writing-queries)
- [Generating random test data](https://clickhouse.com/docs/guides/generating-test-data)
- [Inserting ClickHouse data](https://clickhouse.com/docs/guides/inserting-data)
- [Working with arrays in ClickHouse](https://clickhouse.com/docs/guides/working-with-arrays)
- [Working with JOINs in ClickHouse](https://clickhouse.com/docs/guides/working-with-joins)
- [Troubleshooting](https://clickhouse.com/docs/troubleshooting)
- [Best practices](https://clickhouse.com/docs/best-practices)
- [Sizing and hardware recommendations](https://clickhouse.com/docs/guides/sizing-and-hardware-recommendations)
- [Choosing a primary key](https://clickhouse.com/docs/best-practices/choosing-a-primary-key)
- [Selecting data types](https://clickhouse.com/docs/best-practices/select-data-types)
- [Use materialized views](https://clickhouse.com/docs/best-practices/use-materialized-views)
- [Minimize and optimize JOINs](https://clickhouse.com/docs/best-practices/minimize-optimize-joins)
- [Choosing a partitioning key](https://clickhouse.com/docs/best-practices/choosing-a-partitioning-key)
- [Selecting an insert strategy](https://clickhouse.com/docs/best-practices/selecting-an-insert-strategy)
- [Data skipping indices](https://clickhouse.com/docs/best-practices/use-data-skipping-indices-where-appropriate)
- [Avoid mutations](https://clickhouse.com/docs/best-practices/avoid-mutations)
- [Avoid optimize final](https://clickhouse.com/docs/best-practices/avoid-optimize-final)
- [Using JSON](https://clickhouse.com/docs/best-practices/use-json-where-appropriate)
- [Use case guides](https://clickhouse.com/docs/use-cases)
- [Observability](https://clickhouse.com/docs/use-cases/observability)
- [Time-series](https://clickhouse.com/docs/use-cases/time-series)
- [Date and time data types](https://clickhouse.com/docs/use-cases/time-series/date-time-data-types)
- [Basic operations](https://clickhouse.com/docs/use-cases/time-series/basic-operations)
- [Analysis functions](https://clickhouse.com/docs/use-cases/time-series/analysis-functions)
- [Storage efficiency](https://clickhouse.com/docs/use-cases/time-series/storage-efficiency)
- [Query performance](https://clickhouse.com/docs/use-cases/time-series/query-performance)
- [Build a rollup with materialized views for fast time-series analytics](https://clickhouse.com/docs/knowledgebase/materialized-view-rollup-timeseries)
- [Data lake](https://clickhouse.com/docs/use-cases/data-lake)
- [AI/ML](https://clickhouse.com/docs/use-cases/AI/ask-ai)
- [Example datasets](https://clickhouse.com/docs/getting-started/example-datasets)
- [Tips and community wisdom](https://clickhouse.com/docs/tips-and-tricks/community-wisdom)
- [Use case guides](https://clickhouse.com/docs/use-cases)
- [Time-series](https://clickhouse.com/docs/use-cases/time-series)
- Basic operations
[Edit this page](https://github.com/ClickHouse/clickhouse-docs/blob/main/docs/use-cases/time-series/02_basic-operations.md)
# Basic time-series operations
ClickHouse provides several methods for working with time series data, allowing you to aggregate, group, and analyze data points across different time periods. This section covers the fundamental operations commonly used when working with time-based data.
Common operations include grouping data by time intervals, handling gaps in time series data, and calculating changes between time periods. These operations can be performed using standard SQL syntax combined with ClickHouse's built-in time functions.
We're going to explore ClickHouse time-series querying capabilities with the Wikistat (Wikipedia pageviews data) dataset:
```
CREATE TABLE wikistat
(
`time` DateTime,
`project` String,
`subproject` String,
`path` String,
`hits` UInt64
)
ENGINE = MergeTree
ORDER BY (time);
```
Let's populate this table with 1 billion records:
```
INSERT INTO wikistat
SELECT *
FROM s3('https://ClickHouse-public-datasets.s3.amazonaws.com/wikistat/partitioned/wikistat*.native.zst')
LIMIT 1e9;
```
## Aggregating by time bucket[β](https://clickhouse.com/docs/use-cases/time-series/basic-operations#time-series-aggregating-time-bucket "Direct link to Aggregating by time bucket")
The most popular requirement is to aggregate data based on periods, e.g. get the total amount of hits for each day:
```
SELECT
toDate(time) AS date,
sum(hits) AS hits
FROM wikistat
GROUP BY ALL
ORDER BY date ASC
LIMIT 5;
```
```
ββββββββdateββ¬βββββhitsββ
β 2015-05-01 β 25524369 β
β 2015-05-02 β 25608105 β
β 2015-05-03 β 28567101 β
β 2015-05-04 β 29229944 β
β 2015-05-05 β 29383573 β
ββββββββββββββ΄βββββββββββ
```
We've used the [`toDate()`](https://clickhouse.com/docs/sql-reference/functions/type-conversion-functions#toDate) function here, which converts the specified time to a date type. Alternatively, we can batch by an hour and filter on the specific date:
```
SELECT
toStartOfHour(time) AS hour,
sum(hits) AS hits
FROM wikistat
WHERE date(time) = '2015-07-01'
GROUP BY ALL
ORDER BY hour ASC
LIMIT 5;
```
```
βββββββββββββββββhourββ¬βββhitsββ
β 2015-07-01 00:00:00 β 656676 β
β 2015-07-01 01:00:00 β 768837 β
β 2015-07-01 02:00:00 β 862311 β
β 2015-07-01 03:00:00 β 829261 β
β 2015-07-01 04:00:00 β 749365 β
βββββββββββββββββββββββ΄βββββββββ
```
The [`toStartOfHour()`](https://clickhouse.com/docs/sql-reference/functions/date-time-functions#toStartOfHour) function used here converts the given time to the start of the hour. You can also group by year, quarter, month, or day.
## Custom grouping intervals[β](https://clickhouse.com/docs/use-cases/time-series/basic-operations#time-series-custom-grouping-intervals "Direct link to Custom grouping intervals")
We can even group by arbitrary intervals, e.g., 5 minutes using the [`toStartOfInterval()`](https://clickhouse.com/docs/sql-reference/functions/date-time-functions#toStartOfInterval) function.
Let's say we want to group by 4-hour intervals. We can specify the grouping interval using the [`INTERVAL`](https://clickhouse.com/docs/sql-reference/data-types/special-data-types/interval) clause:
```
SELECT
toStartOfInterval(time, INTERVAL 4 HOUR) AS interval,
sum(hits) AS hits
FROM wikistat
WHERE date(time) = '2015-07-01'
GROUP BY ALL
ORDER BY interval ASC
LIMIT 6;
```
Or we can use the [`toIntervalHour()`](https://clickhouse.com/docs/sql-reference/functions/type-conversion-functions#toIntervalHour) function
```
SELECT
toStartOfInterval(time, toIntervalHour(4)) AS interval,
sum(hits) AS hits
FROM wikistat
WHERE date(time) = '2015-07-01'
GROUP BY ALL
ORDER BY interval ASC
LIMIT 6;
```
Either way, we get the following results:
```
βββββββββββββintervalββ¬ββββhitsββ
β 2015-07-01 00:00:00 β 3117085 β
β 2015-07-01 04:00:00 β 2928396 β
β 2015-07-01 08:00:00 β 2679775 β
β 2015-07-01 12:00:00 β 2461324 β
β 2015-07-01 16:00:00 β 2823199 β
β 2015-07-01 20:00:00 β 2984758 β
βββββββββββββββββββββββ΄ββββββββββ
```
## Filling empty groups[β](https://clickhouse.com/docs/use-cases/time-series/basic-operations#time-series-filling-empty-groups "Direct link to Filling empty groups")
In a lot of cases we deal with sparse data with some absent intervals. This results in empty buckets. Let's take the following example where we group data by 1-hour intervals. This will output the following stats with some hours missing values:
```
SELECT
toStartOfHour(time) AS hour,
sum(hits)
FROM wikistat
WHERE (project = 'ast') AND (subproject = 'm') AND (date(time) = '2015-07-01')
GROUP BY ALL
ORDER BY hour ASC;
```
```
βββββββββββββββββhourββ¬βsum(hits)ββ
β 2015-07-01 00:00:00 β 3 β <- missing values
β 2015-07-01 02:00:00 β 1 β <- missing values
β 2015-07-01 04:00:00 β 1 β
β 2015-07-01 05:00:00 β 2 β
β 2015-07-01 06:00:00 β 1 β
β 2015-07-01 07:00:00 β 1 β
β 2015-07-01 08:00:00 β 3 β
β 2015-07-01 09:00:00 β 2 β <- missing values
β 2015-07-01 12:00:00 β 2 β
β 2015-07-01 13:00:00 β 4 β
β 2015-07-01 14:00:00 β 2 β
β 2015-07-01 15:00:00 β 2 β
β 2015-07-01 16:00:00 β 2 β
β 2015-07-01 17:00:00 β 1 β
β 2015-07-01 18:00:00 β 5 β
β 2015-07-01 19:00:00 β 5 β
β 2015-07-01 20:00:00 β 4 β
β 2015-07-01 21:00:00 β 4 β
β 2015-07-01 22:00:00 β 2 β
β 2015-07-01 23:00:00 β 2 β
βββββββββββββββββββββββ΄ββββββββββββ
```
ClickHouse provides the [`WITH FILL`](https://clickhouse.com/docs/guides/developer/time-series-filling-gaps#with-fill) modifier to address this. This will fill out all the empty hours with zeros, so we can better understand the distribution over time:
```
SELECT
toStartOfHour(time) AS hour,
sum(hits)
FROM wikistat
WHERE (project = 'ast') AND (subproject = 'm') AND (date(time) = '2015-07-01')
GROUP BY ALL
ORDER BY hour ASC WITH FILL STEP toIntervalHour(1);
```
```
βββββββββββββββββhourββ¬βsum(hits)ββ
β 2015-07-01 00:00:00 β 3 β
β 2015-07-01 01:00:00 β 0 β <- new value
β 2015-07-01 02:00:00 β 1 β
β 2015-07-01 03:00:00 β 0 β <- new value
β 2015-07-01 04:00:00 β 1 β
β 2015-07-01 05:00:00 β 2 β
β 2015-07-01 06:00:00 β 1 β
β 2015-07-01 07:00:00 β 1 β
β 2015-07-01 08:00:00 β 3 β
β 2015-07-01 09:00:00 β 2 β
β 2015-07-01 10:00:00 β 0 β <- new value
β 2015-07-01 11:00:00 β 0 β <- new value
β 2015-07-01 12:00:00 β 2 β
β 2015-07-01 13:00:00 β 4 β
β 2015-07-01 14:00:00 β 2 β
β 2015-07-01 15:00:00 β 2 β
β 2015-07-01 16:00:00 β 2 β
β 2015-07-01 17:00:00 β 1 β
β 2015-07-01 18:00:00 β 5 β
β 2015-07-01 19:00:00 β 5 β
β 2015-07-01 20:00:00 β 4 β
β 2015-07-01 21:00:00 β 4 β
β 2015-07-01 22:00:00 β 2 β
β 2015-07-01 23:00:00 β 2 β
βββββββββββββββββββββββ΄ββββββββββββ
```
## Rolling time windows[β](https://clickhouse.com/docs/use-cases/time-series/basic-operations#time-series-rolling-time-windows "Direct link to Rolling time windows")
Sometimes, we don't want to deal with the start of intervals (like the start of a day or an hour) but window intervals. Let's say we want to understand the total hits for a window, not based on days but on a 24-hour period offset from 6 pm.
We can use the [`date_diff()`](https://clickhouse.com/docs/sql-reference/functions/date-time-functions#timeDiff) function to calculate the difference between a reference time and each record's time. In this case, the `day` column will represent the difference in days (e.g., 1 day ago, 2 days ago, etc.):
```
SELECT
dateDiff('day', toDateTime('2015-05-01 18:00:00'), time) AS day,
sum(hits),
FROM wikistat
GROUP BY ALL
ORDER BY day ASC
LIMIT 5;
```
```
ββdayββ¬βsum(hits)ββ
β 0 β 25524369 β
β 1 β 25608105 β
β 2 β 28567101 β
β 3 β 29229944 β
β 4 β 29383573 β
βββββββ΄ββββββββββββ
```
[Previous Date and time data types](https://clickhouse.com/docs/use-cases/time-series/date-time-data-types)
[Next Analysis functions](https://clickhouse.com/docs/use-cases/time-series/analysis-functions)
- [Aggregating by time bucket](https://clickhouse.com/docs/use-cases/time-series/basic-operations#time-series-aggregating-time-bucket)
- [Custom grouping intervals](https://clickhouse.com/docs/use-cases/time-series/basic-operations#time-series-custom-grouping-intervals)
- [Filling empty groups](https://clickhouse.com/docs/use-cases/time-series/basic-operations#time-series-filling-empty-groups)
- [Rolling time windows](https://clickhouse.com/docs/use-cases/time-series/basic-operations#time-series-rolling-time-windows)
Was this page helpful?
Β© 2016β2026 ClickHouse, Inc.
[Trademark](https://clickhouse.com/legal/trademark-policy)Β·[Privacy](https://clickhouse.com/legal/privacy-policy)Β·[Security](https://trust.clickhouse.com/)Β·[Terms of Service](https://clickhouse.com/legal/agreements/terms-of-service)

Β© 2016β2026 ClickHouse, Inc.
[Trademark](https://clickhouse.com/legal/trademark-policy)Β·[Privacy](https://clickhouse.com/legal/privacy-policy)Β·[Security](https://trust.clickhouse.com/)Β·[Terms of Service](https://clickhouse.com/legal/agreements/terms-of-service)
 |
| Readable Markdown | ClickHouse provides several methods for working with time series data, allowing you to aggregate, group, and analyze data points across different time periods. This section covers the fundamental operations commonly used when working with time-based data.
Common operations include grouping data by time intervals, handling gaps in time series data, and calculating changes between time periods. These operations can be performed using standard SQL syntax combined with ClickHouse's built-in time functions.
We're going to explore ClickHouse time-series querying capabilities with the Wikistat (Wikipedia pageviews data) dataset:
```
CREATE TABLE wikistat
(
`time` DateTime,
`project` String,
`subproject` String,
`path` String,
`hits` UInt64
)
ENGINE = MergeTree
ORDER BY (time);
```
Let's populate this table with 1 billion records:
```
INSERT INTO wikistat
SELECT *
FROM s3('https://ClickHouse-public-datasets.s3.amazonaws.com/wikistat/partitioned/wikistat*.native.zst')
LIMIT 1e9;
```
## Aggregating by time bucket[β](https://clickhouse.com/docs/use-cases/time-series/basic-operations#time-series-aggregating-time-bucket "Direct link to Aggregating by time bucket")
The most popular requirement is to aggregate data based on periods, e.g. get the total amount of hits for each day:
```
SELECT
toDate(time) AS date,
sum(hits) AS hits
FROM wikistat
GROUP BY ALL
ORDER BY date ASC
LIMIT 5;
```
```
ββββββββdateββ¬βββββhitsββ
β 2015-05-01 β 25524369 β
β 2015-05-02 β 25608105 β
β 2015-05-03 β 28567101 β
β 2015-05-04 β 29229944 β
β 2015-05-05 β 29383573 β
ββββββββββββββ΄βββββββββββ
```
We've used the [`toDate()`](https://clickhouse.com/docs/sql-reference/functions/type-conversion-functions#toDate) function here, which converts the specified time to a date type. Alternatively, we can batch by an hour and filter on the specific date:
```
SELECT
toStartOfHour(time) AS hour,
sum(hits) AS hits
FROM wikistat
WHERE date(time) = '2015-07-01'
GROUP BY ALL
ORDER BY hour ASC
LIMIT 5;
```
```
βββββββββββββββββhourββ¬βββhitsββ
β 2015-07-01 00:00:00 β 656676 β
β 2015-07-01 01:00:00 β 768837 β
β 2015-07-01 02:00:00 β 862311 β
β 2015-07-01 03:00:00 β 829261 β
β 2015-07-01 04:00:00 β 749365 β
βββββββββββββββββββββββ΄βββββββββ
```
The [`toStartOfHour()`](https://clickhouse.com/docs/sql-reference/functions/date-time-functions#toStartOfHour) function used here converts the given time to the start of the hour. You can also group by year, quarter, month, or day.
## Custom grouping intervals[β](https://clickhouse.com/docs/use-cases/time-series/basic-operations#time-series-custom-grouping-intervals "Direct link to Custom grouping intervals")
We can even group by arbitrary intervals, e.g., 5 minutes using the [`toStartOfInterval()`](https://clickhouse.com/docs/sql-reference/functions/date-time-functions#toStartOfInterval) function.
Let's say we want to group by 4-hour intervals. We can specify the grouping interval using the [`INTERVAL`](https://clickhouse.com/docs/sql-reference/data-types/special-data-types/interval) clause:
```
SELECT
toStartOfInterval(time, INTERVAL 4 HOUR) AS interval,
sum(hits) AS hits
FROM wikistat
WHERE date(time) = '2015-07-01'
GROUP BY ALL
ORDER BY interval ASC
LIMIT 6;
```
Or we can use the [`toIntervalHour()`](https://clickhouse.com/docs/sql-reference/functions/type-conversion-functions#toIntervalHour) function
```
SELECT
toStartOfInterval(time, toIntervalHour(4)) AS interval,
sum(hits) AS hits
FROM wikistat
WHERE date(time) = '2015-07-01'
GROUP BY ALL
ORDER BY interval ASC
LIMIT 6;
```
Either way, we get the following results:
```
βββββββββββββintervalββ¬ββββhitsββ
β 2015-07-01 00:00:00 β 3117085 β
β 2015-07-01 04:00:00 β 2928396 β
β 2015-07-01 08:00:00 β 2679775 β
β 2015-07-01 12:00:00 β 2461324 β
β 2015-07-01 16:00:00 β 2823199 β
β 2015-07-01 20:00:00 β 2984758 β
βββββββββββββββββββββββ΄ββββββββββ
```
## Filling empty groups[β](https://clickhouse.com/docs/use-cases/time-series/basic-operations#time-series-filling-empty-groups "Direct link to Filling empty groups")
In a lot of cases we deal with sparse data with some absent intervals. This results in empty buckets. Let's take the following example where we group data by 1-hour intervals. This will output the following stats with some hours missing values:
```
SELECT
toStartOfHour(time) AS hour,
sum(hits)
FROM wikistat
WHERE (project = 'ast') AND (subproject = 'm') AND (date(time) = '2015-07-01')
GROUP BY ALL
ORDER BY hour ASC;
```
```
βββββββββββββββββhourββ¬βsum(hits)ββ
β 2015-07-01 00:00:00 β 3 β <- missing values
β 2015-07-01 02:00:00 β 1 β <- missing values
β 2015-07-01 04:00:00 β 1 β
β 2015-07-01 05:00:00 β 2 β
β 2015-07-01 06:00:00 β 1 β
β 2015-07-01 07:00:00 β 1 β
β 2015-07-01 08:00:00 β 3 β
β 2015-07-01 09:00:00 β 2 β <- missing values
β 2015-07-01 12:00:00 β 2 β
β 2015-07-01 13:00:00 β 4 β
β 2015-07-01 14:00:00 β 2 β
β 2015-07-01 15:00:00 β 2 β
β 2015-07-01 16:00:00 β 2 β
β 2015-07-01 17:00:00 β 1 β
β 2015-07-01 18:00:00 β 5 β
β 2015-07-01 19:00:00 β 5 β
β 2015-07-01 20:00:00 β 4 β
β 2015-07-01 21:00:00 β 4 β
β 2015-07-01 22:00:00 β 2 β
β 2015-07-01 23:00:00 β 2 β
βββββββββββββββββββββββ΄ββββββββββββ
```
ClickHouse provides the [`WITH FILL`](https://clickhouse.com/docs/guides/developer/time-series-filling-gaps#with-fill) modifier to address this. This will fill out all the empty hours with zeros, so we can better understand the distribution over time:
```
SELECT
toStartOfHour(time) AS hour,
sum(hits)
FROM wikistat
WHERE (project = 'ast') AND (subproject = 'm') AND (date(time) = '2015-07-01')
GROUP BY ALL
ORDER BY hour ASC WITH FILL STEP toIntervalHour(1);
```
```
βββββββββββββββββhourββ¬βsum(hits)ββ
β 2015-07-01 00:00:00 β 3 β
β 2015-07-01 01:00:00 β 0 β <- new value
β 2015-07-01 02:00:00 β 1 β
β 2015-07-01 03:00:00 β 0 β <- new value
β 2015-07-01 04:00:00 β 1 β
β 2015-07-01 05:00:00 β 2 β
β 2015-07-01 06:00:00 β 1 β
β 2015-07-01 07:00:00 β 1 β
β 2015-07-01 08:00:00 β 3 β
β 2015-07-01 09:00:00 β 2 β
β 2015-07-01 10:00:00 β 0 β <- new value
β 2015-07-01 11:00:00 β 0 β <- new value
β 2015-07-01 12:00:00 β 2 β
β 2015-07-01 13:00:00 β 4 β
β 2015-07-01 14:00:00 β 2 β
β 2015-07-01 15:00:00 β 2 β
β 2015-07-01 16:00:00 β 2 β
β 2015-07-01 17:00:00 β 1 β
β 2015-07-01 18:00:00 β 5 β
β 2015-07-01 19:00:00 β 5 β
β 2015-07-01 20:00:00 β 4 β
β 2015-07-01 21:00:00 β 4 β
β 2015-07-01 22:00:00 β 2 β
β 2015-07-01 23:00:00 β 2 β
βββββββββββββββββββββββ΄ββββββββββββ
```
## Rolling time windows[β](https://clickhouse.com/docs/use-cases/time-series/basic-operations#time-series-rolling-time-windows "Direct link to Rolling time windows")
Sometimes, we don't want to deal with the start of intervals (like the start of a day or an hour) but window intervals. Let's say we want to understand the total hits for a window, not based on days but on a 24-hour period offset from 6 pm.
We can use the [`date_diff()`](https://clickhouse.com/docs/sql-reference/functions/date-time-functions#timeDiff) function to calculate the difference between a reference time and each record's time. In this case, the `day` column will represent the difference in days (e.g., 1 day ago, 2 days ago, etc.):
```
SELECT
dateDiff('day', toDateTime('2015-05-01 18:00:00'), time) AS day,
sum(hits),
FROM wikistat
GROUP BY ALL
ORDER BY day ASC
LIMIT 5;
```
```
ββdayββ¬βsum(hits)ββ
β 0 β 25524369 β
β 1 β 25608105 β
β 2 β 28567101 β
β 3 β 29229944 β
β 4 β 29383573 β
βββββββ΄ββββββββββββ
``` |
| ML Classification | |
| ML Categories | null |
| ML Page Types | null |
| ML Intent Types | null |
| Content Metadata | |
| Language | en |
| Author | null |
| Publish Time | not set |
| Original Publish Time | 2025-04-26 20:09:09 (12 months ago) |
| Republished | No |
| Word Count (Total) | 1,431 |
| Word Count (Content) | 994 |
| Links | |
| External Links | 5 |
| Internal Links | 92 |
| Technical SEO | |
| Meta Nofollow | No |
| Meta Noarchive | No |
| JS Rendered | No |
| Redirect Target | null |
| Performance | |
| Download Time (ms) | 204 |
| TTFB (ms) | 203 |
| Download Size (bytes) | 13,398 |
| Shard | 89 (laksa) |
| Root Hash | 12633450985039531489 |
| Unparsed URL | com,clickhouse!/docs/use-cases/time-series/basic-operations s443 |