πŸ•·οΈ Crawler Inspector

URL Lookup

Direct Parameter Lookup

Raw Queries and Responses

1. Shard Calculation

Query:
Response:
Calculated Shard: 89 (from laksa168)

2. Crawled Status Check

Query:
Response:

3. Robots.txt Check

Query:
Response:

4. Spam/Ban Check

Query:
Response:

5. Seen Status Check

ℹ️ Skipped - page is already crawled

πŸ“„
INDEXABLE
βœ…
CRAWLED
19 days ago
πŸ€–
ROBOTS ALLOWED

Page Info Filters

FilterStatusConditionDetails
HTTP statusPASSdownload_http_code = 200HTTP 200
Age cutoffPASSdownload_stamp > now() - 6 MONTH0.7 months ago
History dropPASSisNull(history_drop_reason)No drop reason
Spam/banPASSfh_dont_index != 1 AND ml_spam_score = 0ml_spam_score=0
CanonicalPASSmeta_canonical IS NULL OR = '' OR = src_unparsedNot set

Page Details

PropertyValue
URLhttps://clickhouse.com/docs/use-cases/time-series/basic-operations
Last Crawled2026-04-04 03:37:42 (19 days ago)
First Indexed2025-04-26 20:09:09 (12 months ago)
HTTP Status Code200
Content
Meta TitleBasic operations - time-series | ClickHouse Docs
Meta DescriptionBasic time-series operations in ClickHouse.
Meta Canonicalnull
Boilerpipe Text
ClickHouse provides several methods for working with time series data, allowing you to aggregate, group, and analyze data points across different time periods. This section covers the fundamental operations commonly used when working with time-based data. Common operations include grouping data by time intervals, handling gaps in time series data, and calculating changes between time periods. These operations can be performed using standard SQL syntax combined with ClickHouse's built-in time functions. We're going to explore ClickHouse time-series querying capabilities with the Wikistat (Wikipedia pageviews data) dataset: CREATE TABLE wikistat ( `time` DateTime, `project` String, `subproject` String, `path` String, `hits` UInt64 ) ENGINE = MergeTree ORDER BY (time); Let's populate this table with 1 billion records: INSERT INTO wikistat SELECT * FROM s3('https://ClickHouse-public-datasets.s3.amazonaws.com/wikistat/partitioned/wikistat*.native.zst') LIMIT 1e9; Aggregating by time bucket ​ The most popular requirement is to aggregate data based on periods, e.g. get the total amount of hits for each day: SELECT toDate(time) AS date, sum(hits) AS hits FROM wikistat GROUP BY ALL ORDER BY date ASC LIMIT 5; β”Œβ”€β”€β”€β”€β”€β”€β”€date─┬─────hits─┐ β”‚ 2015-05-01 β”‚ 25524369 β”‚ β”‚ 2015-05-02 β”‚ 25608105 β”‚ β”‚ 2015-05-03 β”‚ 28567101 β”‚ β”‚ 2015-05-04 β”‚ 29229944 β”‚ β”‚ 2015-05-05 β”‚ 29383573 β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ We've used the toDate() function here, which converts the specified time to a date type. Alternatively, we can batch by an hour and filter on the specific date: SELECT toStartOfHour(time) AS hour, sum(hits) AS hits FROM wikistat WHERE date(time) = '2015-07-01' GROUP BY ALL ORDER BY hour ASC LIMIT 5; β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€hour─┬───hits─┐ β”‚ 2015-07-01 00:00:00 β”‚ 656676 β”‚ β”‚ 2015-07-01 01:00:00 β”‚ 768837 β”‚ β”‚ 2015-07-01 02:00:00 β”‚ 862311 β”‚ β”‚ 2015-07-01 03:00:00 β”‚ 829261 β”‚ β”‚ 2015-07-01 04:00:00 β”‚ 749365 β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”˜ The toStartOfHour() function used here converts the given time to the start of the hour. You can also group by year, quarter, month, or day. Custom grouping intervals ​ We can even group by arbitrary intervals, e.g., 5 minutes using the toStartOfInterval() function. Let's say we want to group by 4-hour intervals. We can specify the grouping interval using the INTERVAL clause: SELECT toStartOfInterval(time, INTERVAL 4 HOUR) AS interval, sum(hits) AS hits FROM wikistat WHERE date(time) = '2015-07-01' GROUP BY ALL ORDER BY interval ASC LIMIT 6; Or we can use the toIntervalHour() function SELECT toStartOfInterval(time, toIntervalHour(4)) AS interval, sum(hits) AS hits FROM wikistat WHERE date(time) = '2015-07-01' GROUP BY ALL ORDER BY interval ASC LIMIT 6; Either way, we get the following results: β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€interval─┬────hits─┐ β”‚ 2015-07-01 00:00:00 β”‚ 3117085 β”‚ β”‚ 2015-07-01 04:00:00 β”‚ 2928396 β”‚ β”‚ 2015-07-01 08:00:00 β”‚ 2679775 β”‚ β”‚ 2015-07-01 12:00:00 β”‚ 2461324 β”‚ β”‚ 2015-07-01 16:00:00 β”‚ 2823199 β”‚ β”‚ 2015-07-01 20:00:00 β”‚ 2984758 β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ Filling empty groups ​ In a lot of cases we deal with sparse data with some absent intervals. This results in empty buckets. Let's take the following example where we group data by 1-hour intervals. This will output the following stats with some hours missing values: SELECT toStartOfHour(time) AS hour, sum(hits) FROM wikistat WHERE (project = 'ast') AND (subproject = 'm') AND (date(time) = '2015-07-01') GROUP BY ALL ORDER BY hour ASC; β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€hour─┬─sum(hits)─┐ β”‚ 2015-07-01 00:00:00 β”‚ 3 β”‚ <- missing values β”‚ 2015-07-01 02:00:00 β”‚ 1 β”‚ <- missing values β”‚ 2015-07-01 04:00:00 β”‚ 1 β”‚ β”‚ 2015-07-01 05:00:00 β”‚ 2 β”‚ β”‚ 2015-07-01 06:00:00 β”‚ 1 β”‚ β”‚ 2015-07-01 07:00:00 β”‚ 1 β”‚ β”‚ 2015-07-01 08:00:00 β”‚ 3 β”‚ β”‚ 2015-07-01 09:00:00 β”‚ 2 β”‚ <- missing values β”‚ 2015-07-01 12:00:00 β”‚ 2 β”‚ β”‚ 2015-07-01 13:00:00 β”‚ 4 β”‚ β”‚ 2015-07-01 14:00:00 β”‚ 2 β”‚ β”‚ 2015-07-01 15:00:00 β”‚ 2 β”‚ β”‚ 2015-07-01 16:00:00 β”‚ 2 β”‚ β”‚ 2015-07-01 17:00:00 β”‚ 1 β”‚ β”‚ 2015-07-01 18:00:00 β”‚ 5 β”‚ β”‚ 2015-07-01 19:00:00 β”‚ 5 β”‚ β”‚ 2015-07-01 20:00:00 β”‚ 4 β”‚ β”‚ 2015-07-01 21:00:00 β”‚ 4 β”‚ β”‚ 2015-07-01 22:00:00 β”‚ 2 β”‚ β”‚ 2015-07-01 23:00:00 β”‚ 2 β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ ClickHouse provides the WITH FILL modifier to address this. This will fill out all the empty hours with zeros, so we can better understand the distribution over time: SELECT toStartOfHour(time) AS hour, sum(hits) FROM wikistat WHERE (project = 'ast') AND (subproject = 'm') AND (date(time) = '2015-07-01') GROUP BY ALL ORDER BY hour ASC WITH FILL STEP toIntervalHour(1); β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€hour─┬─sum(hits)─┐ β”‚ 2015-07-01 00:00:00 β”‚ 3 β”‚ β”‚ 2015-07-01 01:00:00 β”‚ 0 β”‚ <- new value β”‚ 2015-07-01 02:00:00 β”‚ 1 β”‚ β”‚ 2015-07-01 03:00:00 β”‚ 0 β”‚ <- new value β”‚ 2015-07-01 04:00:00 β”‚ 1 β”‚ β”‚ 2015-07-01 05:00:00 β”‚ 2 β”‚ β”‚ 2015-07-01 06:00:00 β”‚ 1 β”‚ β”‚ 2015-07-01 07:00:00 β”‚ 1 β”‚ β”‚ 2015-07-01 08:00:00 β”‚ 3 β”‚ β”‚ 2015-07-01 09:00:00 β”‚ 2 β”‚ β”‚ 2015-07-01 10:00:00 β”‚ 0 β”‚ <- new value β”‚ 2015-07-01 11:00:00 β”‚ 0 β”‚ <- new value β”‚ 2015-07-01 12:00:00 β”‚ 2 β”‚ β”‚ 2015-07-01 13:00:00 β”‚ 4 β”‚ β”‚ 2015-07-01 14:00:00 β”‚ 2 β”‚ β”‚ 2015-07-01 15:00:00 β”‚ 2 β”‚ β”‚ 2015-07-01 16:00:00 β”‚ 2 β”‚ β”‚ 2015-07-01 17:00:00 β”‚ 1 β”‚ β”‚ 2015-07-01 18:00:00 β”‚ 5 β”‚ β”‚ 2015-07-01 19:00:00 β”‚ 5 β”‚ β”‚ 2015-07-01 20:00:00 β”‚ 4 β”‚ β”‚ 2015-07-01 21:00:00 β”‚ 4 β”‚ β”‚ 2015-07-01 22:00:00 β”‚ 2 β”‚ β”‚ 2015-07-01 23:00:00 β”‚ 2 β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ Rolling time windows ​ Sometimes, we don't want to deal with the start of intervals (like the start of a day or an hour) but window intervals. Let's say we want to understand the total hits for a window, not based on days but on a 24-hour period offset from 6 pm. We can use the date_diff() function to calculate the difference between a reference time and each record's time. In this case, the day column will represent the difference in days (e.g., 1 day ago, 2 days ago, etc.): SELECT dateDiff('day', toDateTime('2015-05-01 18:00:00'), time) AS day, sum(hits), FROM wikistat GROUP BY ALL ORDER BY day ASC LIMIT 5; β”Œβ”€day─┬─sum(hits)─┐ β”‚ 0 β”‚ 25524369 β”‚ β”‚ 1 β”‚ 25608105 β”‚ β”‚ 2 β”‚ 28567101 β”‚ β”‚ 3 β”‚ 29229944 β”‚ β”‚ 4 β”‚ 29383573 β”‚ β””β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
Markdown
[Skip to main content](https://clickhouse.com/docs/use-cases/time-series/basic-operations#__docusaurus_skipToContent_fallback) [![ClickHouse](https://clickhouse.com/docs/img/ch_logo_docs.svg)![ClickHouse](https://clickhouse.com/docs/img/ch_logo_docs_dark.svg)](https://clickhouse.com/) - [Products](https://clickhouse.com/docs/use-cases/time-series/basic-operations) - [ClickHouse Cloud Best way to use ClickHouse. Available on AWS, GCP, and Azure.](https://clickhouse.com/cloud) - [BYOC (Bring Your Own Cloud) The fully managed ClickHouse Cloud service, Can be deployed in your AWS account.](https://clickhouse.com/cloud/bring-your-own-cloud) - [ClickHouse Set up a database with open-source ClickHouse. ClickHouse](https://clickhouse.com/clickhouse) - [Discover more than 100 integrations.](https://clickhouse.com/integrations) [Discover more than 100 integrations.](https://clickhouse.com/integrations) - [Use cases](https://clickhouse.com/docs/use-cases/time-series/basic-operations) - [Real-time analytics](https://clickhouse.com/use-cases/real-time-analytics) - [Machine Learning & Generative AI](https://clickhouse.com/use-cases/machine-learning-and-data-science) - [Business Intelligence](https://clickhouse.com/use-cases/data-warehousing) - [Logs, Events, Traces](https://clickhouse.com/use-cases/observability) - [All use cases](https://clickhouse.com/use-cases) [All use cases](https://clickhouse.com/use-cases) - [Documentation](https://clickhouse.com/docs) - [Resources](https://clickhouse.com/docs/use-cases/time-series/basic-operations) - [User stories](https://clickhouse.com/user-stories) - [Blog](https://clickhouse.com/blog) - [Events](https://clickhouse.com/company/events) - [Learning and certification](https://clickhouse.com/learn) - [Comparison](https://clickhouse.com/docs/use-cases/time-series/basic-operations) - [BigQuery](https://clickhouse.com/comparison/bigquery) - [PostgreSQL](https://clickhouse.com/comparison/postgresql) - [Redshift](https://clickhouse.com/comparison/redshift) - [Rockset](https://clickhouse.com/comparison/rockset) - [Snowflake](https://clickhouse.com/comparison/snowflake) - [Video](https://clickhouse.com/videos) - [Demo](https://clickhouse.com/demos) - [Pricing](https://clickhouse.com/pricing) - [Contact](https://clickhouse.com/company/contact?loc=nav) [46\.7k](https://github.com/ClickHouse/ClickHouse?utm_source=clickhouse&utm_medium=website&utm_campaign=website-nav) [Search`Ctrl``K`](https://clickhouse.com/docs/search) [Sign in](https://console.clickhouse.cloud/signIn?loc=docs-nav-signIn-cta) [Get started](https://console.clickhouse.cloud/signUp?loc=docs-nav-signUp-cta) [Get started](https://clickhouse.com/docs/introduction-clickhouse) [Cloud](https://clickhouse.com/docs/cloud/overview) [Manage data](https://clickhouse.com/docs/updating-data) [Server admin](https://clickhouse.com/docs/guides/manage-and-deploy-index) [Reference](https://clickhouse.com/docs/sql-reference) [Integrations](https://clickhouse.com/docs/integrations) [ClickStack](https://clickhouse.com/docs/use-cases/observability/clickstack/overview) [chDB](https://clickhouse.com/docs/chdb) [About](https://clickhouse.com/docs/about) [Knowledge Base](https://clickhouse.com/docs/knowledgebase) [English](https://clickhouse.com/docs/use-cases/time-series/basic-operations) - [English](https://clickhouse.com/docs/use-cases/time-series/basic-operations) - [ζ—₯本θͺž](https://clickhouse.com/docs/jp/use-cases/time-series/basic-operations) - [δΈ­ζ–‡](https://clickhouse.com/docs/zh/use-cases/time-series/basic-operations) - [Русский](https://clickhouse.com/docs/ru/use-cases/time-series/basic-operations) - [ν•œκ΅­μ–΄](https://clickhouse.com/docs/ko/use-cases/time-series/basic-operations) [Skip to main content](https://clickhouse.com/docs/use-cases/time-series/basic-operations#__docusaurus_skipToContent_fallback) [![ClickHouse](https://clickhouse.com/docs/img/ch_logo_docs.svg)![ClickHouse](https://clickhouse.com/docs/img/ch_logo_docs_dark.svg)](https://clickhouse.com/) - [Products](https://clickhouse.com/docs/use-cases/time-series/basic-operations) - [ClickHouse Cloud Best way to use ClickHouse. Available on AWS, GCP, and Azure.](https://clickhouse.com/cloud) - [BYOC (Bring Your Own Cloud) The fully managed ClickHouse Cloud service, Can be deployed in your AWS account.](https://clickhouse.com/cloud/bring-your-own-cloud) - [ClickHouse Set up a database with open-source ClickHouse. ClickHouse](https://clickhouse.com/clickhouse) - [Discover more than 100 integrations.](https://clickhouse.com/integrations) [Discover more than 100 integrations.](https://clickhouse.com/integrations) - [Use cases](https://clickhouse.com/docs/use-cases/time-series/basic-operations) - [Real-time analytics](https://clickhouse.com/use-cases/real-time-analytics) - [Machine Learning & Generative AI](https://clickhouse.com/use-cases/machine-learning-and-data-science) - [Business Intelligence](https://clickhouse.com/use-cases/data-warehousing) - [Logs, Events, Traces](https://clickhouse.com/use-cases/observability) - [All use cases](https://clickhouse.com/use-cases) [All use cases](https://clickhouse.com/use-cases) - [Documentation](https://clickhouse.com/docs) - [Resources](https://clickhouse.com/docs/use-cases/time-series/basic-operations) - [User stories](https://clickhouse.com/user-stories) - [Blog](https://clickhouse.com/blog) - [Events](https://clickhouse.com/company/events) - [Learning and certification](https://clickhouse.com/learn) - [Comparison](https://clickhouse.com/docs/use-cases/time-series/basic-operations) - [BigQuery](https://clickhouse.com/comparison/bigquery) - [PostgreSQL](https://clickhouse.com/comparison/postgresql) - [Redshift](https://clickhouse.com/comparison/redshift) - [Rockset](https://clickhouse.com/comparison/rockset) - [Snowflake](https://clickhouse.com/comparison/snowflake) - [Video](https://clickhouse.com/videos) - [Demo](https://clickhouse.com/demos) - [Pricing](https://clickhouse.com/pricing) - [Contact](https://clickhouse.com/company/contact?loc=nav) [46\.7k](https://github.com/ClickHouse/ClickHouse?utm_source=clickhouse&utm_medium=website&utm_campaign=website-nav) [Search`Ctrl``K`](https://clickhouse.com/docs/search) [Sign in](https://console.clickhouse.cloud/signIn?loc=docs-nav-signIn-cta) [Get started](https://console.clickhouse.cloud/signUp?loc=docs-nav-signUp-cta) [Get started](https://clickhouse.com/docs/introduction-clickhouse) [Cloud](https://clickhouse.com/docs/cloud/overview) [Manage data](https://clickhouse.com/docs/updating-data) [Server admin](https://clickhouse.com/docs/guides/manage-and-deploy-index) [Reference](https://clickhouse.com/docs/sql-reference) [Integrations](https://clickhouse.com/docs/integrations) [ClickStack](https://clickhouse.com/docs/use-cases/observability/clickstack/overview) [chDB](https://clickhouse.com/docs/chdb) [About](https://clickhouse.com/docs/about) [Knowledge Base](https://clickhouse.com/docs/knowledgebase) [English](https://clickhouse.com/docs/use-cases/time-series/basic-operations) - [English](https://clickhouse.com/docs/use-cases/time-series/basic-operations) - [ζ—₯本θͺž](https://clickhouse.com/docs/jp/use-cases/time-series/basic-operations) - [δΈ­ζ–‡](https://clickhouse.com/docs/zh/use-cases/time-series/basic-operations) - [Русский](https://clickhouse.com/docs/ru/use-cases/time-series/basic-operations) - [ν•œκ΅­μ–΄](https://clickhouse.com/docs/ko/use-cases/time-series/basic-operations) [Search`Ctrl``K`](https://clickhouse.com/docs/search) - [Introduction](https://clickhouse.com/docs/introduction-clickhouse) - [Concepts](https://clickhouse.com/docs/concepts) - [What is OLAP?](https://clickhouse.com/docs/concepts/olap) - [Why is ClickHouse so fast?](https://clickhouse.com/docs/concepts/why-clickhouse-is-so-fast) - [Why is ClickHouse unique?](https://clickhouse.com/docs/about-us/distinctive-features) - [Glossary](https://clickhouse.com/docs/concepts/glossary) - [FAQ](https://clickhouse.com/docs/concepts/faq) - [Starter guides](https://clickhouse.com/docs/starter-guides) - [Creating tables](https://clickhouse.com/docs/guides/creating-tables) - [Updating and deleting data](https://clickhouse.com/docs/guides/developer/mutations) - [Selecting data](https://clickhouse.com/docs/guides/writing-queries) - [Generating random test data](https://clickhouse.com/docs/guides/generating-test-data) - [Inserting ClickHouse data](https://clickhouse.com/docs/guides/inserting-data) - [Working with arrays in ClickHouse](https://clickhouse.com/docs/guides/working-with-arrays) - [Working with JOINs in ClickHouse](https://clickhouse.com/docs/guides/working-with-joins) - [Troubleshooting](https://clickhouse.com/docs/troubleshooting) - [Best practices](https://clickhouse.com/docs/best-practices) - [Sizing and hardware recommendations](https://clickhouse.com/docs/guides/sizing-and-hardware-recommendations) - [Choosing a primary key](https://clickhouse.com/docs/best-practices/choosing-a-primary-key) - [Selecting data types](https://clickhouse.com/docs/best-practices/select-data-types) - [Use materialized views](https://clickhouse.com/docs/best-practices/use-materialized-views) - [Minimize and optimize JOINs](https://clickhouse.com/docs/best-practices/minimize-optimize-joins) - [Choosing a partitioning key](https://clickhouse.com/docs/best-practices/choosing-a-partitioning-key) - [Selecting an insert strategy](https://clickhouse.com/docs/best-practices/selecting-an-insert-strategy) - [Data skipping indices](https://clickhouse.com/docs/best-practices/use-data-skipping-indices-where-appropriate) - [Avoid mutations](https://clickhouse.com/docs/best-practices/avoid-mutations) - [Avoid optimize final](https://clickhouse.com/docs/best-practices/avoid-optimize-final) - [Using JSON](https://clickhouse.com/docs/best-practices/use-json-where-appropriate) - [Use case guides](https://clickhouse.com/docs/use-cases) - [Observability](https://clickhouse.com/docs/use-cases/observability) - [Time-series](https://clickhouse.com/docs/use-cases/time-series) - [Date and time data types](https://clickhouse.com/docs/use-cases/time-series/date-time-data-types) - [Basic operations](https://clickhouse.com/docs/use-cases/time-series/basic-operations) - [Analysis functions](https://clickhouse.com/docs/use-cases/time-series/analysis-functions) - [Storage efficiency](https://clickhouse.com/docs/use-cases/time-series/storage-efficiency) - [Query performance](https://clickhouse.com/docs/use-cases/time-series/query-performance) - [Build a rollup with materialized views for fast time-series analytics](https://clickhouse.com/docs/knowledgebase/materialized-view-rollup-timeseries) - [Data lake](https://clickhouse.com/docs/use-cases/data-lake) - [AI/ML](https://clickhouse.com/docs/use-cases/AI/ask-ai) - [Example datasets](https://clickhouse.com/docs/getting-started/example-datasets) - [Tips and community wisdom](https://clickhouse.com/docs/tips-and-tricks/community-wisdom) - [Use case guides](https://clickhouse.com/docs/use-cases) - [Time-series](https://clickhouse.com/docs/use-cases/time-series) - Basic operations [Edit this page](https://github.com/ClickHouse/clickhouse-docs/blob/main/docs/use-cases/time-series/02_basic-operations.md) # Basic time-series operations ClickHouse provides several methods for working with time series data, allowing you to aggregate, group, and analyze data points across different time periods. This section covers the fundamental operations commonly used when working with time-based data. Common operations include grouping data by time intervals, handling gaps in time series data, and calculating changes between time periods. These operations can be performed using standard SQL syntax combined with ClickHouse's built-in time functions. We're going to explore ClickHouse time-series querying capabilities with the Wikistat (Wikipedia pageviews data) dataset: ``` CREATE TABLE wikistat ( `time` DateTime, `project` String, `subproject` String, `path` String, `hits` UInt64 ) ENGINE = MergeTree ORDER BY (time); ``` Let's populate this table with 1 billion records: ``` INSERT INTO wikistat SELECT * FROM s3('https://ClickHouse-public-datasets.s3.amazonaws.com/wikistat/partitioned/wikistat*.native.zst') LIMIT 1e9; ``` ## Aggregating by time bucket[​](https://clickhouse.com/docs/use-cases/time-series/basic-operations#time-series-aggregating-time-bucket "Direct link to Aggregating by time bucket") The most popular requirement is to aggregate data based on periods, e.g. get the total amount of hits for each day: ``` SELECT toDate(time) AS date, sum(hits) AS hits FROM wikistat GROUP BY ALL ORDER BY date ASC LIMIT 5; ``` ``` β”Œβ”€β”€β”€β”€β”€β”€β”€date─┬─────hits─┐ β”‚ 2015-05-01 β”‚ 25524369 β”‚ β”‚ 2015-05-02 β”‚ 25608105 β”‚ β”‚ 2015-05-03 β”‚ 28567101 β”‚ β”‚ 2015-05-04 β”‚ 29229944 β”‚ β”‚ 2015-05-05 β”‚ 29383573 β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ ``` We've used the [`toDate()`](https://clickhouse.com/docs/sql-reference/functions/type-conversion-functions#toDate) function here, which converts the specified time to a date type. Alternatively, we can batch by an hour and filter on the specific date: ``` SELECT toStartOfHour(time) AS hour, sum(hits) AS hits FROM wikistat WHERE date(time) = '2015-07-01' GROUP BY ALL ORDER BY hour ASC LIMIT 5; ``` ``` β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€hour─┬───hits─┐ β”‚ 2015-07-01 00:00:00 β”‚ 656676 β”‚ β”‚ 2015-07-01 01:00:00 β”‚ 768837 β”‚ β”‚ 2015-07-01 02:00:00 β”‚ 862311 β”‚ β”‚ 2015-07-01 03:00:00 β”‚ 829261 β”‚ β”‚ 2015-07-01 04:00:00 β”‚ 749365 β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”˜ ``` The [`toStartOfHour()`](https://clickhouse.com/docs/sql-reference/functions/date-time-functions#toStartOfHour) function used here converts the given time to the start of the hour. You can also group by year, quarter, month, or day. ## Custom grouping intervals[​](https://clickhouse.com/docs/use-cases/time-series/basic-operations#time-series-custom-grouping-intervals "Direct link to Custom grouping intervals") We can even group by arbitrary intervals, e.g., 5 minutes using the [`toStartOfInterval()`](https://clickhouse.com/docs/sql-reference/functions/date-time-functions#toStartOfInterval) function. Let's say we want to group by 4-hour intervals. We can specify the grouping interval using the [`INTERVAL`](https://clickhouse.com/docs/sql-reference/data-types/special-data-types/interval) clause: ``` SELECT toStartOfInterval(time, INTERVAL 4 HOUR) AS interval, sum(hits) AS hits FROM wikistat WHERE date(time) = '2015-07-01' GROUP BY ALL ORDER BY interval ASC LIMIT 6; ``` Or we can use the [`toIntervalHour()`](https://clickhouse.com/docs/sql-reference/functions/type-conversion-functions#toIntervalHour) function ``` SELECT toStartOfInterval(time, toIntervalHour(4)) AS interval, sum(hits) AS hits FROM wikistat WHERE date(time) = '2015-07-01' GROUP BY ALL ORDER BY interval ASC LIMIT 6; ``` Either way, we get the following results: ``` β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€interval─┬────hits─┐ β”‚ 2015-07-01 00:00:00 β”‚ 3117085 β”‚ β”‚ 2015-07-01 04:00:00 β”‚ 2928396 β”‚ β”‚ 2015-07-01 08:00:00 β”‚ 2679775 β”‚ β”‚ 2015-07-01 12:00:00 β”‚ 2461324 β”‚ β”‚ 2015-07-01 16:00:00 β”‚ 2823199 β”‚ β”‚ 2015-07-01 20:00:00 β”‚ 2984758 β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ ``` ## Filling empty groups[​](https://clickhouse.com/docs/use-cases/time-series/basic-operations#time-series-filling-empty-groups "Direct link to Filling empty groups") In a lot of cases we deal with sparse data with some absent intervals. This results in empty buckets. Let's take the following example where we group data by 1-hour intervals. This will output the following stats with some hours missing values: ``` SELECT toStartOfHour(time) AS hour, sum(hits) FROM wikistat WHERE (project = 'ast') AND (subproject = 'm') AND (date(time) = '2015-07-01') GROUP BY ALL ORDER BY hour ASC; ``` ``` β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€hour─┬─sum(hits)─┐ β”‚ 2015-07-01 00:00:00 β”‚ 3 β”‚ <- missing values β”‚ 2015-07-01 02:00:00 β”‚ 1 β”‚ <- missing values β”‚ 2015-07-01 04:00:00 β”‚ 1 β”‚ β”‚ 2015-07-01 05:00:00 β”‚ 2 β”‚ β”‚ 2015-07-01 06:00:00 β”‚ 1 β”‚ β”‚ 2015-07-01 07:00:00 β”‚ 1 β”‚ β”‚ 2015-07-01 08:00:00 β”‚ 3 β”‚ β”‚ 2015-07-01 09:00:00 β”‚ 2 β”‚ <- missing values β”‚ 2015-07-01 12:00:00 β”‚ 2 β”‚ β”‚ 2015-07-01 13:00:00 β”‚ 4 β”‚ β”‚ 2015-07-01 14:00:00 β”‚ 2 β”‚ β”‚ 2015-07-01 15:00:00 β”‚ 2 β”‚ β”‚ 2015-07-01 16:00:00 β”‚ 2 β”‚ β”‚ 2015-07-01 17:00:00 β”‚ 1 β”‚ β”‚ 2015-07-01 18:00:00 β”‚ 5 β”‚ β”‚ 2015-07-01 19:00:00 β”‚ 5 β”‚ β”‚ 2015-07-01 20:00:00 β”‚ 4 β”‚ β”‚ 2015-07-01 21:00:00 β”‚ 4 β”‚ β”‚ 2015-07-01 22:00:00 β”‚ 2 β”‚ β”‚ 2015-07-01 23:00:00 β”‚ 2 β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ ``` ClickHouse provides the [`WITH FILL`](https://clickhouse.com/docs/guides/developer/time-series-filling-gaps#with-fill) modifier to address this. This will fill out all the empty hours with zeros, so we can better understand the distribution over time: ``` SELECT toStartOfHour(time) AS hour, sum(hits) FROM wikistat WHERE (project = 'ast') AND (subproject = 'm') AND (date(time) = '2015-07-01') GROUP BY ALL ORDER BY hour ASC WITH FILL STEP toIntervalHour(1); ``` ``` β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€hour─┬─sum(hits)─┐ β”‚ 2015-07-01 00:00:00 β”‚ 3 β”‚ β”‚ 2015-07-01 01:00:00 β”‚ 0 β”‚ <- new value β”‚ 2015-07-01 02:00:00 β”‚ 1 β”‚ β”‚ 2015-07-01 03:00:00 β”‚ 0 β”‚ <- new value β”‚ 2015-07-01 04:00:00 β”‚ 1 β”‚ β”‚ 2015-07-01 05:00:00 β”‚ 2 β”‚ β”‚ 2015-07-01 06:00:00 β”‚ 1 β”‚ β”‚ 2015-07-01 07:00:00 β”‚ 1 β”‚ β”‚ 2015-07-01 08:00:00 β”‚ 3 β”‚ β”‚ 2015-07-01 09:00:00 β”‚ 2 β”‚ β”‚ 2015-07-01 10:00:00 β”‚ 0 β”‚ <- new value β”‚ 2015-07-01 11:00:00 β”‚ 0 β”‚ <- new value β”‚ 2015-07-01 12:00:00 β”‚ 2 β”‚ β”‚ 2015-07-01 13:00:00 β”‚ 4 β”‚ β”‚ 2015-07-01 14:00:00 β”‚ 2 β”‚ β”‚ 2015-07-01 15:00:00 β”‚ 2 β”‚ β”‚ 2015-07-01 16:00:00 β”‚ 2 β”‚ β”‚ 2015-07-01 17:00:00 β”‚ 1 β”‚ β”‚ 2015-07-01 18:00:00 β”‚ 5 β”‚ β”‚ 2015-07-01 19:00:00 β”‚ 5 β”‚ β”‚ 2015-07-01 20:00:00 β”‚ 4 β”‚ β”‚ 2015-07-01 21:00:00 β”‚ 4 β”‚ β”‚ 2015-07-01 22:00:00 β”‚ 2 β”‚ β”‚ 2015-07-01 23:00:00 β”‚ 2 β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ ``` ## Rolling time windows[​](https://clickhouse.com/docs/use-cases/time-series/basic-operations#time-series-rolling-time-windows "Direct link to Rolling time windows") Sometimes, we don't want to deal with the start of intervals (like the start of a day or an hour) but window intervals. Let's say we want to understand the total hits for a window, not based on days but on a 24-hour period offset from 6 pm. We can use the [`date_diff()`](https://clickhouse.com/docs/sql-reference/functions/date-time-functions#timeDiff) function to calculate the difference between a reference time and each record's time. In this case, the `day` column will represent the difference in days (e.g., 1 day ago, 2 days ago, etc.): ``` SELECT dateDiff('day', toDateTime('2015-05-01 18:00:00'), time) AS day, sum(hits), FROM wikistat GROUP BY ALL ORDER BY day ASC LIMIT 5; ``` ``` β”Œβ”€day─┬─sum(hits)─┐ β”‚ 0 β”‚ 25524369 β”‚ β”‚ 1 β”‚ 25608105 β”‚ β”‚ 2 β”‚ 28567101 β”‚ β”‚ 3 β”‚ 29229944 β”‚ β”‚ 4 β”‚ 29383573 β”‚ β””β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ ``` [Previous Date and time data types](https://clickhouse.com/docs/use-cases/time-series/date-time-data-types) [Next Analysis functions](https://clickhouse.com/docs/use-cases/time-series/analysis-functions) - [Aggregating by time bucket](https://clickhouse.com/docs/use-cases/time-series/basic-operations#time-series-aggregating-time-bucket) - [Custom grouping intervals](https://clickhouse.com/docs/use-cases/time-series/basic-operations#time-series-custom-grouping-intervals) - [Filling empty groups](https://clickhouse.com/docs/use-cases/time-series/basic-operations#time-series-filling-empty-groups) - [Rolling time windows](https://clickhouse.com/docs/use-cases/time-series/basic-operations#time-series-rolling-time-windows) Was this page helpful? Β© 2016–2026 ClickHouse, Inc. [Trademark](https://clickhouse.com/legal/trademark-policy)Β·[Privacy](https://clickhouse.com/legal/privacy-policy)Β·[Security](https://trust.clickhouse.com/)Β·[Terms of Service](https://clickhouse.com/legal/agreements/terms-of-service) ![](https://static.scarf.sh/a.png?x-pxid=e6377503-591b-4886-9398-e69c7fee0b91) Β© 2016–2026 ClickHouse, Inc. [Trademark](https://clickhouse.com/legal/trademark-policy)Β·[Privacy](https://clickhouse.com/legal/privacy-policy)Β·[Security](https://trust.clickhouse.com/)Β·[Terms of Service](https://clickhouse.com/legal/agreements/terms-of-service) ![](https://static.scarf.sh/a.png?x-pxid=e6377503-591b-4886-9398-e69c7fee0b91)
Readable Markdown
ClickHouse provides several methods for working with time series data, allowing you to aggregate, group, and analyze data points across different time periods. This section covers the fundamental operations commonly used when working with time-based data. Common operations include grouping data by time intervals, handling gaps in time series data, and calculating changes between time periods. These operations can be performed using standard SQL syntax combined with ClickHouse's built-in time functions. We're going to explore ClickHouse time-series querying capabilities with the Wikistat (Wikipedia pageviews data) dataset: ``` CREATE TABLE wikistat ( `time` DateTime, `project` String, `subproject` String, `path` String, `hits` UInt64 ) ENGINE = MergeTree ORDER BY (time); ``` Let's populate this table with 1 billion records: ``` INSERT INTO wikistat SELECT * FROM s3('https://ClickHouse-public-datasets.s3.amazonaws.com/wikistat/partitioned/wikistat*.native.zst') LIMIT 1e9; ``` ## Aggregating by time bucket[​](https://clickhouse.com/docs/use-cases/time-series/basic-operations#time-series-aggregating-time-bucket "Direct link to Aggregating by time bucket") The most popular requirement is to aggregate data based on periods, e.g. get the total amount of hits for each day: ``` SELECT toDate(time) AS date, sum(hits) AS hits FROM wikistat GROUP BY ALL ORDER BY date ASC LIMIT 5; ``` ``` β”Œβ”€β”€β”€β”€β”€β”€β”€date─┬─────hits─┐ β”‚ 2015-05-01 β”‚ 25524369 β”‚ β”‚ 2015-05-02 β”‚ 25608105 β”‚ β”‚ 2015-05-03 β”‚ 28567101 β”‚ β”‚ 2015-05-04 β”‚ 29229944 β”‚ β”‚ 2015-05-05 β”‚ 29383573 β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ ``` We've used the [`toDate()`](https://clickhouse.com/docs/sql-reference/functions/type-conversion-functions#toDate) function here, which converts the specified time to a date type. Alternatively, we can batch by an hour and filter on the specific date: ``` SELECT toStartOfHour(time) AS hour, sum(hits) AS hits FROM wikistat WHERE date(time) = '2015-07-01' GROUP BY ALL ORDER BY hour ASC LIMIT 5; ``` ``` β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€hour─┬───hits─┐ β”‚ 2015-07-01 00:00:00 β”‚ 656676 β”‚ β”‚ 2015-07-01 01:00:00 β”‚ 768837 β”‚ β”‚ 2015-07-01 02:00:00 β”‚ 862311 β”‚ β”‚ 2015-07-01 03:00:00 β”‚ 829261 β”‚ β”‚ 2015-07-01 04:00:00 β”‚ 749365 β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”˜ ``` The [`toStartOfHour()`](https://clickhouse.com/docs/sql-reference/functions/date-time-functions#toStartOfHour) function used here converts the given time to the start of the hour. You can also group by year, quarter, month, or day. ## Custom grouping intervals[​](https://clickhouse.com/docs/use-cases/time-series/basic-operations#time-series-custom-grouping-intervals "Direct link to Custom grouping intervals") We can even group by arbitrary intervals, e.g., 5 minutes using the [`toStartOfInterval()`](https://clickhouse.com/docs/sql-reference/functions/date-time-functions#toStartOfInterval) function. Let's say we want to group by 4-hour intervals. We can specify the grouping interval using the [`INTERVAL`](https://clickhouse.com/docs/sql-reference/data-types/special-data-types/interval) clause: ``` SELECT toStartOfInterval(time, INTERVAL 4 HOUR) AS interval, sum(hits) AS hits FROM wikistat WHERE date(time) = '2015-07-01' GROUP BY ALL ORDER BY interval ASC LIMIT 6; ``` Or we can use the [`toIntervalHour()`](https://clickhouse.com/docs/sql-reference/functions/type-conversion-functions#toIntervalHour) function ``` SELECT toStartOfInterval(time, toIntervalHour(4)) AS interval, sum(hits) AS hits FROM wikistat WHERE date(time) = '2015-07-01' GROUP BY ALL ORDER BY interval ASC LIMIT 6; ``` Either way, we get the following results: ``` β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€interval─┬────hits─┐ β”‚ 2015-07-01 00:00:00 β”‚ 3117085 β”‚ β”‚ 2015-07-01 04:00:00 β”‚ 2928396 β”‚ β”‚ 2015-07-01 08:00:00 β”‚ 2679775 β”‚ β”‚ 2015-07-01 12:00:00 β”‚ 2461324 β”‚ β”‚ 2015-07-01 16:00:00 β”‚ 2823199 β”‚ β”‚ 2015-07-01 20:00:00 β”‚ 2984758 β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ ``` ## Filling empty groups[​](https://clickhouse.com/docs/use-cases/time-series/basic-operations#time-series-filling-empty-groups "Direct link to Filling empty groups") In a lot of cases we deal with sparse data with some absent intervals. This results in empty buckets. Let's take the following example where we group data by 1-hour intervals. This will output the following stats with some hours missing values: ``` SELECT toStartOfHour(time) AS hour, sum(hits) FROM wikistat WHERE (project = 'ast') AND (subproject = 'm') AND (date(time) = '2015-07-01') GROUP BY ALL ORDER BY hour ASC; ``` ``` β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€hour─┬─sum(hits)─┐ β”‚ 2015-07-01 00:00:00 β”‚ 3 β”‚ <- missing values β”‚ 2015-07-01 02:00:00 β”‚ 1 β”‚ <- missing values β”‚ 2015-07-01 04:00:00 β”‚ 1 β”‚ β”‚ 2015-07-01 05:00:00 β”‚ 2 β”‚ β”‚ 2015-07-01 06:00:00 β”‚ 1 β”‚ β”‚ 2015-07-01 07:00:00 β”‚ 1 β”‚ β”‚ 2015-07-01 08:00:00 β”‚ 3 β”‚ β”‚ 2015-07-01 09:00:00 β”‚ 2 β”‚ <- missing values β”‚ 2015-07-01 12:00:00 β”‚ 2 β”‚ β”‚ 2015-07-01 13:00:00 β”‚ 4 β”‚ β”‚ 2015-07-01 14:00:00 β”‚ 2 β”‚ β”‚ 2015-07-01 15:00:00 β”‚ 2 β”‚ β”‚ 2015-07-01 16:00:00 β”‚ 2 β”‚ β”‚ 2015-07-01 17:00:00 β”‚ 1 β”‚ β”‚ 2015-07-01 18:00:00 β”‚ 5 β”‚ β”‚ 2015-07-01 19:00:00 β”‚ 5 β”‚ β”‚ 2015-07-01 20:00:00 β”‚ 4 β”‚ β”‚ 2015-07-01 21:00:00 β”‚ 4 β”‚ β”‚ 2015-07-01 22:00:00 β”‚ 2 β”‚ β”‚ 2015-07-01 23:00:00 β”‚ 2 β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ ``` ClickHouse provides the [`WITH FILL`](https://clickhouse.com/docs/guides/developer/time-series-filling-gaps#with-fill) modifier to address this. This will fill out all the empty hours with zeros, so we can better understand the distribution over time: ``` SELECT toStartOfHour(time) AS hour, sum(hits) FROM wikistat WHERE (project = 'ast') AND (subproject = 'm') AND (date(time) = '2015-07-01') GROUP BY ALL ORDER BY hour ASC WITH FILL STEP toIntervalHour(1); ``` ``` β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€hour─┬─sum(hits)─┐ β”‚ 2015-07-01 00:00:00 β”‚ 3 β”‚ β”‚ 2015-07-01 01:00:00 β”‚ 0 β”‚ <- new value β”‚ 2015-07-01 02:00:00 β”‚ 1 β”‚ β”‚ 2015-07-01 03:00:00 β”‚ 0 β”‚ <- new value β”‚ 2015-07-01 04:00:00 β”‚ 1 β”‚ β”‚ 2015-07-01 05:00:00 β”‚ 2 β”‚ β”‚ 2015-07-01 06:00:00 β”‚ 1 β”‚ β”‚ 2015-07-01 07:00:00 β”‚ 1 β”‚ β”‚ 2015-07-01 08:00:00 β”‚ 3 β”‚ β”‚ 2015-07-01 09:00:00 β”‚ 2 β”‚ β”‚ 2015-07-01 10:00:00 β”‚ 0 β”‚ <- new value β”‚ 2015-07-01 11:00:00 β”‚ 0 β”‚ <- new value β”‚ 2015-07-01 12:00:00 β”‚ 2 β”‚ β”‚ 2015-07-01 13:00:00 β”‚ 4 β”‚ β”‚ 2015-07-01 14:00:00 β”‚ 2 β”‚ β”‚ 2015-07-01 15:00:00 β”‚ 2 β”‚ β”‚ 2015-07-01 16:00:00 β”‚ 2 β”‚ β”‚ 2015-07-01 17:00:00 β”‚ 1 β”‚ β”‚ 2015-07-01 18:00:00 β”‚ 5 β”‚ β”‚ 2015-07-01 19:00:00 β”‚ 5 β”‚ β”‚ 2015-07-01 20:00:00 β”‚ 4 β”‚ β”‚ 2015-07-01 21:00:00 β”‚ 4 β”‚ β”‚ 2015-07-01 22:00:00 β”‚ 2 β”‚ β”‚ 2015-07-01 23:00:00 β”‚ 2 β”‚ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ ``` ## Rolling time windows[​](https://clickhouse.com/docs/use-cases/time-series/basic-operations#time-series-rolling-time-windows "Direct link to Rolling time windows") Sometimes, we don't want to deal with the start of intervals (like the start of a day or an hour) but window intervals. Let's say we want to understand the total hits for a window, not based on days but on a 24-hour period offset from 6 pm. We can use the [`date_diff()`](https://clickhouse.com/docs/sql-reference/functions/date-time-functions#timeDiff) function to calculate the difference between a reference time and each record's time. In this case, the `day` column will represent the difference in days (e.g., 1 day ago, 2 days ago, etc.): ``` SELECT dateDiff('day', toDateTime('2015-05-01 18:00:00'), time) AS day, sum(hits), FROM wikistat GROUP BY ALL ORDER BY day ASC LIMIT 5; ``` ``` β”Œβ”€day─┬─sum(hits)─┐ β”‚ 0 β”‚ 25524369 β”‚ β”‚ 1 β”‚ 25608105 β”‚ β”‚ 2 β”‚ 28567101 β”‚ β”‚ 3 β”‚ 29229944 β”‚ β”‚ 4 β”‚ 29383573 β”‚ β””β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ ```
ML Classification
ML Categoriesnull
ML Page Typesnull
ML Intent Typesnull
Content Metadata
Languageen
Authornull
Publish Timenot set
Original Publish Time2025-04-26 20:09:09 (12 months ago)
RepublishedNo
Word Count (Total)1,431
Word Count (Content)994
Links
External Links5
Internal Links92
Technical SEO
Meta NofollowNo
Meta NoarchiveNo
JS RenderedNo
Redirect Targetnull
Performance
Download Time (ms)204
TTFB (ms)203
Download Size (bytes)13,398
Shard89 (laksa)
Root Hash12633450985039531489
Unparsed URLcom,clickhouse!/docs/use-cases/time-series/basic-operations s443