Decoding the SQL’s of Snowflake Account Admin Usage
In Snowflake Admin area of Snowsight UI, the Usage page provides detailed information about Snowflake consumption across all relevant services, in either credits or the appropriate currency. If you ever had an urge to know how the numbers associated within the usage page display are derived or you already gave a try but couldn’t match up your derived usage with that of usage page, then you are at a right place to decode the SQL’s facilitating the Admin Usage page.
In the Admin Usage page, the usage data can be filtered further by:
=> Date range
=> Account
=> Type of usage:
# All Usage Types — Displays results as currency.
# Compute: Displays — usage in credits used.
# Storage: Displays — usage in bytes stored.
# Data Transfer — Displays usage in bytes transferred.
The default usage is as below with date range of Last 7 days for all Snowflake Accounts under the organization and all Usage Types.
For comparison of usage lets look at Snowflake overall cost contributors — Compute, Storage and Data Transfer. The usage metrics is understandably derived from SNOWFLAKE.ACCOUNT_USAGE views and for reference, the mentioned SQL query results are transformed into visualization dashboard (Admin Usage) by using charts for easy pattern comparison.
1) Compute
For compute, it is further filtered on one virtual warehouse ‘COMPUTE_WH’ and on Jul 26, 2023 the WAREHOUSE_METERING is 0.6
In dashboard build from SQL, for the same date Jul 26, 2023 and same warehouse the credits value is 0.6
SQL for Compute —
SELECT
DATE (CASE WHEN EXTRACT (HOUR FROM END_TIME) >= 22 THEN
DATEADD (DAY, 1, END_TIME) ELSE END_TIME END) AS CREDITS_DATE,
ROUND (SUM (CREDITS_USED), 1) AS CREDITS
FROM SNOWFLAKE.ACCOUNT_USAGE.WAREHOUSE_METERING_HISTORY
WHERE WAREHOUSE_NAME = 'COMPUTE_WH'
GROUP BY 1
ORDER BY 1 ASC;
Observation for Compute — In Snowflake Admin Usage, the cut of time is considered as 10 PM for the credits calculated on that particular day. Note this cut off time is determined by self after running multiple test cases and not specifically mentioned by Snowflake.
2) Storage
For Storage on Jul 25, 2023 — STAGE is 2.4GB, DATABASE & FAILSAFE in few KB resulting in Total Storage to 2.4GB
In dashboard build from SQL, for the same date Jul 25, 2023 the Total Storage & STAGE Storage is 2.4GB. The DATABASE and FAILSAFE storage is 0GB which is rounded of values of the MB data present.
SQL for Storage —
SELECT
USAGE_DATE + 1 AS "STORAGE DATE",
ROUND (AVG (STAGE_BYTES) / POWER (1024, 3), 1) AS "STAGE STORAGE IN GB",
ROUND (AVG (STORAGE_BYTES) / POWER (1024, 3), 1) AS "DATABASE STORAGE IN GB",
ROUND (AVG (FAILSAFE_BYTES) / POWER (1024, 3), 1) AS "FAILSAFE STORAGE IN GB",
ROUND (AVG (STORAGE_BYTES + STAGE_BYTES + FAILSAFE_BYTES) /
POWER (1024, 3), 1) AS "TOTAL STORAGE IN GB"
FROM SNOWFLAKE.ACCOUNT_USAGE.STORAGE_USAGE
WHERE USAGE_DATE >= CURRENT_DATE() - 7
GROUP BY 1
ORDER BY 1 ASC
;
Observation for Storage — In Snowflake Admin Usage, there is a 2 day lag to current date and building your own dashboard would give information to the latest date.
3) Data Transfer
For Data Transfer on Jul 28, 2023 there is total 10249291776 bytes data trasferred.
In the query result of SQL, for the same date Jul 28, 2023 there are same number of total bytes transferred.
SQL for Data Transfer —
SELECT
DATE (CASE WHEN EXTRACT(HOUR FROM END_TIME) >= 22 THEN
DATEADD(DAY, 1, END_TIME) ELSE END_TIME END) AS TRANSFER_DATE,
SOURCE_CLOUD AS PROVIDER,
TARGET_REGION,
SUM (BYTES_TRANSFERRED) AS TOTAL_BYTES_TRANSFERRED,
SUM (CASE WHEN TRANSFER_TYPE = 'COPY' THEN BYTES_TRANSFERRED END)
AS COPY_BYTES_TRANSFERRED,
SUM (CASE WHEN TRANSFER_TYPE = 'REPLICATION' THEN BYTES_TRANSFERRED END)
AS REPLICATION_BYTES_TRANSFERRED,
SUM (CASE WHEN TRANSFER_TYPE = 'EXTERNAL_FUNCTION' THEN BYTES_TRANSFERRED END)
AS EXTERNAL_FUNCTION_BYTES_TRANSFERRED
FROM SNOWFLAKE.ACCOUNT_USAGE.DATA_TRANSFER_HISTORY
GROUP BY ALL
;
Observation for data transfer — In Snowflake Admin Usage, all the Bytes of data transferred are shown and all may not result in the data transfer cost. Ex: If the data is unloaded from aws ap-southeast-1 region to aws ap-southeast-1 region, there would be no data transfer cost associated as data transfer is within the same cloud service provider region.
You can further customize the SQL’s derived from SNOWFLAKE.ACCOUNT_USAGE views mentioned above to meet your use cases and requirement and build your own dashboards referring to https://medium.com/@snowflakewiki/building-dashboard-for-snowflake-account-monitoring-349b93c7ea39
The sample overall dashboard can look as below -
References:
https://docs.snowflake.com/en/user-guide/ui-snowsight-dashboards.html
https://docs.snowflake.com/en/user-guide/ui-snowsight-admin
Follow and Clap if you like the content and feel free to ask if you have any questions in the comments. I will be more than happy to assist and guide you.