Skip to main content
err:user:unprocessable_entity:query_memory_limit_exceeded
Example
{
  "meta": {
    "requestId": "req_4dgzrNP3Je5mU1tD"
  },
  "error": {
    "detail": "Query exceeded the maximum memory limit of 2GB",
    "status": 422,
    "title": "Unprocessable Entity",
    "type": "https://unkey.com/docs/errors/user/unprocessable_entity/query_memory_limit_exceeded"
  }
}

What Happened?

Your query tried to use more than 2GB of memory! We limit memory usage to keep the analytics service stable and fast for everyone. This typically happens when you’re selecting too many rows, using large GROUP BY operations, or performing complex JOINs without enough filtering.

How to Fix It

1. Use Aggregations Instead of Raw Data

Instead of fetching all rows, aggregate the data:
SELECT *
FROM key_verifications_v1
WHERE time >= now() - INTERVAL 7 DAY

2. Add More Filters

Reduce the amount of data the query needs to process:
SELECT api_id, key_id, outcome, time
FROM key_verifications_v1
WHERE time >= now() - INTERVAL 30 DAY

3. Limit Result Size

Add a LIMIT to cap the number of rows:
SELECT api_id, key_id, outcome, time
FROM key_verifications_v1
WHERE time >= now() - INTERVAL 7 DAY
ORDER BY time DESC
LIMIT 10000

4. Avoid Large GROUP BY Cardinality

GROUP BY on high-cardinality columns (like key_id) uses a lot of memory. Instead, group by lower-cardinality columns:
SELECT key_id, COUNT(*) as total
FROM key_verifications_v1
WHERE time >= now() - INTERVAL 30 DAY
GROUP BY key_id

Need More Memory?

Have a legitimate need for higher memory limits? Contact our support team!Reach out to support and tell us:
  • What you’re trying to analyze
  • Why the query needs more than 2GB of memory
  • An example of the query you’re running
We’ll review your use case and see if we can accommodate your needs.