28k records, 117 queries => >90M records read?!?

Hi there

I’m on a free plan, using workers with a D1 database. It currently has <30k records. Today I saw metrics reported >90M rows read per day (>600M last 7 days)! But only 117 read queries… how is that even possible?

At the same time I have about 10k write requests. Could these account for the reads, too? An insert would be:

INSERT INTO table (id, created, lastseen, data) VALUES(?, DATETIME(), DATETIME(), json(?))
            ON CONFLICT(id) DO UPDATE SET lastseen=DATETIME(), data=json(?);

Pretty simple. “id” is the primary key. It shouldn’t have to scan the full table in order to find that record, right?

The documentation says free plan was limited to 5M rows read per day. Yet all my bills have been zero so far. I’ve been using that setup for about two months now.

You can see your top queries by row-reads with this command:

wrangler d1 insights dbname --sortBy reads --timePeriod 7d

So you can see what is causing all the reads.

2 Likes

Thanks a lot! That’s indeed helpful: I guess it’s my use of json_* functions in some of those queries which causes the high numbers. One of them would blow up the 29k records to 1.39M rows processed… There’s some optimisation to do there.

1 Like

This topic was automatically closed 2 days after the last reply. New replies are no longer allowed.