How to use D1 in a Workers Script

I am using a simple script to fetch and display json data using KV right now.

If i want to use D1 whats the code to fetch data from the tables, create it and delete it.

I am looking for similar thing that is in KV.

eg in KV

const EMAILS = KV_EMAILS;
var kv_password = await EMAILS.get(user_email);
await EMAILS.put(user_email, new_password);

similar thing. as i am not much of a developer. any suggestion will be helpful.

I would check out the developer documentation which shows how to use the API as well as has community projects.

1 Like

There is no binding documentation for D1 to Workers.

We do :slight_smile:

The get started guide is all about making a Worker and running D1 in it:

If you have any questions let me know!

I am not using Wrangler as i don’t know Typescript. I want to use JS to directly work in Workers Script Editor.

can you provide information on how to bind it as in Settings there is R2, KV but not D1 to Bind.

You cannot use D1 through the quick editor today. You will need to use Wrangler (note you can continue to use JS there, TS isn’t a requirement)

oh, that’s good to know. also how can i increase the limit from 100 MB to more?

I know it’s in Aplha, but is there a way to get high limits?

Not during the alpha no.

FWIW Wrangler doesn’t force you to use TypeScript - it just will work with TypeScript.

Continue to use normal JS if you wish, with Wrangler.

Hey your wangler.toml file would have this in it:

[[ d1_databases ]]
binding = "db"
database_name = "YOUR_DB_NAME..."
database_id = "YOUR_DB_ID..."

then something like this would do the trick:

export default {
	async fetch(request, env, ctx) {
		try {
			const { results } = await env.db.prepare('SELECT id, name, email, country FROM users LIMIT 10').all()
			return Response.json(results)
		} catch (e) {
			console.log({
				message: e.message,
				cause: e.cause.message,
			})
			return Response.json(results)
		}
	},
}

OMG just add a binding GUI in the Dashboard already, like for KV etc. I HATE using Wrangler. I am much more efficient using my own web-based worker editor with immediate save/deployment functionality. In case you don’t know - wrangler sucks!

Hey,

This cannot be added right now, it requires Wrangler since we need to bundle a shim in.
Why do you hate Wrangler? What does it not provide that the editor does?
I will pass the feedback along to the team

Maybe it’s just me, but I don’t like when there are magical things happening that I don’t fully understand/have control of. Some other reasons:

  • All the config files etc. are an overkill.
  • While I get that some people prefer to use local IDE, with my setup I can edit my worker on any machine anytime I want, plus, after hitting CTRL+S my worker is saved, ready to be tested. No need to switch to wrangler, publish, and wait for it to do its magic.
  • I don’t like to install additional things which somehow always fail or cause incompatibility issues… (old wrangler version → need to update → oh your node is too old, need to update… oh, your npm is too old, and so on…)

I tried to publish a worker with D1 binding with wrangler just now. I see the shim in the code using the “Quick edit” editor. Another surprise is that when I use my own editor (which downloads the worker code using the API) there are now some additional lines in the beginning and end of the script (looks like a hash or something), which are not visible in the quick editor… like… wth is going on.

If there is some extra code necessary for the D1 to work, why not wrap it in a dedicated API and make it simply available in the worker? KV is excellent in that regard. D1 could work just the same.

Is there a plan to do it this way or do I have to create my own D1 API and use it via service binding?

2 Likes

As I’m using Javascript vanilla as well, do you have found a way to use the env.<BIDNING_NAME> inside the worker ?

I totally agree. I know you guys want to use node, npm, github, this api, that endpoint, this shell or that shell, but nothing’s good enough as a plain old change and save. I love it simple.

Click-Ops is fine when you’re playing with, and getting to know things, or for small/personal projects, but there are many reasons why Infrastructure as Code (IaC) and automated deployment of code are the predominant approach when working with Cloud service providers like Cloudflare, AWS, Azure, GCP, OCI etc.

It’s pretty much essential when you’re running Production systems of any scale or importance, for your deployments to be fully automated, and to have in place the controls, security, repeatability, and isolation of environments that CI/CD Pipelines bring.

It’s also important that your code and infrastructure changes are kept in sync, with everything in source control, and when working as part of a team, peer/code reviewed, which obviously aren’t options if any of the work is being done manually via a web-based dashboard.

As a developer, certainly at any company which is ISO 27001 or 27701 accredited, it’s highly unlikely you’d be allowed direct access to Production environments (and probably nothing beyond the QA environment, if you even have access to that) so you’d not be able to work on these in the Cloudflare Dashboard anyway.

This is why for most businesses, Wrangler, or tools like it, are pretty much essential when it comes to creating and deploying solutions to cloud services like the Cloudflare Workers platform; and even for those small/personal projects, I find the benefits far outweigh the little bit of additional effort that is required up-front when starting a new project.

1 Like