I love workers but we need a proper official debugging solution

I started working on a API with workers and Fauna and it’s just a pain. Using try catch gives you the final error but in more complex scenarios you need to use the console to figure out what’s going on. For example when making multiple queries to the database and see each response on every step.

Personally I’ve found that not being able to run the worker locally it’s not so much of an issue since the publishing is quite fast. What’s really slowing me down is not having console.log().

Is Cloudflare working on something to alleviate this problem?

I don’t think the current flow is going to change, they recently added wrangler dev to be able to debug locally.

1 Like

Thanks @thomas4 !

Jesus I completely missed wrangler dev.

How come it’s not included in the debugging tips?


Oh btw I had an ECONNREFUSED in Node and had to use http://[::1]:8787 instead.

I think they just haven’t gotten to update the docs yet, the Wrangler 1.8 was just released.

Keep in mind that dev mode is still very buggy and it will crash, just restart it if it does.

1 Like

This is only slightly offtopic, but I would love to see what you’re doing with workers and fauna. I don’t know if it’s something you’re allowed to share, but if you are, I’m interested!

Hey @sklabnik !

Just a regular REST API but using Fauna instead of PG or Mongo, nothing special really.

I’m also storing sessions on KV for fast access instead of using JWT.

Just getting started but once I have a couple of endpoints I could share some code privately. Give me a couple of days and I’ll DM you!

1 Like

@pier check out the Webpack DevServer :slight_smile: https://webpack.js.org/configuration/dev-server You can start a local NodeJS server with your code… as well as hot module reloading. In my project for example… I just do:

if (module.hot) {
  module.hot.dispose(() => server.stop())

My package.json commands are as follows:

"build:dev": "webpack --config webpack.dev.js"
"start": "node dist/dev"

It sounds like you jumped right in and assumed the worst, but you should treat developing with Cloudlare Workers, the same as you would with any other platform (AWS, Heroku, etc). You should invest some time into thinking about how you can work on your project locally (development environment)

And like @thomas4 said… there’s wrangler dev now that supports KV/secrets/env :slight_smile:

Cloudflare Workers is not a Node environment and not even a Browser environment, it’s service-workers with custom API’s. Therefore, it’s hard to assume that things that work in local dev will work on actual Workers, because they will not. I’ve run into at least 20 edge-cases… though, most related to encryption, minification and promises.

1 Like

@dmitry that works?

Does webpack dev server trigger fetch events?

What about Request and Response? AFAIK those are not available in Node.

@pier Check out signalnerve’s example here https://github.com/signalnerve/workers-graphql-server I think this will help you understand a bit better! https://github.com/signalnerve/workers-graphql-server/blob/master/webpack.config.js (This is the production webpack config, however you can set up a development webpack config as well)


Here’s what my dev webpack env file look’s like! You’ll notice that i’m not using the apollo-server-cloudflare package https://github.com/apollographql/apollo-server/tree/master/packages/apollo-server-cloudflare, but rather the regular apollo-server that works with NodeJS (for my dev environment)!

And; just because NodeJS does not work on Workers… does not mean you can’t run a NodeJS process locally to run your code :stuck_out_tongue: (Obviously you need to make sure you aren’t including Node libraries in your code … because it won’t work on Workers :))

1 Like

Thanks for the links and example!

BTW are you sending the Fauna secret to your clients? Are you storing it client-side on locaStorage?

Wouldn’t you need to send the secret to the client… in order to store it in localStorage? :stuck_out_tongue:

@pier this is my login mutation

It returns the secret to the user, along with their user data. Also… idk how much experience you have with Docker… but they do have a public image that you can run locally https://hub.docker.com/r/fauna/faunadb

Here’s a list of commands that I came up with to help me:

Pull latest FaunaDB docker image:
docker pull fauna/faunadb:latest

Stop all Docker containers (Powershell):
docker stop $(docker ps -aq)

Start FaunaDB Docker container in the background (Persists to drive):
docker run -d --rm --name faunadb -p 8443:8443 -p 8084:8084 -v C:\Users\Dmitry\Documents\faunadb\lib:/storage/data -v C:\Users\Dmitry\Documents\faunadb\logs:/storage/log -v C:\Users\Dmitry\Documents\faunadb\faunadb.yml:/etc/faunadb.yml fauna/faunadb:latest --config /etc/faunadb.yml

Start command shell into the FaunaDB instance:
docker exec -it faunadb /bin/bash

Check FaunaDB API endpoint status:
curl http://localhost:8443/ping

// Create new FaunaDB database
fauna create-database test

// List all FaunaDB databases
fauna list-databases

// Create new FaunaDB key
fauna create-key test [role=(admin|server|server-readonly|client)] --domain= --port=8443 --scheme=http --secret=secret

// Delete FaunaDB database
fauna delete-database test

// List FaunaDB keys
fauna list-keys

// Delete FaunaDB key
fauna delete-key 200219702370238976

// Add localhost endpoint
fauna add-endpoint http://localhost:8443

// List all endpoints
fauna list-endpoints

// Add FaunaDB cloud endpoint
fauna cloud-login

// Set default FaunaDB endpoint
fauna default-endpoint localhost

// Delete endpoint
fauna delete-endpoint localhost

// Start FaunaDB shell into Database
fauna shell test

// Override FaunaDB connection parameters
fauna create-database test

// Evaluate FaunaDB query
fauna eval "Paginate(Collections())" --secret=fnADmpCfTGACAFraNl1-bxwkvcc-EF4KC1Wv4DH1

// Run FaunaDB queries from a file
fauna eval --file=./queries/init.fql --secret=fnADmpCfTGACAFraNl1-bxwkvcc-EF4KC1Wv4DH1

// Import Schema
curl -u fnADmo2CFTACADJ-G3fOJVix6Q8lxaipP1J267zo: http://localhost:8084/import --data-binary "@schema.gql"

Yes, that’s what I imagined you were doing. Seems very risky to be honest but each use case is different.

they do have a public image that you can run locally

Thanks. So far I’m happy with the cloud DB but I guess once I get deeper into this thing I will end up using a local instance.

Huh? Very risky? I’m not returning the admin secret :stuck_out_tongue: This secret is what gives permissions to the user… and determines what collections they can access, etc. It’s the secret that is returned when a user does the Login FQL function :slight_smile:

I’d recommend you check out this article! https://docs.fauna.com/fauna/current/tutorials/authentication/user Will definitely make what i’m saying more sense!

1 Like