Google Storage, BigQuery, logpush


I currently have logpush loading data to a Google storage bucket. Within this bucket, logpush is creating a subfolder for each day. 20211102/, 20211103/, 20211103/ and so on. Within each daily folder are .gz log files - added by the minute.

I am able to view and verify the log data is there. Now I need to move the data into BigQuery.

For that, I’ve set up a function per: - this is set to run every minute.

I have verified this function is in cloud function and is running without errors. However, I am not seeing any data in the destination dataset/table.

Looking at / I’ve set DIRECTORY="/" - I am sure the bucket_name, dataset, and table values are ok. The bucket, BQ, and Compute are all part of the same project, which I am the owner of… therefore, I don’t think this is a permissions issue.

Does anyone have an idea why I can’t get the data to load in BQ?

Thank you in advance,