Need help processing logpush logs in Azure

Hello all I am hoping someone out there can help me, this is more of a question for Azure Log Analytics (further referred to as ALA) / Azure Monitor Workspaces (further referred to as AMW) but I need help ingesting my CF logpush logs and I know very little about Azure Log Analytics and Azure Monitor beyond some basic things I’ve setup to monitor our Azure native services for system uptime and Public IP creation alarms I’ve never worked with Log Analytics or Monitor Workspaces.

I’ve got my logpush setup and files are flowing into my blob storage account in the container I want thanks to the SAS url.

I’ve worked on filtering down to the fields I want from Cloudflare in the logpush and validated the json entries are what I want in the downloaded log.gz files (which it’s funny they aren’t gzip’d they’re just single line json’s… I also don’t know much about json LOL).

I have a “Data Collection Endpoint” DCE in AMW and using it to create a “Data Collection Rule” DCR while setting up a custom table in ALA, I drag my sample json log file over to the import window and I get this

TimeGenerated field is not found in the sample provided. Please use the transformation editor to populate TimeGenerated column in the destination table.

I found this

source
| extend TimeGenerated = todatetime(EdgeStartTimestamp)

and that worked to make a TimeGenerated column, but when I proceed I get a table and now I don’t know how to get the logs automatically ingested into the table. Google isn’t helping me very much and AI keeps giving me tidbits that seem helpful but are all mostly related to Azure native app monitoring. I feel like there must be some kind of Azure service (they only have a million LOL) which I can point at the Blob container and say pick files up from here, push into table there, and delete file from container.

Thanks in advance if anyone has any suggestions

This topic was automatically closed 15 days after the last reply. New replies are no longer allowed.