How to real-time stream diagnostic log files stored in azure blob storage
I tried reading azure diagnostic logs stored on storage accounts via Grafana Loki and Promtail, and it works with old logs that are already archived. As soon as I try to instantaneously scrape diagnostic logs that are being written, I get only a couple of them read and the rest gets ignored.
After searching, I understood it could be due to the fact that object stores are immutable, and hence new blobs (or json files) get written with every change. This means the file I am scraping is being replaced during the process, rather than just appended as the case with normal file system.
Is there a reasonable way to scrape log files while being written to azure storage accounts? I am aware that streaming log entries from event hub works better for this scenario, but I am interested in the diagnostic logs being written to azure storage accounts.