@sparklingrobots, I’ve followed the instruction on the #automate-downloading-logs link, which is essentially the same kind of script at the github project that I linked to earlier. Same results. (Although today, we’re up to 7 lines.)
So far today, I’ve spent more than an hour trying to get Goaccess running. < sigh > Still doesn’t work. I’m putting effort into that because most lines in the log don’t make sense to me. Here’s an example:
188.8.131.52 - - [23/Nov/2019:15:31:18 +0000] “\x04\x01\x00\x19\xBC}I\x1D\x00” 400 166 “-” “-” 0.102 “-”
(Aside: I’ve lost track of how much time I’ve spent trying to follow instructions provided to download all logs, then understand the data that is there, but I’d estimate 8-12 hours for three sites. That’s non-billable time. Frustrating doesn’t even begin to describe it.)
You ask: Re: your site only showing 5 pages served in 60 days…do you have folks logging in regularly to add/edit content?
The answer is no. I’m the only user account. So the site is serving virtually entirely anonymous traffic. If that’s so, and the CDN cache is serving virtually all pages, then how am I supposed to figure out how to identify and block the bots (or whatever they might be) that I’d like to keep out?
Thanks so much for hearing us out. We do appreciate it!