opendns-fetchstats
I saw a post in the old forums about opendns-fetchstats. I looked up the url, but having fairly limited experience with programming (basic VB scripts in MS Access), I don't quite know what to do with the code on the page. Can anyone help me? There are no explanations on the site. I'd like to download complete csv files from my log covering a week at a time.
-
The URL generated by the script is actually this: https://dashboard.opendns.com/stats/xxxxxxxxx/topdomains/2018-10-29to2018-10-30/page1.csv
Notice the page1 at the end. It then increments the page number for each "get". This works fine in a browser, but fails in the script. It's failing because the GetUrlData function (below) is returning a 0 byte result which causes the script to write the generic "You can not access xxxxxx" message. If you comment out objHTTP.Option(6) = False then you get a different result (Web page data) which throws the <!DOCTYPE error.
Set objHTTP = CreateObject("WinHttp.WinHttpRequest.5.1")
Function GetUrlData(strUrl, strMethod, strData)
objHTTP.Open strMethod, strUrl
If strMethod = "POST" Then
objHTTP.setRequestHeader "Content-type", "application/x-www-form-urlencoded"
End If
objHTTP.Option(6) = False
objHTTP.Send(strData)If Err.Number <> 0 Then
GetUrlData = "ERROR: " & Err.Description & vbCrLf & Err.Source & " (" & Err.Nmber & ")"
Else
GetUrlData = objHTTP.ResponseText
End If
End Function -
Hi @rotblitz
Im trying to use your tool in CentOS 7 server bash script with no avail, I get the error:
"You can not Access xxxxxxxxx"
I opened an issue case in GitHub at @rcrowley but still no answer:
https://github.com/rcrowley/opendns-fetchstats/issues/6I'm a Vulnerability Analyst so I can manage my self around IT pretty easy, if you could point me in the right direction at least?
I Will update my GitHub Fork to make this changes and fixes available for everyone and update the wiki with the %40 .*login*. fixes.
My purpose is to use this to get logs for Splunk, Im developing an app with Security Dashboards Built In for OpenDNS Home, since there is no API key available for Home this is my only solution.
Please @rotblitz could you help me figure this out?
Thanks in advance.
-
Why do you think I could help here? I'm not the author of the fetchstats tool. You must have misunderstood something...
Whatever, it looks like you must replace @ by %40 or maybe %%40 in the e-mail address.
-
Just managed to get this script running (thanks to all the contributions to this thread). The one piece of information I'm missing in the output (to "file.csv" in my case) is the date for each URL visit. I'm assuming it must be available in the data because the web dashboard filters by date. Is there a way to get the script to pull the date through into the output?
Thanks
-
You either name the file as you want when you enter the parameters manually anyway, or you use my command script.
https://support.opendns.com/hc/en-us/community/posts/220022787/comments/224556047And no, OpenDNS is not able to ever provide you with “URL visits”. They only know about your DNS lookups. This is a huge difference.
-
Thank you. Seems as though OpenDNS have some sort of server restriction on report requests. If I run the script for more than once (to extract data for different days) the Command Prompt just hangs. When I visit the OpenDNS site there's a error message on the stats page telling me request less than 20 reports in 2 mins (I assume it is pro-rating the reports and estimating it's exceeding the run rate). The reason I'm using FetchStats was partly to do with this error - it occurs if I page through the stats on the website too quickly.
-
A couple of further observations. Initial use has revealed the script does not always return all results. I've run it twice for the same date and it's returned different total visits. Tried it over 3-4 dates and found more than on instance. Proposing to continuing testing to see if it's related to the time out on report running.
Also noticed it extracts page HTML into the CSV file too. I'm assuming it's more of a web scrape vs. accessing the actual stats?
-
I think the script is broken... I'm not an expert and quite prepared to accept I'm doing something wrong; however, inspecting the CSV file out shows records 1201 to 1401 are missing and in their place what looks like a page worth of HTML. I think there are 200 records per page so it looks like a whole page failed. Other CSV files look just fine and whilst I can filter out junk I can't create the missing records.
-
"Is it possible to export the CSV to a specific file path? Currently I'm using >file.csv which puts the CSV in the same directory as the script."
You simply specify the output path, for example: > C:\documents\OpenDNSstats\file.csv
No idea about your other questions. I would have to use the script to investigate this, but I'm not motivated to do that.
-
Would it be possible for the script to be edited to add wait times in between pages?
After fetching 20ish pages OpenDNS stops processing requests until 2 minutes has elapsed. If the wait was set at 10 seconds then a report that had 80 pages might take 13ish minutes to complete but it would download all the data available.
-
Something is up when I am running this. cscript //NoLogo fetchstats.vbs <username> all 2014-03-10 2014-03-13
When I get the password prompt I know I am typing the same username (email address) and password as used to log on to OpenDNS via the browser but I get
Login failed. Check Username and Password
Any help would be appreciated
-
In which case I *think* it may he something to do with an anomaly in password standards between what you can use in the script vs. keying directly into the OpenDNS site. I wish I could recall more specific detail but I do remember having to mess around and change my password so it was something that would work with both the site and the script. I think it may have something to do with special characters - it insists on one but it pedantic. Try using an asterisk in your password, with upper and lower case, and a number.
-
Thanks I will try but FYI I am using a 10 character password with an upper case letter a two digit number and the $ sign so am confused as to what may be the issue. Seems like quite a vanilla combination I'm wondering and would like to confirm I should be using the same email address and password as used to access my OPENDNS account online?
-
It has been reported far back in this thread that "all" for the network ID as parameter for the script does not work, but that it has to be the numeric network ID.
The $ character in the password should be fine. Other special characters reserved for HTTP encoding should not be used or should be encoded. For example, the @ character in the e-mail address may need to replaced by %40 (or %%40 on a Windows command script/line).
-
I have found that for the bash script, having a ! in your password causes the script to fail.
When I run the script normally, I get "Login failed. Check your username and/or password."
I manually populated (or ran the appropriate curl commands) all of the variables, then ran
curl --insecure --cookie "$COOKIEJAR" --cookie-jar "$COOKIEJAR" \
--data "formtoken=$FORMTOKEN&username='myemail.goes@here'\
&password='MyPasswordHasA!'&sign_in_submit=foo" "$LOGINURL"and got
bash: !': event not found
Looking at the page curl returns, it indeed says "Login failed. Check your username and/or password.".
-
Is there a real workable solution to get data from opendns to csv file.
Running cscript in cmd asks password but ends to "login failed". My password has ! mark and I changed it to %21. My username has @ and chaged it to %40 and tested with doubles %% too. Not working.
Tested url with my changes of course.
https://dashboard.opendns.com/stats/1234567/topdomains/2018-10-05to2018-10-06.csv
That creates csv file but only with first 200lines.
I would like to download csv file with all my dns lines from opendns between added dates.
-
You need to add a time delay (2.5 minutes) to the script (see bold line below). Takes a long time for the script to run but it does, at least, run*. Took me a while to get OpenDNS to realise their server throttled data volume before they conceded it did and that it was a deliberate attempt to block server abuse. I have asked they reconsider because it's completely contrary to the purpose of the interface (i.e. you cannot even move between pages on the website without it eventually failing!:
page=1
Do While True
Wscript.Sleep(150000) '(include time delay between pages)
data = GetUrlData(CSVURL & "/stats/" & Network & "/topdomains/" & DateRange & "/page" & page & ".csv", "GET", "")
If page = 1 Then
If LenB(data) = 0 Then
WScript.StdErr.Write "You can not access " & Network & vbCrLf
WScript.Quit 2* I do still occasionally experience instances where the site fails to export all stats. I think the interface / server is quite fragile... Running the script again, sometimes later in the day, seems to solve the problem.
-
Pardon me for being stupid but where do I add that script. My purpose was to create simple bat file which could be run manually when needed. It should create file from "todays" dns entries. Just doubleclick and after short while report is ready to be opened with excel.
Are those script lines that you wrote something that needs to be added additionally after this line:
cscript //NoLogo fetchstats.vbs <username> <network-id> <YYYY-MM-DD> [<YYYY-MM-DD>]
Sincerely "completely lost" japjap
Please sign in to leave a comment.
Comments
105 comments