Welcome guest, is this your first visit? Click the "Create Account" button now to join.
Page 2 of 11 FirstFirst 1234 ... LastLast
Results 11 to 20 of 102
  1. #11
    Important User

    osiris4isis's Avatar
    Join Date
    Apr 2012
    Location
    Farside of the Sun
    Posts
    2,645
    Rep Power
    637

    Default

    Quote Originally Posted by keesdewit View Post
    wget does this also over http.
    As in:
    wget --recursive --level=inf --spider --server-response --no-directories --output-file="file.txt"
    You can then calculate the number of filesizes under "Content-Length" in the logfile like this

    #echo "Estimated size: $(grep -e "Content-Length" "$LOG" | awk '{sum+=$2} END {printf("%.0f", sum / 1024 / 1024)}') Mb"

    The only problem I have is, that either the first or the last file from the FOR loop makes it to the log file. I need them all there.
    Aren't you getting each file one by one? If so, why are you using the "-recursive" option?
    As I remember doesn't http protocol allow getting the header only? If so, you can first get the "Content-Length" without getting the entire file. With the length for each file, you can then verified the file AFTER it's downloaded to make sure it's ok.

  2.    Advertissements


  3. #12
    Master
    Join Date
    Mar 2015
    Location
    127.0.0.1
    Posts
    159
    Rep Power
    52

    Default

    Yes, Osiris, that is exactly what happens here: by using the wget "--spider" and "--server-response" options we are only retrieving the headers being written to $log.
    So nothing is being downloaded at this stage.

    Thanks for you remark on the "--recursive" option. Using or not using "--recursive" brings us the same results, since wget loops through a number of files all in the same directory, $URL/$FOLDER/ , so I removed the "--recursive" options.

    Still remains the issue of not getting everything written to the logfile; just the first or the last one from the EXT for loop.(EXT="2dc cam hmp lma ne0 ne1 ne2 nhs pak pnm poi")
    With grep and AWK one should be able to calculate the sum of the collected Content-Lenght sizes in the logfile from each downloaded i.e. and.2dc, and.cam, and.hmp, and.lma etc.

  4. #13
    Member + salsa's Avatar
    Join Date
    Mar 2015
    Location
    Italy
    Posts
    44
    Rep Power
    2

    Default

    Quote Originally Posted by keesdewit View Post
    Salsa, try this script: [Only registered and activated users can see links. ]
    It works out of the box.
    You forgot items from 12 to 47, i added them and now it's working!!!!!!!!!!!!!!!!!!!!

    A request: Can you expand the script letting us to download the whole base file???

    Thank you

    Edited the script, now it is running....... and it works perfectly!!!

    Thank you again
    Last edited by salsa; 28th March 2015 at 16:05.

  5. #14
    Master
    Join Date
    Mar 2015
    Location
    127.0.0.1
    Posts
    159
    Rep Power
    52

    Default

    Hi Salsa, my bad: it was a limited test-version. With this download you 're good to go.
    [Only registered and activated users can see links. ]

    And please replace #!/opt/bin/bash with #!/bin/bash
    Last edited by keesdewit; 28th March 2015 at 15:41.

  6. #15
    Important User

    osiris4isis's Avatar
    Join Date
    Apr 2012
    Location
    Farside of the Sun
    Posts
    2,645
    Rep Power
    637

    Default

    Quote Originally Posted by keesdewit View Post
    Yes, Osiris, that is exactly what happens here: by using the wget "--spider" and "--server-response" options we are only retrieving the headers being written to $log.
    So nothing is being downloaded at this stage.

    Thanks for you remark on the "--recursive" option. Using or not using "--recursive" brings us the same results, since wget loops through a number of files all in the same directory, $URL/$FOLDER/ , so I removed the "--recursive" options.

    Still remains the issue of not getting everything written to the logfile; just the first or the last one from the EXT for loop.(EXT="2dc cam hmp lma ne0 ne1 ne2 nhs pak pnm poi")
    With grep and AWK one should be able to calculate the sum of the collected Content-Lenght sizes in the logfile from each downloaded i.e. and.2dc, and.cam, and.hmp, and.lma etc.
    Oh ok. It's odd that you're only getting first and last log output. Try this (for temporary testing), generate a logfile for each file processed, called it <map filename>.log Then check to see if there is anything in them. If so, you have a buffer flush or data overwritten problem.

  7. #16
    Master
    Join Date
    Mar 2015
    Location
    127.0.0.1
    Posts
    159
    Rep Power
    52

    Default

    Issue above solved. New script version will be posted later this week (around thursday).

  8. #17
    Master
    Join Date
    Mar 2015
    Location
    127.0.0.1
    Posts
    159
    Rep Power
    52

    Default

    Here is the Linux Mapdownloader for Europe Version 0.4

    [Only registered and activated users can see links. ]

    Changes:
    - Request downloadsize of maps
    - wget does 5 retries in bad connection circumstances
    - improved logic


    Next steps: adding more continents

  9. #18
    Master
    Join Date
    Mar 2015
    Location
    127.0.0.1
    Posts
    159
    Rep Power
    52

    Default

    Sometimes I make some last minute changes to the script in a Windows editor.
    Inevitably you 'll see some typical Windows CR LF signs at the end of some lines, like ^M.
    Which is ugly and it sometimes even breaks the script.

    You can remove that by opening the script in a Vi editor, press Esc and type
    :%s/^V^M//g

    The ^V^M signs should be typed Ctrl-V, Ctrl-M

  10. #19
    Master
    Join Date
    Mar 2015
    Location
    127.0.0.1
    Posts
    159
    Rep Power
    52

    Default

    For all you MacOs lovers out there.
    The Linux mapdownloader is quite easy to be used on a MacOSX system.
    Just execute the script from a terminal and see to it that the path to the necessary executables is set properly for the programs bash and wget.

    [Only registered and activated users can see links. ]

  11. #20
    Master
    Join Date
    Mar 2015
    Location
    127.0.0.1
    Posts
    159
    Rep Power
    52

    Default

    First attempt to build a World Map for the Linux Sygic MapDownloader.
    Only the continents Africa and Europe are implemented so far.


    [Only registered and activated users can see links. ]

    Again, please change "#!/opt/bin/bash" to "#!/bin/bash"
    Last edited by keesdewit; 8th April 2015 at 12:59.

 

 
Page 2 of 11 FirstFirst 1234 ... LastLast

Tags for this Thread

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
This website uses cookies
We use cookies to store session information to facilitate remembering your login information, to allow you to save website preferences, to personalise content and ads, to provide social media features and to analyse our traffic. We also share information about your use of our site with our social media, advertising and analytics partners.